patents.google.com

CN111083162A - Multimedia stream pause detection method and device - Google Patents

  • ️Tue Apr 28 2020

Disclosure of Invention

The embodiment of the disclosure provides a multimedia stream pause detection method and device, which can reflect the situations of whether a multimedia stream is paused in actual watching or not. The technical scheme is as follows:

in one aspect, an embodiment of the present disclosure provides a multimedia stream stuck detection method, where the method includes:

acquiring a data frame transmitted by a multimedia stream and storing a time stamp of the data frame;

acquiring a time stamp of the data frame according to a multimedia playing rule, wherein the multimedia playing rule is used for indicating the number of the data frames played under the non-blocking condition;

determining a stuck condition of the multimedia stream based on the acquired timestamps of the data frames.

Optionally, the acquiring a data frame of a multimedia streaming transmission includes:

creating a data acquisition process;

and periodically and circularly acquiring the data frames transmitted by the multimedia stream from the server by adopting the data acquisition process.

Optionally, the data frames comprise audio frames and video frames;

saving a timestamp for the data frame, comprising:

and respectively buffering the time stamp of the audio frame and the time stamp of the video frame into two queues.

Optionally, the obtaining the timestamp of the data frame according to the multimedia playing rule includes:

determining a first audio frame number which should be played in a periodic time based on the multimedia playing rule;

and acquiring the time stamp of the audio frame of the first audio frame number from the queue for storing the time stamp of the audio frame in the period time.

Optionally, the multimedia playing rule includes an audio parameter, where the audio parameter is used to indicate the amount of data played in a unit time;

the determining a first audio frame number to be played in a periodic time based on the multimedia playing rule includes:

calculating a first data amount based on the audio parameter and the cycle time;

calculating a frame number based on the first data amount and the length of each audio frame;

and rounding up the frame number to obtain a first audio frame number.

Optionally, the calculating the first data amount based on the audio parameter and the cycle time includes:

calculating to obtain a second data volume by adopting the data volume played in the unit time and the cycle time;

acquiring the residual data volume of the previous period, wherein the residual data volume is the data volume included by the first audio frame number after the frame number of the previous period is rounded, and is more than the first data volume of the previous period;

and adding the second data quantity to the residual data quantity to obtain the first data quantity.

Optionally, the determining a stuck condition of the multimedia stream based on the acquired timestamp of the data frame includes:

and determining that the audio is jammed in response to the number of the timestamps of the acquired audio frames being less than the first audio frame number.

Optionally, the method further comprises:

the timestamps of the acquired audio frames are removed from the queue in which the timestamps of the audio frames are stored.

Optionally, the obtaining the timestamp of the data frame according to the multimedia playing rule includes:

determining the time stamp of the last audio frame acquired from the time stamp queue of the audio frame at each cycle time;

and acquiring the timestamp of the video frame with the timestamp not exceeding the timestamp of the last audio frame and the largest timestamp from the queue for storing the timestamps of the video frames.

Optionally, the determining a stuck condition of the multimedia stream based on the acquired timestamp of the data frame includes:

and in response to the time interval of the time stamps of the video frames acquired twice in succession being larger than a threshold value, determining that the video is stuck.

In another aspect, an embodiment of the present disclosure provides a multimedia stream stuck detection apparatus, where the apparatus includes:

the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is configured to acquire a data frame transmitted by a multimedia stream and save a time stamp of the data frame;

the second acquisition module is configured to acquire the timestamp of the data frame according to a multimedia playing rule, wherein the multimedia playing rule is used for indicating the number of the data frames played under the non-stuck condition;

a determination module configured to determine a stuck condition of the multimedia stream based on the acquired timestamps of the data frames.

On the other hand, the embodiment of the present disclosure further provides a terminal device, where the terminal device includes: a processor; a memory configured to store processor-executable instructions; wherein the processor is configured to perform the multimedia stream stuck detection method as described in any of the previous items.

In another aspect, the disclosed embodiments also provide a computer-readable storage medium, where instructions, when executed by a processor of a terminal device, enable the terminal device to perform the multimedia stream stuck detection method as described in any one of the preceding claims.

In the embodiment of the present disclosure, by storing the data frames transmitted by the multimedia stream and the time stamps of the data frames, and then acquiring the time stamps of the data frames from the local according to the multimedia playing rule, since the multimedia playing rule indicates the number of data frames played in the non-stuck condition, the stuck condition of the multimedia stream can be determined according to whether the time stamps of the number of data frames defined by the rule are actually acquired, the time interval between the time stamps of the data frames, and the like. Because the process of acquiring the data frame actually simulates the actual playing scene, the pause condition determined by the method can show whether the multimedia stream is paused in the actual viewing, so that the content distribution network can be optimized based on the pause condition, and the quality of the multimedia stream is improved.

Detailed Description

To make the objects, technical solutions and advantages of the present disclosure more apparent, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.

Fig. 1 is a flowchart of a multimedia stream stuck detection method according to an embodiment of the present disclosure. Referring to fig. 1, the method includes:

step 101: and acquiring a data frame transmitted by the multimedia stream and storing a time stamp of the data frame.

In the disclosed embodiment, the multimedia stream may be a video stream (e.g., an online video stream, a live video stream, etc.), and the data frames of the video stream include audio frames and video frames. In other embodiments, the multimedia stream may also be an audio stream, and in this case, the data frame is only an audio frame.

Step 102: and acquiring the time stamp of the data frame according to a multimedia playing rule, wherein the multimedia playing rule is used for indicating the number of the data frames played under the non-blocking condition.

In the embodiment of the present disclosure, data such as the number of data frames and time intervals are determined by acquiring the data frames, and whether there is a stuck can be determined based on the data.

The multimedia playing rule usually includes the number of audio frames that should be obtained within a period of time, so the process of playing data frames of the multimedia player can be simulated based on the multimedia playing rule, and although the playing action is not executed, the pause condition under the playing condition can be simulated based on the condition of the obtained data frames.

Step 103: determining a stuck condition of the multimedia stream based on the acquired timestamps of the data frames.

For video streaming, the stuck condition typically includes whether the video is stuck, whether the audio is stuck, and the rate of the stuck at the time of the stuck, etc.

Here, video stuck means that the time interval between two acquired video frames exceeds a predetermined value, which is usually the time length that the human eye can feel stuck, and is usually 200 and 300 ms. Audio stuck is usually stuck if a frame is dropped and is accompanied by "zizi" noise when played.

In the embodiment of the present disclosure, by storing the data frames transmitted by the multimedia stream and the time stamps of the data frames, and then acquiring the time stamps of the data frames from the local according to the multimedia playing rule, since the multimedia playing rule indicates the number of data frames played in the non-stuck condition, the stuck condition of the multimedia stream can be determined according to whether the time stamps of the number of data frames defined by the rule are actually acquired, the time interval between the time stamps of the data frames, and the like. Because the process of acquiring the data frame actually simulates the actual playing scene, the pause condition determined by the method can show whether the multimedia stream is paused in the actual viewing, so that the content distribution network can be optimized based on the pause condition, and the quality of the multimedia stream is improved.

Fig. 2 is a flowchart of a multimedia stream stuck detection method according to an embodiment of the disclosure. Referring to fig. 2, the method includes:

step 201: a data acquisition process is created.

In the embodiment of the present disclosure, there are two ways to create a data acquisition process:

first, a process is initiated to emulate the data acquisition process in the player to acquire the data. In this case, the method may be executed by any client as long as the data of the multimedia streaming can be acquired from the server.

Second, the (video/live) player is started, a data acquisition process of the player is created, and the data is acquired by the data acquisition process of the player. At this time, the method may be executed by the client installed with the player, so that the data of the multimedia streaming can be acquired from the server.

It should be noted that the player includes a buffering process besides the data obtaining process, that is, the player may buffer data for a period of time (for example, 10S) in advance, so as to avoid the influence of the network condition on viewing. The existence of the buffering process can influence the quality evaluation of the multimedia stream by the user in the playing process. Therefore, in order to ensure the objectivity of the evaluation, the evaluation of the stuck condition is only performed on the data acquired by the data acquisition process of the player.

Step 202: and periodically and circularly acquiring the data frames transmitted by the multimedia stream from the server by adopting the data acquisition process.

In the disclosed embodiments, the data frames may include audio frames and video frames.

In the present application, the data acquisition and the subsequent steps are performed periodically, so that on one hand, the process of

step

201 can also be used to perform other actions of the terminal device, and on the other hand, the determination result that does not affect the stuck condition is performed periodically.

Here, the time of each period may be 5 to 50 milliseconds, for example, 10 milliseconds, and in each 10 milliseconds, the data acquisition process may perform data acquisition in a previous period, and in a later period, the data acquisition process is in a rest state, so that continuous work is avoided, and a large processing load is brought to the terminal device.

In this embodiment of the present disclosure, since the data obtaining process obtains data periodically and circularly, the method may further include: each time data acquisition begins. For example, the start time of the first acquisition of data is denoted as time0, the start time of the second acquisition of data is denoted as time1, and so on.

Step 203: and respectively buffering the time stamp of the audio frame and the time stamp of the video frame into two queues.

In the embodiment of the present disclosure, the timestamps of the audio frames are stored in one queue, the timestamps of the video frames are stored in one queue, and the timestamps of the frames in each queue are arranged according to the sequence.

Step 204: and determining a first audio frame number which should be played in the periodic time based on the multimedia playing rule.

When the player is used for playing multimedia streams, under the condition that audio parameters are not changed, the size of audio data decoded from each audio frame is fixed, that is, the number of bits per frame (BytesPerFrame) is fixed, and the size of the audio data to be played in unit time is also fixed. For example, the audio parameters are as follows: the frequency (samplerate) is 44100, the number of channels (channels) is two channels, and the number of bits of the audio stream is 16 bits. Under the audio parameters, 44100 samples per second are represented, each sample having two channels of data, each data sample being 16 bits, i.e. 2 bytes.

In the embodiment of the present disclosure, the audio parameters are described in the multimedia playing rules. Thus, this step may comprise:

calculating a first data amount based on the audio parameter and the cycle time;

calculating a frame number based on the first data amount and the length of each audio frame;

rounding up the frame number to obtain a first audio frame number

Here, the cycle time is calculated based on the time when two cycles of acquiring data start.

For example, the difference between the time2 when the current cycle acquisition data starts and the time1 when the last cycle acquisition data starts results in the cycle time. The first data amount may be calculated based on the audio parameter and the cycle time.

Based on the length of each audio frame and the first data amount in the period time, frame numbers AudioPackets are calculated based on formula (1):

AudioPackets=AudioBytes/BytesPerFrame (1)

here, the AudioPackets may be rounded up to obtain AudioPackets' as the first audio frame number.

In an embodiment of the present disclosure, calculating the first data amount based on the audio parameter and the cycle time may include:

calculating to obtain a second data volume by adopting the data volume played in the unit time and the cycle time;

acquiring the residual data volume of the previous period, wherein the residual data volume is the data volume included by the first audio frame number after the frame number of the previous period is rounded, and is more than the first data volume of the previous period;

and adding the second data quantity to the residual data quantity to obtain the first data quantity.

In the embodiment of the present disclosure, the second data amount AudioBytes' in the cycle time is calculated according to the following formula (2):

AudioBytes’=(time1-time0)*samplerate*channels*2Byte (2)

here, samplerate channels 2Byte is the amount of data played per unit time.

In the embodiment of the present disclosure, the first data amount may be calculated by using formula (3):

AudioPackets=AudioBytes’+Bytes (3)

bytes is the remaining amount of data in the last cycle.

In each period, when AudioPackets are rounded up to obtain AudioPackets 'as the first audio frame number, the excess data amount byte calculated during rounding up may be calculated as AudioPackets'. BytesPerFrame-AudioBytes, so as to be used in the calculation of the next period.

Step 205: and acquiring the time stamp of the audio frame of the first audio frame number from the queue for storing the time stamp of the audio frame in the period time. And when the number of the acquired audio frames is less than the first audio frame number, determining that the audio is stuck, and executing

step

206.

And if the audio frame with the first audio frame number is acquired, no audio jam exists currently.

Step 206: and calculating the audio pause rate based on the interval of the time stamp of the last acquired audio frame and the time stamp of the first audio frame acquired next time.

In the embodiment of the present disclosure, the following method may be adopted to calculate the audio morton rate: determining the audio pause time in a set time period; dividing the audio pause time by the length of the set time period to obtain the pause rate. The set time period here may be set as needed, for example, 10 seconds.

In

step

205, the audio katron time length may be obtained as follows: when the number of the acquired audio frames is less than the first number of audio frames, the timestamp of the last acquired audio frame may be used as the audio pause start until the timestamp of the acquired audio frame is cycled again, the timestamp of the audio frame at this time is marked as the audio pause end, and the time from the audio pause start to the audio pause end is the pause time. Multiple segments of katon time may exist in each set time period, and at this time, the multiple segments of katon time may be added to obtain the audio katon time length in the set time period.

Step 207: determining a time stamp of a last audio frame retrieved from the queue of time stamps of the audio frames at each cycle time.

Step 208: and acquiring the timestamp of the video frame with the timestamp not exceeding the timestamp of the last audio frame and the largest timestamp from the queue for storing the timestamps of the video frames.

In

step

207, the timestamp of the last audio frame in the current period is obtained, at this time, the timestamp of the video frame with the largest timestamp, which does not exceed the timestamp of the last audio frame, is obtained from the queue, and the obtained timestamp is used for performing hiton judgment. Here, the manner of obtaining the video frame time stamp obtains the time stamp of the video frame in the period time, that is, the time stamp of the obtained video frame is aligned with the time stamp of the audio frame obtained in the previous step. Therefore, whether the audio and video are stuck or not in each period can be synchronously judged.

And when the time interval of the time stamps of the video frames acquired twice continuously is larger than a threshold value, determining that the video is blocked. Otherwise, no video is currently stuck.

Here, the threshold may be 200-300ms, such as 200 ms.

It should be noted that, since the time interval between the video frames is longer, the meaning of calculating the video pause duration ratio (i.e., the video pause rate) is not great, and the video pause rate may also be calculated in an audio manner, which is not described herein again.

Of course, the number of clicks within a set time may be counted for the video, thus only the quality of the data distribution network.

Here, steps 206 and 207+208 may not be in a sequential order and may be performed simultaneously.

Fig. 3 is a block diagram of a multimedia stream stuck detection apparatus according to an embodiment of the disclosure. Referring to fig. 3, the apparatus includes a first obtaining

module

301, a second obtaining

module

302, and a determining

module

303.

The first obtaining

module

301 is configured to obtain a data frame of multimedia streaming transmission and save a timestamp of the data frame;

a second obtaining

module

302, configured to obtain the timestamp of the data frame according to a multimedia playing rule, where the multimedia playing rule is used to indicate the number of data frames played in a non-stuck situation;

a determining

module

303 configured to determine a stuck condition of the multimedia stream based on the acquired time stamp of the data frame.

In an implementation manner of the embodiment of the present disclosure, the first obtaining

module

301 includes:

an

acquisition submodule

311 configured to create a data acquisition process; and periodically and circularly acquiring the data frames transmitted by the multimedia stream from the server by adopting the data acquisition process.

In one implementation of the disclosed embodiment, the data frames include audio frames and video frames;

the first obtaining

module

301 further includes:

the

storage sub-module

312 is configured to buffer the time stamp of the audio frame and the time stamp of the video frame into two queues respectively.

In an implementation manner of the embodiment of the present disclosure, the second obtaining

module

302 includes:

a first timestamp obtaining sub-module 321, configured to determine, based on the multimedia playing rule, a first audio frame number that should be played within a period time; and acquiring the time stamp of the audio frame of the first audio frame number from the queue for storing the time stamp of the audio frame in the period time.

In one implementation manner of the embodiment of the present disclosure, the multimedia playing rule includes an audio parameter, where the audio parameter is used to indicate a data amount played in a unit time;

a first timestamp obtaining sub-module 321, configured to calculate a first data amount based on the audio parameter and the cycle time; calculating a frame number based on the first data amount and the length of each audio frame; and rounding up the frame number to obtain a first audio frame number.

In an implementation manner of the embodiment of the present disclosure, the first timestamp obtaining sub-module 321 is configured to calculate a second data volume by using the data volume played in the unit time and the cycle time; acquiring the residual data volume of the previous period, wherein the residual data volume is the data volume included by the first audio frame number after the frame number of the previous period is rounded, and is more than the first data volume of the previous period; and adding the second data quantity to the residual data quantity to obtain the first data quantity.

In an implementation manner of the embodiment of the present disclosure, the determining

module

303 is configured to determine that the audio is stuck in response to that the number of timestamps of the acquired audio frames is less than the first audio frame number.

In an implementation manner of the embodiment of the present disclosure, the apparatus further includes:

and the stuck

rate calculating module

304 is configured to calculate an audio stuck rate based on an interval between the timestamp of the last acquired audio frame and the timestamp of the first audio frame acquired next time.

In an implementation manner of the embodiment of the present disclosure, the second obtaining

module

302 further includes:

a second timestamp obtaining sub-module 322, configured to determine a timestamp of a last audio frame obtained from the queue of timestamps of the audio frames at each cycle time;

and acquiring the timestamp of the video frame with the timestamp not exceeding the timestamp of the last audio frame and the largest timestamp from the queue for storing the timestamps of the video frames.

In an implementation manner of the embodiment of the present disclosure, the determining

module

303 is configured to determine that the video is stuck in response to a time interval between timestamps of two consecutive acquired video frames being greater than a threshold.

Fig. 4 shows a block diagram of a

terminal device

400 according to an exemplary embodiment of the present disclosure. The

terminal device

400 may be: a smartphone, a tablet, a laptop, or a desktop computer. The

terminal device

400 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.

In general, the

terminal device

400 includes: a

processor

401 and a

memory

402.

Processor

401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The

processor

401 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The

processor

401 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the

processor

401 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the

processor

401 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.

Memory

402 may include one or more computer-readable storage media, which may be non-transitory.

Memory

402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in

memory

402 is used to store at least one instruction for execution by

processor

401 to implement the multimedia stream stuck detection method provided by method embodiments herein.

In some embodiments, the

terminal device

400 may further include: a

peripheral interface

403 and at least one peripheral. The

processor

401,

memory

402 and

peripheral interface

403 may be connected by bus or signal lines. Each peripheral may be connected to the

peripheral interface

403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of

radio frequency circuitry

404,

touch screen display

405,

camera

406,

audio circuitry

407,

positioning components

408, and

power supply

409.

The

peripheral interface

403 may be used to connect at least one peripheral related to I/O (Input/Output) to the

processor

401 and the

memory

402. In some embodiments,

processor

401,

memory

402, and

peripheral interface

403 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the

processor

401, the

memory

402 and the

peripheral interface

403 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.

The

Radio Frequency circuit

404 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The

radio frequency circuitry

404 communicates with communication networks and other communication devices via electromagnetic signals. The

rf circuit

404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the

radio frequency circuit

404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The

radio frequency circuitry

404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the

rf circuit

404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.

The

display screen

405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the

display screen

405 is a touch display screen, the

display screen

405 also has the ability to capture touch signals on or over the surface of the

display screen

405. The touch signal may be input to the

processor

401 as a control signal for processing. At this point, the

display screen

405 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the

display screen

405 may be one, providing the front panel of the

terminal device

400; in other embodiments, the

display screen

405 may be at least two, and respectively disposed on different surfaces of the

terminal device

400 or in a folding design; in still other embodiments, the

display

405 may be a flexible display disposed on a curved surface or on a folded surface of the

terminal device

400. Even further, the

display screen

405 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The

Display screen

405 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.

The

camera assembly

406 is used to capture images or video. Optionally,

camera assembly

406 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments,

camera assembly

406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.

The

audio circuit

407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the

processor

401 for processing, or inputting the electric signals to the

radio frequency circuit

404 for realizing voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different positions of the

terminal device

400. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the

processor

401 or the

radio frequency circuit

404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments,

audio circuitry

407 may also include a headphone jack.

The

positioning component

408 is used to locate the current geographic position of the

terminal device

400 for navigation or LBS (location based Service). The

positioning component

408 may be a positioning component based on the GPS (global positioning System) of the united states, the beidou System of china, the graves System of russia, or the galileo System of the european union.

The

power supply

409 is used to supply power to various components in the

terminal device

400. The

power source

409 may be alternating current, direct current, disposable or rechargeable. When

power source

409 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.

In some embodiments, the

terminal device

400 further includes one or more sensors 410. The one or more sensors 410 include, but are not limited to: acceleration sensor 411, gyro sensor 412, pressure sensor 413, fingerprint sensor 414, optical sensor 415, and proximity sensor 416.

The acceleration sensor 411 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the

terminal device

400. For example, the acceleration sensor 411 may be used to detect components of the gravitational acceleration in three coordinate axes. The

processor

401 may control the

touch display screen

405 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 411. The acceleration sensor 411 may also be used for acquisition of motion data of a game or a user.

The gyro sensor 412 may detect a body direction and a rotation angle of the

terminal device

400, and the gyro sensor 412 may cooperate with the acceleration sensor 411 to acquire a 3D motion of the user on the

terminal device

400. From the data collected by the gyro sensor 412, the

processor

401 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.

The pressure sensor 413 may be disposed on a side bezel of the

terminal device

400 and/or on a lower layer of the

touch display screen

405. When the pressure sensor 413 is arranged on the side frame of the

terminal device

400, the holding signal of the user to the

terminal device

400 can be detected, and the

processor

401 performs left-right hand identification or shortcut operation according to the holding signal collected by the pressure sensor 413. When the pressure sensor 413 is disposed at the lower layer of the

touch display screen

405, the

processor

401 controls the operability control on the UI interface according to the pressure operation of the user on the

touch display screen

405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.

The fingerprint sensor 414 is used for collecting a fingerprint of the user, and the

processor

401 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 414, or the fingerprint sensor 414 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity,

processor

401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 414 may be provided on the front, back or side of the

terminal device

400. When a physical key or vendor Logo is provided on the

terminal device

400, the fingerprint sensor 414 may be integrated with the physical key or vendor Logo.

The optical sensor 415 is used to collect the ambient light intensity. In one embodiment, the

processor

401 may control the display brightness of the

touch display screen

405 based on the ambient light intensity collected by the optical sensor 415. Specifically, when the ambient light intensity is high, the display brightness of the

touch display screen

405 is increased; when the ambient light intensity is low, the display brightness of the

touch display screen

405 is turned down. In another embodiment, the

processor

401 may also dynamically adjust the shooting parameters of the

camera assembly

406 according to the ambient light intensity collected by the optical sensor 415.

The proximity sensor 416, also called a distance sensor, is generally provided on the front panel of the

terminal device

400. The proximity sensor 416 is used to collect the distance between the user and the front surface of the

terminal device

400. In one embodiment, when the proximity sensor 416 detects that the distance between the user and the front surface of the

terminal device

400 gradually decreases, the

processor

401 controls the

touch display screen

405 to switch from the bright screen state to the dark screen state; when the proximity sensor 416 detects that the distance between the user and the front surface of the

terminal device

400 becomes gradually larger, the

processor

401 controls the

touch display screen

405 to switch from the breath screen state to the bright screen state.

Those skilled in the art will appreciate that the configuration shown in fig. 4 does not constitute a limitation of the

terminal device

400, and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.

The disclosed embodiments also provide a computer-readable storage medium, and when the instructions in the computer-readable storage medium are executed by a processor of a terminal device, the terminal device is enabled to execute the multimedia stream stuck detection method as described above.

The above description is intended to be exemplary only and not to limit the present disclosure, and any modification, equivalent replacement, or improvement made without departing from the spirit and scope of the present disclosure is to be considered as the same as the present disclosure.