CN115086737A - Data processing method and device, electronic equipment and storage medium - Google Patents
- ️Tue Sep 20 2022
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The user information related to the embodiments of the present disclosure may be information authorized by the user or sufficiently authorized by each party. It should be noted that the information (including but not limited to account information, etc.), data (including but not limited to data for analysis, stored data, displayed data, etc.) and signals involved in the embodiments of the present disclosure are authorized by the user or fully authorized by various parties, and the collection, use and processing of the relevant data need to comply with relevant laws and regulations and standards in relevant countries and regions.
The data processing method provided by the embodiment of the disclosure is executed by an electronic device, and in some embodiments, the electronic device is provided as a terminal. FIG. 1 is a schematic diagram of an implementation environment of a data processing method according to an example embodiment. Referring to fig. 1, the implementation environment specifically includes: the system comprises a
first terminal101, a
second terminal102 and a server 103, wherein network connection is established between the
first terminal101 and the
second terminal102 and the server 103, and the
first terminal101 and the
second terminal102 can interact with the server 103 through the network connection.
In some embodiments, the terminal is at least one of a smartphone, a smartwatch, a desktop computer, a laptop computer, an MP3 player, an MP4 player, a laptop portable computer, and the like. In some embodiments, the server is at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center.
The
first terminal101 interacts with the server 103 based on the logged-in first user account, and the server 103 creates a virtual space for the first user account, so that the
first terminal101 can issue multimedia data in the virtual space based on the first user account, so as to interact with a second user account accessing the virtual space. The
second terminal102 can interact with the server 103 based on the logged second user account, and can access the virtual space created by the server 103 for the first user account, so as to play the multimedia data in the virtual space, so that the second user can view the multimedia data played by the
second terminal102. The embodiment is only illustrated by the
first terminal101 and the
second terminal102, and those skilled in the art can know that the number of the
first terminal101 and the
second terminal102 may be more, and the number of the terminals and the type of the device are not limited in the embodiment of the disclosure.
In some embodiments, an application may be installed and run on the
first terminal101 and the
second terminal102, and a user may log in the application through the
first terminal101 and the
second terminal102 to obtain a service provided by the application, and the server 103 is configured to provide the service for the application. For example, the application program is a live application program, the virtual space is a live room, the multimedia data is live data, the first user account is a main broadcast account, and the second user account is a viewer account. The first terminal 101 logs in the live application program based on the anchor account to carry out live broadcasting, and the second terminal 102 logs in the live application program based on the audience account to access the live broadcasting room of the anchor account, so that the audience can watch live broadcasting data of the live broadcasting room through the
second terminal102.
Fig. 2 is a flowchart illustrating a data processing method according to an exemplary embodiment, where the method is performed by an electronic device, as shown in fig. 2, and the method includes the following steps, taking the electronic device as a terminal for example:
in
step201, an interactive control is displayed in a virtual space, where the interactive control is used to send a virtual gift to a first user account to which the virtual space belongs after being triggered, and the interactive control is used to prompt a second user account accessing the virtual space to send a target number of virtual gifts to the first user account.
In the embodiment of the disclosure, the terminal can access a virtual space corresponding to any first user account based on the logged second user account, and then display the virtual space for the second user to watch. In the process that the terminal displays the virtual space, the second user can send the virtual gift to the first user account corresponding to the virtual space through the terminal, so that interaction with the first user account is achieved. In the process that one or more terminals access the virtual space, an interactive control is displayed in the virtual space, and then each second user accessing the virtual space can check the interactive control and the content prompted by the interactive control, so that the second user account in the virtual space is prompted to send a virtual gift to the first user account by triggering the interactive control, and a corresponding interactive special effect is displayed under the condition that the total number of the virtual gifts sent by the second user account in the virtual space through triggering the interactive control reaches a target number, and therefore the effect of multi-person combined interaction in the virtual space is achieved.
For each terminal accessing the virtual space, the terminal accesses the virtual space based on a second user account, the interaction controls displayed in the virtual space are the same, and the second user can send the virtual gift to the first user account by triggering the interaction controls. The first user account to which the virtual space belongs indicates that the first user account has a right to issue multimedia data in the virtual space, that is, the virtual space is created for the first user account. The first user is a user referred to by the first user account, and the second user is a user referred to by the second user account. For example, in a live broadcast scenario, the virtual space is a live broadcast room, the first user account is a main broadcast account, the live broadcast room is a live broadcast room corresponding to the main broadcast account, the first user is a main broadcast, and the second user is a viewer. The interactive control can be represented in any form, for example, the interactive control is displayed in the form of a button, or a special effect can be triggered. The target number is any number, for example, the target number is 100 or 150, etc.
In
step202, when the total number of the virtual gifts sent by the interaction control by the second user account in the virtual space reaches the target number, displaying an interaction special effect corresponding to the interaction control in the virtual space.
In the embodiment of the disclosure, each second user account accessing the virtual space can send a virtual gift to the first user account by triggering the interaction control, one second user account can send one or more virtual gifts to the first user account by triggering the interaction control, the target number is equal to the number of virtual gifts required for triggering the interaction special effect corresponding to the interaction control, the total number is the number of virtual gifts sent by one or more second user accounts in the virtual space by triggering the interaction control, and when the total number reaches the target number, the interaction special effect corresponding to the interaction control is displayed in the virtual space, so that the second user account can know that the interaction is successful through the displayed interaction special effect, that is, a scheme of sending the virtual gift to the first user account by combining the second user accounts in the virtual space is realized, therefore, the effect of multi-person combined interaction in the virtual space is realized. The interactive effect is an effect in any form, for example, the interactive effect is a wearing effect or an animation effect.
The embodiment of the disclosure provides a scheme of joint interaction, which displays an interaction control in a virtual space to prompt second user accounts in the virtual space to send virtual gifts to a first user account by triggering the interaction control, so that each second user account in the virtual space can send a virtual gift to the first user account by triggering the interaction control, and the second user accounts in the virtual space are prompted to jointly send the virtual gifts to the first user account to realize interaction with the first user account.
In some embodiments, displaying an interactive control in a virtual space includes:
and responding to the target object in the multimedia data of the virtual space to execute the target action, and displaying the interactive control in the virtual space.
In the embodiment of the disclosure, in the process of playing the multimedia data in the virtual space at the terminal, a target action is executed in response to a target object in the multimedia data, which indicates that the played segment in the multimedia data at this time is a highlight moment, so that an interaction control is displayed in the virtual space, so that a second user can be called up, the second user is encouraged to participate in interaction, and the second user account in the subsequent virtual space sends a virtual gift to the first user account by triggering the interaction control, so that the enthusiasm of the first user can be improved.
In some embodiments, displaying an interactive control in a virtual space in response to a target object performing a target action in multimedia data of the virtual space, comprises:
and responding to the target object to execute the target action, and displaying the interactive control at the target position, wherein the target position and the target object are in a preset relative position relation.
In the embodiment of the disclosure, a target object executes a target action in multimedia data in a virtual space, and an interaction control is displayed at a target position which is in a preset relative position relation with the target object, so that the preset relative position relation is maintained between the interaction control and the target object, the target object is referred by the interaction control, a second user can view the interaction control when viewing the target object, and the display effect of the interaction control is further improved.
In some embodiments, the interactive control displays interactive status information, and the interactive status information includes: the total number of virtual gifts and the target number sent by triggering the interactive control.
The interaction state information is displayed in the interaction control, so that the total number and the target number of the interaction control virtual gifts triggered in the current virtual space are presented, the second user can clearly check the completion progress of the interaction, the displayed information quantity is further improved, and the second user can be promoted to participate in the interaction.
In some embodiments, displaying an interactive control in a virtual space includes:
responding to a target object in the multimedia data of the virtual space to execute a target action, and displaying an interactive control containing interactive forecast information, wherein the interactive forecast information is used for prompting that the interaction is about to start;
and responding to any second user account in the virtual space, sending a first virtual gift by triggering the interactive control, and switching the interactive preview information displayed in the interactive control into interactive state information.
In the embodiment of the present disclosure, when the target object executes the target action, the displayed interaction control includes interaction advance notice information to prompt that the interaction is about to start, so that the second user account in the virtual space can send the virtual gift to the first user account by triggering the interaction control, thereby initiating the interaction. Once any second user account in the virtual space sends a first virtual gift to the first user account by triggering the interaction control, the interaction preview information displayed in the interaction control is switched to interaction state information, so that the second user account accessing the virtual space can check the progress condition of sending the virtual gift, and further the second user in the virtual space is prompted to participate in interaction by triggering the interaction control.
In some embodiments, displaying an interactive control in a virtual space includes:
displaying a first interactive control in a multimedia picture display area in a virtual space, wherein the first interactive control is used for prompting a second user account in the virtual space to send virtual gifts with a target quantity to the first user account;
displaying a second interactive control in a comment information display area in the virtual space, wherein the second interactive control is used for sending a target virtual gift to the first user account after being triggered;
under the condition that the total number of the virtual gifts sent by the second user account in the virtual space through triggering the interactive control reaches the target number, displaying the interactive special effect corresponding to the interactive control in the virtual space, wherein the interactive special effect comprises the following steps:
and displaying the interactive special effect in the virtual space under the condition that the total number of the virtual gifts sent by the second user account in the virtual space through triggering the first interactive control and the second interactive control reaches the target number.
Different interaction controls are displayed in different display areas in the virtual space, the effect of strong interaction reminding can be achieved, the second user account in the virtual space is reminded to send the virtual gift to the first user account by triggering any interaction control, the virtual gift sending mode is enriched, and the interaction efficiency is also guaranteed. And the second interactive control is used for sending the target virtual gift to the first user account after being triggered, and once the second interactive control is triggered, the target virtual gift is sent to the first user account, so that the operation of sending the virtual gift is simplified, and the sending efficiency of the virtual gift is improved.
In some embodiments, displaying an interactive special effect corresponding to the interactive control in the virtual space when the total number of the virtual gifts sent by the second user account in the virtual space by triggering the interactive control reaches the target number includes:
and displaying the interactive special effect in the virtual space under the condition that the total number of the interactive control in the effective interactive time period reaches the target number.
The effective interaction time period is set for the interaction control, so that the second user account in the virtual space interacts with the first user account in the effective interaction time period through the interaction control, the scheme of interacting with the first user account in the specific time period is realized, the interaction modes in the virtual space are enriched, the second user can participate in the interaction as soon as possible, the tension of the second user is improved, and the interaction effect is further improved.
In some embodiments, displaying an interactive control in a virtual space includes:
displaying the interactive control in the virtual space under the condition that the multimedia data in the virtual space belongs to the target type;
and the interactive special effect corresponding to the interactive control is matched with the target type.
In the embodiment of the present disclosure, when the total number of the virtual gifts sent by the interaction control is triggered by the second user account in the virtual space reaches the target number, the interaction effect displayed in the virtual space is matched with the target type, so that the style of the displayed interaction effect is consistent with that of the virtual space, and the display effect of the interaction effect is ensured.
In some embodiments, displaying an interactive control in a virtual space includes:
displaying the interactive control in the virtual space under the condition that the frequency of displaying the interactive control in the virtual space in the current period is less than a frequency threshold value; or,
and displaying the interactive control in the virtual space under the condition that the interval duration between the current moment and the moment of displaying the interactive control in the virtual space last time is greater than the target duration.
In the embodiment of the disclosure, the number of times of displaying the interactive control is limited in each period, so that the limited number of times of displaying the interactive control can be ensured, the situation that the first user frequently enables the second user to send the virtual gift is avoided, the normal order of the virtual space is ensured, and the popularity of the mode of realizing the joint interaction through the interactive control can also be ensured.
And judging whether the interval duration between the previous interactive control display time in the virtual space is greater than the target duration or not, and displaying the interactive control in the virtual space only under the condition that the interval duration is greater than the target duration, so that the condition that a second user feels repugnance to the interactive control due to frequent occurrence of the interactive control can be avoided.
In some embodiments, after displaying the interactive control in the virtual space, the method further comprises:
responding to the triggering operation of the interactive control, and displaying the virtual gift to be sent;
and sending the virtual gift to the first user account in response to the triggering operation of any displayed virtual gift.
In the embodiment of the present disclosure, the interactive control is used to invoke a virtual gift to be sent. The second user account calls the virtual gifts to be sent for display by triggering the interactive control, so that the second user can select any virtual gift from the virtual gifts to be sent for sending, the interactive control is used as an inlet for sending the virtual gifts, the operation of sending the virtual gifts is simplified, and the efficiency of sending the virtual gifts is improved.
Fig. 2 is a flowchart illustrating only a basic flow of an embodiment of the present disclosure, and the scheme provided by the embodiment of the present disclosure is further explained below based on a specific implementation manner, and fig. 3 is a flowchart illustrating another data processing method according to an exemplary embodiment, where the method is executed by an electronic device, and taking the electronic device as a terminal as an example, referring to fig. 3, the method includes:
in
step301, in response to a target object in multimedia data in a virtual space executing a target action, an interaction control is displayed in the virtual space, the interaction control is used to send a virtual gift to a first user account to which the virtual space belongs after being triggered, and the interaction control is used to prompt a second user account accessing the virtual space to send a target number of virtual gifts to the first user account.
In the embodiment of the present disclosure, a first user account can publish multimedia data in a virtual space belonging to the first user account, so that a second user account accessing the virtual space can view the multimedia data. And if the target object executes the target action, displaying the interaction control in the virtual space so as to realize the joint interaction between the second user account in the virtual space and the first user account by triggering the interaction control in the subsequent step.
The multimedia data of the virtual space is the multimedia data issued by the first user account to which the virtual space belongs in the virtual space. The target object is an arbitrary object in the multimedia data. For example, the target object is a person or an animal in the multimedia data. The target object is related to the type of the multimedia data, in some embodiments, in a live scene, the virtual space is a live broadcast room, the multimedia data is live broadcast data, the first user account is a main broadcast account, the second user account is an audience account, and the target object is a main broadcast person in the multimedia data or other characters in the multimedia data under the condition that the multimedia data belongs to a dancing live broadcast type or a singing live broadcast type; and under the condition that the multimedia data is of a live pet type, the target object is a pet in the multimedia data. The target motion is an arbitrary motion, for example, a twisting motion or a head shaking motion. The virtual space is a playing interface displayed by the terminal, the multimedia data is played in the playing interface, the terminal responds to the target object to execute the target action, and the interactive control is displayed in the playing interface.
In the embodiment of the disclosure, in the process of playing the multimedia data in the virtual space at the terminal, a target action is executed in response to a target object in the multimedia data, which indicates that the played segment in the multimedia data at this time is a highlight moment, so that an interaction control is displayed in the virtual space, so that a second user can be called up, the second user is encouraged to participate in interaction, and the second user account in the subsequent virtual space sends a virtual gift to the first user account by triggering the interaction control, so that the enthusiasm of the first user can be improved.
For example, taking the target object as a first user indicated by the first user account and the target action as a dancing action as an example, in the process of playing the multimedia data, in response to the first user executing the dancing action, the first user is shown dancing at the moment, and if the first user is considered as a wonderful segment of the multimedia data at the moment, the interactive control is displayed, so that the second user account in the subsequent virtual space sends the virtual gift to the first user account by triggering the interactive control.
In some embodiments, the interaction control displayed in the virtual space corresponds to an interaction task, where the interaction task is a target number of virtual gifts sent by a second user account in the virtual space by triggering the interaction control, when a target object executes a target action, which is equivalent to triggering the interaction task, a plurality of terminals logging in the second user account in the virtual space display the interaction control, so that each second user account can participate in the interaction task in a manner of triggering the interaction control to send a virtual gift to the first user account, and then a corresponding interaction special effect is triggered after the interaction task is completed, so that the second user account participating in the interaction task can be fed back.
In some embodiments, this
step301 comprises: and responding to the target object to execute a target action, and displaying the interactive control at a target position in the virtual space, wherein the target position and the target object are in a preset relative position relation.
The target position and the target object are in a preset relative position relationship, which means that the target position and the target object keep a fixed relative position relationship, and the target position moves along with the movement of the target object, so that the relative position relationship between the target position and the target object is kept unchanged. The target object executes the target action in the multimedia data of the virtual space, the interactive control is displayed at the target position which is in the preset relative position relation with the target object, the preset relative position relation is kept between the interactive control and the target object, the target object is indicated by the interactive control, the second user can view the interactive control when watching the target object, and the display effect of the interactive control is further improved.
In a possible implementation of the above embodiment, the target position is in a preset relative position relationship with a target portion of the target object, the target portion being any portion in the target object, for example, the target portion is a head, a hand, or the like.
For example, taking the target portion as the head of the target object as an example, the target position is a position above the target head and contacting with the head, the interactive control is displayed at the target position, the head of the target object shakes, or the target object lowers the head, the interactive control moves along with the movement of the head of the target object, an effect that the interactive control moves along with the movement of the head of the target object is reflected, and an effect that the target object wears the interactive control can also be reflected.
In one possible implementation of the above embodiment, the interactive control is displayed in the form of a triggerable wearing special effect. When the interactive control is displayed in a triggerable wearing special effect mode, the effect that the target object wears the interactive control is presented, and the display effect of the interactive control is enhanced.
In some embodiments, the interactive control has interactive status information displayed therein, and the interactive status information includes: the second user account in the virtual space sends the total number and the target number of the virtual gifts by triggering the interactive control.
In the embodiment of the present disclosure, the total number displayed in the interactive control can be changed in real time, that is, after the second user account in the virtual space triggers the interactive control to send the virtual gift to the first user account, the total number displayed in the interactive control changes accordingly. The total number and the target number in the interaction status information can be displayed in any form, for example, in the form of a number, or in the form of a progress bar. The interaction state information is displayed in the interaction control, so that the total number and the target number of the virtual gifts of the interaction control triggered in the current virtual space are presented, a second user can clearly check the completion progress of the interaction, the displayed information quantity is further improved, and the second user can be promoted to participate in the interaction.
In a possible implementation manner of the foregoing embodiment, the interaction state information further includes an effective interaction time corresponding to the interaction control. For example, the effective interaction time includes a display duration of the interaction control and a first preset duration, where the first preset duration is an effective interaction duration corresponding to the interaction control. For another example, the effective interaction time is displayed in a countdown manner, and the presented time is the remaining duration of the effective time period corresponding to the interaction control.
In a possible implementation manner of the foregoing embodiment, the
step301 includes: and responding to any second user account in the virtual space, sending a first virtual gift by triggering the interactive control, and switching the interactive preview information displayed in the interactive control into interactive state information.
The interactive preview information is used to prompt that the interaction is about to start, for example, the interactive preview information is "charging is about to start". The first virtual gift is a first one of the virtual gifts sent by the second user account of the virtual space to the first user account through the interactive control, and the second user account sending the first virtual gift may be a second user account logged in by the local device or a second user account logged in by another device.
In the embodiment of the present disclosure, when the target object executes the target action, the displayed interaction control includes interaction advance notice information to prompt that the interaction is about to start, so that the second user account in the virtual space can send the virtual gift to the first user account by triggering the interaction control, thereby initiating the interaction. Once any second user account in the virtual space sends a first virtual gift to the first user account by triggering the interaction control, the interaction preview information displayed in the interaction control is switched to interaction state information, so that the second user account accessing the virtual space can check the progress condition of sending the virtual gift, and further the second user in the virtual space is prompted to participate in interaction by triggering the interaction control.
In some embodiments, the interactive control includes a display area and a touch area, the display area displays the interactive preview information or the interactive status information, the touch area displays the special effect identifier, the touch area can be triggered, and the second user account can send the virtual gift to the first user account by triggering the touch area. The effect identifier indicates an interactive effect corresponding to the interactive control, for example, the effect identifier is represented by a pattern identifier of the interactive effect or by an effect name of the interactive effect. By displaying the special effect identification in the interactive control, the second user account can know the interactive special effect corresponding to the special effect identification in advance, and the second user account is prompted to trigger the interactive control to send the virtual gift, so that the special effect corresponding to the special effect identification is triggered.
It should be noted that in the embodiment of the present disclosure, when any second user account in the virtual space sends the first virtual gift by triggering the interactive control, the interactive preview information displayed in the interactive control is switched to the interactive state information, and in another embodiment, the interactive preview information displayed in the interactive control can also be switched to the interactive state information at other times. In some embodiments, the interactive preview information displayed in the interactive control is switched to the interactive state information when the display duration of the interactive preview information reaches a first preset duration.
In
step302, in an effective interaction time period corresponding to the interaction control, when the total number of the virtual gifts sent by the interaction control by the second user account in the virtual space is triggered to reach a target number, an interaction special effect corresponding to the interaction control is displayed in the virtual space.
The interactive control has an effective interactive time period, the second user account can send the virtual gift to the first user account by triggering the interactive control in the effective interactive time period, and the virtual gift which triggers the interactive control is taken as an effective virtual gift for counting, so that whether the total quantity obtained based on counting reaches a target quantity or not is displayed the interactive special effect. In the embodiment of the disclosure, the effective interaction time period is set for the interaction control, so that the second user account in the virtual space interacts with the first user account in the effective interaction time period through the interaction control, a scheme of interacting with the first user account in a specific time period is realized, interaction modes in the virtual space are enriched, the second user can be prompted to participate in interaction as soon as possible, the tension of the second user is improved, and the interaction effect is further improved.
For example, taking the interaction effect as an example of a wearing effect, when the total number reaches a target number, the wearing effect is displayed in a virtual space, and taking the wearing effect as virtual glasses and virtual earrings as examples, the effect that the target object wears the virtual glasses and the virtual earrings is displayed in the virtual space, so that the display effect of the virtual space is enriched, and a second user account in the virtual space can obtain feedback of sending a virtual gift, so that the success of the joint interaction is known.
In some embodiments, the effective interaction time period is a time period corresponding to any duration after the interactive control display time. For example, 30 seconds after the interactive control display time is the effective interactive time period.
In some embodiments, the interactive control can be triggered during the active interaction time period, and the interactive control cannot be triggered outside the active interaction time period.
It should be noted that in the embodiment of the present disclosure, in a case that the total number reaches the target number in the effective interaction time period, the interaction special effect is displayed in the virtual space, and in another embodiment, the
step302 is not required to be executed, but other manners are adopted, and in a case that the total number of the virtual gifts sent by the second user account in the virtual space by triggering the interaction control reaches the target number, the interaction special effect corresponding to the interaction control is displayed in the virtual space.
In some embodiments, in the displaying of the interactive control, the second user account is capable of sending the virtual gift to the first user account through the interactive control, and the sending of the virtual gift includes: the terminal responds to the triggering operation of the interactive control and displays the virtual gift to be sent; and sending the virtual gift to the first user account in response to the triggering operation of any displayed virtual gift.
In the embodiment of the present disclosure, the interactive control is used to invoke a virtual gift to be sent. The second user account calls the virtual gifts to be sent for display by triggering the interactive control, so that the second user can select any virtual gift from the virtual gifts to be sent for sending, the interactive control is used as an inlet for sending the virtual gifts, the operation of sending the virtual gifts is simplified, and the efficiency of sending the virtual gifts is improved.
In one possible implementation manner of the foregoing embodiment, the virtual gifts to be sent are displayed in the form of a gift list, that is, the process of displaying the virtual gifts includes: and the terminal responds to the triggering operation of the interactive control and displays a virtual gift list, wherein the virtual gift list comprises the virtual gift to be sent.
The embodiment of the disclosure provides a scheme of joint interaction, which displays an interaction control in a virtual space to prompt second user accounts in the virtual space to send virtual gifts to a first user account by triggering the interaction control, so that each second user account in the virtual space can send a virtual gift to the first user account by triggering the interaction control, and the second user accounts in the virtual space are prompted to jointly send the virtual gifts to the first user account to realize interaction with the first user account.
And in the process that the terminal plays the multimedia data in the virtual space, a target action is executed in response to a target object in the multimedia data, the played segment in the multimedia data at the moment is represented as a wonderful frequency band, which is equivalent to highlight moment at the moment, so that an interaction control is displayed in the virtual space, a second user can be called up, the second user is encouraged to participate in interaction, the second user account in the subsequent virtual space sends a virtual gift to the first user account by triggering the interaction control, and the enthusiasm of the first user can be further improved.
And the interaction state information is displayed in the interaction control, so that the total number and the target number of the virtual gifts of the interaction control triggered in the current virtual space can be presented, the second user can clearly check the completion progress of the interaction, the appearance of the second user is further improved, and the second user can be promoted to participate in the interaction.
In addition, an effective interaction time period is set for the interaction control, so that the second user account in the virtual space interacts with the first user account through the interaction control in the effective interaction time period, a scheme that a plurality of second user accounts jointly interact with the first user account in a specific time period is realized, an interaction mode in the virtual space is enriched, an interaction effect and interaction experience of the second user are improved, and entertainment is also improved.
On the basis of the embodiment shown in fig. 3, the embodiment of the present disclosure may further limit the number of times the interactive control is displayed in the virtual space corresponding to the first user account or the time interval between the multiple displayed interactive controls. The process of displaying the interactive control further includes the following two ways:
the first mode is as follows: and displaying the interactive control in the virtual space under the condition that the target object in the multimedia data of the virtual space executes the target action and the frequency of displaying the interactive control in the virtual space in the current period is less than the frequency threshold.
The number threshold is an arbitrary number, and for example, the number threshold is 3 or 5. The number of times that the virtual space corresponding to each first user account displays the interactive control in one period cannot exceed a threshold number of times, where the period is any duration, for example, the period is one day or one week.
In this embodiment of the disclosure, in each period, the number of times that the virtual space corresponding to the first user account displays the interactive control cannot exceed the number threshold, and therefore, each time the virtual space corresponding to the first user account needs to display the interactive control, it is necessary to first determine whether the number of times that the interactive control is displayed in the virtual space corresponding to the first user account exceeds the number threshold, and only when the number of times that the interactive control is displayed is less than the number threshold, the interactive control is displayed in the virtual space.
The second mode is as follows: and displaying the interactive control in the virtual space under the condition that the target object in the multimedia data of the virtual space executes the target action and the interval duration between the current moment and the moment when the interactive control is displayed last time in the virtual space is greater than the target duration.
The target time period is an arbitrary time period, for example, the target time period is 3 hours or 1 hour. In the embodiment of the disclosure, each time the virtual space corresponding to the first user account needs to display the interactive control, it needs to first determine whether an interval duration between the current time and a time when the interactive control is displayed last time in the virtual space is greater than a target duration, and only when the interval duration is greater than the target duration, the interactive control is displayed in the virtual space, so that a situation that a second user feels uncomfortable to the interactive control due to frequent occurrence of the interactive control can be avoided.
It should be noted that the above two manners can be combined, that is, the interactive control is displayed in the virtual space under the condition that the target object in the multimedia data in the virtual space executes the target action, the number of times the interactive control is displayed in the virtual space in the current period is less than the number threshold, and the interval duration between the current time and the last time the interactive control is displayed in the virtual space is greater than the target duration.
It should be noted that, in the embodiment of fig. 3, on the basis of the above-mentioned embodiment, in response to a target object in multimedia data in a virtual space executing a target action, the interactive control is displayed only when a display frequency or an interval duration between a current time and a time at which the interactive control was displayed last time meets a requirement, and in another embodiment, the display frequency or the interval duration is determined without executing the target action on the target object in the multimedia data in the virtual space, but the display frequency or the interval duration between the current time and the time at which the interactive control was displayed last time is determined at other times, and the interactive control is displayed in the virtual space when it is determined that the display frequency or the interval duration meets the requirement.
Fig. 2 is a flowchart illustrating a data processing method according to another exemplary embodiment, where the method is executed by an electronic device, and taking the electronic device as a terminal as an example, referring to fig. 4, the method includes:
in
step401, in case that the multimedia data of the virtual space belongs to the target type, an interactive control is displayed in the virtual space.
The target type is any type, in the embodiment of the present disclosure, each multimedia data belongs to one type, and the target type is any one or more of multiple types. For example, in a live scene, the multimedia data is live data, each live data belongs to a live type, and the target type is any one or more of a plurality of live types, such as a dancing live type or a singing live type.
Under the condition that the multimedia data of the virtual space belongs to the target type, the multimedia data representing the virtual space is wonderful data, and an interaction control is displayed in the virtual space, so that a second user can be called, the second user is encouraged to participate in interaction, the second user account in the subsequent virtual space sends a virtual gift to the first user account by triggering the interaction control, and the enthusiasm of the first user can be improved.
In some embodiments, the type of the multimedia data of the virtual space is set by the first user, or determined by detecting the multimedia data of the virtual space.
For example, in a live broadcast scene, if any anchor sets a dance type for a live broadcast room, a terminal logging in the anchor acquires live broadcast data belonging to the dance type, and the server releases the live broadcast data belonging to the dance type in the live broadcast room by interacting with the server.
For another example, in the live broadcast process, the local device or the server extracts a live broadcast image from live broadcast data, performs type recognition on the live broadcast image, and determines a live broadcast type to which the live broadcast data belongs.
In
step402, when the total number of the virtual gifts sent by the interaction control by the second user account in the virtual space reaches the target number, an interaction effect corresponding to the interaction control is displayed in the virtual space, and the interaction effect is matched with the target type.
Each target type corresponds to an interactive special effect, for example, the target type is a dancing live type, and the interactive special effect matched with the dancing live type is an animation for displaying a virtual character to dance. For another example, the target type is a singing live type, and the interactive special effect matched with the singing live type is mutual animation for displaying a microphone and a headset.
The embodiment of the disclosure provides a scheme of joint interaction, which includes displaying an interaction control in a virtual space to prompt a second user account in the virtual space to send a virtual gift to a first user account by triggering the interaction control, so that each second user account in the virtual space can send the virtual gift to the first user account by triggering the interaction control, and the second user accounts in the virtual space are prompted to jointly send the virtual gift to the first user account to realize interaction with the first user account.
And under the condition that the multimedia data belong to the target type, the multimedia data representing the virtual space are wonderful data, and then the interaction control is displayed in the virtual space so as to prompt the second user account in the virtual space to send the virtual gift to the first user account by triggering the interaction control, so that the enthusiasm of the first user can be improved, and the interaction effect is further improved.
And under the condition that the total number of the virtual gifts sent by the interaction control is triggered by the second user account in the virtual space reaches the target number, the interaction special effect displayed in the virtual space is matched with the target type, so that the style of the displayed interaction special effect is consistent with that of the virtual space, and the display effect of the interaction special effect is ensured.
On the basis of the embodiment shown in fig. 2, the interactive control displayed in the virtual space includes a first interactive control and a second interactive control, and the second user account can send the virtual gift to the first user account by triggering both the first interactive control and the second interactive control. Correspondingly, as shown in fig. 5, the method is executed by an electronic device, and taking the electronic device as a terminal as an example, the data processing method includes:
in
step501, a first interactive control is displayed in a multimedia screen display area in a virtual space, where the first interactive control is used to prompt a second user account accessing the virtual space to send a target number of virtual gifts to a first user account to which the virtual space belongs.
In the embodiment of the disclosure, the two interactive controls are displayed in different areas of the virtual space, so that an effect of strong interactive reminding can be achieved, the second user account in the virtual space is reminded to send the virtual gift to the first user account by triggering any interactive control, and a virtual gift sending mode is enriched.
The multimedia image display area is used for displaying an image of the multimedia data of the virtual space, that is, an image included in the multimedia data of the virtual space is displayed in the multimedia image display area. And displaying a first interactive control in the multimedia picture display area to enhance the display effect of the first interactive control, so that a second user can check the first interactive control, and the first interactive control is used for prompting the number, namely the target number, of virtual gifts which need to be sent to the first user account in the current interaction, so that the second user account is prompted to participate in the interaction, and the second user account is enabled to send the virtual gifts to the first user account by triggering the first interactive control.
It should be noted that the first interactive control displayed in
step501 is similar to the interactive control displayed in
step301, and is not described herein again.
In
step502, a second interactive control is displayed in the comment information display area in the virtual space, and the second interactive control is used for sending the target virtual gift to the first user account after being triggered.
And the comment information display area is used for displaying the comment information in the virtual space. The second interactive control is connected with a target virtual gift, and the target virtual gift is any virtual gift to be sent. In the embodiment of the disclosure, once the second interactive control is triggered, the target virtual gift is sent to the first user account, so that the operation of sending the virtual gift is simplified, and the sending efficiency of the virtual gift is improved.
In some embodiments, a prompt message is displayed in the second interactive control, and the prompt message is used for prompting that the second interactive control is triggered to send the target virtual gift to the first user account.
In some embodiments, the second interactive control is fixedly displayed in the comment information display area. In the embodiment of the disclosure, the comment information sent by the second user account in the virtual space is displayed in the comment information display area, and the displayed comment information can be continuously updated according to the change of the sending time of the comment information, and the second interactive control is fixedly displayed in the comment information display area, and the display position is kept unchanged, so that the displayed second interactive control can achieve the effect of reminding the second user, and the effect that the second interactive control cannot reach the reminding effect due to the fact that the second interactive control is replaced and displayed by other comment information is avoided.
In some embodiments, the second interactive control displayed in the comment information display area in the virtual space is dismissed. For example, in response to any second user account in the virtual space sending a first virtual gift to the first user account through the first interactive control or the second interactive control, canceling to display the second interactive control; or canceling the display of the second interactive control under the condition that the display duration of the second interactive control reaches a second preset duration. Wherein, the second preset duration is any duration. As shown in the second diagram in fig. 6, a first
interactive control601 is displayed in a multimedia screen display area in a virtual space, and a second
interactive control602 is displayed in a comment information display area in the virtual space; as shown in the third diagram in fig. 6, the second interactive control displayed in the comment information display area is canceled from being displayed.
In
step503, when the total number of the virtual gifts sent by the first interactive control and the second interactive control by the second user account in the virtual space reaches the target number, displaying an interactive special effect corresponding to the interactive control in the virtual space.
In the embodiment of the disclosure, under the condition that a first interaction control and a second interaction control are displayed in a virtual space, a second user account in the virtual space can send a virtual gift to the first user account based on any interaction control, and the virtual gifts sent by triggering the first interaction control or the second interaction control are all virtual gifts sent by the second user account participating in interaction, then the total number of the virtual gifts sent by triggering the first interaction control and the second interaction control by the second user account in the virtual space is counted, and once the total number reaches a target number, an interaction special effect corresponding to the interaction control is displayed. The
step503 is similar to the
step302 and will not be described herein again.
The embodiment of the disclosure provides a scheme of joint interaction, which displays an interaction control in a virtual space to prompt second user accounts in the virtual space to send virtual gifts to a first user account by triggering the interaction control, so that each second user account in the virtual space can send a virtual gift to the first user account by triggering the interaction control, and the second user accounts in the virtual space are prompted to jointly send the virtual gifts to the first user account to realize interaction with the first user account.
In addition, different interaction controls are displayed in different display areas in the virtual space, so that the effect of strong interaction reminding can be achieved, a second user account in the virtual space is reminded to send a virtual gift to the first user account by triggering any one of the interaction controls, the virtual gift sending mode is enriched, and the interaction efficiency is also guaranteed.
And the second interactive control is used for sending the target virtual gift to the first user account after being triggered, and once the second interactive control is triggered, the target virtual gift is sent to the first user account, so that the operation of sending the virtual gift is simplified, and the sending efficiency of the virtual gift is improved.
It should be noted that the embodiments shown in fig. 2 to fig. 5 can be combined arbitrarily, and as shown in fig. 6, the data processing method includes: under the condition that the multimedia data in the virtual space belongs to the target type, responding to the target object in the multimedia data to execute the target action, displaying a
first interaction control601 in a multimedia picture display area in the virtual space, and displaying a
second interaction control602 in a comment information display area in the virtual space; and displaying an interactive
special effect603 corresponding to the interactive control in the virtual space under the condition that the total number of the virtual gifts sent by the interactive control is triggered by the second user account in the virtual space reaches a target number in an effective interactive time period corresponding to the interactive control.
In some embodiments, as shown in fig. 6, the first
interactive control601 includes a display area and a touch area, and the display area displays interactive preview information or interactive status information, in fig. 6, the interactive preview information is "charging is about to start", and the interactive status information includes "total current number is 50, target number is 100, and remaining effective duration is 10 seconds". The touch area displays a special effect identifier, and the second user account can send the virtual gift to the first user account by triggering the touch area.
It should be noted that, on the basis of the embodiments shown in fig. 2 to fig. 5, the terminal implements data processing by interacting with the server, as shown in fig. 7, and the data processing method is executed by the server and the first terminal as an example, and includes:
in step 701, when the multimedia data in the virtual space belongs to the target type, the server responds to the target object in the multimedia data to execute the target action, and sends a control display notification to each terminal accessing the virtual space, where the control display notification carries the interaction advance notice information.
In the embodiment of the disclosure, a server creates a virtual space for a first user account, and when receiving multimedia data sent by a terminal logging in the first user account, the server issues the multimedia data in the virtual space, so that the terminal accessing the virtual space can receive and play the multimedia data. And the server also determines the type of the multimedia data, and if the multimedia data in any virtual space belongs to the target type, the server performs action detection on the target object in the multimedia data to determine whether the target object executes the target action.
In some embodiments, the server performs type recognition on the multimedia data and determines the type to which the multimedia data belongs. In the embodiment of the present disclosure, the type to which the multimedia data belongs is a viewpoint tag corresponding to the multimedia data, and the viewpoint tag corresponding to the multimedia data in each virtual space is determined by identifying the type to which the multimedia data belongs, so as to determine a target viewpoint tag (i.e., a target type).
In some embodiments, the server extracts a multimedia image from the multimedia data every third preset time interval, performs motion recognition on the extracted multimedia image, and obtains a recognition result indicating whether a target object in the multimedia image performs a target motion.
The third preset time period is any time period, for example, the third preset time period is 10 seconds or 8 seconds. In the embodiment of the disclosure, the server extracts a multimedia image from the multimedia data at intervals of a third preset duration in a slicing manner, and performs action recognition to determine whether a target object in the multimedia data executes a target action. For example, in a live scene, the multimedia data is live data, and the multimedia image extracted from the multimedia data is a live image.
In one possible implementation of the above embodiment, the server performs motion recognition on the multimedia image through an MMU (an image recognition tool).
In step 702, the first terminal receives a control display notification sent by the server, and in response to the control display notification, displays a first interactive control in a multimedia picture display area in the virtual space, where the first interactive control displays interactive preview information, and displays a second interactive control in a comment information display area in the virtual space.
The first terminal is any one of a plurality of terminals accessing the virtual space, and each terminal accesses the virtual space based on the logged-in second user account. In the disclosed embodiment, each terminal accessing the virtual space is capable of displaying a first interactive control and a second interactive control in the virtual space in response to the control display notification.
In step 703, the server sends a first virtual gift to the first user account by triggering the first interactive control or the second interactive control in response to any second user account in the virtual space, and sends interaction state information to each terminal accessing the virtual space.
In the embodiment of the disclosure, for each terminal accessing the virtual space, in the case that the terminal displays the first interactive control and the second interactive control, the virtual gift can be sent to the first user account by triggering the first interactive control or the second interactive control, and the server is notified of the sent virtual gift, and the server forwards the virtual gift to the first user account.
In some embodiments, the process of sending the virtual gift to the first user account includes that the terminal displays at least one virtual gift to be sent in response to a triggering operation on a first interactive control, sends a virtual first gift sending request to the server in response to a triggering operation on any virtual gift, where the first gift sending request carries the virtual gift and a control identifier corresponding to the first interactive control, and the server sends the virtual gift to the terminal logged in the first user account in response to the first gift sending request, and updates the total amount in the interactive state information based on the control identifier.
In the embodiment of the disclosure, the first gift sending request sent by the terminal to the server carries the control identifier corresponding to the first interactive control, so that the server can distinguish the virtual gift sent by the interactive control based on the control identifier in the first gift sending request, and the accuracy of the total amount in the updated interactive state information is ensured.
In some embodiments, the process of sending the virtual gift to the first user account includes that the terminal sends a virtual second gift sending request to the server in response to a triggering operation of a second interactive control, where the second gift sending request carries a target virtual gift and a control identifier corresponding to the second interactive control, the server sends the target virtual gift to the terminal logged in the first user account in response to the second gift sending request, and updates the total amount in the interactive status information based on the control identifier.
In step 704, the first terminal receives the interaction state information sent by the server, and switches the interaction preview information displayed by the first interaction control into the interaction state information.
In step 705, the server sends updated interaction state information to each terminal accessing the virtual space by triggering the first interaction control or the virtual gift sent by the second user account in the virtual space to the first user account every fourth preset time interval, so that the total amount of the interaction state information displayed in the interaction control is updated.
The fourth preset time period is any time period, for example, the fourth preset time period is 1 second. In the embodiment of the present disclosure, the server interacts with each terminal accessing the virtual space to ensure that the total number displayed by each terminal is updated in real time, thereby ensuring the accuracy of the displayed interaction state information.
It should be noted that, in the embodiment of the present disclosure, the server interacts with the terminal accessing the virtual space every fourth preset time interval, so that the total amount of the interactive state information displayed in the interactive control is updated, and in another embodiment, step 705 does not need to be executed, and the terminal updates the total amount of the interactive state information in real time in other ways.
In step 706, the server sends an interactive special effect display notification to each terminal accessing the virtual space when the total number of virtual gifts sent by the second user account in the virtual space to the first user account by triggering the first interactive control and the second interactive control reaches a target number.
Wherein the special effect display notification carries an interactive special effect or carries a special effect identifier.
In step 707, the first terminal receives the interactive special effect display notification sent by the server, and displays the interactive special effect corresponding to the interactive control in the virtual space.
It should be noted that, in the embodiment of the present disclosure, the first terminal interacts with the server to display the interactive control and then display the interactive special effect, and in another embodiment, the above-mentioned step 701 and 706 need not be executed, but other manners are adopted, the first terminal displays the interactive control in the virtual space, sends a first virtual gift to the first user account by triggering the first interactive control or the second interactive control in response to any second user account in the virtual space, switches the interactive preview information displayed by the interactive control into the interactive state information, and displays the interaction corresponding to the interactive special effect control in the virtual space when the second user account in the virtual space reaches the target number by triggering the total number of the virtual gifts sent to the first user account by the first interactive control and the second interactive control.
In the scheme provided by the embodiment of the disclosure, the server interacts with the terminal through the multimedia data in the virtual space and the condition of sending the virtual gift of the second user account accessing the virtual space, so that the scheme that the second user account in the virtual space jointly sends the virtual gift to the first user account is realized, and the effect of joint interaction of multiple people in the virtual space is realized.
Based on the embodiment shown in fig. 7, taking the virtual space as a live broadcast room, the multimedia data as live broadcast data, the first user account as a main broadcast account, the second user account as a viewer account, and the server includes a live broadcast sub-server and a processing sub-server as an example, a scheme for implementing data processing by the interaction of the terminal with the live broadcast server and the processing server is implemented, as shown in fig. 8, the live broadcast sub-server includes an Application Programming Interface (API), a provider (a consumption service), a Redis (a storage service), a Hive (a data warehouse tool), and a Task (Task service), and the processing sub-server includes an MMU and a Kafka (a subscription message service), and then the data processing scheme includes:
in step 801, the live sub-server defines a target anchor account in account information corresponding to an anchor account stored in Redis by Hive, so that a live room corresponding to the target anchor account can create a charging interaction task.
The charging interaction task is a task of realizing interaction between a plurality of audience accounts and a main play account through a displayed interaction control.
In step 802, the first terminal invokes an API in the live sub-server based on account information corresponding to the viewer account, logs in to the live sub-server, searches for a target anchor account from information stored in Redis through the API, and then accesses a live room corresponding to the target anchor account.
In step 803, the live broadcast sub-server sends live broadcast data of a live broadcast room corresponding to the target anchor account to the processing sub-server, and then the processing sub-server extracts live broadcast images from the live broadcast data through the MMU and performs action recognition, detects an action executed by a target object in the live broadcast data to determine whether the live broadcast data is in highlight time, and creates a Kafka event if the target object is determined to execute the target action, that is, if the live broadcast data is determined to be in highlight time at the current time.
In step 804, the live broadcast sub-server obtains the Kafka event through the concumer, performs frequency control check on the live broadcast room, and sends a control display notification to each terminal accessing the live broadcast room based on the special effect signaling under the condition that the number of times of displaying the interactive control in the live broadcast room in the current period is less than a number threshold and the interval duration between the current time and the time of displaying the interactive control last time in the live broadcast room is greater than a target duration, so that each terminal accessing the live broadcast room merges the live broadcast video stream with the special effect, displays a merging special effect, that is, displays a live broadcast picture in the live broadcast room, and displays a first interactive control in a live broadcast picture display area in the live broadcast room, wherein the first interactive control displays that charging is about to start, and displays a second interactive control in a comment information display area in the live broadcast room. Wherein, the special effect signaling is LiveCommoneEffect Info.
In step 805, the live broadcast sub-server receives, through the concumer, a gift transmission request transmitted by a terminal accessing the live broadcast room; determining whether the gift sending request carries a control identification or not according to whether the gift sending request carries the control identification or not, wherein if the gift sending request carries the control identification, the virtual gift carried by the gift sending request is a charging virtual gift; and counting the charging virtual gifts, sending an interactive special effect to a terminal accessing the live broadcast room when the counted charging virtual gifts reach the target number in the effective interactive time period, so that the terminal accessing the live broadcast room receives the interactive special effect, and converging the interactive special effect and the live broadcast video stream to display the interactive special effect. For example, the control is identified as KSUGIFtPACKAGE _ GiftBoxSourceType (a type).
In step 806, the live broadcast sub-server calls the information stored in the Redis through the Task, determines the defined target anchor account, determines whether the charging interaction Task corresponding to the target anchor account is completed, and sends a reminding notification to the audience account accessing the live broadcast room corresponding to the target anchor account, so that the terminal participates in the charging interaction Task through the displayed first interaction control or the second interaction control.
It should be noted that, all the above optional technical solutions may be combined arbitrarily to form optional embodiments of the embodiment of the present disclosure, and are not described in detail herein.
FIG. 9 is a block diagram illustrating a data processing apparatus according to an example embodiment. Referring to fig. 9, the apparatus includes:
a
display unit901, configured to display an interactive control in a virtual space, where the interactive control is used to send a virtual gift to a first user account to which the virtual space belongs after being triggered, and the interactive control is used to prompt a second user account accessing the virtual space to send a target number of virtual gifts to the first user account;
the
display unit901 is further configured to display an interactive special effect corresponding to the interactive control in the virtual space when the total number of the virtual gifts sent by the second user account in the virtual space by triggering the interactive control reaches the target number.
In some embodiments, the
display unit901 is configured to perform a target action in response to a target object in multimedia data of a virtual space, and display an interactive control in the virtual space.
In some embodiments, the
display unit901 is configured to perform, in response to a target object performing a target action, displaying an interactive control on a target position, where the target position is in a preset relative position relationship with the target object.
In some embodiments, the interactive control displays interactive status information, and the interactive status information includes: the total number of virtual gifts and the target number sent by triggering the interactive control.
In some embodiments, the
display unit901 is configured to execute a target action in response to a target object in the multimedia data in the virtual space, and display an interactive control containing interactive preview information, where the interactive preview information is used to prompt that an interaction is about to start; and responding to any second user account in the virtual space, sending a first virtual gift by triggering the interactive control, and switching the interactive preview information displayed in the interactive control into interactive state information.
In some embodiments, the
display unit901 is configured to execute a multimedia screen display area in a virtual space, and display a first interactive control, where the first interactive control is used to prompt a second user account in the virtual space to send a target number of virtual gifts to the first user account; displaying a second interactive control in a comment information display area in the virtual space, wherein the second interactive control is used for sending a target virtual gift to the first user account after being triggered; and displaying the interactive special effect in the virtual space under the condition that the total number of the virtual gifts sent by the second user account in the virtual space through triggering the first interactive control and the second interactive control reaches the target number.
In some embodiments, the
display unit901 is configured to display the interactive special effect in the virtual space when the total number of the effective interaction time periods corresponding to the interactive controls reaches the target number.
In some embodiments, the
display unit901 is configured to perform displaying an interactive control in the virtual space if the multimedia data in the virtual space belongs to the target type;
and the interactive special effect corresponding to the interactive control is matched with the target type.
In some embodiments, the
display unit901 is configured to display the interactive control in the virtual space if the number of times that the interactive control is displayed in the virtual space in the current period is less than a number threshold; or displaying the interactive control in the virtual space under the condition that the interval duration between the current moment and the moment of displaying the interactive control in the virtual space last time is greater than the target duration.
In some embodiments, as shown in fig. 10, the apparatus further comprises:
the
display unit901 is further configured to execute a triggering operation responding to the interactive control, and display a virtual gift to be sent;
a sending
unit902 configured to perform sending of the virtual gift to the first user account in response to a trigger operation on any of the displayed virtual gifts.
It should be noted that, the apparatus provided in the foregoing embodiment is only illustrated by dividing the functional units, and in practical applications, the above function allocation may be performed by different functional units according to needs, that is, the internal structure of the electronic device is divided into different functional units to perform all or part of the functions described above. In addition, the data processing apparatus and the data processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
An embodiment of the present disclosure provides an electronic device, including:
one or more processors;
a memory for storing the processor executable program code;
wherein the processor is configured to execute the program code to implement the data processing method as described above.
In some embodiments, where the electronic device is provided as a terminal, fig. 11 is a block diagram illustrating a terminal 1100 in accordance with an example embodiment. Fig. 11 is a block diagram illustrating a terminal 1100 according to an exemplary embodiment of the disclosure. The terminal 1100 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1100 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so forth.
In general, terminal 1100 includes: a
processor1101 and a
memory1102.
1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The
processor1101 may be implemented in at least one of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The
processor1101 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the
processor1101 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the
processor1101 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
1102 may include one or more computer-readable storage media, which may be non-transitory.
Memory1102 can also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the
memory1102 is used to store at least one program code for execution by the
processor1101 to implement the data processing methods provided by the method embodiments in the embodiments of the present disclosure.
In some embodiments, the terminal 1100 may further include: a
peripheral interface1103 and at least one peripheral. The
processor1101,
memory1102 and
peripheral interface1103 may be connected by buses or signal lines. Various peripheral devices may be connected to the
peripheral interface1103 by buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of
radio frequency circuitry1104,
display screen1105,
camera assembly1106,
audio circuitry1107,
positioning assembly1108, and
power supply1109.
The
peripheral interface1103 may be used to connect at least one peripheral associated with I/O (Input/Output) to the
processor1101 and the
memory1102. In some embodiments, the
processor1101,
memory1102, and
peripheral interface1103 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the
processor1101, the
memory1102 and the
peripheral device interface1103 may be implemented on separate chips or circuit boards, which is not limited by this embodiment.
The
Radio Frequency circuit1104 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The
radio frequency circuit1104 communicates with communication networks and other communication devices via electromagnetic signals. The
radio frequency circuit1104 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the
radio frequency circuit1104 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The
radio frequency circuit1104 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the
radio frequency circuit1104 may further include NFC (Near Field Communication) related circuits, which are not limited by the embodiments of the present disclosure.
The
display screen1105 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the
display screen1105 is a touch display screen, the
display screen1105 also has the ability to capture touch signals on or over the surface of the
display screen1105. The touch signal may be input to the
processor1101 as a control signal for processing. At this point, the
display screen1105 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments,
display1105 may be one, providing the front panel of terminal 1100; in other embodiments, the
display screens1105 can be at least two, respectively disposed on different surfaces of the terminal 1100 or in a folded design; in still other embodiments,
display1105 can be a flexible display disposed on a curved surface or on a folded surface of terminal 1100. Even further, the
display screen1105 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The
Display screen1105 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
1106 is used to capture images or video. Optionally,
camera assembly1106 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments,
camera assembly1106 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The
audio circuitry1107 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the
processor1101 for processing or inputting the electric signals to the
radio frequency circuit1104 to achieve voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1100. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the
processor1101 or the
radio frequency circuit1104 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the
audio circuitry1107 may also include a headphone jack.
1108 is used to locate the current geographic position of terminal 1100 for purposes of navigation or LBS (Location Based Service). The
Positioning component1108 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
1109 is configured to provide power to various components within terminal 1100. The
power supply1109 may be alternating current, direct current, disposable or rechargeable. When the
power supply1109 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1100 can also include one or more sensors 1110. The one or more sensors 1110 include, but are not limited to: acceleration sensor 1111, gyro sensor 1112, pressure sensor 1113, fingerprint sensor 1114,
optical sensor1115, and
proximity sensor1116.
Acceleration sensor 1111 may detect acceleration levels in three coordinate axes of a coordinate system established with terminal 1100. For example, the acceleration sensor 1111 may be configured to detect components of the gravitational acceleration in three coordinate axes. The
processor1101 may control the
display screen1105 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1111. The acceleration sensor 1111 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1112 may detect a body direction and a rotation angle of the terminal 1100, and the gyro sensor 1112 may cooperate with the acceleration sensor 1111 to acquire a 3D motion of the user with respect to the
terminal1100. From the data collected by gyroscope sensor 1112,
processor1101 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1113 may be disposed on a side bezel of terminal 1100 and/or
underlying display screen1105. When the pressure sensor 1113 is disposed on the side frame of the terminal 1100, the holding signal of the terminal 1100 from the user can be detected, and the
processor1101 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1113. When the pressure sensor 1113 is disposed at the lower layer of the
display screen1105, the
processor1101 controls the operability control on the UI interface according to the pressure operation of the user on the
display screen1105. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1114 is used to collect a fingerprint of the user, and the
processor1101 identifies the user according to the fingerprint collected by the fingerprint sensor 1114, or the fingerprint sensor 1114 identifies the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the user is authorized by the
processor1101 to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. Fingerprint sensor 1114 may be disposed on the front, back, or side of terminal 1100. When a physical button or vendor Logo is provided on the terminal 1100, the fingerprint sensor 1114 may be integrated with the physical button or vendor Logo.
1115 is used to collect ambient light intensity. In one embodiment, the
processor1101 may control the display brightness of the
display screen1105 based on the ambient light intensity collected by the
optical sensor1115. Specifically, when the ambient light intensity is high, the display brightness of the
display screen1105 is increased; when the ambient light intensity is low, the display brightness of the
display screen1105 is reduced. In another embodiment,
processor1101 may also dynamically adjust the shooting parameters of
camera head assembly1106 according to the ambient light intensity collected by
optical sensor1115.
1116, also referred to as a distance sensor, is typically disposed on a front panel of terminal 1100.
Proximity sensor1116 is used to capture the distance between the user and the front face of terminal 1100. In one embodiment, when the
proximity sensor1116 detects that the distance between the user and the front face of the terminal 1100 is gradually reduced, the
display screen1105 is controlled by the
processor1101 to switch from a bright screen state to a dark screen state; when the
proximity sensor1116 detects that the distance between the user and the front face of the terminal 1100 becomes progressively larger, the
display screen1105 is controlled by the
processor1101 to switch from a breath-screen state to a light-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 11 is not limiting of terminal 1100, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, there is also provided a computer-readable storage medium, in which instructions, when executed by a processor of an electronic device, cause the electronic device to perform the above-mentioned data processing method when executed. For example, the memory 702 or the memory 802 may include instructions executable by the processor 701 of the terminal 700 or the processor 801 of the server 800 to perform the data processing method. Alternatively, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A computer program product comprising a computer program/instructions which, when executed by a processor, implement the data processing method described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.