CN109121194B - Method and apparatus for state transition of electronic device - Google Patents
- ️Tue Dec 10 2019
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which the method for state transition of an electronic device or the apparatus for state transition of an electronic device of the embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include a controller 101, a network 102, and an electronic device 103. The network 102 is used to provide a medium for communication links between the controller 101 and the electronic device 103. Network 102 may include various connection types, such as wired, wireless communication links, and so forth.
The electronic device 103 may interact with the controller 101 through the network 102 to receive or send messages. The electronic device 103 may be equipped with a camera (not shown in the figure). Wherein a camera (not shown in the figure) can take an image.
It should be noted that the method for state transition of an electronic device provided in the embodiment of the present application is generally executed by the controller 101, and accordingly, the controller 101 may be disposed on the electronic device 103 or may not be disposed on the electronic device 103.
The controller may be hardware or software. When the controller is hardware, it may be implemented as a distributed device cluster formed by multiple devices, or may be implemented as a single device. When the controller is software, it may be implemented as a plurality of software or software modules (for example, to provide distributed services), or may be implemented as a single software or software module, and is not particularly limited herein.
It should be understood that the number of controllers, electronics, and networks in fig. 1 are merely illustrative. There may be any number of controllers, electronic devices, and networks, as desired for an implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for state transition of an electronic device in accordance with the present application is shown. The method for state transition of the electronic equipment comprises the following steps:
Step 201, acquiring an image sequence of a target user from a camera of an electronic device.
In the present embodiment, an execution subject (such as the controller 101 shown in fig. 1) for state transition of an electronic apparatus can acquire an image sequence of a target user from a camera of the electronic apparatus. The target user may be a user currently present in an area photographed by the camera. The camera may capture an image of the target user and store in memory. The memory is a memory device for holding information. The execution body may acquire the image sequence of the target user in various ways. As an example, the execution body may directly acquire the image sequence of the target user from the memory. For example, the camera 1s captures 24 images of the target user and stores the images in the memory, and the execution subject acquires the 24 images from the memory. As another example, the execution subject may sequentially acquire images in the image sequence within a preset time period. Optionally, after the execution main body acquires the image sequence within a preset time period, the execution main body may further continue to acquire the image sequence within a next preset time period.
In step 202, for an image in an image sequence, face feature point information of the image is extracted.
In the present embodiment, the execution subject may extract the human face feature point information of the image for the image in the above-described image sequence. The facial features include, but are not limited to, eyebrows, eyes, corners of the mouth, etc., and the facial feature point information may be position information of facial feature points, for example, coordinate information of the eyebrows, eyes, corners of the mouth, etc. in the image. Generally, the execution subject may select images in the image sequence in time order, and extract the face feature point information of each image.
Step 203, determining the size of the face region in the image according to the extracted face feature point information.
In this embodiment, the execution subject may determine the size of the face region in the image according to the extracted face feature point information. The size of the face region can be expressed by the number of pixels in the horizontal direction and the number of pixels in the vertical direction. Or may be expressed in total number of pixels. As an example, the number of pixels covered by two human face feature points farthest from each other in the horizontal direction is taken as the number of pixels in the horizontal direction, and the number of pixels covered by two human face feature points farthest from each other in the vertical direction is taken as the number of pixels in the vertical direction. The number of pixels covered by the area of the region surrounded by the human face characteristic points is taken as the total number of pixels.
And 204, in response to the fact that the size of the face region in the image is larger than or equal to the preset size, determining the offset of the face corresponding to the image and storing the offset into an offset set.
In this embodiment, after determining that the size of the face region in the image is greater than or equal to the preset size, the execution subject may determine an offset of the face corresponding to the image and store the offset into the offset set. The preset size may be expressed by the number of pixels in the horizontal direction and the number of pixels in the vertical direction, or may be expressed by the total number of pixels (the number of pixels in the horizontal direction × the number of pixels in the vertical direction). Generally, the size of the face area in the image is equal to or greater than a preset size, and may be that the number of pixels in the horizontal direction of the face area (for example, 100) is greater than a preset number of pixels in the horizontal direction (for example, 80), and the number of pixels in the vertical direction of the face area (for example, 100) is greater than a preset number of pixels in the vertical direction (for example, 80). The offset is used for representing the horizontal offset of the face relative to the camera. The offset may be expressed in terms of angle, distance, etc. The execution principal may determine an offset of a face corresponding to the image according to face recognition techniques. It should be noted that, determining the offset of the face corresponding to the image by the execution subject according to the face recognition technology is a known technology widely studied and applied at present, and is not described herein again.
Step 205, determine the distance between the target user and the camera and store the distance into a distance set.
in this embodiment, the execution body may determine the distance between the target user and the camera in various ways. For example, the execution body may determine a distance between the target user and the camera through a distance sensor. It should be noted that the distance measurement by the distance sensor is a well-known technology which is widely researched and applied at present, and is not described herein again.
In some alternative implementations of this embodiment, the executive agent may determine the distance between the target user and the binocular camera according to the basic principle of binocular ranging, see FIG. 3, where P is a certain point on the target user, and O R and O T are the two optical centers of the binocular camera, respectively, the imaging points of point P on the photoreceptors of the binocular camera are P and P '. X R represents the distance from the left end point of the horizontal line segment where imaging point P on the photoreceptors is located to the imaging point P ' on the photoreceptors, X T represents the distance from the left end point of the horizontal line segment where imaging point P ' on the photoreceptors is located to the imaging point P ' on the photoreceptors, (X R -X T) represents parallax, f is the focal length of the binocular camera, B is the center distance of the two cameras, Z is the distance we need to determine between the target user and the binocular camera, then, setting the distance from the imaging point P on the photoreceptors to P ' to be dis:
dis=B-(XR-XT)
According to the similar triangle principle:
The following can be obtained:
therefore, the distance between the target user and the binocular camera can be calculated as long as the value of (X R -X T) is obtained.
In some optional implementations of this embodiment, the executing body may further determine a distance between the target user and the camera through a human body proximity sensor or the like. The method for determining the distance between the target user and the camera is not limited in the present application.
And step 206, performing conversion processing on the state of the electronic equipment based on the offset set and the distance set.
In this embodiment, based on the offset set and the distance set, the execution main body may send a state transition instruction to the electronic device to perform state transition processing. The status may be a standby status, an operating status (e.g., a status of broadcasting a welcome). The transition process may be a process of changing the standby state to the active state, or a process of maintaining the standby state.
Optionally, in response to determining that the offset in the offset set is smaller than a preset offset and that there is a distance smaller than a preset distance in the distance set, the execution main body may convert the electronic device from the standby state to the operating state through an instruction. It should be noted that, distances smaller than the preset distance exist in the distance set and the distances may be arranged from large to small (based on the operation selected in the order of step 201), so that a state that the target user dynamically approaches the electronic device may be presented.
Optionally, in response to determining that the offset amount in the offset amount set is smaller than a preset offset amount and that there is no distance smaller than a preset distance in the distance set, the executing body may keep the electronic device in a standby state.
with continued reference to fig. 4, fig. 4 is a schematic diagram 400 of an application scenario of the method for state transition of an electronic device according to the present embodiment. In the application scenario of fig. 4, a controller (not shown in the figure) acquires a sequence 403 of images (e.g., 24 images) of the target user 401 from the camera 402. Generally, the camera 402 can capture the target user 401, and store the captured image sequence 403 in a memory (not shown in the figure). For the images in the image sequence 403, the face feature point information 404 is extracted, and the size of the face region (which may be represented by the number of pixels 405) is determined from the face feature point information 404. In response to determining that the size of the face region in the image is equal to or greater than a predetermined size (i.e., in response to determining that the number of pixels 405 of the face region in the image is greater than a predetermined number of pixels 406), determining an offset of the face corresponding to the image and storing the offset into an offset set 407, determining a distance between the target user 401 and the camera 402 and storing the distance into a distance set 408. After determining that the offset in the offset set 407 is smaller than the preset offset and the distance in the distance set 408 is smaller than the preset distance, the electronic device 409 is switched from the standby state to the operating state.
In the method provided by the above embodiment of the present application, first, an image sequence of a target user is obtained from a camera of an electronic device, and then, for images in the image sequence, the following determination steps are performed: extracting the face characteristic point information of the image; determining the size of a face area in the image according to the extracted face characteristic point information; in response to the fact that the size of the face area in the image is larger than or equal to the preset size, determining the offset of the face corresponding to the image and storing the offset into an offset set; determining the distance between the target user and the camera and storing the distance in a distance set. And finally, performing conversion processing on the state of the electronic equipment based on the offset set and the distance set. The camera tracks the face of the target user in real time and processes the face image sequence of the target user, so that the flexibility of converting the state of the electronic equipment is improved.
With further reference to FIG. 5, a flow 500 of yet another embodiment of a method for state transition of an electronic device is shown. The process 500 of the method for state transition of an electronic device includes the following steps:
Step 501, acquiring an image sequence of a target user from a camera of an electronic device.
in step 502, for an image in an image sequence, face feature point information of the image is extracted.
step 503, determining the size of the face region in the image according to the extracted face feature point information.
in this embodiment, the steps 501-503 are substantially the same as the steps 201-203 in the embodiment corresponding to fig. 2, and are not described herein again.
And step 505, acquiring an image of the user at the imaging distance shot by the camera as a target image.
In this embodiment, the execution subject may acquire an image of the user at the imaging distance captured by the above-described camera as the target image. The target image may be an image of the user at an imaging distance taken by the camera. Generally, the imaging distance refers to a distance at which clear imaging is possible after the camera is focused. The imaging distance is one parameter that the camera has. The camera shooting target images can be stored in a memory, and the execution subject can obtain the target images from the memory.
Step 505, determining a preset size based on the target image.
In this embodiment, the execution subject may determine the preset size based on the target image. The execution subject may extract the face feature point information of the target image, and determine a size of the face region as a preset size. It should be noted that the determination of the size of the face region according to the face feature point information is already explained in step 203 of the embodiment corresponding to fig. 2, and is not described herein again.
Step 506, in response to determining that the size of the face region in the image is greater than or equal to the preset size, determining an offset of the face corresponding to the image and storing the offset into an offset set.
Step 507, determining the distance between the target user and the camera and storing the distance into a distance set.
And step 508, performing conversion processing on the state of the electronic equipment based on the offset set and the distance set.
in the present embodiment, the steps 506 and 508 are substantially the same as the steps 204 and 206 in the corresponding embodiment of fig. 2, and are not described herein again.
As can be seen from fig. 5, compared with the embodiment corresponding to fig. 2, the flow 500 of the method for state transition of an electronic device in the present embodiment embodies a step of acquiring an image of a user at an imaging distance captured by the camera as a target image, and a step of determining the preset size based on the target image. Therefore, the scheme described in the embodiment refines the process of the preset size, and can more flexibly perform conversion processing on the state of the electronic equipment.
With further reference to fig. 6, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for state transition of an electronic device, which corresponds to the embodiment of the method shown in fig. 2.
As shown in fig. 6, the apparatus 600 for state transition of an electronic device according to the present embodiment includes: an acquisition unit 601 configured to acquire an image sequence of a target user from a camera of the electronic device; a determining unit 602 configured to perform, for the images in the sequence of images, the following determining steps: extracting the face characteristic point information of the image; determining the size of a face area in the image according to the extracted face characteristic point information; in response to the fact that the size of the face area in the image is larger than or equal to a preset size, determining the offset of the face corresponding to the image and storing the offset into an offset set, wherein the offset is used for representing the horizontal offset of the face relative to the camera; and determining the distance between the target user and the camera and storing the distance into a distance set. A conversion unit 603 configured to perform conversion processing on the state of the electronic device based on the offset amount set and the distance set.
In some optional implementation manners of this embodiment, the determining unit further includes: an acquisition module configured to acquire an image of the user at an imaging distance captured by the camera as a target image; the preset size is determined based on the target image.
In some optional implementations of this embodiment, the determining module is further configured to: extracting face characteristic point information of the target image; and determining the size of the face region in the target image as the preset size according to the extracted face feature point information of the target image.
In some optional implementations of this embodiment, the converting unit 603 may be further configured to: and in response to the fact that the offset in the offset set is smaller than the preset offset and the distance smaller than the preset distance exists in the distance set, converting the electronic equipment from the standby state to the working state.
in some optional implementations of this embodiment, the camera may be a binocular camera.
in the apparatus provided in the foregoing embodiment of the present application, the obtaining unit 601 obtains the image sequence of the target user from the camera of the electronic device. After that, the determining unit 602 performs the following determining steps for the images in the above-described image sequence: extracting the face characteristic point information of the image; determining the size of a face area in the image according to the extracted face characteristic point information; in response to the fact that the size of the face area in the image is larger than or equal to a preset size, determining the offset of the face corresponding to the image and storing the offset into an offset set, wherein the offset is used for representing the horizontal offset of the face relative to the camera; and determining the distance between the target user and the camera and storing the distance into a distance set. Finally, based on the offset amount set and the distance set, the conversion unit 603 performs conversion processing on the state of the electronic device. Thereby improving the sensitivity of state transitions of the electronic device.
referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a camera, a sensor, and the like; an output section 707 including a Liquid Crystal Display (LCD), a speaker, and the like; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU)701, performs the above-described functions defined in the method of the present application. It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a determination unit, and a conversion unit. The names of these units do not in some cases constitute a limitation on the units themselves, and for example, the acquiring unit may also be described as a "unit that acquires an image sequence of a target user from a camera of the above-described electronic apparatus".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an image sequence of a target user from a camera of electronic equipment; for an image of the sequence of images, performing the following determining steps: extracting the face characteristic point information of the image; determining the size of a face area in the image according to the extracted face characteristic point information; in response to the fact that the size of the face area in the image is larger than or equal to the preset size, determining the offset of the face corresponding to the image and storing the offset into an offset set, wherein the offset is used for representing the horizontal offset of the face relative to the camera; determining the distance between a target user and a camera and storing the distance into a distance set; and performing conversion processing on the state of the electronic equipment based on the offset set and the distance set.
the above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.