patents.google.com

CN115514883B - Cross-equipment collaborative shooting method, related device and system - Google Patents

  • ️Fri May 12 2023

Detailed Description

The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this application refers to and encompasses any or all possible combinations of one or more of the listed items.

The term "User Interface (UI)" in the description and claims of the present application and in the drawings is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a picture, characters, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being toolbars (toolbars), menu bars (menu bars), text boxes (text boxes), buttons (buttons), scroll bars (scrollbars), pictures and text. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifies the controls contained in the interface by nodes of < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application program interface, and is source code written in a specific computer language, such as hypertext markup language (hyper text markup language, GTML), cascading style sheets (cascading style sheets, CSS), java script (JavaScript, JS), etc., and the web page source code may be loaded and displayed as user-recognizable content by a browser or web page display component similar to the browser function. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as GTML defines elements and attributes of the web page by < p >, < img >, < video >, < canvas >.

A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.

Fig. 1A illustrates a process of cross-device collaborative shooting.

As shown in fig. 1A, cross-device collaborative shooting involves a master device and a slave device. First, the master device may discover the slave device, and then the slave device may register at the master device. After registration, the master device may send a command, such as a command to turn on the camera, to the slave device over the wireless network, and the slave device may initiate the camera to capture a picture in response to the command and compress the captured picture and send it to the master device. Finally, the main equipment can display the pictures acquired by the auxiliary equipment and the pictures acquired by the camera of the main equipment on the preview interface, so that the cross-equipment collaborative shooting is realized.

However, in the manner shown in fig. 1A, the master device cannot acquire the capability information of the slave device, and therefore cannot control and adjust the photographing screen of the slave device, for example, cannot issue a control command for adjusting the photographing effect, such as a command for zooming, a flash, or the like, to the slave device.

Fig. 1B shows a cross-device collaborative shooting scenario involved in an instant messaging process.

As shown in fig. 1B, devices a and B may perform video call through an instant messaging server, for example

Figure BDA0003797099790000061

And providing video call. Specifically, the device a may collect an image through a camera, and then send the image to the device B through an instant messaging server. The device B can directly display the image sent by the device A on the video call interface, and also can display the self adoption of the device BImages of the collection, thereby enabling collaborative photographing across devices.

The method shown in fig. 1B is similar to that in fig. 1A, and the device a cannot acquire the capability information of the device B, and therefore cannot control and adjust the photographing screen of the device B, for example, cannot issue a control command for adjusting the photographing effect, such as a zoom, a flash, or the like, to the device B. In addition, because the device A and the device B communicate through the network, larger time delay can be caused in the cross-device collaborative shooting process, and the user experience is affected.

In order to solve the problem that in collaborative shooting of a cross-device, shooting effects of other devices cannot be controlled, the following embodiments of the present application provide a collaborative shooting method, a related device and a system of the cross-device. The cross-device collaborative shooting method relates to a master device and a slave device. In the method, in the process of collaborative shooting of the master device and the slave device, the master device can receive control operation of a user on the slave device; the slave device may adjust the photographing effect in response to the control operation and then transmit the image acquired after the adjustment to the master device. Thereafter, the master device may render the image returned from the slave device. In some cases, the host device may also display the images acquired and processed by itself at the same time. In addition, the main device can also respond to the user operation to take a picture, record a video, forward and the like of the displayed image.

By implementing the cross-equipment collaborative shooting method, in the collaborative shooting process, the main equipment can not only provide shooting experience of multiple visual angles for users, but also control shooting effects of the auxiliary equipment, so that the requirement of the users for controlling remote shooting effects is met. In addition, the master device can also realize the processing of previewing, photographing, video recording, forwarding, clipping and the like of the respective pictures of the slave device and the master device.

By implementing the cross-device collaborative shooting method, the control requirements of various cross-device collaborative shooting scenes can be met, such as shooting effects of a mobile phone control television, shooting effects of a watch control mobile phone, shooting effects of a mobile phone control tablet computer and the like.

In the following embodiments of the present application, the number of master devices is one. The number of slave devices is not limited, and may be one or more. Thus, the image that the host device ultimately presents may include images from multiple devices, e.g., the preview screen that the host device ultimately displays includes: an image of the master device, and images returned by the plurality of slave devices.

The collaborative photographing referred to in the following embodiments of the present application means that a master device and a slave device establish a communication connection, and the master device and the slave device both use a camera to perform photographing and processing, the slave device transmits a photographed image to the master device based on the communication connection, and the master device displays the photographed image of the slave device. In some cases, the main device may also display the images acquired and processed by itself at the same time during the collaborative shooting process.

The communication connection between the master device and the slave device may be a wired connection, a wireless connection. The wireless connection may be a close range connection such as a high fidelity wireless communication (wireless fidelity, wi-Fi) connection, bluetooth connection, infrared connection, NFC connection, zigBee connection, etc., or a long range connection (long range connection includes, but is not limited to, a mobile network supporting 2g,3g,4g,5g, and subsequent standard protocols). For example, the master device and the slave device may log into the same user account (e.g., hua as an account) and then make a remote connection through a server (e.g., hua as a provided multi-device collaborative photographing server).

The adjustment of the photographing effect according to the following embodiment of the present application refers to adjustment of photographing parameters of an electronic device. The shooting parameters of the electronic equipment further comprise: hardware parameters of the camera involved in capturing the image and/or software parameters involved in processing the image. The shooting parameters also include some combination of hardware parameters and software parameters. Such as a hybrid zoom range, night mode, portrait mode, time-lapse shooting, slow motion, panoramic mode, HDR, etc.

The hardware parameters include one or more of the following: the number of cameras, the type of camera, the optical zoom value, whether to turn on the optical image anti-shake, the aperture size, whether to turn on the flash, whether to turn on the light supplement, the shutter time, the ISO light sensing value, the pixel and video frame rate, and so forth. The types of cameras can include, but are not limited to, common cameras, wide-angle cameras and ultra-wide-angle cameras; the optical zoom value can be 1-time zoom, 2-time zoom and 5-time zoom; the aperture size may be f/1. 8. f/1. 9. f/3.4, a step of; the shutter time may be 1/40, 1/60, 1/200, etc.

The software parameters include one or more of the following: digital zoom value, image clipping size, color temperature calibration mode of the image, whether to make noise reduction mode of the image, beauty/body type, filter type, sticker option, whether to start self-timer mirror image, etc. Wherein, the digital zoom value can be 10 times zoom and 15 times zoom; the image cropping size may be 3:3, 3:4, 9:16; the color temperature calibration mode may be daylight, fluorescent, incandescent, shadow, cloudy calibration mode; the type of beauty/body beautification can be face thinning, slimming, skin grinding, whitening, big eyes, acne removing and the like; the filter type can be solar system, texture, brightness, soft light, sipunk and the like; the stickers can be stickers for expressions, animals, landscapes, and picture inserts.

The electronic device will adjust the shooting parameters when the electronic device is responding to a particular shooting mode or enabling a particular algorithm. For example, when the camera uses the face mode function, the electronic apparatus may operate by adjusting a parameter of a focal length among photographing parameters to be small, increasing an aperture, turning on a light supplement lamp, and simultaneously using default united states Yan Suanfa, etc.

When the electronic device shoots an image, the shooting parameters and the processing parameters used can also refer to shooting capabilities of the cameras or the camera groups recorded in the subsequent embodiments. The parameter ranges of the photographing parameters and the processing parameters may be determined according to the photographing capability.

The default shooting parameters indicate parameters that the master device and the slave device use when the camera is enabled. The default shooting parameters may be parameters that the camera leaves the factory and is preset, or parameters that the user uses before using the camera. Also, the parameters include a plurality of hardware parameters used when the camera captures the image, and a plurality of software parameters used when the image processing module processes the image.

The system provided in the embodiments of the present application is first described below. Fig. 2A illustrates the structure of

system

10.

As shown, the

system

10 includes:

master

100,

slave

200. Wherein the number of

slave devices

200 may be one or more, one

slave device

200 is illustrated in fig. 2A.

The

master device

100 and the

slave device

200 are each electronic devices configured with cameras. The number of cameras that the

master device

100 and the

slave device

200 have is not limited in the embodiments of the present application. For example, the

slave device

200 may be configured with five cameras (2 front cameras and 3 rear cameras).

Electronic devices include, but are not limited to, smartphones, tablet computers, personal digital assistants (personal digital assistant, PDAs), wearable electronic devices with wireless communication capabilities (e.g., smartwatches, smart glasses), augmented reality (augmented reality, AR) devices, virtual Reality (VR) devices, and the like. Exemplary embodiments of electronic devices include, but are not limited to, piggybacking

Figure BDA0003797099790000081

Linux, or other operating system. The electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be appreciated that in other embodiments, the electronic device described above may be a desktop computer or the like instead of a portable electronic device.

A communication connection is established between the

master device

100 and the

slave device

200, which may be a wired connection, a wireless connection.

In some embodiments, the wireless connection may be a high fidelity wireless communication (Wi-Fi) connection, a bluetooth connection, an infrared connection, an NFC connection, a ZigBee connection, or the like. The

master device

100 may directly transmit a control command for adjusting the photographing effect to the

slave device

200 through the close-range connection. The

slave device

200 may respond to the control command issued by the

electronic device

100 and transmit the adjusted image back to the

master device

100. Thereafter, the

master device

100 may display the image returned from the

slave device

200. In addition, the

master device

100 may also complete the tasks of recording, photographing, and forwarding by using the images. Here, specific implementations of the

master device

100 sending a control command for adjusting a photographing effect to the

slave device

200, the

slave device

200 adjusting an image according to the control command, and the like may refer to the detailed description of the subsequent method embodiments, which are not described herein in detail.

In other embodiments, the wireless connection may also be a long range connection including, but not limited to, a mobile network supporting 2g,3g,4g,5g, and subsequent standard protocols.

Optionally, the

system

10 shown in fig. 2A may further include a server 300, where the master device and the slave device may log into the same user account (e.g., the "home" account) and then remotely connect through the server 300 (e.g., the "home" multi-device collaborative photographing server). The server 300 may be used for data transmission of the

master device

100 and the

slave device

200. I.e. the

master device

100 may send control commands to the

slave device

200 via the server 300. Likewise, the

slave device

200 may transmit an image to the

master device

100 through the server 300.

Fig. 2B is a schematic structural diagram of an electronic device 400 according to an embodiment of the present application. Electronic device 400 may be

master device

100 or

slave device

200 in

system

10 shown in fig. 2A.

The electronic device 400 may include:

processor

110,

external memory interface

120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130,

charge management module

140,

power management module

141,

battery

142, antenna 1,

antenna

2,

mobile communication module

150, wireless communication module 160,

audio module

170, speaker 170A, receiver 170B,

microphone

170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM)

card interface

195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.

It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the

electronic device

100. In other embodiments of the present application,

electronic device

100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.

The

processor

110 may include one or more processing units, such as: the

processor

110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.

The controller can generate operation control signals according to the command operation codes and the time sequence signals to finish the control of acquiring commands and executing commands.

A memory may also be provided in the

processor

110 for storing commands and data. In some embodiments, the memory in the

processor

110 is a cache memory. The memory may hold commands or data that the

processor

110 has just used or recycled. If the

processor

110 needs to reuse the command or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the

processor

110 is reduced, thereby improving the efficiency of the system.

The wireless communication function of the electronic device 400 may be implemented by the antenna 1, the

antenna

2, the

mobile communication module

150, the wireless communication module 160, a modem processor, a baseband processor, and the like.

The

antennas

1 and 2 are used for transmitting and detecting electromagnetic wave signals. Each antenna in electronic device 400 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch. The

mobile communication module

150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 400. The

mobile communication module

150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The

mobile communication module

150 may detect electromagnetic waves by the antenna 1, perform processes such as filtering, amplifying, and the like on the detected electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The

mobile communication module

150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the

mobile communication module

150 may be disposed in the

processor

110. In some embodiments, at least some of the functional modules of the

mobile communication module

150 may be provided in the same device as at least some of the modules of the

processor

110.

The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the detected electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio output device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the

mobile communication module

150 or other functional module, independent of the

processor

110.

The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 400. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 detects an electromagnetic wave via the

antenna

2, modulates the electromagnetic wave signal, filters the electromagnetic wave signal, and transmits the processed signal to the

processor

110. The wireless communication module 160 may also detect a signal to be transmitted from the

processor

110, frequency modulate it, amplify it, and convert it into electromagnetic waves for radiation via the

antenna

2. Illustratively, the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, or the like.

In some embodiments, antenna 1 and

mobile communication module

150 of electronic device 400 are coupled, and

antenna

2 and wireless communication module 160 are coupled, such that electronic device 400 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), 5G and subsequent standard protocols, BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).

The electronic device 400 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.

Processor

110 may include one or more GPUs that execute program commands to generate or change display information.

The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the electronic device 400 may include 1 or N display screens 194, N being a positive integer greater than 1.

The electronic device 400 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.

The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.

The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.

In some embodiments, electronic device 400 may include 1 or N cameras 193, N being a positive integer greater than 1.

The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 400 is selecting a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.

Video codecs are used to compress or decompress digital video. The electronic device 400 may support one or more video codecs. Thus, the electronic device 400 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.

The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194.

When the electronic device 400 shown in figure 2B is the

master device

100 in figure 2A,

the

mobile communication module

150 and the wireless communication module 160 may be used to provide communication services for the

host device

100. Specifically, in embodiments of the present application, the

master device

100 may establish a communication connection with other electronic devices (i.e., slave devices 200) having cameras 193 through the

mobile communication module

150 or the wireless communication module 160. Further, through the above connection, the

master device

100 may transmit a control command to the

slave device

200 and receive an image transmitted back from the

slave device

200.

ISP, camera 193, video codec, GPU, display 194, application processor, etc. provide the

host device

100 with the ability to capture and display images. When the

host device

100 turns on the camera 193, the

host device

100 may obtain optical imaging collected by the camera 193 and convert the optical signal into an electrical signal through the ISP. In the experiment for controlling the shooting effect in the embodiment of the application, the ISP can also adjust shooting parameters such as exposure, color temperature and the like of a shooting scene and optimize image processing parameters for noise, brightness and skin color of an image.

In cross-device collaborative photography, a video codec may be used for digital video compression or decompression. The

master device

100 may encode the photographing file photographed in cooperation across devices into video files of various formats through a video codec.

With a GPU, a display screen 194, and an application processor, etc., the

host device

100 may implement display functionality. Specifically, the electronic device may display an image or the like acquired by the camera 193 through the display screen 194. In addition, the image received by the

master device

100 and transmitted from the

slave device

200 may be displayed by the display 194 or the like. In some embodiments, the display 194 may also display only images sent from the

device

200. At the same time, the

host device

100 may respond to user operations acting on user interface controls via the touch sensor 180K, i.e., a "touch panel".

When the electronic device 400 shown in figure 2B is the

slave device

200 in figure 2A,

the

mobile communication module

150 and the wireless communication module 160 may be used to provide communication services for the

slave device

200. Specifically, the

slave device

200 may establish a communication connection with the

master device

100 through the

mobile communication module

150 or the wireless communication module 160. Through the above connection, the

slave device

200 receives a control command for controlling a photographing effect transmitted from the

master device

100, and may photograph an image in response to the control command, and transmit the image acquired and processed by the camera 193 to the

master device

100.

As with the

master device

100, the ISP, the camera 193, and the video codec may provide the function of capturing and transmitting images for the

slave device

200. In cross-device collaborative photography, a video codec may be used to compress or decompress digital video when the

slave device

200 sends images to the

master device

100. The

slave device

200 may encode the photographing file photographed in cooperation with each other across devices into an image stream through a video codec and then transmit it to the

master device

100.

In some embodiments, the

slave device

200 may also display images captured by its own camera 193. At this time, display functions may be implemented for the

slave device

200 by the GPU, the display screen 194, and the application processor, etc.

While

slave device

200 may respond to user operations acting on user interface controls via touch sensor 180K.

The software systems of the

master device

100 and the

slave device

200 may each employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Taking an Android system with a layered architecture as an example, the embodiment of the present invention illustrates software structures of the

master device

100 and the

slave device

200. Of course, in other operating systems (e.g., hong-mong system, linux system, etc.), the solution of the present application can be implemented as long as the functions implemented by the respective functional modules are similar to those of the embodiments of the present application.

Fig. 3 is a software configuration block diagram of the

master device

100 according to the embodiment of the present invention.

As shown in fig. 3, the software architecture block diagram of the

host device

100 may include an application layer, a framework layer, a service layer, and a hardware abstraction layer (hardware abstraction layer, HAL). The framework layer may also include, among other things, a device virtualization suite (device virtualkit, DVKit) and a device virtualization platform (distributedmobile sensing development platform, DMSDP).

DVkit is a software development kit (software development kit, SDK). The DVkit may provide a capability interface to the application layer. Through the above interface, the application layer may invoke services and capabilities provided in the DVkit, e.g., discovery of slave devices, etc. DMSDP is a framework layer service. When the DVkit initiates the connection of the slave device, the DVkit may pull up the DMSDP service, and then the DMSDP may implement a control session, a transmission of a data session, and the like during the connection of the slave device. The application layer may include a series of application packages. Examples may include cameras, gallery, calendar, conversation, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications. In the embodiment of the application, the application layer includes various applications using a camera, such as a camera application, a live broadcast application, a video call application, and the like. The video call application refers to an application having both voice call and video call, such as an instant messaging application

Figure BDA0003797099790000122

(not shown in fig. 3), and so forth. The camera applications may include native camera applications, and third party camera applications. The application layer may request the frame layer for shooting capabilities using the camera.

The framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The framework layer includes some predefined functions.

As shown in fig. 3, the framework layer may include a Camera kit (Camera kit) and a Camera interface (Camera API).

Among other things, the camera kit (camera kit) may include a mode management module that may be used to adjust the shooting mode used when the

host device

100 runs various types of applications that use cameras. The shooting mode may include, but is not limited to, preview mode, photo mode, video mode, cross-device mode, and so forth. Wherein, after the

master device

100 enters the cross-device mode, a device virtualization kit (DVKit) may be used to discover the

slave device

200. These shooting modes can be realized by calling a Camera interface (Camera API).

The Camera interface (Camera API) may include two parts, camera management (Camera manager) and Camera device (Camera device).

Among them, camera management (camera manager) may be used to manage photographing capabilities of the

master device

100 and photographing capabilities of the

slave device

200 connected to the

master device

100.

The shooting capabilities of the device may include the hardware capabilities of the camera and the software capabilities of image processing software such as ISP/GPU. Hardware capabilities refer to some of the capabilities of the camera that can be adjusted. The software capability refers to the capability of an image processing module such as an ISP (Internet service provider), a GPU (graphics processing unit) and the like to process the electric signal image. Shooting capabilities also include some capability to combine hardware and software capabilities simultaneously, such as a hybrid zoom range, night mode, portrait mode, time-lapse shooting, slow motion, panoramic mode, etc. Taking a face mode as an example: the master device 100 (or slave device 200) may adjust the camera focal length while adding a beauty algorithm, etc.

The hardware capabilities include one or more of the following: the number of cameras, the type of camera, the optical zoom range, the optical image anti-shake, the aperture adjustment range, the flash, the light supplement, the shutter time, the ISO sensitivity, the pixel, the video frame rate, and so on. The types of cameras can include, but are not limited to, common cameras, wide angle cameras, ultra wide angle cameras, and the like. The optical zoom range can be 1-5 times zoom; the aperture size may be f/1.8-f/17; the shutter time may be 1/40, 1/60, 1/200, etc.

The software capabilities include one or more of the following: digital zoom range, supported image clipping specification, supported image color temperature calibration mode, supported image noise reduction, supported beauty/body type, supported filter type, supported sticker, and supported self-timer mirror image. Wherein, the digital zoom range can be 10-15 times zoom; the image cropping size may be 3:3, 3:4, 9:16; the color temperature calibration mode can be sunlight, fluorescence, incandescent lamp, shadow, cloudy day calibration mode, face-beautifying algorithm such as face-thinning, weight-thinning, skin-polishing, whitening, big eye, acne removing and the like, body-beautifying algorithm, filter algorithm such as solar system, texture, brightness, soft light, sibleone and the like, and decals such as expression, animal, scenery, picture and the like.

Table 1 exemplarily shows the respective photographing capabilities of the

master device

100 and the

slave device

200.

Figure BDA0003797099790000121

Figure BDA0003797099790000131

TABLE 1

As shown in table 1, camera management (camera manager) may record the numbers of the cameras of the

master device

100 and the cameras of the

slave device

200. For example, three cameras of the

master device

100 may be numbered 1, 2, 3, respectively. The three cameras of the

slave device

200 may be numbered 1001, 1002, 1003, respectively. The number is used to uniquely identify the camera. In particular, the numbering of the cameras of the

slave device

200 may be done by the virtual camera HAL of the

master device

100. Specific numbering rules may refer to the subsequent description of the virtual HAL and will not be described in detail herein.

It will be appreciated that the hardware capabilities and software capabilities described above may also include other capabilities, respectively, and are not limited to those mentioned above, which are not limiting to the embodiments of the present application.

A camera device (camera device) may be used to forward control commands of the stream to the service layer for further processing in response to individual applications in the application layer. A stream in the embodiments of the present application refers to a set of data sequences that have sequential, massive, fast, and continuous arrival. In general, a data stream can be considered as a dynamic data set that grows indefinitely over time. The stream of the present application is an image stream composed of images frame by frame. The control commands of the stream are created by the individual applications of the application layer and issued to the remaining modules below the application layer.

The camera device (camera device) may also cooperate with a device virtualization suite (DVKit) and a device virtualization platform (DMSDP) to establish a communication connection between the

master device

100 and the

slave device

200, and bind the

slave device

200 and the

master device

100.

Specifically, a device virtualization suite (DVKit) may be used to discover the

slave device

200, and a device virtualization platform (DMSDP) may be used to establish a session channel with the discovered

slave device

200. The DMSDP may include a control session (control session) and a data session (data session). Wherein the control session is used to transmit control commands between the

master device

100 and the slave device 200 (e.g., a request to use a camera of the

slave device

200, a photographing command, a video recording command, a control command to adjust a photographing effect, etc.). The data session is used to transfer streams returned from the

device

200, such as preview streams, photo streams, video streams, and so forth.

The service layer may include modules such as camera service (camera service), cameraDeviceClient, camera3Device, cameraProviderManager, device co-management, dynamic pipelining, and stream processing.

Among them, a Camera service (Camera service) provides various services for implementing an interface function for a Camera interface (Camera API).

Camera DeviceClient is an instantiation of camera. One camera deviceclient corresponds to one camera. The camelservicemay include a function that creates a camelerdeviceclient, such as connectidevice (). When a request for creating a camera instance is responded to an application layer, the camera service calls the function, thereby creating a camera device client instance corresponding to the camera.

Camera3Device may be used to manage the lifecycle of various types of streams, including but not limited to create, stop, clear, destroy stream information, and so forth.

The camel providermanager may be used to obtain virtual camera HAL information including the shooting capabilities of the

slave device

200. A detailed description of the shooting capability of the

slave device

200 may refer to table 1.

Device co-management may be used to control the latency of the respective pictures of the

master device

100 and the

slave device

200. When the

master device

100 and the

slave device

200 perform cross-device collaborative shooting, since a certain time is required for the

slave device

200 to transmit the shot image to the

master device

100, preview images of both parties displayed by the

master device

100 may not be the same time and have a time delay. The device collaborative management module may copy a portion of frames of the frames acquired by the

main device

100 by adding a buffer (buffer), so that preview frames of both sides are acquired at a time point with little difference, and thus, a time delay generated by the preview frames of both sides may be controlled within a visual perception range of a user, without affecting user experience.

The dynamic pipeline may be used to generate a pending command queue in response to a stream creation command issued by a camera device (camera device). In particular, the dynamic pipeline may generate one or more pending command queues depending on the type of stream creation command, the device being acted upon. The type of stream creation command may include, for example, a preview command, a video command, a photographing command, and the like. The devices on which the stream creation command acts may include the

master device

100, the

slave device

200, and may even be further refined to the specific cameras of the devices. The detailed workflow of the dynamic pipeline will be described in detail in the following method embodiments, which are not described in detail herein.

The dynamic pipeline may add an Identification (ID) or tag (tag) to the device that the command acts on to each command of the generated pending command queue. The dynamic pipeline may also add a request tag of a single frame or a continuous frame to each command of the generated pending command queue for indicating the type of the command. The photographing command is a single-frame command, and the preview command or the video command is a continuous-frame command.

The dynamic pipeline may also be used to distribute the commands acting on the

slave device

200 in the pending command queue to the stream processing module, and distribute the commands acting on the

master device

200 to the local camera HAL module of the HAL layer, which will be described in detail with reference to fig. 13.

When the user controls the photographing effect through the application program of the application layer, the dynamic pipeline may also refresh or add various parameters for controlling the photographing effect, such as a zoom value, a skin beauty Yan Suanfa (e.g., a skin abrasion level, a whitening level, etc.), a filter, a color temperature, an exposure, etc., in the command of the pending command queue.

Stream processing may include: a life cycle management module, a preprocessing module, a post-processing module, a frame synchronization module and the like.

The lifecycle management module may be used to monitor the entire lifecycle of the flow. When a camera device (camera device) transmits a command to create a stream to a stream process, lifecycle management can record information of the stream, such as a time stamp requesting creation of the stream, whether the

slave device

200 responds to create the stream, and the like. When the

host device

100 turns off the camera or ends running the application, the lifecycle of the corresponding stream stops and the lifecycle management module can record the end time of the stream.

The preprocessing module is used for processing each command issued by the dynamic pipeline and comprises modules of multi-stream configuration, multiplexing control and the like.

The multi-stream configuration may be used to configure the type and number of streams required according to the type of stream of the stream creation command issued by the camera device (camera device). The types and numbers of streams required for different types of control commands are different. The types of streams may include, but are not limited to, preview streams, photo streams, video streams, analysis streams, and the like. For example: the camera device (camera device) may issue a control command "create a photo stream", and the multi-stream configuration may configure four preview streams, an analysis stream, a photo stream, and a video stream for the control command.

The method of configuring multiple streams to create commands for streams the multiple streams may be referred to in table 2:

Figure BDA0003797099790000151

TABLE 2

It will be appreciated that in some embodiments, there may be other configuration methods for multi-stream configuration, and embodiments of the present application are not limited in this regard.

Multiplexing control may be used to multiplex multiple streams requested by a camera device (camera device), i.e., to streamline multiple streams configured by a multi-stream configuration module. For example, a preview stream with a requested picture quality of 1080P and an analysis stream with a requested picture quality of 720P may be multiplexed into a preview stream of 1080P.

The following method embodiments will be described in detail for specific implementation of the multi-stream configuration module and the multiplexing control module, which are not described herein.

The post-processing module is used to process the image stream returned from the

device

200. The post-processing module may include a smart split and multi-stream output module. The smart cut may be used to expand the stream returned from the

device

200 into a stream consistent with the type and number of streams requested by the camera device (camera device). For example, the camera device (camera device) requests a 1080P preview stream and a 720P analysis stream. The control command requesting two streams is multiplexed into a control command requesting one 1080P preview stream by multiplexing control. By executing the control command requesting a 1080P preview stream,

slave device

200 may transmit a 1080P preview stream back to

master device

100. When the stream processing receives the 1080P preview stream, the stream processing may restore the 1080P preview stream to a 1080P preview stream and a 720P analysis stream through the intelligent branching module.

The multi-stream output may be used to output streams actually required for intelligent streaming and transmit the output streams to a camera device (camera device).

The post-processing module may also include processing of images for mirroring, rotation, etc., without limitation.

The frame synchronization module can be used for performing frame synchronization processing on the image frames during photographing. Specifically, cross-device transmissions may cause a delay in the instruction sent by the

master device

100 when reaching the

slave device

200. I.e.

slave device

200 will receive this same instruction later than

master device

100. Therefore, the execution result obtained by the

slave device

200 when executing the above-described instruction may also be different from the result expected by the

master device

100. For example, the

master device

100 may issue a control command at a first time, the

slave device

200 may not return the image stream in response to the control command at a second time, and the

master device

100 may wish to receive the image stream from the

slave device

200 at the first time. Thus, frame synchronization may shift the result at the second time back from the

device

200 result forward to get a result (user expected result) closer to the first time, thereby reducing the impact of network latency.

The Hardware Abstraction Layer (HAL) may include a local camera HAL and a virtual camera HAL.

The local camera HAL may include a camera session (camera session) and a camera provider module. A camera session (camera session) may be used for the

host device

100 to issue control commands to the hardware. The camera provider module may be used to manage the shooting capabilities of the camera of the

host device

100. The shooting capability of the

master device

100 can be referred to table 1 above. Here, the camera provider may manage only the photographing capability of the local camera of the

main apparatus

100.

The virtual camera HAL also includes a camera session (camera session) and a camera provider module. Among them, a camera session (camera session) may also be used to register the

slave device

200, which establishes a communication connection with the

master device

100, to the home, to feed back the connection state of the

slave device

200 to the DVkit, and to transmit a control command issued by the

master device

100 to the

slave device

200. The camera provider module is responsible for management of the shooting capability of the

slave device

200. Also, the shooting capability of the

management slave device

200 may refer to table 1, and will not be described here.

In addition, the camera virtual HAL also provides a function of numbering the cameras of the registered

slave device

200. When the virtual camera HAL of the master device acquires the shooting capability of the slave device, the virtual camera HAL may acquire the number of cameras that the slave device has and establish an ID for each camera. The ID may be used by the

master device

100 to distinguish between multiple cameras of the

slave device

200. The virtual camera HAL may use a different numbering method from the

host device

100 itself in numbering the cameras. For example, when the cameras of the

master device

100 itself are numbered from 1, the cameras of the

slave device

200 may be numbered from 1000.

It will be appreciated that the configuration of the master device to 100 illustrated in fig. 3 does not constitute a specific limitation on the

slave device

200. In other embodiments of the present application, the

slave device

200 may include more or fewer modules than shown, or may combine certain modules, or split certain modules, or may be arranged in different modules.

Fig. 4 illustrates a system frame diagram of the

slave device

200.

As shown in FIG. 4, the

slave device

200 may include an application layer, a framework layer, a service layer, and a Hardware Abstraction Layer (HAL).

The application layer may include a series of application packages, which may include, for example, a camera proxy service. The camera proxy service may include modules for pipeline control, multi-stream adaptation, multi-Operating System (OS) adaptation, and the like.

Among other things, pipe control may be used to establish communication connections (including establishing control sessions and data sessions), transfer control commands, and image streaming with the

host device

100.

The multi-stream adaptation may be used for the camera proxy service to return stream data according to configuration information of the streams sent by the

host device

100. With reference to the description of the foregoing embodiments, the

host device

100 may configure a desired stream for a camera device session in response to a request to create the stream. Performing the above configuration procedure may generate corresponding flow configuration information. The camera proxy service may generate a corresponding stream creation command according to the configuration information. Thus, in response to the above-described control command to create a stream, the underlying service of the

slave device

200 may create a stream that matches the above-described command.

Multiple Operating System (OS) adaptation can be used to solve compatibility problems for different Operating systems, such as android systems and hong-mo systems, etc.

The framework layer comprises camera management (camera manager), camera device (camera device); the service layer includes camera service (camera service), cameraDeviceClient, camera Device and camera providermanager modules.

Where camera management (camera manager), camera device (camera device), camera service (camera service), cameraDeviceClient, camera3Device, cameraProviderManager have the same function as corresponding respective modules in the

host device

100, reference is made to the description of fig. 3 above.

The local camera HAL of the

slave device

200 may refer to the local camera HAL of the

master device

100. The embodiments of the present application are not described herein. The camera HAL layer of the

slave device

200 may also include a camera session (camera session) and a camera provider module. A camera session (camera session) may be used for communication between control commands and hardware. The camera provider module may be used to manage shooting capabilities of the

slave device

200. The photographing capability of the

slave device

200 may refer to the above table 1, and likewise, the camera provider of the

slave device

200 may manage only the photographing capability of the local camera of the

slave device

200.

It will be appreciated that the configuration of the

slave device

200 illustrated in fig. 4 does not constitute a specific limitation of the

slave device

200. In other embodiments of the present application, the

slave device

200 may include more or fewer modules than shown, or may combine certain modules, or split certain modules, or may be arranged in different modules.

Based on the foregoing software and hardware structures of the

system

10, the

master device

100, and the

slave device

200, the cross-device collaborative shooting method provided in the embodiments of the present application is described in detail below.

The cross-device collaborative shooting method provided by the embodiment of the application can be applied to various scenes, including but not limited to:

(1) Live scene

In a live broadcast scene, the

master device

100 may be connected with the

slave device

200, the

master device

100 and the

slave device

200 may be in different angles to shoot the same object or different objects, the

slave device

200 may send the shot image to the

master device

100, and the

master device

100 may upload the images of both parties to a live broadcast server and distribute the images to more users for viewing by the live broadcast server. In this process, the

master device

100 may control the photographing effect of the

slave device

200, for example, parameters such as a focal length of a camera of the

slave device

200 or a filter used may be adjusted by the

master device

100.

By using the cross-device collaborative shooting method provided by the embodiment of the application in the live scene, a host who initiates live broadcast can conveniently control the shooting effect of the

slave device

200 on the

master device

100, and a user who watches live broadcast can see a plurality of images displayed at different angles for the same object.

(2) Camera application scene

After the

master device

100 starts the camera application, it may connect with the

slave device

200 and take images at different angles, and the

slave device

200 may send the taken images to the

master device

100. The

master device

100 may control the photographing effect of the

slave device

200, may display photographed images of both sides at the same time, and may perform processes such as preview, photographing, video recording, and the like on the photographed images. The camera application may be a native camera application or a third party camera application.

Thus, functions of cross-equipment double-view video, multi-view video and the like can be realized, more and more free visual angles can be provided for users, and the users can conveniently control the shooting effect of the

slave equipment

200 on the

master equipment

100, so that the shooting interestingness is increased.

(3) Video call scene

The

master device

100 may be a cell phone and the

slave device

200 may be a large screen television. The mobile phone can carry out video call with other devices, in the process, the mobile phone can be connected with the large-screen television and control the camera of the large-screen television to shoot, the large-screen television can send the shot image to the mobile phone, and then the mobile phone sends the image to the other end device of the video call. Therefore, the mobile phone can realize video call through the large-screen television, a user does not need to hold the mobile phone to shoot images at a specific angle, and more convenient video call experience can be provided for the user.

(4) Scene for controlling intelligent electronic equipment to photograph by wearable equipment

Wearable equipment such as intelligent wrist-watch can be connected with intelligent electronic equipment such as cell-phone to control the camera of cell-phone to shoot, the cell-phone can be with the image transmission who shoots for intelligent wrist-watch, makes the user directly watch the picture that the cell-phone was shot and was handled on intelligent wrist-watch. In the process, the user can also control the shooting effect of the mobile phone through the intelligent watch.

Therefore, the user can process the shooting picture of the mobile phone through the intelligent watch, and shooting can be conveniently and rapidly completed without the help of other people in the process of shooting the combination photo and the scenery.

It can be understood that the above scenario is only an example, and the cross-device collaborative shooting method provided in the embodiment of the present application may also be applied to other scenarios, which is not limited herein.

Taking a live scene as an example, a cross-device collaborative shooting method is described below in connection with a UI in the live scene.

Fig. 5 illustrates a live scene provided in an embodiment of the present application.

As shown in fig. 5, the live scene may include a

master device

100, a

slave device

200, an object a, and an object B.

The

master device

100 and the

slave device

200 establish a communication connection, and the

master device

100 and the

slave device

200 may be in different positions or angles. The

master device

100 photographs the object a, and the

slave device

200 photographs the object B. Then, the

slave device

200 may display the photographed image and transmit the image to the

master device

100 after processing. The

master device

100 may simultaneously display the image transmitted from the

slave device

200 and the image obtained by photographing the object a itself.

During the live broadcast, the

host device

100 may also upload the two images displayed to the live broadcast server. Further, the server may distribute the two images to devices of other users entering the live room.

Based on the live scenario described above, some User Interfaces (UIs) on the

master device

100 and the

slave device

200 provided in the embodiments of the present application are described below. Fig. 6A-6D, 7A-7B, 8A-8B, 9A-9D, and 10A-10C illustrate some user interfaces implemented on the

master device

100 and the

slave device

200 in a live scenario.

Fig. 6A-6C illustrate one manner in which the

master device

100 and the

slave device

200 establish a communication connection. Fig. 6A-6C are user interfaces implemented on the

master device

100, and fig. 6D is a user interface implemented on the

slave device

200.

FIG. 6A illustrates an

exemplary user interface

60 on the

host device

100 for exposing installed applications. The

user interface

60 is displayed with: status bars, calendar indicators, weather indicators, trays with commonly used application icons, navigation bars,

icons

601 for live class applications,

icons

602 for camera applications, icons for other applications, and so forth. Wherein the status bar may include: one or more signal strength indicators of mobile communication signals (also may be referred to as cellular signals), carrier names (e.g., "chinese mobile"), one or more signal strength indicators of Wi-Fi signals, battery status indicators, time indicators, etc. The navigation bar may include system navigation keys such as a return key, a home screen key, a multi-tasking key, and the like. In some embodiments, the

user interface

60 exemplarily shown in fig. 6A may be a Home screen.

As shown in fig. 6A, the

host device

100 may detect a user operation of an

icon

601 acting on the live class application, and display the user interface 61 shown in fig. 6B in response to the user operation.

The user interface 61 may be a main interface provided by a live-type application. The user interface 61 may include:

region

611,

interactive message window

612,

preview box

613, add

control

614, set control 615.

Region

611 may be used to present some information of the host, such as head portraits, live time length, number of viewers, and live account numbers, among others.

The

interactive message window

612 may be used to display messages sent by the anchor or viewer during the live broadcast process, or system messages generated by interactive operations such as "like" and "like".

The

preview pane

613 may be used to display images captured and processed in real time by the camera of the

host device

100. The

host device

100 may refresh the display content therein in real time so that the user previews the image captured and processed by the camera of the

host device

100 in real time. The camera may be a rear camera of the

main device

100 or a front camera.

The settings control 615 may be used to adjust the photographic effect of the

host device

100. When a user operation (e.g., a click operation, a touch operation, etc.) is detected on the set control 615, the

master device

100 may display: options for adjusting the photographing parameters of the

main apparatus

100 and/or the image processing parameters. These options may be referred to in the following description of the user interface and are not described here in detail.

The

add control

614 may be used to find the

slave device

200. When detecting a user operation on the

add control

614, the

master device

100 may discover other nearby electronic devices using the aforementioned short-range communication technologies such as bluetooth, wi-Fi, NFC, etc., or may discover other remote electronic devices using the aforementioned long-range communication technologies, and query whether the discovered other electronic devices have a camera.

Upon receiving responses from other electronic devices, the

host device

100 may display the discovered camera-equipped electronic devices on the user interface 62. For example, referring to fig. 6C, the

host device

100 may display a

window

622, the

window

622 may include information for two electronic devices, including respective: icons, names, distances, locations, etc. of the electronic device.

Icon

623 may be used to show the type of electronic device. For example, the first slave device displayed by the

master device

100 may be a tablet computer. The user can quickly and easily preliminarily recognize through the

icon

623 whether the slave device is a device to which he/she wants to connect.

Name 624 may be used to display the name of the slave device. In some embodiments, the

name

624 may be the model number of the slave device. In other embodiments, the name may also be a user-defined name for the slave device. The names may also be a combination of device model number and user-defined names. The present application is not limited in this regard. It is understood that the list of user interfaces 62, such as "pad C1", "Phone P40-LouS", etc., are exemplary names.

As shown in fig. 6C, the

master device

100 may detect a user operation acting on the

icon

623 of the electronic device, and in response to the user operation, the

master device

100 transmits a request to establish a communication connection to the electronic device (slave device 200) corresponding to the

icon

623.

Referring to fig. 6D, fig. 6D shows the user interface 63 displayed by the electronic device (slave device 200) after the

slave device

200 receives the request for establishing the communication connection sent by the

master device

100. As shown in fig. 6D, the user interface 63 includes:

device information

631, a

confirmation control

632, and a cancel

control

633 of the

master device

100.

The

device information

631 may be used to present identity information of the

host device

100. I.e. the user can determine the information of the

master device

100 that issued the above request through the

device information

631. When a user is able to determine the information of the

host device

100 and trust the host device through the

device information

631, the user may agree to the

host device

100 to use the camera of the electronic device through the

confirmation control

632. The electronic device may detect an operation on the

confirmation control

632, in response to which the

slave device

200 may agree to the

master device

100 to use its own camera, i.e., the

master device

100 may establish a communication connection with the electronic device.

The communication connection may be the wired connection or the wireless connection. For example, the master device and the slave device may log into the same user account (e.g., hua as an account) and then make a remote connection through a server (e.g., hua as a provided multi-device collaborative photographing server). The embodiments of the present application are not limited in this regard. It should be understood that the electronic device corresponds to a slave device of the

master device

100.

The user interface 63 also includes a cancel

control

633. When the

host device

100 cannot be determined by the

device information

631 or when the

host device

100 is not trusted, the user can reject a request to use the electronic device camera sent by the

host device

100 through the cancel

control

633. The electronic device may detect an operation on cancel

control

633 and in response to the user operation, the electronic device may refuse the

host device

100 to use its own camera, i.e., the electronic device does not agree to establish a communication connection with the

host device

100.

Fig. 7A-7B illustrate another manner in which the

master device

100 and the

slave device

200 establish a communication connection. Wherein fig. 7A is a user interface 71 implemented on a

slave device

200 and fig. 7B is a user interface 72 implemented on a

master device

100.

When the

master device

100 detects an operation acting on the

icon

623 in the user interface 62, in response to the user operation, the

master device

100 transmits a request to establish a communication connection to the electronic device (slave device 200) corresponding to the

icon

623.

As shown in fig. 7A, the user interface 71 may include a verification code 712 and a cancel control 714.

The verification code 712 may be used for connection confirmation of the

master device

100 and the

slave device

200. In response to the above request sent by the

master device

100, the

slave device

200 may generate a verification code 712. In some embodiments, the verification code 712 may also be generated by the server 300 and then transmitted to the

slave device

200 over the wireless network. The

slave device

200 may then display the verification code on the user interface 71.

The cancel control 714 may be used to reject a request sent by the

master device

100 to use the

slave device

200 camera. The

slave device

200 may detect a user operation on the cancel control 714, in response to which the

master device

100 may close the

dialog

711.

Fig. 7B illustrates the user interface 72 for the

host device

100 to enter a passcode. While the

slave device

200 displays the user interface 71, the

master device

100 may display the user interface 72. The user interface 72 may display a

dialog

721.

Dialog

721 may include a

verification code

7211, a

validation control

7212. The

authentication code

7211 may represent an authentication code of the user

input host device

100. The

master device

100 may detect an operation on the

validation control

7212. In response to the user operation, the

master device

100 may transmit the

authentication code

7211 to the

slave device

200. The user operation is, for example, a click operation, a long press operation, or the like.

The

slave device

200 may check whether the received

authentication code

7211 is identical to the authentication code 712 displayed by itself. If the two verification codes are the same, the

slave device

200 agrees to the

master device

100 to use its own camera. Further, the

slave device

200 may turn on its own camera and transmit the image collected by the camera and processed by the ISP to the

master device

100. Conversely, the

slave device

200 may reject the request of the

master device

100 to use the

slave device

200 camera.

In some embodiments, when

master device

100 enters a

verification code

7211 that is different from verification code 712 displayed by

slave device

200,

slave device

200 may maintain display of verification code 712 and wait for

master device

100 to enter a new verification code. The

slave device

200 may also agree to use its own camera by the

master device

100 when the new authentication code received by the

slave device

200 coincides with the authentication code 712.

In other embodiments, when the

authentication code

7211 transmitted by the

master device

100 is different from the authentication code 712 displayed by the

slave device

200, the

slave device

200 may regenerate another authentication code M, and the

master device

100 may retrieve the authentication code M. When the verification code N acquired by the

master device

100 is consistent with the verification code M, the

slave device

200 may also agree that the

master device

100 uses its own camera.

The communication connection may be established in other manners, for example, using near field communication (near field communication, NFC) technology, and the

master device

100 and the

slave device

200 may perform authentication by a user operation such as a bump, not limited to the 2 described manners of establishing communication connection shown in fig. 6D to 7B. The authentication method of the present application is not limited to the authentication method of the above-mentioned 2.

After the connection is established, the

master device

100 and the

slave device

200 may display hint information, respectively. The prompt may prompt the user that the

master device

100 and the

slave device

200 have established a communication connection. As shown in fig. 8A, the user interface 81 shows a user interface that displays prompt information from the

device

200.

After the authorization shown in fig. 6C (or the authorization shown in fig. 7A-7B) is completed, the

slave device

200 may display the user interface 81. The user interface 81 may include a

prompt box

811 and a

preview box

812.

Preview pane

812 may be used to display images captured from the camera of

device

200.

Prompt box

811 may be used to display prompt information. The above-mentioned hint information is, for example, "the camera of the slave device is being used by the

master device

100".

When the

slave device

200 grants the

master device

100 the use of the

slave device

200's camera, the

slave device

200 may turn on its own camera. The

slave device

200 may then display the pictures acquired and processed by its own camera in

preview box

812. Above the display layer of

preview pane

812, a

prompt pane

811 may be displayed by

slave device

200.

In some embodiments, the

slave device

200 may also display images acquired and processed by its own camera through a floating window. Specifically, the

slave device

200 may display a floating window in the upper right corner of the

user interface

60 as shown in FIG. 6A. The floating window may display images acquired and processed from the camera of the

device

200.

When the

slave device

200 displays the user interface 81, the

master device

100 may display the user interface 82 as shown in fig. 8B. The user interface 82 may include a

prompt window

821, a

window

822, and a

window

823.

The

prompt window

821 may be used to display prompt information. The above-mentioned hint information is, for example, "the

master device

100 has connected the

slave device

200".

Window

822 may be used to display images acquired and processed from the camera of

device

200. The

window

823 may display images captured and processed by the camera of the

host device

100.

When the

slave device

200 agrees that the

master device

100 uses the

slave device

200 camera, the

master device

100 may obtain an image acquired and processed by the

slave device

200 camera from the

slave device

200. The

master device

100 may then display the image on the

window

822. Meanwhile, the

host device

100 may also display a

hint window

821. The user can understand that the

master device

100 is connected to the

slave device

200 through the prompt contents displayed in the

prompt window

821.

In addition, the user interface 82 may also add a

setup control

824. The setup control may be used to display the shooting capability options of the

slave device

200. Reference should be made to the following examples for specific description, which are omitted here for brevity.

In some embodiments, the

host device

100 may also exchange content displayed by

window

823 and

window

822. Specifically, the

master device

100 may detect a user operation acting on the

window

822, and in response to the user operation, the

master device

100 may display an image acquired and processed by the camera of the

slave device

200 in the

window

823. At the same time, the

host device

100 may display images captured and processed by the camera of the

host device

100 at

window

822. The user operation may be a click operation, a left slide operation, or the like.

In some embodiments, the host device may also divide the

window

823 into two separate portions. One for displaying images captured and processed by the camera of the

master device

100 and the other for displaying images captured and processed by the camera of the

slave device

200. The present application does not limit the arrangement of the preview images of the

master device

100 and the

slave device

200 displayed on the

master device

100.

Fig. 6A-8B illustrate a set of user interfaces where the

master device

100 establishes a communication connection with the

slave device

200 and displays images captured and processed by the cameras of the

slave device

200. After that, the

master device

100 may acquire the ability of the

slave device

200 to control the photographing effect, and may transmit a command to the

slave device

200 to control the photographing effect.

Fig. 9A to 9D exemplarily show a set of user interfaces of the

master device

100 controlling the photographing effect of the

slave device

200. 9A-9C are user interfaces on the

master device

100 and 9D are user interfaces on the

slave device

200.

After the

prompt window

821, which displays prompt contents, shown in fig. 8B is closed, the

host device

100 may display the user interface 91 as shown in fig. 9A. The user interface 91 may include a

window

911, a

window

912, a

delete control

913, a

set control

915, and a

set control

916.

Window

911 may display images captured and processed by the camera of

host device

100.

Window

912 may display images acquired and processed from the camera of

device

200.

Delete control

913 may be used to close

window

912.

Delete control

913 may be used to close

window

912. The

host device

100 may detect a user operation on the

delete control

913, in response to which the

host device

100 may close the

window

912.

The

setup control

915 may be used to display a capability option for the

host device

100 to control a photographic effect. The

set control

916 may be used to display capability options for controlling the photographic effect from the

device

200. A prompt, such as "click to adjust remote screen," may also be displayed alongside the

setup control

916. The

master device

100 may detect a user operation acting on the

setting control

916, and in response to the user operation, the

master device

100 may display a capability option of the

slave device

200 to control a photographing effect, referring to fig. 9B.

In some embodiments, the user interface 91 may also include a

delete control

913, an add control 914. The

delete control

913 may be used to close one or more windows in the user interface 91, e.g.,

window

911,

window

912. The add control 914 may be used to find connections to other slave devices. After the

host device

100 detects a user operation on the add control 914, the

host device

100 may display the query results shown in

window

622 in FIG. 6C.

When the

master device

100 detects an action on the other slave devices displayed in the

window

622, the

master device

100 may send a request to the slave device to use the camera. Similarly, the other slave devices may agree to the request sent by the

master device

100, and then the slave device may enable its own camera to collect and process the image according to the default shooting parameters, and further send the processed image to the master device. Meanwhile, the

main device

100 may add a window to display the above image.

By implementing the method, the

master device

100 can display images sent by a plurality of slave devices, so that a richer shooting experience is provided for a user.

In some embodiments, the

master device

100 may also multiplex the

set control

915 and the

set control

916. A particular user interface 91 may display a general setup control. When the image captured and processed by the

host device

100 is displayed in the

window

911, the setting control may display an ability option of the

host device

100 to control the photographing effect. When images captured and processed from the

device

200 are displayed in the

window

911, the setting control may display a capability option of the

device

200 to control the photographing effect.

Fig. 9B illustrates the user interface 92 of the

master device

100 displaying the capability options of the

slave device

200 for controlling the shooting effect. The user interface 92 may include a

photographic effects window

921 from the

device

200. The

window

921 may display various capability options that the

slave device

200 has to control photographing effects, such as an aperture, a flash, intelligent follow,

white balance

922, ISO sensitivity, a zoom range, beauty, a filter, and the like.

The present application specifically describes a user interface in which the

master device

100 transmits control commands to the

slave device

200 and the

slave device

200 executes the control commands described above, taking the

white balance

922 adjustment as an example. White balance can be used to calibrate the color temperature bias of the camera.

White balance

922 may include daylight mode, incandescent

light mode

923, fluorescent mode, cloudy day mode, shadow mode. The

master device

100 may detect a user operation acting in any of the modes described above. When the

master device

100 detects a user operation acting on the

incandescent lamp mode

923, the

master device

100 may issue a control command to the

slave device

200 to change the white balance mode to the incandescent lamp mode in response to the user operation. The

slave device

200 receiving the above command may change the

white balance

922 to the

incandescent lamp mode

923. The

master device

100 may then receive and display the image of the change

incandescent lamp pattern

923 sent from the

slave device

200. Refer to fig. 9C. Meanwhile, the display image from the viewfinder of the

apparatus

200 may also be adjusted to an image after the white balance mode is changed. Referring to FIG. 9D

In some embodiments, the

master device

100 may also set a dedicated page to display the capability option of the

slave device

200 to control the shooting effect. I.e., the

host device

100 may use a capability option in a separate

page display window

921. The embodiments of the present application are not limited in this regard.

As shown in fig. 9C, the user interface 93 may include a

window

931.

Window

931 may be used to display images acquired and processed from the camera of

device

200. When the

master device

100 can issue a control command to the

slave device

200 to change the white balance mode to the incandescent lamp mode, the

master device

100 can receive an image from the

slave device

200 to change the white balance mode to the incandescent lamp mode. The

window

931 may display the above-described image.

While the user interface 93 displays the image after the white balance adjustment, the

slave device

200 may also display the user interface 94 after the white balance adjustment. Refer to fig. 9D. The user interface 94 is a user interface displayed on the

slave device

200. The user interface 94 may include a

preview window

941.

The

preview window

941 may be used to display images captured and processed from the camera of the

device

200. Upon receiving a control command issued from the

master device

100 to the

slave device

200 to change the white balance mode to the incandescent lamp mode, the

slave device

200 may change the white balance mode to the incandescent lamp mode. The

preview window

941 may then display the image captured and processed from the camera of the

device

200 in incandescent mode. Meanwhile, the

slave device

200 may transmit the above-described image to the

master device

100. The

master device

100 may display the above-described image as shown in fig. 9C.

In some embodiments, the

slave device

200 may have some or all of the above-described capabilities in the list, or may also have other capabilities not mentioned in the

window

921 to control the shooting effect. The present application is not limited in this regard.

The

master device

100 may also acquire the capability of the

master device

100 to control the photographing effect, and issue a control instruction to itself. Fig. 10A to 10C show a set of user interfaces for controlling photographing effects by the

host apparatus

100 down to itself. Fig. 10A illustrates a user interface 101 in which the

master device

100 acquires the ability to control a photographing effect itself. The user interface 101 may include a

setup control

1011.

The

setting control

1011 may be used to display a capability option for the

host device

100 to control the shooting effect. A prompt may also be displayed next to the

setup control

1011, such as "click adjust home screen". The

host device

100 may detect a user operation on the

setting control

1011, and in response to the user operation, the

host device

100 may display a shooting effect capability list of the

host device

100, as in the user interface 102 shown in fig. 10B.

The user interface 102 may include a

window

1021. The window may display options for the ability of the

host device

100 to control the shooting effect, such as

aperture

1022, flash, intelligent follow, beauty, filters, etc. The present embodiment describes, taking an aperture as an example, that the

host apparatus

100 transmits a control command for adjusting the photographing effect to the

host apparatus

100.

The

aperture

1022 is adjustable in size by a

dial

1024. The

master device

100 may detect a user operation on the

dial

1024, and in response to the user operation, the

master device

100 may issue a control command to the

master device

100 to adjust the aperture.

Specifically, the

dial

1024 may initially scale "f/8". The user can slide the float on the

dial

1024 to the aperture scale of "f/17" by a right-slide operation. The

host device

100 may detect this user operation, in response to which the camera of the

host device

100 may replace the aperture with "f/8" with "f/17". The aperture change to "f/17" may result in a shallower depth of field, and accordingly, the window displaying the image captured by the

host device

100 may display an image having a shallower depth of field captured by the

host device

100. As shown by

window

1031 in fig. 10C.

Fig. 10C illustrates the

user interface

103 with the

main device

100 preview window depth of view reduced. The

user interface

103 may include a

preview window

1031. The

preview window

1031 may be used to display images captured and processed by the camera of the

host device

100.

Upon receiving a control command issued by the

host device

100 to replace the aperture "f/8" with "f/17", the

host device

100 may replace the aperture "f/8" with "f/17". The

preview window

1031 may then display the image captured and processed by the

master device

100 having an aperture size of "f/17".

Fig. 6A-6D, fig. 7A-7B, fig. 8A-8B, fig. 9A-9D, fig. 10A-10C, fig. 11A-11C depict a series of user interfaces for the

master device

100 to establish a communication connection with the

slave device

200 and to control the effects of capturing images from the

slave device

200 in a live scene. The above-described method of controlling the photographing effect of the

slave device

200 may also be used in photographing scenes. A series of user interfaces for the

master device

100 to establish a communication connection with the

slave device

200 and to control the photographing effect of the

slave device

200 in the photographing application scenario will be described.

Fig. 11A illustrates the

master device

100 displaying a user interface 111 that adds slave devices. User interface 111 can include add

control

1112,

dialog

1113.

Master device

100 may detect a user operation on

add control

1112, in response to which

master device

100 may query an electronic device having a camera. Upon receiving the message with the camera transmitted back from the electronic device, the

host device

100 may display a

dialog box

1113.

Dialog

1113 may be used to present information for an electronic device having a camera. For example,

dialog

1113 illustrates information for two electronic devices with cameras (

electronic device

1114, electronic device 1115). Similarly, the above information is information such as the name and location of the slave device.

Taking the

electronic device

1115 as an example, the

master device

100 may detect a user operation acting on the

electronic device

1115, and in response to the user operation, the

master device

100 may send a request to the

slave device

200 to use the camera of the

electronic device

1115. The

electronic device

1115 may detect an operation in which the user agrees to the

host device

100 to use the own camera, and in response to the operation, the

electronic device

1115 may agree to the

host device

100 to use the camera of the

electronic device

1115. The user interface of the above-described process in which the

electronic device

1115 grants the use authority to the

host device

100 may refer to the user interface in the live scene. As shown in fig. 6C-6D, or fig. 7A-7B. This is not repeated here.

After the user of the

slave device

200 agrees that the

master device

100 uses the camera of the

slave device

200, the

slave device

200 may display the user interface 112, referring to fig. 11B. While the

host device

100 may display a user interface 113, as shown in fig. 11C.

User interface 112 is a user interface on

slave device

200. User interface 112 illustratively shows a user interface for displaying prompts from

device

200. The user interface 112 may include a

prompt window

1121 and a

preview window

1122.

When the

master device

100 is granted access to the

slave device

200, the

slave device

200 may turn on its own camera, and further, the

preview window

1122 may display the image captured and processed by the

slave device

200 camera. The user interface 112 may also display a

prompt window

1121. The

prompt window

1121 prompts the user that the camera of the device has been used by other devices. For example, "the current picture is being used by LISA".

As shown in fig. 11C, the user interface 113 is a user interface on the

host device

100. User interface 113 illustrates a user interface in which

host device

100 displays a prompt.

When the

slave device

200 displays the user interface 112, the

master device

100 may display the user interface 113. The user interface 113 may include a

window

1131, a

window

1132, and a

prompt window

1134.

The

window

1131 may be used to display images captured and processed by the camera of the

host device

100.

Window

1132 may display images acquired and processed from the camera of

device

200. When receiving an image transmitted from the

slave device

200, the

window

1132 may display the image. The

prompt window

1134 may be used to display prompt information, such as "connected camera: phone P40-LouS). The above-described hint information may be used to hint to the user that the

master device

100 has connected to the

slave device

200.

In some embodiments,

window

1131 and

window

1132 may also exchange display content. I.e.,

window

1131 may display images acquired and processed from the camera of

device

200.

Window

1132 may be used to display images captured and processed by the camera of

host device

100. In particular, when the

window

1131 may display an image collected and processed by the camera of the

slave device

200, the

master device

100 may be provided with the above-described delete key 1133 in the

window

1131. In response to a user operation acting on delete key 1133,

host device

100 may close

window

1131.

In some embodiments,

window

1132 may be displayed over

window

1131 in the form of a floating window in user interface 113. In another embodiment,

window

1132 may also be tiled with

window

1131. Refer to fig. 11D.

Fig. 11D illustrates another user interface 114 in which the

host device

100 displays a prompt. User interface 114 may include

window

1141,

window

1142,

prompt window

1143, setting

control

1147, setting

control

1148.

Window

1141 may be used to display images captured and processed by the camera of

host device

100.

Window

1142 may be used to display images captured and processed from the camera of

device

200. Likewise,

window

1141 and

window

1142 may also exchange display content. I.e.,

window

1141 may display an image captured and processed from the camera of

device

200.

Window

1142 may be used to display images captured and processed by the camera of

host device

100. The present application is not limited in this regard. The

prompt window

1143 is used to display prompt information.

The

setup control

1147 may be used to display a camera capability option of the camera of the

host device

100.

Setting control

1148 may be used to display a capture capability option from the camera of

device

200.

The

master device

100 may detect a user operation on the

setting control

1148, in response to which the

master device

100 may display a shooting capability option of the

slave device

200. The shooting capability options described above may refer to shooting capability options displayed in a

dialog box

921 in the user interface 92 shown in fig. 9B. This is not repeated here.

Further, the

master device

100 may detect a user operation acting on a certain shooting capability option, and in response to the user operation, the

master device

100 may transmit a control command to adjust the shooting effect to the

slave device

200.

Also taking the user interface 92 shown in fig. 9B as an example, in response to a user operation of the

incandescent lamp

923 acting in the above-described

dialog

921, the

master device

100 may transmit a control command to adjust the white balance mode to the incandescent lamp mode to the

slave device

200. In response to the control command described above, the

slave device

200 may perform color temperature calibration of the incandescent lamp mode on the image captured by the camera. The

slave device

200 may then transmit an image for color temperature calibration using the incandescent lamp mode to the

master device

100.

In response to the image after the photographing parameters are adjusted, which is transmitted from the

slave device

200, the

master device

100 may display the image.

In some embodiments, the user interface 114 may also include a

delete control

1144, an

add control

1145, and a

shoot control

1146.

Delete control

1144 may be used to close

window

1142. The

add control

1145 may be used for the

host device

100 to discover other electronic devices with cameras.

The

host device

100 can detect a user operation on the

capture control

1146. In response to the user operation, the

host device

100 may store the contents displayed in the

windows

1131, 1132 as pictures or videos. In a live or video call scenario, the live application/video call application may also obtain the above-mentioned picture or video and send it to the server providing the live/video call.

In some embodiments, the number of slave devices may also not be limited to one. For example, the

host device

100 may detect a user operation of an add control, such as add control 914 of user interface 91, add

control

1145 of user interface 114, and the like. In response to the user operation, the

host device

100 may establish a connection with other electronic devices with cameras. Further, the

master device

100 may establish connection with a plurality of slave devices and use images transmitted from the plurality of slave devices.

In some embodiments, the

master device

100 may also display only images sent by the slave device. For example, after the

master device

100 establishes a communication connection with the

slave device

200, the

master device

100 may turn off its own camera and display only the image transmitted from the

slave device

200.

The following describes a detailed flow of the cross-device collaborative shooting method provided in the embodiment of the present application. Fig. 12 shows a detailed flow of the cross-device collaborative shooting method. As shown in fig. 12, the method may include the steps of:

s101: the

master device

100 establishes a communication connection with the

slave device

200.

In a specific implementation, the communication connection established between the

master device

100 and the

slave device

200 may be a wired connection or a wireless connection as described above.

In some embodiments, the

host device

100 may first discover other electronic devices with cameras in response to a received user operation (e.g., a user operation acting on the

add control

614 as shown in fig. 6B), and then send a connection request to the electronic device selected by the user. After the electronic device responds to the user operation granting the request (e.g., the user operation acting on the

confirmation control

632 shown in fig. 6D), the

host device

100 and the electronic device successfully establish a communication connection.

In other embodiments, the

host device

100 may scan the two-dimensional code of the

electronic device

200 and establish a connection with the

electronic device

200. Specifically, the electronic device can display the two-dimensional code using the camera of the electronic device. The

master device

100 may acquire the two-dimensional code, and in response to the operation, the

master device

100 may transmit a use request to the electronic device and obtain approval of the electronic device.

Without being limited to the above method, the

master device

100 may also establish a communication connection in other ways. Such as a bump-and-bump operation based on NFC technology. The embodiments of the present application are not limited in this regard.

Specifically, the

master device

100 may search for other electronic devices with cameras through a device virtualization kit (DVKit), send a request for establishing communication connection to the discovered electronic device, and after the DVKit receives a message that agrees to establish communication connection fed back by the other electronic devices, establish communication connection with the electronic device. Further, the DVKit establishes a communication connection with the electronic device through a distributed device virtualization platform (DMSDP), and the DMSDP is specifically configured to establish a session with the electronic device. The sessions include control sessions and data sessions.

At this time, the above-described electronic device that establishes a session with the

master device

100 may be referred to as a

slave device

200 of the

master device

100.

S102: the

master device

100 acquires shooting capability information of the

slave device

200 through a communication connection.

DMSDP may register the slave device to the virtual camera HAL based on the session channel established by the

master device

100 and the

slave device

200. Meanwhile, the DMSDP may request photographing capability of the

slave device

200 from the

slave device

200.

Specifically, the

slave device

200 may acquire its shooting capability from a camera service module (camera service) module itself. The shooting capability includes the hardware capability of the camera and the software capability of the image processing module such as ISP, GPU and the like, and some shooting capabilities combining the hardware capability and the software capability can be specifically described with reference to table 1 in FIG. 3.

The

slave device

200 may then transmit the shooting capability described above to the

master device

100. Upon receiving the shooting capability transmitted from the

device

200, the DMSDP may transmit the shooting capability information to the HAL layer virtual HAL module. Further, the virtual camera HAL may also send the shooting capabilities described above to camera management (camera manager).

Table 1 shows the shooting functions that the

slave device

200 may include in hardware capabilities, combined with hardware capabilities and software capabilities. Such as the number of cameras, camera IDs, pixels, aperture sizes, zoom ranges, filters, beauty, and various shooting modes, which the

slave device

200 has. In particular, the shooting modes such as the night view mode, the portrait mode, and the like include not only hardware capabilities of the camera of the

slave device

200 but also image-point capabilities of the

slave device

200.

The device co-management and dynamic pipeline may learn that the

slave device

200 is currently registered in the virtual camera HAL.

When the

slave device

200 is registered in the virtual camera HAL, the device collaborative management needs to perform collaborative management on the

slave device

200. Specifically, when the

master device

100 displays images from the

master device

100 and the

slave device

200 at the same time, the device cooperation management can distinguish which camera of which device the above image comes from, for example, the camera 1 of the master device, the camera 1001 of the slave device, according to the IDs of the cameras. The device co-management may then cause the time delay for displaying the images from the

master device

100 and the

slave device

200 to be within acceptable range to the human eye by repeating or buffering the images of the

master device

100.

Similarly, the dynamic pipeline may distinguish between control commands flowing to the

master device

100 or the

slave device

200 according to information such as the ID of the camera, and further send the control commands sent to the

slave device

200 to the virtual camera HAL.

After registration is completed, the virtual HAL may send a notification to the DVKit to change the connection state. In particular, the virtual HAL may inform the DVKit to change the unconnected state to the connected state. The unconnected state refers to that the

host device

100 does not establish a connection with other electronic devices using the camera of the electronic device. Accordingly, the above-mentioned connected state means that the

host device

100 has established a connection with other electronic devices using the camera of the electronic device.

S103: the

slave device

200 collects and processes images and then transmits the collected and processed images to the

master device

100.

After the

slave device

200 and the

master device

100 successfully establish a communication connection, the

slave device

200 may automatically start capturing and processing images.

Reference is made to the user interfaces shown in fig. 6D and 8A. When the

slave device

200 agrees to connect with the

master device

100, the

slave device

200 may open its own camera application. The user interface of the camera application is shown in fig. 8A. At the same time, the

slave device

200 may display images captured and processed by its own camera, see

preview box

812 of fig. 8A.

The

slave device

200 may then transmit the image to the

master device

100. The

master device

100 may display the above-described image. As shown in fig. 8B, the

host device

100 may add a

window

822 in the user interface 82. The preview window may display images captured and processed from the

device

200. Thus, the

master device

100 can realize collaborative shooting across devices and simultaneously display images acquired and located by a plurality of cameras on a display screen.

In some embodiments, the

master device

100 may also only control and use the cameras of the

slave device

200, i.e. the

master device

100 only displays images of the

slave device

200, and not captured and processed images of the

master device

100.

In other embodiments, after the

master device

100 and the

slave device

200 establish a communication connection, the

master device

100 may transmit a control command to turn on the camera to the

slave device

200 in response to an operation of turning on the camera of the

slave device

200 by the user. The

slave device

200 may respond to the above command and turn on the camera.

For example, after the user interface 63 shown in fig. 6D, the

host device

100 may also display a prompt window. The prompt window may ask the user whether to open the camera of the

slave device

200. When the

master device

100 detects a user operation for turning on the

slave device

200, the

master device

100 may transmit a control command for turning on the camera to the

slave device

200. In response to the control command, the

host device

100 may turn on the camera and obtain an image acquired and processed by the camera.

Further, the

slave device

200 may transmit the above-described image to the

master device

100. The

host device

100 may display the image described above, such as

window

822 shown in fig. 8B. Meanwhile, the

slave device

200 may display the above-described image on its own display screen. Such as user interface 81 shown in fig. 8A.

In some embodiments, the

slave device

200 may not display the image acquired and processed by its own camera.

In the two schemes, the shooting parameters used by the camera of the

slave device

200 may be default, for example, the

slave device

200 may default to use a rear-mounted common camera for acquisition, the camera uses a double focal length, the color temperature calibration uses a default daylight mode, and the aperture size is f/1. 6. Optical anti-shake, flash off, shutter time 1/60, ISO sensitivity 400, pixels 8192 x 6144, crop box size 3:4, off face/body algorithms, no filter, no decal, etc.

Then, the

master device

100 may transmit a series of control commands for collaborative photographing, such as photographing, video recording, or control commands for adjusting photographing effects, such as changing a filter, to the

slave device

200. The

slave device

200 may adjust the acquisition and processing of the image according to the above-described command.

S104: the

master device

100 transmits a command for controlling a photographing effect to the

slave device

200 in response to the received user operation.

The control command includes the following information: shooting parameters adjusted in response to user-specific operations, type of stream creation command (i.e., preview command, video command, photo command). The multi-stream configuration configures different numbers and types of streams according to different stream creation commands.

The user-adjusted shooting parameters, which may include, but are not limited to, depending on the user operation received by the host device 100: hardware parameters of the camera involved in capturing the image and/or software parameters involved in processing the image. The shooting parameters also include some combination of hardware parameters and software parameters. Such as a hybrid zoom range, night mode, portrait mode, time-lapse shooting, slow motion, panoramic mode, HDR, etc.

Wherein the hardware parameters include one or more of the following: the ID of the camera, the optical zoom range, whether to turn on the optical image anti-shake, the aperture size adjustment range, whether to turn on the flash, whether to turn on the light supplement, the shutter time, the ISO sensitivity value, the pixel and video frame rate, and so on.

The software parameters include one or more of the following: digital zoom value, image clipping size, color temperature calibration mode of the image, whether to make noise reduction mode of the image, beauty/body type, filter type, sticker option, whether to start self-timer mirror image, etc.

In some embodiments, the control command may further include default values of other shooting parameters. The above-mentioned other photographing parameters refer to photographing parameters other than the parameters adjusted by the user operation.

The control command carrying the photographing parameters may be transmitted to the

slave device

200 through the communication connection established by the

master device

100 and the

slave device

200. In particular, the virtual camera HAL may send the created control commands to a device virtualization platform (DMSDP), which may include a data session channel and a control session channel of the

master device

100 and the

slave device

200. The control command may be transmitted to the

slave device

200 through the control session channel.

For example, the "set white balance mode of the

slave device

200 to incandescent lamp mode" control command may include: the stream creation command may be a preview command, with the shooting parameters modified (white balance=incandescent lamp mode). The control commands may also include default shooting parameters such as one-time focus, no filter, off beauty, etc.

As shown by

dialog box

921 shown in user interface 92. According to the photographing capability information of the

slave device

200 acquired in S102, the

master device

100 may display photographing capability options corresponding to the photographing capability on the display screen. The

host device

100 may detect a user operation acting on a certain shooting capability option. In response to the user operation, the

master device

100 may transmit a command to control the photographing effect to the

slave device

200.

Taking the control command of "set white balance mode of

slave device

200 to incandescent lamp mode" as an example,

master device

100 may detect a user operation acting on the incandescent lamp mode, for example, a user operation acting on

incandescent lamp

923 in user interface 92, and in response to the user operation,

master device

100 may transmit the above-described control command to

slave device

200 for controlling

slave device

200 to set white balance mode to incandescent lamp mode.

In creating and transmitting the control command, the

master device

100 may also perform various processes on the command, such as dynamic pipelining, multi-stream configuration, and multiplexing control.

The dynamic pipeline process may refer to fig. 13.

When the application program generates a control command to set the white balance mode of the

slave device

200 to the incandescent lamp mode, the

camera application

131 may generate a control command with the preview control 1311 and the adjustment of the photographing

effect

1312. The control command corresponds to a repeated

frame control

1314. Repeated frame control may indicate that the control command is acting on multiple frames. The repeated frame control may include fields Cmd and Surfaces. The Cmd field may be used to represent a control command. In some embodiments, the number of the control command may also be included in the cmd. The Surfaces may be used to receive a view of the rendered frame and send the resulting effect to a surfacefinger for image composition display to a screen.

The dynamic pipeline may tag the control commands described above. The tags may include an on-demand tag, a stream tag, a repeat frame tag. The camera service 132 may include a pending

command queue

1321. When the above-mentioned repeated frame control command reaches the pending

command queue

1321, the repeated frame control command may replace a basic command 1322 (for example, "cmd+streamids+buffer") in the original pending

command queue

1321, where the basic command further includes other default parameters for controlling the shooting effect, for example, default mode of white balance: daylight mode, default filter: no filter, etc.

After replacement, an on-demand repeat frame control command 1324 (e.g., "cmd+streamids+buffer+incandescent lamp mode+isnew+repeat") is added to the pending

command queue

1321. The repeat

frame control command

1324 may add two fields IsNew and repeat. IsNew may be used to indicate that the command is an on-demand control issued for an application. Repetition may be used to indicate that the command is a repeated frame control. Meanwhile, the repeated

frame control command

1324 may replace the original default white balance mode (e.g., the above-described daylight mode) with the incandescent lamp mode (i.e., white balance=incandescent lamp mode).

In addition, the dynamic pipeline may also flag the flow direction tag that sends the control command to the

slave device

200. For example, the dynamic pipeline may add a flow tag Device to the control command. The Device may represent the object to which the control command acts by a camera number (ID). Referring to table 1, when device=1, the control command may represent a control command to the front lens of the

main apparatus

100. The local camera HAL of the

master device

100 may receive the control command described above. When device=1002, the above control command may represent a control command to the rear-mounted normal lens of the

slave Device

200. The virtual camera HAL of the

host device

100 may receive the control command described above. Accordingly, a Device according to a control command to set the white balance mode of the

slave Device

200 to the incandescent lamp mode may be set to 1002.

The multi-stream configuration in the stream processing module may then add stream configuration information for the control commands described above. The control commands described above include the preview control 1311 and the

shooting effect

1312 of "set white balance mode to incandescent lamp mode". That is, the multi-stream configuration may configure one preview stream (1080P) and one analysis stream (720P) for preview control 1311. The specific configuration rules of the multi-stream configuration may be described with reference to table 2 in fig. 3, and will not be described herein.

The multiplexing control may multiplex the plurality of streams configured by the multi-stream configuration module. The number of streams transmitted back from the

slave device

200 to the

master device

100 can be reduced through multiplexing control, thereby reducing network load and improving transmission efficiency. In particular, multiplexing control may cover low quality streams with high quality streams. For example, a split stream of

picture quality

720P may multiplex a preview stream of 1080P. Thus, for control commands requiring a preview stream (1080P) and an analysis stream (720P),

slave device

200 may only transmit back a preview stream of 1080P.

Further, according to the flow direction label device=1002, the control command may be sent to the virtual camera HAL module of the HAL layer. As shown in fig. 13, the virtual camera HAL may filter out control commands 1324 issued on demand. The filtered on-demand control commands may be stored in a

send command queue

1331.

Master device

100 may send control commands in a

send queue

1331 to

slave device

200.

In some embodiments, when the control command is a command to start photographing/recording, the multi-stream configuration may further configure a photographing stream, a recording stream. Depending on the multi-stream configuration, the multiplexing control will also change accordingly.

Fig. 14 exemplarily shows an example of multiplexing and offloading of a different stream when the control command is a photographing command. Fig. 14 may include a pre-treatment section and a post-treatment section. The preprocessing section may be implemented by a preprocessing module of the stream processing, and the post-processing section may be implemented by a post-processing module of the stream processing, both of which may be described with reference to fig. 3.

When

camera device session

141 sends a photographing command,

multi-stream configuration

142 module may configure a preview stream (1080P), an analysis stream (720P), a video stream (4K), a photographing stream (4K) for the command. Then, the multiplexing

control

143 module may multiplex the above information configuring the four streams into configuration of two streams: one preview stream (1080P), one video stream (4K). Finally, the

master device

100 transmits the photographing control command, which is divided and multiplexed, to the

slave device

200.

In the process in which the

master device

100 transmits a command for controlling a photographing effect to the

slave device

200 in response to a received user operation, the

master device

100 may multiplex an image stream (stream) required in the control command. In response to the multiplexed control command, the

slave device

200 may transmit only the multiplexed image stream, thereby reducing the image stream transmitted in the network, reducing the network load, and further improving the transmission efficiency.

S105: a command for controlling a photographing effect is received from the

slave device

200, and in response to the command, the

slave device

200 may acquire and process an image.

The camera proxy service of the

slave device

200 may receive a control command for adjusting a photographing effect transmitted from the

master device

100. In particular, DMSDP may have established a data session channel and a control session channel between the

master device

100 and the

slave device

200. The camera proxy service module of the

slave device

200 may receive the control command transmitted by the control session channel. In response to the control command described above, the

slave device

200 can acquire and process an image according to the photographing parameters carried in the control command.

Specifically, the

slave device

200 may perform a corresponding operation according to the photographing parameter carried in the control command. When the control command carries a hardware parameter, the

slave device

200 may collect an image according to the hardware parameter, which may include, but is not limited to: the

slave device

200 captures an image using the front or rear camera, the optical zoom range, the optical image anti-shake on, the flash on, the light supplement on, the frame rate of 30fps indicated in the control command. When the control command carries a software parameter, the

slave device

200 may process the acquired image according to the software parameter, which may include, but is not limited to: cutting, color temperature calibration, noise reduction, filter effect addition and the like are performed on the acquired image.

Also taking the control command of "setting the white balance mode of the

slave device

200 to the incandescent lamp mode" as an example, after receiving the command, the slave device may adapt and parse the command, and then send the command to the local camera HAL of the

slave device

200, and finally obtain the processing result.

Fig. 13 is a diagram illustrating a processing procedure of the

slave device

200 in part by the control instruction described above.

The

camera proxy service

134 may include a receive

command

1341, a table of surfaces maps 1344, and a

repeat frame control

1342, and the receive

command queue

1341 may receive control commands sent by the

master device

100 to set the white balance mode of the

slave device

200 to the incandescent mode. The control command may refer to command 1343 (i.e., "cmd+streamids+repeating+incandescent lamp mode+1002, one 1080P preview stream"). Wherein the incandescent lamp mode may represent setting the white balance mode of the

slave device

200 to the incandescent lamp mode, 1002 may identify the object to which the command is sent as the camera 1002 of the

slave device

200.

First, the

camera proxy service

134 can convert StreamIds of control commands in the received command queue into a Surface corresponding to the

slave device

200 according to the Surface mapping table 1344. In the camera service, there is a one-to-one correspondence between streamId and surfaceId, i.e., one streamId corresponds to one surfaceId. surface may be used to identify surfaces. The surfaces described above may act as a carrier to display images of the stream generated from the

device

200. Referring to the description of the foregoing embodiments, streamId may indicate a particular stream and surfaces may indicate a carrier on which a particular display image is rendered. The

slave device

200 needs to generate a corresponding stream according to the request sent by the

master device

100, and therefore, the

slave device

200 needs to generate the streams ids according to the mapping table 1344 to generate surfaces corresponding thereto after receiving the control command.

The pending

command queue

1351 may then replace the base command in the original queue with the repeated frame control described above, resulting in a pending

command

1352. The

slave Device

200 may send the above command to the

command processing queue

1361 of the camera 1002 according to device=1002. In response to the above command, the ISP of the camera 1002 may set the white balance mode to the incandescent lamp mode, so that the ISP of the camera 1002 may perform the incandescent lamp color temperature calibration on the image collected by the camera 1002. The ISP may be an Image Video Processor (IVP)/natural processing unit (natural processing unit, NPU)/digital signal processing (digital signal processing, DSP), etc., which is not limited in this application.

After the white balance mode is adjusted, a new image is available from the

apparatus

200.

Referring to the user interface 91 shown in fig. 9A, before responding to the above-described control command to set the white balance mode of the

slave device

200 to the incandescent lamp mode, the image acquired and processed from the camera 1002 of the

slave device

200 may display contents with reference to the

window

912. As shown in

window

912, in a default white balance option (e.g., daylight mode), the captured and processed image by camera 1002 may be less bright and the image may be grayed out. After responding to a control command to set the white balance mode of the

slave device

200 to the incandescent lamp mode, the image acquired and processed by the camera 1002 may be shown with reference to the

window

931 in the user interface 93. At this time, the brightness of the image is higher, and the overall tone of the picture is closer to the color observed by human eyes.

It is to be understood that the above description of the image effects before and after the

slave device

200 responds to the control command for adjusting the photographing effect is an example. The above description should not be taken as limiting the embodiments of the present application.

S106: the

slave device

200 displays the processed image.

This step is optional. As described in S103, in some embodiments, after the

slave device

200 uses the own camera at the same time as the

master device

100, the

slave device

200 may display the image acquired and processed by the own camera on the display screen of the present device. Reference is made to the user interface 81 shown in fig. 8A. Accordingly, the

slave device

200 may also display the updated image upon receiving a control command in response to adjusting the photographing effect. Such as user interface 94 shown in fig. 9D.

In other embodiments, the

slave device

200 may not display the images acquired and processed by its own camera when the camera is turned on. Therefore, after the

slave device

200 responds to the control command for adjusting the photographing effect transmitted from the

master device

100, the

slave device

200 may not display the adjusted image. That is, the slave device only transmits the acquired image to the

master device

100 for use, and the display screen of the slave device does not realize the image acquired and processed by the camera of the slave device.

S107: the

slave device

200 transmits the processed image to the

master device

100.

In response to a control command to adjust the shooting effect transmitted from the

master device

100, the

slave device

200 can obtain a new image. In response to the type and number of streams required by the

master device

100, the

slave device

200 may generate streams required by the

master device

100 accordingly. In some embodiments,

slave device

200 may stream capture a set of image streams. The group of image streams is the group with the highest image quality among the types and numbers of streams required by the

host device

100. The highest image quality may be the highest resolution, etc. The

slave device

200 may then copy the acquired set of image streams into a plurality of streams, and compress, adjust, etc. the copied streams according to the type and number of streams required by the

master device

100, for example, copy a set of 720P analysis streams from the acquired set of 1080P video streams.

Also taking the control command of "set white balance mode of

slave device

200 to incandescent lamp mode" as an example, in the foregoing description, the multiplexing control module may multiplex two streams configured by the multi-stream configuration module into one stream. Thus, the

master device

100 finally transmits a control command requiring one 1080P preview stream to the

slave device

200. Reference is made specifically to the description of S103.

In response to the control commands sent by

master device

100,

slave device

200's local camera HAL may generate a 1080P preview stream. The preview stream described above has set the white balance mode to the incandescent lamp mode. The local camera HAL may then send the stream to the camera proxy service of the

slave device

200, which may further transmit the stream back to the

master device

100.

In some embodiments, the streams returned from the

slave device

200 to the

master device

100 are greater than 1 stream, such as the multiple stream scenario illustrated in fig. 14. When the

master device

100 transmits the photographing control command to the

slave device

200, the multi-stream configuration and multiplexing control finally configures one preview stream (1080P) and one video stream (4K) for the photographing control command, and the description of S103 may be referred to.

In response to the photographing control command described above, the

slave device

200 may generate one 1080P preview stream (stream 1), one 4K video stream (stream 2), and one photographed image stream (stream 3). In an alternative embodiment,

slave device

200 may capture a set of image streams according to

stream

2. From

stream

2, the

slave device

200 may replicate two other sets of streams (stream 1', stream 2'), and from the

master device

100, the

slave device

200 may process streams 1 'and 2' to obtain

streams

1 and 2, e.g., compress stream 1 'of 4K to obtain stream 1 (1080P), obtain a photographed image from stream 2' of 4K, and so on, as required by the

master device

100 for

streams

1 and 2. The camera proxy service of the

slave device

200 may then transmit the 3 streams to the

master device

100.

S108: the

master device

100 displays an image transmitted from the

slave device

200.

The

master device

100 may restore the streams returned from the

slave device

200 to the number and type of streams actually required by the

master device

100. The analysis stream may be used for image processing, the preview stream may be used for display, etc. The

host device

100 may send each stream to a corresponding module for processing and utilization according to the type of the recovered stream. With the

slave device

200 transmitting the preview stream, the

master device

100 can realize a function of displaying an image.

The post-processing module of the

master device

100 may perform intelligent splitting and multi-stream output to the

slave device

200.

First, the intelligent diversion module can record the number and types of flows configured by the multi-flow configuration module. Upon receipt of the multiplexed stream returned from the

device

200, the intelligent splitting module may replicate the received stream into a plurality of streams according to the records described above. The replicated multiple streams may be the same as or slightly lower than the received stream, so as to obtain multiple streams with the same number and type of streams configured by the multi-stream configuration module. The split streams may then be sent to the corresponding modules for processing and utilization.

For example, in an example in which the white balance mode of

slave device

200 is set to the incandescent lamp mode,

master device

100 may receive a 1080P preview stream returned from

slave device

200. At this time, the intelligent branching module may restore the preview stream to one preview stream (1080P) and one analysis stream (720P) configured by the multi-stream configuration module.

Fig. 14 also shows another example of smart diversion. The figure shows a split process where the

master device

100 receives multiple streams back from the

slave device

200.

As shown, the

host device

100 may receive a 1080P stream (stream 1), a 4K stream (stream 2), and a picture stream (stream 3). The intelligent splitting module may split stream 1 into a 1080P preview stream and a 720P analysis stream.

Stream

2 may be divided into a 1080P stream (stream 1) and a 4K video stream, and the 1080P stream (stream 1) may be split further, with reference to the splitting of stream 1. Stream 3 may be split into a 4K photo stream.

The splitting process can restore the multiplexed stream to the stream required by the original

main device

100, thereby reducing network load and improving transmission efficiency caused by multiplexing, and meeting the original requirement of the application program, so that the normal use of the application program is not affected.

The multiple streams obtained after the splitting can be respectively sent to corresponding modules of the application layer for application programs to use. Based on this, the application may display the preview stream. In some embodiments, the

master device

100 may only display the image returned from the

slave device

200. In other embodiments, the

master device

100 may display both the image returned from the

device

200 and the image captured and processed by its own camera, referring to the user interface 93 shown in fig. 9C.

As shown in fig. 9C, the

master device

100 displays a form in which an image returned from the

device

200 can pass through a floating window, referring to a

window

931. In some embodiments, the

host device

100 may detect a user operation on the

window

931, in response to which the

host device

100 may move the position of the floating window to any position of the user interface 93. The operation described above is, for example, a long press drag operation. In some embodiments, the

host device

100 may detect another user operation on the

window

931, in response to which the

host device

100 may exchange content displayed in the floating window and the preview window. The operation is, for example, a click operation. In addition, the

main device

100 may also adjust the size of the floating window, etc. in response to other operations of the user.

In other embodiments, the

master device

100 may also divide into two tiled window displays, such as the user interface 114, when displaying both the image returned from the

device

200 and the image captured and processed by the own camera.

It will be appreciated that when the

master device

100 connects a plurality of slave devices, the

master device

100 may display images returned by the plurality of slave devices. Likewise, the

master device

100 may be displayed through a floating window, may be tiled, and is not limited in this embodiment.

In some embodiments, when the control command is a parameter-adjusted photographing command, the frame synchronization module may synchronize the received stream returned from the

slave device

200 before the post-processing shunts the stream returned from the

slave device

200. Frame synchronization may reduce delay errors due to network transmissions. Particularly for the photographing flow, the photographing result obtained by the photographing flow with the frame synchronization can be closer to the requirement of the user.

Referring to fig. 15, fig. 15 illustrates a frame synchronization method schematic diagram.

As shown in fig. 15, the frame sync may include three parts. The

first portion

151 and the

third portion

153 may represent processes occurring on the

host device

100. The

second portion

152 may represent a process that occurs from the

device

200. Taking a stream of 7 frames and a photographing command as an example, the

master device

100 performs frame synchronization on the photographing command.

At the

second frame

1511, the

master device

100 may create a

photograph command

1512. The

master device

100 may then send the command to the

slave device

200 via a wired or wireless connection. The

slave device

200 receives the above command at the

third frame

1521. Thus, the

slave device

200 takes the image of the

third frame

1521 as a result of executing the photographing

command

1512. The

slave device

200 having processed the photographing command may transmit the processing result back to the

master device

100. The processing result received by the

master device

100 is a

third frame

1521 image. At this time, the

master device

100 may perform synchronization processing on the above result.

In a photo scene, frame synchronization may advance the processing results. For example, the processing result received by the

host device

100 is a

third frame

1521 image. The

master device

100 may advance by one frame with the

second frame

1511 as the result of the synchronized processing. It will be appreciated that the receipt of a command from the

device

200 one frame time later is an exemplary network delay. That is, in some embodiments, the time at which the command is received from the

device

200 may also be a fourth frame, a fifth frame, etc. The time delay for receiving a command from the

device

200 varies depending on the actual network communication quality. The

same master device

100 advances one frame as a synchronization result is also exemplified.

In some embodiments, the

master device

100 may not have obtained the

second frame

1511 as a result of the synchronized processing. For example, in the second frame, the

master device

100 may create a photographing command. Then, the

slave device

200 receives the above command at the fourth frame. Accordingly, the

slave device

200 takes the image of the fourth frame as a result of executing the photographing command. The

master device

100 performs forward synchronization processing on the received fourth frame image to obtain a third frame image.

When the control command is a command for adjusting only parameters or a video recording command, for example, the control command for controlling the photographing effect of the above-described "white balance mode is set to incandescent lamp mode", the

main apparatus

100 may not perform frame synchronization on the received image.

S109: the

main apparatus

100 performs processing such as photographing, video recording, and forwarding on the displayed image.

When the application of the

master device

100 obtains the image transmitted from the

slave device

200, the

master device

100 may further utilize the image.

As shown in fig. 9C, in a live application scenario, the live application of the

host device

100 may forward the obtained image. Specifically, the

host device

100 may forward the image to a live server. The server may distribute the image to the user devices viewing the live broadcast.

As shown in fig. 11D, in an application scenario of photographing or video recording, the

main apparatus

100 may detect a user operation acting on the photographing

control

1146, and in response to the operation, the

main apparatus

100 may perform photographing storage or video recording storage on an obtained image acquired from the camera of the

apparatus

200.

In the embodiments of the present application:

in the above-described method S103, before receiving the control command for adjusting the photographing effect transmitted by the master device, the image acquired and processed by the slave device may be referred to as a first image, for example, an image shown by a

window

912 in the user interface 91.

In the above-described method S108, after the master device transmits a control command for adjusting the photographing effect to the slave device, the image collected and processed by the slave device according to the control command may be referred to as a second image. Such as the image shown by

window

921 in user interface 93.

First shooting parameters: the shooting parameters used by the slave device to acquire and process the first image can be called first shooting parameters, and the first shooting parameters can be default parameters or shooting parameters carried in a control command sent by the master device received in the previous time.

Second shooting parameters: the shooting parameters used by the slave device to acquire and process the second image may be referred to as second shooting parameters, that is, shooting parameters carried in a control command for adjusting the shooting effect sent by the master device to the slave device.

The image acquired and processed by the host device according to the default shooting parameters of its own camera may be referred to as a third image, for example, an image displayed in the

window

911 in the user interface 91.

The user can adjust the shooting effect of the main equipment, and the main equipment can control the camera of the main equipment to adjust shooting parameters in response to the adjustment operation. The main device acquires and processes the new image according to the adjusted shooting parameters, and the new image can be called a fourth image. Such as an image displayed by

window

1031 in

user interface

103.

In the above method S104, the number of streams configured by the master device according to the requirements of the application layer is the first number, and the number of streams sent to the slave device, which is obtained by multiplexing the reusable streams by the master device, is the second number. The type of the stream configured by the master device according to the requirement of the application layer is a first type, the stream which can be multiplexed by the master device and is sent to the slave device is a second type.

Another possible software structure of the

host device

100 is described below.

The difference from the software architecture shown in fig. 3 is that: the software framework provided in this embodiment places the management of the entire stream lifecycle in a HAL layer implementation. That is, in the present embodiment, the stream processing module of the service layer in fig. 3 is moved to the HAL layer. The other modules are unchanged. Other parts of another software structure provided in the embodiments of the present application may refer to the description of fig. 3, which is not repeated in the embodiments of the present application.

The above-mentioned preprocessing and post-processing operations of the multi-stream configuration, multiplexing control and the like on the streams are realized in the HAL layer, so that higher processing efficiency can be realized.

In cross-device collaborative shooting, a master device can be connected with one or more slave devices, so that multi-view shooting experience can be provided for a user, and shooting effects of the slave devices, such as focusing, exposure, zooming and the like, can be controlled, and therefore requirements of the user for controlling far-end shooting effects are met. Furthermore, by implementing the cross-device collaborative shooting method, the master device can acquire all preview pictures, shooting results and video recording results of the slave device. For example, the master device may store the picture of the slave device by photographing or video recording, or in a live application scenario, forward the picture of the slave device to a third party server, etc.

By implementing the cross-equipment collaborative shooting method, the distributed control among the equipment with the operating system can be realized, the functions of the electronic equipment are extended to other common hardware shooting equipment, and the lens is flexibly expanded. For example, the method can support the control of the mobile phone to obtain the data flow of multiple devices, and realize the collaborative recording of the mobile phone and a large screen, a watch, a car machine and the like.

In addition, the multiplexing and the splitting of the stream in the cross-equipment collaborative shooting can also realize the network transmission load, so that the transmission efficiency is improved, the transmission delay is further reduced, and the clear and smooth image quality is ensured.

The method, system and device for cross-device collaborative shooting can be further expanded into a distributed audio scene, for example: the distributed camera framework is acted upon the distributed audio framework. The distributed audio scene can realize the unification of distributed audio and video, and the efficiency of the whole cross-device communication is improved.

As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context.

In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer commands. When the computer program command is loaded and executed on a computer, the flow or functions described in the embodiments of the present application are all or partially produced. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer commands may be stored in or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.

Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program commanding associated hardware, the program being stored on a computer readable storage medium, the program when executed comprising the process of the above-described method embodiments. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.