patents.google.com

CN107103840B - Foldable device and control method thereof - Google Patents

  • ️Fri Mar 20 2020

Detailed Description

This application claims priority to korean patent application nos. 10-2014-14-0062544, 10-2014-0127683, 10-2015-0026080, filed in the korean intellectual property office at 23/5/2014, 24/9/2014 and 1/24/2015, respectively, the disclosures of which are incorporated herein by reference in their entireties.

Advantages and features of exemplary embodiments and methods of implementing exemplary embodiments may be understood more readily by reference to the following detailed description of specific exemplary embodiments and the accompanying drawings. The inventive concept may be embodied in many different forms and should not be construed as limited to the exemplary embodiment or exemplary embodiments set forth herein. Rather, the one or more exemplary embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of one or more exemplary embodiments to those skilled in the art, and the present concept will be defined only by the appended claims and equivalents thereof.

All terms (including descriptive terms or technical terms) used herein should be interpreted as understood by one of ordinary skill in the art. However, these terms may have different meanings according to the intention of one of ordinary skill in the art, precedent cases, or appearance of new technology. Further, some terms may be arbitrarily selected by the applicant, and in this case, the meanings of the selected terms will be described in detail in the detailed description of the present invention.

Throughout the specification, when a component "comprises" an element, the component may also comprise other elements unless there is a specific description to the contrary. Furthermore, throughout the specification, the term "unit" means a software component, a hardware component, or a combination of software and hardware components (e.g., a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC)), and performs a specific function. However, the term "unit" is not limited to software or hardware. A "unit" may be formed in an addressable storage medium or may be formed to operate on or in one or more processors. Thus, for example, the term "unit" may refer to a component (e.g., a software component, an object-oriented software component, a class component, and a task component) and may include a process, a function, an attribute, a program, a subroutine, a code segment, a driver, firmware, microcode, circuitry, data, database, data structure, table, array, or processing of variables, an execution, an attribute, a process, a subroutine, a segment. The functionality provided by the components and "units" may be associated with a smaller number of components and "units" or may be divided into additional components and "units".

Throughout the specification, "foldable device" means a foldable electronic device, apparatus, or electronic device.

Further, the term "fold" may be intended to deform a flat electronic device such that two facing surfaces of the electronic device are close to each other to the extent that they almost contact each other (refer to fig. 1 and 2). The flat electronic device may be deformed by folding or bending with a hinge.

For example, when the hinge is installed on the

foldable device

100, the

foldable device

100 may be folded in such a manner that two surfaces of the

foldable device

100 are folded by the hinge to such an extent that the two surfaces are in contact with each other or almost in contact with each other to become parallel or almost parallel. Further, when the

foldable device

100 is formed of a flexible material, the

foldable device

100 may be folded along an arbitrary line to the extent that two surfaces of the

foldable device

100 approach each other and become parallel.

One or more exemplary embodiments will now be described more fully with reference to the accompanying drawings, in which certain exemplary embodiments are shown. In the following description, well-known functions or constructions are not described in detail since they would obscure the disclosure with unnecessary detail.

As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. When a statement such as "at least one of … …" is placed after a listed element, that entire list of elements is modified rather than modifying the individual elements listed.

Hereinafter, the term "simultaneously displayed" is used when a screen of an a application and a screen of a B application are simultaneously displayed on a device or when a screen of an a application and a screen of a B application are simultaneously displayed on a device.

Fig. 1A illustrates a device 100 (i.e., electronic device 100) folded with a hinge according to an exemplary embodiment.

As shown in fig. 1A, the

device

100 may be a foldable

electronic device

100 that is folded using a hinge.

As shown in fig. 1A, the

electronic device

100 may be used in a folded state or may be used in an unfolded state.

The

device

100 may switch its screen according to the state of the

electronic device

100.

Throughout the specification, the state of an electronic device may be defined as two states: a folded state and an unfolded state. The folded state and the opened state will be described in more detail later.

Further, throughout the specification, the state of the

electronic device

100 is changed from the open state to the folded state by the folding motion. The

electronic device

100 changes from the folded state to the unfolded state by the unfolding motion.

According to one or more exemplary embodiments, the

device

100 may include a smartphone, a tablet Personal Computer (PC), a notebook, a wearable device, an electronic book, or the like.

In an exemplary embodiment, when the

execution screen

110 of the first application is displayed on the

device

100, if the

device

100 detects a folding motion or an unfolding motion, the

electronic device

100 may simultaneously display the

execution screen

110 of the first application and the

execution screen

120 of the second application. The second application may vary according to various exemplary embodiments.

Throughout the specification, "the execution screen of the first application", "the first application screen", or "the screen of the first application" refers to a screen displayed on the display as a result of executing the first application. Further, "the execution screen of the second application", "the second application screen", or "the screen of the second application" refers to a screen displayed on the display as a result of executing the second application. Referring to fig. 1A, when the

electronic device

100 is in a folded state, areas of two surfaces facing each other are the same. Accordingly, the

electronic device

100 may be symmetrically folded, but this is merely an example, and the areas of two surfaces facing each other in the folded state may be different. Thus, the

electronic device

100 may be folded asymmetrically.

Fig. 1B illustrates an

electronic device

100 folded by bending according to another exemplary embodiment.

As shown in fig. 1B, the

device

100 may be bent in an arbitrary line, and since the

electronic device

100 is bent, a region where the arbitrary line exists may be bent to a predetermined degree. Further, when two surfaces facing each other become parallel due to bending of the electronic device, the

device

100 may be in a folded state.

Fig. 2 is a block diagram of an

apparatus

100a according to another example embodiment.

The

apparatus

100a includes a display 210 (i.e., a display unit), a state detector 220 (i.e., a state detection unit), and a controller 230 (i.e., a control unit).

The

display

210 displays a screen of an application run by the

controller

230, a user interface screen, or a screen showing a state of the

device

100 a. The

controller

230 runs at least one application and controls the

display

210 to display a screen of the at least one running application on the

display

210. When the execution screen of at least one first application is displayed, if the

controller

230 detects a folding motion or an unfolding motion with respect to the

apparatus

100a, the

controller

230 causes the execution screen of the at least one first application and the execution screen of at least one second application to be simultaneously displayed.

For example, as shown in fig. 1A, when the

apparatus

100a in the folded state displays the execution screen of the

first application

110, if the

apparatus

100a is unfolded and the

controller

230 thus detects the unfolding motion, the

apparatus

100a may simultaneously display the execution screen of the

first application

110 and the execution screen of the

second application

120.

In one or more exemplary embodiments, when the execution screen of the first application and the execution screen of the second application are displayed in the folded state and the opened state, the execution screens may display the same contents, and only the sizes of the execution screens may be different.

In one or more exemplary embodiments, when the execution screen of the first application and the execution screen of the second application are displayed in the folded state and the opened state, a layout of each execution screen may become different. For example, the position or size of the menu on each screen may become different, or the position or size of the icon may become different, according to the folded state or the opened state.

The

controller

230 may generally control the operation of the

device

100 a. For example,

controller

230 may run and control an Operating System (OS) of

device

100a, may process various types of data, and may control elements of

device

100a (including

display

210,

status detector

220, etc.).

The

state detector

220 detects a folding motion and an unfolding motion for the

apparatus

100 a.

In one or more exemplary embodiments, the

state detector

220 may detect a movement that changes the state of the

apparatus

100a from the folded state to the open state, or a movement that changes the state of the

apparatus

100a from the open state to the folded state.

The

device

100 may include a

sensor

7880, i.e., a sensor (see fig. 78). The

sensor

7880 may sense a state of the

device

100, a motion of the

device

100, or a situation around the

device

100, and may transmit the sensed information to the

controller

230.

The

sensor

7880 may include at least one of the following sensors: a magnetic sensor, an acceleration sensor, a hall sensor, a bending sensor, a gyroscope sensor, a proximity sensor, a temperature/humidity sensor, an infrared sensor, a location sensor (e.g., Global Positioning System (GPS)), an atmospheric pressure sensor, an RGB sensor (i.e., an illuminance sensor), but one or more exemplary embodiments are not limited thereto. The function of the sensor may be understood by those of ordinary skill in the art based on the name of the sensor or is well known in the related art.

In one or more exemplary embodiments, the

state detector

220 of the

apparatus

100a may detect a movement that changes the state of the

apparatus

100a to an open state or a folded state by using the

sensor

7880. For example, the

state detector

220 may detect the folding motion or the unfolding motion by using a hall sensor or a magnetic sensor provided in the folding area.

In one or more exemplary embodiments, the

state detector

220 may detect whether the current state of the

apparatus

100a is an open state or a folded state, and the

state detector

220 may detect a folding motion or an unfolding motion when the current state is changed.

In one or more exemplary embodiments, the

state detector

220 may measure a bending angle or a folding angle of the

apparatus

100 a. If the

device

100a has a hinge, the

state detector

220 may measure the folding angle of the hinge.

In one or more exemplary embodiments, the state detection sensor is included in a region of the

devices

100a that are close to each other when the

devices

100a are bent or folded, and thus the

state detector

220 may detect the folded state. The state detection sensor may include at least one of a proximity sensor, an illuminance sensor, a magnetic sensor, a hall sensor, a touch sensor, a bending sensor, and an infrared sensor, or a combination thereof.

In one or more exemplary embodiments, the

state detector

220 may determine whether the

apparatus

100a is in a folded state or an open state, and may provide the determination result to the

controller

230. In this case, the

controller

230 may not need to separately determine the folded state or the opened state, but may recognize whether the state of the

apparatus

100a is the folded state or the opened state according to the output from the

state detector

220.

In one or more exemplary embodiments, when the

state detector

220 provides the

controller

230 with information about a bending angle or a folding angle or sensed information obtained by the state detection sensor, the

controller

230 may determine the folded state or the opened state of the

apparatus

100 a.

Fig. 3 is a block diagram of an

apparatus

100b according to another example embodiment.

The

device

100b may include a

first display

310 and a

second display

320. The

first display

310 and the

second display

320 may be distinguished from each other in a hardware manner or a software manner.

For example, the

first display

310 and the

second display

320 may be two hardware screens. The

first display

310 and the

second display

320 may be respectively disposed on any surfaces of the

folding device

100b along the same direction. In addition, the

first display

310 may be disposed at a front surface of the

foldable device

100b, and the

second display

320 may be disposed at a rear surface of the

foldable device

100 b.

In one or more exemplary embodiments, the

first display

310 and the

second display

320 may be one physical element including two different regions as software.

One display formed on one surface of the

folding device

100b may be divided into at least two software regions, which may be used as a display part, respectively. For example, a hardware screen provided at the front surface of the

foldable device

100b may be divided into at least two software areas, which may be the

first display

310 and the

second display

320, respectively.

Further, a portion of the hardware screen disposed at the front surface of the

foldable device

100b may be the

first display

310, and a portion of the hardware screen disposed at the rear surface of the

foldable device

100b may be the

second display

320.

When the display is software, the position and size of the display may be changed according to whether the

foldable device

100b is folded or rolled. The position or size of the display may be changed dynamically.

Although fig. 3 illustrates two displays (i.e., the

first display

310 and the

second display

320 in one or more exemplary embodiments), the

foldable device

100b may include at least three displays.

The

controller

230 may control the first and second application screens to be displayed on at least one of the first and

second displays

310 and 320. In one or more exemplary embodiments, a first application screen may be displayed on the

first display

310 and a second application screen may be displayed on the

second display

320.

Fig. 4 illustrates an arrangement of a

first display

310 and a

second display

320 according to an exemplary embodiment.

As shown in fig. 4, when the front surface of the

device

100b is a first surface and the rear surface of the

device

100b is a second surface, the

first display

310 may be disposed on the first surface of the

electronic device

100b and the

second display

320 may be disposed on the second surface of the

electronic device

100 b.

Each of the

first display

310 and the

second display

320 may correspond to a portion of a surface of the

electronic device

100b, or may correspond to an entire portion of the surface. In one or more exemplary embodiments, the

first display

310 may be disposed on the first surface of the

apparatus

100b as a whole, or may be divided into a plurality of regions, and may be disposed on the first surface of the

apparatus

100 b. In one or more exemplary embodiments, the

second display

320 may be disposed on the second surface of the

apparatus

100b as a whole, or may be divided into a plurality of regions, and may be disposed on the second surface of the

apparatus

100 b.

Fig. 5 illustrates an

apparatus

100 implemented to have a foldable structure according to an exemplary embodiment.

The

apparatus

100 may be implemented in such a way: the elements of the device 100 (including the

first display

310, the

second display

320, the

controller

230, the housing, etc.) are flexible. In one or more exemplary embodiments, only some elements of the apparatus 100 (e.g., the housing, etc.) may be flexible, and other elements (e.g., the battery, etc.) may be rigid.

The

display

210 may be implemented to have at least one of a foldable structure, a flexible structure, and a rigid structure. In addition, the

first display

310 may be foldable and the

second display

320 may be rigid.

When the

display

210 has a foldable characteristic, a folding area of the

display

210 may be set to match a folding area of the

apparatus

100.

When the

display

210 has a flexible characteristic, the

display

210 may be freely disposed on the

apparatus

100.

When the

display

210 has a rigid characteristic, the

display

210 may be disposed in an area of the

apparatus

100 other than the foldable area or the bendable area.

Fig. 6-11 illustrate displays that are physically distinguishable between the two, according to various exemplary embodiments.

The display may be a flexible display.

The

device

100 comprising the flexible display may be deformed by using a hinge or by being bendable. The

device

100 may be folded by deforming. In fig. 6 through 11, when the

device

100 is folded, the

folding area

610 may indicate a portion deformed due to being folded.

As shown in fig. 6, the

display

210a may be unitary to span the

fold region

610 at the first surface of the

device

100 a. The

display

210a may be a foldable display or a flexible display. The display may not be disposed on the second surface of the

device

100 a.

The folding motion may be defined as a motion of folding two areas of the first surface exposing the second surface divided by the

folding area

610 to face each other. An unfolding motion may be defined as a motion that causes the

device

100a to unfold after the

device

100a is folded.

Fig. 7 illustrates structures of display 210a1 and display 210a2 according to another exemplary embodiment.

Display 210a1 and display 210a2 may be disposed in two regions of the first surface of

device

100a, where the two regions do not overlap with

fold region

610. Thus, the region is divided by the

folding region

610. Each of display 210a1 and display 210a2 may be at least one of a foldable display device, a flexible display device, and a rigid display device.

The folding motion may be defined as a motion in which the first surface is exposed and two regions of the second surface divided by the

folding region

610 are folded to face each other. An unfolding motion may be defined as a motion that causes the

device

100a to unfold after the

device

100a is folded.

Fig. 8 illustrates a

first display

310a and a

second display

320a according to another exemplary embodiment.

The

first display

310a may be a unitary body to span the

fold region

610 at the first surface of the

device

100 b. The

second display

320a may be disposed in one of two regions of the second surface of the

device

100b, wherein the two regions are separated by a

fold region

610. The

first display

310a may be a foldable display device or a flexible display device. The

second display

320a may be at least one of a foldable display device, a flexible display device, and a rigid display device.

Fig. 9 illustrates a structure of a

first display

310a and a

second display

320b according to another exemplary embodiment.

The

first display

310a may be unitary, spanning the

fold region

610 at the first surface of the

device

100 b. The

second display

320b may be unitary, spanning the

fold region

610 at the second surface of the

device

100 b. Each of the

first display

310a and the

second display

320b may be a foldable display device or a flexible display device.

Fig. 10 illustrates a structure of the first and

second displays

310a and 320c1 and 320c2 according to another exemplary embodiment.

The

first display

310a may be unitary, spanning the

fold region

610 at the first surface of the

device

100 b. The second displays 320c1 and 320c2 may be disposed in two regions of the second surface of the

device

100b, wherein the two regions are separated by a

fold region

610. The

first display

310a may be a foldable display device or a flexible display device. Each of the second displays 320c1 and 220c2 may be at least one of a foldable display device, a flexible display device, and a rigid display device.

Fig. 11 illustrates a structure of the first displays 310b1 and 310b2 and the

second display

310a according to another exemplary embodiment.

The first displays 310b1 and 310b2 may be respectively disposed at two regions of the first surface of the

device

100b, wherein the two regions do not overlap the

folding region

610 and are divided by the

folding region

610. The

second display

320a may be disposed in one of two regions of the second surface of the

device

100b, wherein the two regions are divided by the

folding region

610. Each of the first displays 310b1 and 310b2 and the

second display

320a may be at least one of a foldable display device, a flexible display device, and a rigid display device.

Fig. 12 and 13 illustrate displays differentiated in software according to various exemplary embodiments.

As shown in fig. 12, the

first display

310a may correspond to a portion of a hardware screen disposed on a first surface of the

device

100 b. The

second display

320a may correspond to a portion of a hardware screen disposed on a second surface of the

device

100 b.

As shown in fig. 13, the first display 310a1 may correspond to a portion of a hardware screen disposed on a first surface of the

device

100 b. The second display 310a2 may correspond to the remaining portion of the hardware screen other than the portion of the

first display 310a

1.

Fig. 14A to 14F illustrate folded states of the

apparatus

100 according to various exemplary embodiments.

The folded state of fig. 14A, 14B, and 14C is an example in which the

foldable device

100 is folded using a hinge. The folded state of fig. 14D, 14E, and 14F is an example in which the

foldable device

100 is folded by bending.

The folded state and the unfolded state may be quantized into folded shape estimates within a predetermined range. The folded shape estimate may vary depending on the configuration of the

state detector

220. For example, the folded shape estimation value may be a bending angle of the

apparatus

100, a folding angle of the

apparatus

100, or a sensing value of a state detection sensor. For example, the sensing value of the state detection sensor may be a sensing value of a proximity sensor, a sensing value of an illuminance sensor, a sensing value of a magnetic sensor, a sensing value of a hall sensor, or a sensing value of a touch sensor.

The folded state of the

apparatus

100 may be estimated based on the folded angle of the

apparatus

100, but the one or more exemplary embodiments are not limited thereto.

The folded state refers to a state in which the folded angle of the

apparatus

100 is in the first range. For example, the folded state may refer to a state in which the folded angle of the

apparatus

100 is close to 0 degree. As shown in fig. 14A to 14F, the folding angle refers to an angle defined from a vertex at which two facing surfaces intersect due to folding.

In some exemplary embodiments, the folded state refers to a state in which a sensed value in the proximity sensor, the magnetic sensor, the infrared sensor, or the like is within a third range, wherein the sensed value indicates a degree to which the two regions are close due to folding.

The fold lines of the

device

100 may be preset such that the

device

100 may be folded only at the preset fold lines. The number of fold lines may vary according to various exemplary embodiments. For example, the

device

100 may be folded in various ways, e.g., the

device

100 may be folded two or three times.

In one or more exemplary embodiments, the

device

100 may be folded or bent at any point of the

device

100. In this case, the

apparatus

100 may detect the folded or bent shape by using a sensor.

Fig. 15 shows a folded state of the

apparatus

100 according to an exemplary embodiment.

The folded state of the

device

100 refers to a state in which the

device

100 is symmetrically folded. As shown in fig. 15, the

device

100 may be folded symmetrically at the

folding region

610. The

device

100 may be folded symmetrically by using a hinge. In one or more exemplary embodiments, the

device

100 may be symmetrically folded by bending.

Fig. 16 shows a folded state of the

device

100 according to another exemplary embodiment.

Here, the folded state of the

apparatus

100 refers to a state in which the

apparatus

100 is folded asymmetrically. As shown in fig. 16, the

device

100 may be folded asymmetrically, exposing the

first region

1610. The

device

100 may be folded asymmetrically by using a hinge. In one or more exemplary embodiments, the

device

100 may be folded asymmetrically by bending.

In one or more exemplary embodiments, key buttons, a portion of the

display

210, a touch sensor, etc. may be disposed in the

first region

1610.

Fig. 17 shows the folded state of the

device

100b according to an exemplary embodiment.

Here, the folded state of the

apparatus

100b means a state in which: when one of the

first display

310 and the

second display

320 of the

apparatus

100b is disposed to face the user, the quadrangular portion of the other of the

first display

310 and the

second display

320, which does not face the user, is asymmetrically folded to face the user. For example, as shown in fig. 17, when the

first display

310 is disposed to face a user, a portion of the

second display

320 may be asymmetrically folded, thereby facing the user. The

device

100b may be folded asymmetrically by using a hinge. In one or more exemplary embodiments, the

device

100b may be asymmetrically folded by bending.

Fig. 18 shows the folded state of the

device

100b according to another exemplary embodiment.

The folded state of the

apparatus

100b means the following state: when one of the

first display

310 and the

second display

320 of the

apparatus

100b is disposed to face the user, the triangular portion of the other not facing the user is asymmetrically folded to face the user. For example, as shown in fig. 18, when the

first display

310 is disposed to face a user, the triangular portion of the

second display

320 may be folded into a triangle, thereby facing the user.

The

device

100b may be folded asymmetrically by using a hinge or by bending.

Fig. 19 shows the folded state of the

device

100b according to another exemplary embodiment.

The folded state of the

apparatus

100b means the following state: when one of the

first display

310 and the

second display

320 of the

apparatus

100b is disposed to face the user, portions of the other of the

first display

310 and the second display, which does not face the user, are folded, thereby being exposed in a plurality of regions. For example, as shown in fig. 19, when the

first display

310 is disposed to face a user, the third and

fourth areas

1910 and 1920 of the

second display

320 may be folded to face the user. The folded state shown in fig. 19 may be referred to as a multi-region folded state.

The

device

100b may be folded asymmetrically by using a hinge or by bending.

Fig. 20A to 20F illustrate an opened state of the

apparatus

100 according to various exemplary embodiments.

The state of fig. 20A, 20B, and 20C is an example in which the

foldable device

100 is folded by using a hinge. The states of fig. 20D, 20E, and 20F are examples in which the

foldable device

100 is folded by bending.

The unfolded state refers to a state in which the folding angle of the

apparatus

100 is within the second range. For example, the unfolded state may refer to a state in which the folding angle of the

apparatus

100 is approximately equal to 180 degrees.

In some exemplary embodiments, the opened state refers to a state in which a sensed value in a proximity sensor, a magnetic sensor, an infrared sensor, or the like is within a fourth range, wherein the sensed value indicates a degree to which two regions are close due to folding.

Fig. 21 shows an open state of the

device

100 according to an exemplary embodiment.

As shown in fig. 21, in the unfolded state, when the folding angle of the

device

100 is 180 degrees, the

device

100 is unfolded. As shown in fig. 21, the window of the

display

210 shown in the open state may be divided so that a plurality of application screens may be displayed on the divided window (S2102).

In one or more exemplary embodiments, at least one display having one of various shapes may be disposed on a surface opposite to the

display

210 exposed in the open state. Further, as shown in fig. 21, the display on the opposite surface may be in a closed state (S2104).

Fig. 22 shows an open state of the

device

100 according to another exemplary embodiment.

As shown in fig. 22, in the open state, the

device

100 is unfolded, and thus, the folding angle of the

device

100 is approximately 180 degrees. As shown in fig. 22, one application screen may be displayed on the

display

210 exposed in the open state (S2202).

In one or more exemplary embodiments, at least one display having one of various shapes may be disposed on a surface opposite to the

display

210 exposed in the open state. Further, as shown in fig. 22, the display on the opposite surface may be in a closed state (S2204).

Fig. 23 illustrates a folding motion and an unfolding motion according to an exemplary embodiment. The folding and unfolding movements may be performed in the manner shown in fig. 23.

Fig. 24 illustrates a folding motion and an unfolding motion according to an exemplary embodiment. The folding and unfolding movements may be performed in the manner shown in fig. 24.

Fig. 25 is a flowchart illustrating a method of controlling a device according to an exemplary embodiment.

The device to be controlled by using the method may be any electronic device that is folded by folding with a hinge or by bending and has at least one display. For example, the method may be performed on the

device

100a shown in fig. 2 or the

device

100b shown in fig. 3. The method is performed on the

apparatus

100a shown in fig. 2, but one or more exemplary embodiments are not limited thereto.

The

device

100a displays a screen of the first application on the display 210 (S2502).

When the

state detector

220 or the

controller

230 detects the folding motion or the unfolding motion (S2504), the

controller

230 simultaneously displays the screen of the first application and the screen of the second application on the display 210 (S2506).

As shown in fig. 7 to 11, when a plurality of physically separated displays are used, operation S2506 of displaying a screen of a first application and a screen of a second application may be performed by using another display other than the display displaying the screen of the first application in operation S2502, or operation S2506 may be performed by using both the other display and the display displaying the screen of the first application.

Fig. 26 is a flowchart illustrating a method of controlling a device according to another exemplary embodiment.

When an unfolding motion that changes the state of the

apparatus

100 from a folded state to an open state is detected, the execution screen of the first application and the execution screen of the second application may be simultaneously displayed.

First, the

device

100 in the folded state displays the execution screen of the first application (S2602).

When the

folding device

100 is in the folded state, the

controller

230 may display a running screen of the first application on the first surface of the flexible display serving as the display. As shown in fig. 6, in a case where the

display

210a is displayed on the first surface of the

foldable device

100 as a whole, the execution screen of the first application may be displayed in an area of the

display

210 a. As shown in fig. 7, in the case where the displays 210a1 and 210a2 are respectively disposed at two regions of the first surface of the

device

100a (where the two regions are separated by the folding region 610), the execution screen of the first application may be displayed on only one of the displays 210a1 and

210a

2.

As shown in fig. 9 and 12, when the

second display

320b is disposed to span the

folding area

610, the execution screen of the first application may be displayed on an area of the

second display

320 b. The area of the

second display

320b may be an area facing a user, a predefined area, etc. As shown in fig. 10, in the case where the second displays 320c1 and 320c2 are respectively disposed at two regions of the second surface of the

device

100a, which are separated by the

folding region

610, the execution screen of the first application may be displayed on only one of the second displays 320c1 and

320c

2. The one of the second displays 320c1 and 320c2 on which the execution screen of the first application is displayed may be a user facing area, a predefined area, or the like.

Next, when the

state detector

220 detects the unfolding motion (S2604), the

controller

230 simultaneously displays the execution screen of the first application and the execution screen of the second application (2606).

When the

foldable device

100 in the folded state is unfolded, the

controller

230 may display a running screen of the first application and a running screen of the second application on the second surface of the flexible display serving as the display. The second application may be related to the first application. The user may use the

device

100 in a folded state generally, and then, when the user desires to use a larger screen, the user may unfold and use the

device

100. Further, since the

device

100 in the open state can simultaneously display the execution screen of the first application and the execution screen of the second application, the user can conveniently use a plurality of applications.

Fig. 27 shows a screen in a folded state and a screen in an open state according to an exemplary embodiment.

As shown in fig. 27, when the state of the

apparatus

100 is changed from the folded state in which the screen of the first application is displayed on the

display

210 to the open state, the screen of the first application and the screen of the second application may be simultaneously displayed on the

display

210.

When the

apparatus

100 is in the folded state, the screen of the first application is displayed on an area of the

display

210 of the first surface of the

apparatus

100, and when the

apparatus

100 is in the open state, the screen of the first application and the screen of the second application are displayed on the entire area of the

display

210 of the first surface.

In one or more exemplary embodiments, when the

apparatus

100 is in the folded state, a screen of a first application is displayed on an area of the

display

210 of the first surface of the

apparatus

100, and when the

apparatus

100 is in the opened state, a screen of the first application and a screen of a second application are displayed on the

display

210 of the second surface of the

apparatus

100.

FIG. 28 is a flow chart of a method of controlling a device according to another exemplary embodiment.

When the

device

100b detects a folding motion that changes the state of the

device

100b from the open state to the partially folded state, the

device

100b simultaneously displays the screen of the first application and the screen of the second application. The partially folded state (as shown in fig. 17 to 19) refers to the following state: when one of the

first display

310 and the

second display

320 of the

apparatus

100b is disposed to face the user, a portion of the other of the

first display

310 and the

second display

320, which does not face the user, is asymmetrically folded to face the user.

First, the

device

100b in the open state displays a screen of a first application on the first display 310 (S2802). When the

state detector

220 detects a folding motion for changing the

apparatus

100b into a partially folded state (S2804), the

controller

230 displays a screen of a first application on an area of the

first display

310 and displays a screen of a second application on an area of the second display 320 (S2806).

In the embodiment of fig. 28, the

first display

310 and the

second display

320 may be disposed on different surfaces of the

device

100b, respectively.

The region of the

first display

310 may be a region of the

first display

310 that is exposed to a user according to a state in which the portion of the

device

100b is folded. I.e., the area of the

first display

310 not covered by the

second display

320.

The region of the

second display

320 refers to a region disposed in a direction exposed to the user according to the partially folded state of the

device

100 b.

Fig. 29A and 29B illustrate screens in an opened state and a partially folded state according to various exemplary embodiments.

As shown in fig. 29A and 29B, when the screen of the first application is displayed on the

first display

310 of the device 100B in the open state (S2902), when the

device

100a is changed to the partially folded state having the quadrangular shape (S2904), the screen of the first application may be displayed on an area of the

first display

310 and the screen of the second application may be displayed on an area of the

second display

320.

As shown in fig. 29A, the screen of the first application may be displayed with a portion of the screen of the first application covered. That is, the

second display

320 covers a portion of the screen of the first application.

Alternatively, as shown in fig. 29B, in one or more exemplary embodiments, at least one of the size and scale of the screen of the first application may change to match the area of the region not covered by the

second display

320. The screen of the second application may be displayed on an area of the

second display

320 disposed on the same side as the

first display

310 due to the folding. The screen size of the second application may be adjusted to match the size of the screen of the first application that changes due to the folding, or may be displayed with a portion of the screen of the second application overlaid.

Fig. 30A and 30B illustrate screens in an opened state and a folded state according to various exemplary embodiments.

As shown in fig. 30A and 30B, when the screen of the first application is displayed on the

first display

310 of the device 100B in the open state, when the device 100B is changed to the partially folded state having the triangular shape, the screen of the first application may be displayed on an area of the

first display

310 and the screen of the second application may be displayed on an area of the

second display

320.

As shown in fig. 30A, the screen of the first application may be displayed in a state where a portion of the screen of the first application is covered. That is, the

second display

320 may cover a portion of the screen of the first application.

Alternatively, as shown in fig. 30B, in one or more exemplary embodiments, at least one of the size and scale of the screen of the first application may be changed to correspond to an area of the region not covered by the

second display

320. Further, in one or more exemplary embodiments, a blank area where no content is displayed may be included in an area not covered by the

second display

320. The blank area may have a size or shape that causes the screen of the first application to be displayed in a rectangular shape.

The screen of the second application may be displayed on an area of the

second display

320 disposed on the same side as the

first display

310 due to the folding. The screen of the second application may be adjusted to match the size of the area of the

second display

320.

Fig. 31A and 31B illustrate screens in an opened state and a folded state according to various exemplary embodiments.

As shown in fig. 31A and 31B, when a screen of a first application is displayed on the

first display

310 of the device 100B in an open state, when a plurality of regions of the device 100B are changed to a partially folded state, the screen of the first application may be displayed on the region of the

first display

310 and the screen of a second application may be displayed on one or more regions of the

second display

320. For example, a text message received from a person named cheolso KIM may be displayed on

area

3110 and a phone application screen displaying an icon for calling cheolso KIM may be displayed on

area

3120.

As shown in fig. 31A, the screen of the first application may be displayed with a portion of the screen of the first application covered. That is, the plurality of

regions

3110 and 3120 of the

second display

320 may cover a portion of the screen of the first application.

Alternatively, as shown in fig. 31B, in one or more exemplary embodiments, at least one of the size and scale of the screen of the first application may be changed to match the area of the region not covered by the

regions

3110 and 3120 of the

second display

320.

The screen of the second application may be displayed on

regions

3110 and 3120 of the

second display

320 disposed on the same side as the

first display

310 due to the folding. In addition, screens of different applications may be displayed on the plurality of

regions

3110 and 3120 of the

second display

320.

The screen of the second application may be adjusted to match the size of the

regions

3110 and 3120 of the

second display

320 that are in the same direction as the

first display

310 due to the folding.

According to various exemplary embodiments described with reference to fig. 28 to 31B, the apparatus 100B may be folded by using a hinge or by being formed to be bendable.

When the user uses the

device

100b in the open state, if the user wants to view simple contents, the user may fold the region of the

device

100b, and thus may view the simple contents by using the display on the opposite surface of the

device

100 b. For example, when the user watches a video, if the

notification

2910 indicating that a text message is received is displayed, the user may fold the region of the

device

100b and may simply view the content of the text message.

Fig. 32 is a flowchart of a method of controlling the

apparatus

100 according to another exemplary embodiment.

When the

device

100 detects the folding motion or the unfolding motion, the

device

100 determines the second application according to the preset criterion. According to one or more exemplary embodiments, the preset criterion for determining the second application may be changed.

First, the

device

100 displays a screen of a first application (S3202), and when the

state detector

220 detects a folding motion or an unfolding motion (S3204), the

controller

230 determines a second application (S3206). Next, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S3208).

The second application may be determined according to one of various criteria, for example, the second application may be determined according to a random or pseudo-random selection, a user predefined second application may be determined as the second application, the second application may be determined according to a preset list of related applications, a most recently running application may be determined as the second application, or a frequently used application may be determined as the second application. Various examples of determining the second application will be described in more detail later.

Fig. 33 shows a structure of an

apparatus

100c according to another exemplary embodiment.

The

apparatus

100c includes a

display

210, a

state detector

220, a

controller

230, and a user interface 3310 (i.e., a user input unit or an input/output (I/O) unit).

The

display

210 displays a screen of an application run by the

controller

230, a screen of a user interface, or a screen indicating a state of the

apparatus

100 c.

Display

210 may include a

first display

310 disposed on a first surface of

device

100c and a

second display

320 disposed on a second surface of

device

100 c.

The

controller

230 runs at least one application and controls the

display

210 to display a screen of the at least one application. When the screen of the first application is displayed, if the

controller

230 detects a folding motion or an unfolding motion, the

controller

230 simultaneously displays the screen of the first application and the screen of the second application.

The

controller

230 may generally control the operation of the

device

100 c. For example,

controller

230 may run an operating system of

device

100c, may process various types of data, and may control elements of

device

100c (including

display

210,

status detector

220,

user interface

3310, etc.).

The

user interface

3310 receives user input. The

user interface

3310 may include at least one of a key, a touch sensor, a touch screen, a pen recognition panel, a bending sensor, a bio-signal detection sensor, and a microphone, or a combination thereof. The

user interface

3310 may be provided together with a user interface screen through a Graphical User Interface (GUI).

The

user interface

3310 may receive user inputs for controlling the

device

100 c. For example, the

user interface

3310 may receive user input to turn the

device

100c on and off, user input to run an application, user input to an application, and so forth. Further, the

user interface

3310 may receive a user input selecting the second application, a user input selecting whether to simultaneously display the screen of the first application and the screen of the second application when the folding motion or the unfolding motion is detected, or a user input selecting a display method related to the screen of the second application when the folding motion or the unfolding run is detected.

The

user interface

3310 may receive a user input selecting the second application. For example, the

controller

230 may display a second application selection menu on the

display

210 and may receive user input through the second application selection menu. The

controller

230 may determine the second application based on user input via the

user interface

3310.

In one or more exemplary embodiments, when the folding motion or the unfolding motion is detected, the

user interface

3310 may receive a user input selecting whether the screen of the first application is simultaneously displayed with the screen of the second application. The

controller

230 displays the screen of the second application and the screen of the first application only when the user selects to display the screen of the second application.

Fig. 34 is a flowchart of a method of controlling the

apparatus

100 according to another exemplary embodiment.

A second application selection menu for selecting a second application may be provided along with the screen of the first application, and the second application may be selected according to a user selection.

First, the

controller

230 simultaneously displays the second application selection menu and the screen of the first application (S3402). The second application selection menu is configured to allow the user to select the second application.

A second application selection menu may be displayed as a GUI on the

display

210.

In one or more exemplary embodiments, the second application selection menu is provided to receive a user input via a touch sensor, key buttons, or the like provided in a predetermined area of the housing of the

apparatus

100 c. Here, information regarding the second application selection menu may be displayed on an area of the

display

210.

Next, when the

state detector

220 detects the folding motion or the unfolding motion (S3404), the

controller

230 determines whether an input of a user selection has been received through the second application selection menu (S3406).

If the input of the user selection has been received (S3406), the

control

230 may determine the application selected by the user as the second application, and may simultaneously display a screen of the first application and a screen of the second application (S3408).

If the input selected by the user has not been received (S3406), the

controller

230 changes the screen of the

display

210 according to the preset criteria (S3410). The preset criteria may vary according to various exemplary embodiments. The preset criteria may include determining an application as the second application, wherein the application is preset to run with the first application. In one or more exemplary embodiments, the preset criterion may include determining an application as the second application, wherein the application is preset by a user via a setting menu or the like. In one or more exemplary embodiments, the preset criterion may include changing a screen of the first application by changing a size or a setting of the screen of the first application when the folding motion or the unfolding motion is detected.

When the second application is determined, the determination may reflect the user's intent. Further, when the user has selected a second application that the user wants to view together with the screen of the first application, the screen of the second application may be automatically displayed together with the screen of the first application when a folding motion or an unfolding motion is performed. Therefore, the user can conveniently use the multi-window interface.

Fig. 35 illustrates an example of providing the second

application selection menu

3520 according to an exemplary embodiment.

The

device

100c may be folded asymmetrically. In a folded state, the

apparatus

100c displays a screen of a first application on the

display

210 and provides a second

application selection menu

3520 using the

first region

3510.

The second

application selection menu

3520 displays one or more user-selectable applications and receives user input selecting an application. As shown in fig. 35, the

first region

3510 may be divided into a plurality of regions to which the user-selectable applications may be respectively matched.

A screen of a first application may be displayed on the

second display

320 and the

first region

3510 may be located on the

first display

310. The

first region

3510 may be defined as a portion of the

first display

310 exposed in a folded state. If the

first display

310 is a touch screen, a second

application selection menu

3520 may be displayed on the

first application region

3510 and the

device

100c may receive user input (being a touch input to the first display 310).

In one or more exemplary embodiments, the

first region

3510 may be a region on a housing of the

apparatus

100c where a touch sensor, a proximity sensor, or the like is disposed. In one or more exemplary embodiments, information regarding the second

application selection menu

3520 may be displayed on the

first region

3510 on the housing or may be displayed on an outwardly exposed region of the

display

210.

The second

application selection menu

3520 is provided via a predetermined area physically separated from the

display

210 which asymmetrically folds and displays the first application. The user can intuitively access the second

application selection menu

3520. Also, since the second

application selection menu

3520 is provided by using an area of a lower layer exposed due to asymmetric folding, the second

application selection menu

3520 may be displayed without covering the screen of the first application and complicating the screen setting.

Fig. 36 illustrates an example of providing a second

application selection menu

3650 according to another exemplary embodiment.

Referring to fig. 36, the second

application selection menu

3650 may be provided only in response to a user's request for the second

application selection menu

3650. For example, the

controller

230 of the

apparatus

100c in the folded state may display a screen of a first application and a

first menu

3640 for requesting a second

application selection menu

3650 on the display 210 (S3610). The user may request

device

100c to display a second

application selection menu

3650 by selecting

first menu

3640. For example, the user may request the

apparatus

100c to display the second

application selection menu

3650 by clicking or dragging the

first menu

3640.

When the

user requesting device

100c displays the second

application selection menu

3650, the

controller

230 may display a screen of the first application and the second

application selection menu

3650 on the display 210 (S3620). The user may select the second application from the second

application selection menu

3650. User input may be received in the form of touch input, key input, pen input, voice input, and the like.

When the user selects the second application from the second

application selection menu

3650 and changes the

device

100c to the open state (S3630), the

controller

230 displays a screen of the first application and a second application screen on the display 210 (S3630). As described above with reference to fig. 34, if the user does not select the second application, when the folding motion or the unfolding motion is detected, the screen of the

display

210 may be determined and displayed according to the preset criteria.

Referring to fig. 36, the state of the

apparatus

100c is changed from the folded state to the open state. In one or more exemplary embodiments, even if it is preset that a screen of a first application and a screen of a second application are simultaneously displayed when the state of the

apparatus

100c is changed from the open state to the folded state, the

first menu

3640 requesting the second

application selection menu

3650 may be provided. That is, in a folded state of the

apparatus

100c, a screen of the first application and the

first menu

3640 may be provided, and when the user requests the second

application selection menu

3650, the

apparatus

100c in the folded state may simultaneously display the screen of the first application and the second

application selection menu

3650. Further, if the user selects the second application, when the state of the

device

100c is changed from the folded state to the open state, the

device

100c in the open state may simultaneously display the screen of the first application and the screen of the second application selected by the user.

The second

application selection menu

3650 may be displayed together with the screen of the first application only when the user desires. Accordingly, it is possible to significantly reduce the problem of the screen setting being complicated due to the second

application selection menu

3650 or due to the screen of the first application being covered by the second

application selection menu

3650.

Fig. 37 illustrates an example of providing a second

application selection menu

3730 according to another exemplary embodiment.

Referring to fig. 37, the

controller

230 of the

device

100c in the open state displays a screen of a first application and a second

application selection menu

3730 on the first display 310 (S3710). When a user input selecting the second application is received via the second

application selection menu

3730 and the

device

100c is partially folded, the

controller

230 displays a screen of the first application on the

first display

310 and displays a screen of the second application selected by the user on the exposed area of the second display 320 (S3720).

The user having selected the second application can view the screen of the second application by performing a simple gesture (e.g., partially bending or folding a portion of the

device

100 c), and thus the user can conveniently view the screen of the second application.

Fig. 38 illustrates an example of providing a second application selection menu according to another exemplary embodiment.

Referring to fig. 38, the

device

100c simultaneously displays a first application screen and information on a second application selection menu and receives a user's selection by using elements of the

user interface

3310 formed on the housing of the

device

100 c. For example, the

apparatus

100c may receive a user's selection by using the

touch sensors

3810a and 3810b as elements of the

user interface

3310.

The

device

100c may also accurately receive the user input by using the

user interface

3310 provided on the housing of the

device

100c, and the area of the screen occupied by the second application selection menu is reduced.

FIG. 39 is a flow chart of a method of controlling a device according to another exemplary embodiment.

Referring to fig. 39, when a folding motion or an unfolding motion is detected, a screen of a first application and a second application selection menu are simultaneously displayed, and when a user input via the second application selection menu is detected, the screen of the first application and the screen of the second application are simultaneously displayed.

First, the

controller

230 displays a screen of a first application on the display 210 (S3902).

Next, when the

state detector

220 detects the folding motion or the unfolding motion (S3904), the

controller

230 displays the screen of the first application and the second application selection menu simultaneously (S3906).

When receiving a user selection for the second application (S3908), the

controller

230 determines the application selected by the user as the second application and simultaneously displays a screen of the first application and a screen of the second application (S3910).

After detecting the folding motion or the unfolding motion (S3904), if the user selection is not received within a preset time, the

controller

230 may remove the second application selection menu from the screen.

When the folding motion or the unfolding motion is detected, a second application selection menu is provided so that the user can easily select applications to be simultaneously displayed after performing the folding motion or the unfolding motion.

Fig. 40 shows an example of displaying a screen of the first application and the second application selection menu in operation S3906 of the flowchart shown in fig. 39.

As shown in fig. 40, when an unfolding motion for the

apparatus

100c is detected, the

controller

230 may simultaneously display a screen of a first application and a second

application selection menu

4010 on the

display

210. In addition, a similar example may be made when a folding motion for the

control

100c is detected.

FIG. 41 is a flow chart of a method of controlling a device according to another exemplary embodiment.

Referring to fig. 41, when a folding motion or an unfolding motion is detected, a screen of a first application, a screen of a second application, and a second application selection menu are simultaneously displayed.

First, the

controller

230 displays a screen of a first application on the display 210 (S4102).

Next, when the

state detector

220 detects a folding motion or an unfolding motion (S4104), a screen of the first application, a screen of the second application, and a second application selection menu are simultaneously displayed (S4106).

The screen of the second application displayed together with the screen of the first application may be determined according to a preset criterion. The second application may be determined according to one of various criteria, for example, the second application may be determined according to a random or pseudo-random selection, a user predefined second application may be determined as the second application, a preset list of related applications may be determined as the second application, a most recently running application may be determined as the second application, or a commonly used application may be determined as the second application. .

When a user input to select the second application is received (S4108), the

controller

230 simultaneously displays a screen of the first application and a screen of the second application selected by the user (S4110). The previously displayed screen of the second application is replaced by a screen of the second application selected by the user. In one or more exemplary embodiments, a previously displayed screen of the second application and a screen of the second application selected by the user are simultaneously displayed.

After detecting the folding motion or the unfolding motion (S4104), if a user input selecting the second application is not received within a preset time, the

controller

230 may remove the second application selection menu from the screen.

When the folding motion or the unfolding motion is detected, a screen of the first application and a screen of the second application may be displayed, but the user may easily change the screen of the second application. Accordingly, the user can conveniently view a plurality of application screens.

Fig. 42 illustrates an example of providing a second

application selection menu

4210 according to another exemplary embodiment.

Referring to fig. 42, the

controller

230 of the

device

100c in the folded state displays a screen of a first application on the

display

210, and when the

device

100c is changed to the open state, the

controller

230 may simultaneously display the screen of the first application, the screen of a second application, and the second

application selection menu

4210 on the

display

210.

Fig. 43 shows an example of providing a second

application selection menu

4360 according to another exemplary embodiment.

Referring to fig. 43, the

controller

230 of the

device

100c in the folded state displays a screen of a first application on the display 210 (S4310), and when the

device

100c is changed to the open state, the

controller

230 may simultaneously display the screen of the first application, the screen of a second application, and a

first menu

4350 for requesting a second

application selection menu

4360 on the display 210 (S4320). When the user requests the

device

100c to provide the second application selection menu through the

first menu

4350, the

controller

230 simultaneously displays a screen of the first application, a screen of the second application, and the second application selection menu 4360 (S4330). When the user input is received through the second

application selection menu

4360, the

controller

230 simultaneously displays a screen of the first application and a screen of the second application selected by the user (S4340).

Fig. 44 is a flowchart of a method of controlling the

apparatus

100a according to another exemplary embodiment.

Referring to fig. 44, when the folding motion or the unfolding motion is detected, the

device

100a determines the second application based on a list of pre-stored related applications.

First, the

device

100a displays a screen of a first application (S4402). When the

state detector

220 detects a folding motion or an unfolding motion (S4404), the

controller

230 determines a second application based on the list of related applications (S4406). The list of related applications represents the type of the second application to be matched according to the type of the first application. A list of related applications may be stored in the

device

100 c. For example, a list of related applications may be stored in a memory of the

device

100 c.

Next, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S4408).

Fig. 45 shows an example of a list of related applications according to an exemplary embodiment.

The list of related applications includes information on a type of the second application to be matched according to the type of the first application. For example, as shown in fig. 45, if the first application is a book-related application, the second application may be a dictionary-related application. In addition, if the first application is a camera-related application, the second application may be an album-related application.

Information about related applications that the user is likely to use together is stored, and the first application and the second application are automatically matched. Accordingly, when the folding motion or the unfolding motion is detected, the user can conveniently use the related application.

Fig. 46 illustrates an example of simultaneously displaying a screen of a first application and a screen of a second application based on a list of related applications according to an exemplary embodiment.

In fig. 46, a Social Network Service (SNS) application is used in the

device

100c in a folded state. When the

device

100c is changed to the open state, the

controller

230 determines the second application as an album application based on the list of related applications and simultaneously displays a screen of the SNS application and a screen of the album application. The user may upload a picture to the SNS application by dragging or dropping the picture from the album application.

FIG. 47 is a flowchart of a method of controlling

device

100c according to another example embodiment.

Referring to fig. 47, when the

apparatus

100c detects a folding motion or an unfolding motion, the

apparatus

100c determines a second application based on an input to the first application. Since the input to the first application typically represents the current user's interests, information about the current interests of the user may be obtained.

First, the

device

100c displays a screen of a first application (S4702). When the

state detector

220 detects a folding motion or an unfolding motion (S4704), the

controller

230 analyzes an input to the first application to determine a second application (S4706). For example, the input to the first application may include a voice of a caller or recipient of the telephony application, a message of a messenger, and so on. Next, the

controller

230 determines a second application based on the analysis result of the input to the first application (S4708).

When the second application is second determined, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S4710).

Fig. 48 illustrates an example of a screen to determine a second application and display the second application according to an exemplary embodiment.

Referring to fig. 48, when the first application is a call application, the

controller

230 recognizes the content of a phone conversation between a caller and a recipient, and determines the second application according to the content of the phone conversation. For example, if one of the call participants says "how do you stroll around a south-Jiangnan station on the weekend? ", the

controller

230 recognizes that the specific region name is included in the phone conversation, and thus may determine the map application as the second application.

Thereafter, when the

device

100c detects a folding motion or an unfolding motion, the

controller

230 displays a screen of a call application as a screen of a first application, and displays a screen of a map application as a screen of a second application. When the screen of the map application is displayed, the

controller

230 may display a map screen in which "south of the river station" identified from the phone conversation is marked.

The

device

100c can recognize the contents of the user's telephone conversation, and thus can easily and conveniently provide the user with information of interest to the user.

Fig. 49 illustrates an example of a screen to determine a second application and display the second application according to another exemplary embodiment.

Referring to fig. 49, when the first application is a messenger application or a short message application, the

controller

230 may analyze the content of a message transmitted from the first application or the content of a message received by the first application, and thus may determine the second application. For example, if a message transmitted or received by a first application includes "new gap S5 published", the

controller

230 may analyze the content of the message and may determine a search application as a second application to search for the content of the message in a website.

Thereafter, when the

device

100c detects a folding motion or an unfolding motion, the

controller

230 may simultaneously display a screen of a messenger application or a short message application as a first application and a screen of a search application as a second application. The

controller

230 may display a screen related to a search result obtained by inputting a keyword included in a message into the search window on the screen of the search application.

Fig. 50 is a flowchart of a method of controlling the

apparatus

100a according to another exemplary embodiment.

Referring to fig. 50, when a notification event occurs and a folding motion or an unfolding motion is detected, the

device

100a determines an application related to the notification event as a second application. The user may run an application related to the notification event by performing a simple motion (e.g., by changing the folded state of the

apparatus

100 a).

First, the

device

100a displays a screen of a first application (S5002), and when the

state detector

220 detects a folding motion or an unfolding motion (S5004), the

controller

230 determines a second application.

The

controller

230 determines whether a notification event occurs (S5006). The notification event may include receiving a short message, receiving an incoming call, receiving an email, receiving an instant messaging message, receiving a notification from an SNS application, or receiving a notification from various applications.

If the notification event occurs (S5006), the

controller

230 determines the application related to the notification event as the second application (S5008). For example, if a short message has been received, the short message application is determined as the second application. If the mail has been received, the mail application is determined as the second application. If the instant messaging message has been received, the instant messaging application is determined to be the second application. The SNS application is determined as the second application if the notification from the SNS application has been received.

Next, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S5012).

If the notification event does not occur (S5006), the

controller

230 changes the screen of the

display

210 according to preset criteria (S5010). The preset criteria may vary according to various exemplary embodiments. The preset criteria may include determining an application as the second application, wherein the application is pre-configured to run with the first application. In one or more exemplary embodiments, the preset criterion may include determining an application as the second application, wherein the application is preset by a user via a setting menu or the like. In one or more exemplary embodiments, the preset criterion may include changing a screen of the first application by changing a size or a setting of the screen of the first application when the folding motion or the unfolding motion is detected.

Fig. 51 illustrates an example of a screen to determine a second application and display the second application according to another exemplary embodiment.

Referring to fig. 51, the

device

100c receives an instant messenger message when the

device

100c displays a screen of a first application, and the

controller

230 simultaneously displays a screen of a messenger application and a screen of the first application when the

device

100c detects a folding motion. When the screen of the first application is displayed, if the

icon

5110 indicating the notification event is displayed, the user may perform an expansion motion on the

device

100c, and thus may run an application related to the notification event and may view the notification event.

Fig. 52 is a flowchart of a method of controlling the

apparatus

100a according to another example embodiment.

Referring to fig. 52, when receiving content or a link and detecting a folding motion or an unfolding motion, the

apparatus

100a determines an application for executing the content or the link as a second application. The user can run an application for executing content or links by simply folding or unfolding the

device

100 a.

First, the

apparatus

100a displays a screen of a first application (S5202), and when the

state detector

220 detects a folding motion or an unfolding motion (S5204), the

controller

230 determines whether content or a link is received via the first application (S5206). The content may include pictures, videos, music, contact information, e-commerce cards, and the like. The links may include internet address links, application travel path links, map information links for map applications, and the like.

When the content or the link is received (S5206), the

controller

230 determines an application for running the content or the link received via the first application as the second application (S5208). For example, if a picture is received, the second application may be an album application, a photo replication application, a photo viewer application, or the like. If a video is received, the second application may include a video copy application, a video browser application, etc. in the first application. The second application may be a music application if music is received. The second application may be a contact information application if contact information is received. If an electronic commerce card is received, the second application may include a contact information application, an electronic commerce card management application, and the like. If an internet site link is received, the second application may be an internet browser application. The second application may be an application installation application if the application runtime path link is received. The second application may be a mapping application if a mapping information link of the mapping application is received.

Next, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S5212).

When the content or the link is not received (S5206), the

controller

230 changes the screen of the

display

210 according to a preset criterion (S5210). The preset criteria may vary according to various exemplary embodiments. For example, in the present embodiment, the preset criterion may include determining an application as the second application, wherein the application is preset to be run together with the first application. In some embodiments, the preset criteria may include determining an application as the second application, wherein the application is preset by the user via a setup menu or the like. In some embodiments, the preset criteria may include changing a screen of the first application by changing a size or setting of the screen of the first application when the folding motion or the unfolding motion is detected.

Fig. 53 illustrates an example of a screen to determine a second application and display the second application according to another exemplary embodiment.

Referring to fig. 53, in a case where the first application is an instant messenger application, a message accompanied by a picture is received via the instant messenger application, and an expansion motion for the

apparatus

100c is detected, the

controller

230 may determine the album application as the second application. In this case, when the unfolding motion for the

device

100c is detected, the screen of the instant messenger application and the screen of the album application are simultaneously displayed. In addition, the

controller

230 may display a screen in which a picture of a message attached to the instant messenger application is played in the album application.

Fig. 54 illustrates an example of a screen to determine a second application and display the second application according to another exemplary embodiment.

Referring to fig. 54, when the first application is an instant messenger application, a message attached with a map link is received via the instant messenger application, and an expansion motion for the

device

100c is detected, the

controller

230 may determine the map application as the second application. In this case, when the unfolding motion for the

device

100c is detected, the screen of the instant messenger application and the screen of the map application are simultaneously displayed. In addition, the

controller

230 may display a map corresponding to the map link of the instant messenger application in the map application.

Fig. 55 is a flowchart of a method of controlling the

apparatus

100a according to another example embodiment.

Referring to fig. 55, when the folding motion or the unfolding motion is detected, the

apparatus

100a determines the second application based on the application use history of the user.

First, the

device

100a displays a screen of a first application (S5502), and when the

state detector

220 detects a folding motion or an unfolding motion (S5504), the

controller

230 determines a second application based on an application use history of the user (S5506). For example, the

controller

230 determines an application commonly used by the user, a most recently used application, or an application running in the background as the second application. In one or more exemplary embodiments, when the user requests the

device

100a to run a specific application via the second application selection menu, the

controller

230 may obtain information on the user's preferred application by learning the user's application selection history, and may determine the user's preferred application as the second application.

Next, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S5508).

FIG. 56 is a flowchart of a method of controlling the

apparatus

100a according to another example embodiment.

Referring to fig. 56, when the

device

100a detects a folding motion or an unfolding motion, the

device

100a determines an application preset by a user as a second application.

First, the

apparatus

100a displays a screen of a first application (S5602), and when the

state detector

220 detects a folding motion or an unfolding motion (S5604), the

controller

230 determines an application previously set as a second application by a user as the second application (S5606). For example, the setting may be made in such a manner that: when the user unfolds the

device

100a (i.e., when the state of the

device

100a changes from the folded state to the open state), the

device

100a always displays the screen of the first application and the screen of the internet browser application at the same time. The user may set the second application in advance by using an application or a setting menu for setting of the

apparatus

100 a.

Next, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S5608).

The user may determine an application commonly used by the user as the second application. Thus, the user can access a commonly used application by simply performing a folding motion or an unfolding motion.

Fig. 57 shows a flow chart of a method of controlling the

apparatus

100a according to another exemplary embodiment.

Referring to fig. 57, when the

device

100a detects a folding motion or an unfolding motion, the

device

100a determines a second application based on the state of the user.

First, the

device

100a displays a screen of a first application (S5702), and when the

state detector

220 detects a folding motion or an unfolding motion (S5704), the

controller

230 determines a second application based on the state of the user (S5706). For example, the

controller

230 may determine the second application in consideration of the current location, time, schedule, etc. of the user.

Next, the

controller

230 simultaneously displays the determined screen of the second application and the screen of the first application (S5708).

Fig. 58 illustrates an example of a screen to determine a second application and display the second application according to another exemplary embodiment.

Referring to fig. 58, in case that the first application is a schedule management application, the

controller

230 may determine the second application based on the current location of the user, the current time, and the schedule. For example, in a case where the schedule management application includes a recording schedule about a wedding of a daphne cathedral at 3 pm today and the current time is 2 pm, when the user performs a spreading motion on the apparatus while the schedule management application is displayed on the apparatus, the

controller

230 may simultaneously display both the schedule management application and a map application showing a planned path from the current location to the daphne cathedral.

Fig. 59 shows a flowchart of a method of controlling the

apparatus

100c according to another exemplary embodiment.

Referring to fig. 59, when the

device

100c detects a folding motion or an unfolding motion, the

device

100c may simultaneously display a screen of a first application and a screen of a second application according to a user input, or may display only the screen of the first application, or may not display the screen of the second application.

First, the

device

100c displays a screen of a first application and simultaneously provides a window division selection menu (S5902). In one or more exemplary embodiments, a window division selection menu may be provided as a GUI displayed on the

display

210. The window division selection menu may be provided in a preset region of the housing of the

device

100c (similar to the second

application selection menu

3520 shown in fig. 35).

Next, when the

state detector

220 detects a folding motion or an unfolding motion (S5904), the

controller

230 determines whether window division is selected (S5906).

When the window division is selected, the

controller

230 simultaneously displays the screen of the first application and the screen of the second application (S5908). The second application may be determined according to the various methods described above.

When the window division is not selected (S5906), the

controller

230 displays a screen of the first application (S5910). That is, when the window is not selected, the

controller

230 displays only the screen of the first application, and does not display the screen of the second application.

Fig. 60 illustrates an example of providing a window

division selection menu

6020 according to an exemplary embodiment.

Referring to fig. 60, when the

apparatus

100c is asymmetrically folded, a screen of a first application is displayed on the

display

210, and a window

division selection menu

6020 is provided through the

fourth area

6010.

From the window

division selection menu

6020, the user may select a division window so that the screen of the second application and the screen of the first application are simultaneously displayed, or may select a full screen window so that only the screen of the first application is displayed without displaying the screen of the second application. As shown in fig. 60, the

fourth area

6010 may be divided into two areas, and a window division option may be matched with the two areas.

The

fourth area

6010 may be on the

display

210. The

fourth region

6010 may be defined as a portion of the region exposed to the outside in the folded state of the

display

210. When the

display

210 is a touch screen, a window

division selection menu

6020 may be displayed on the

fourth area

6010 and may receive a user selection in the form of a touch input to the

display

210.

In one or more exemplary embodiments, the

fourth area

6010 may be disposed on the housing of the

apparatus

100c, for example, the

fourth area

6010 may be an area where a touch sensor, a proximity sensor, or the like is disposed. Information regarding the window

division selection menu

6020 may be displayed in the

fourth area

6010 on the housing, or may be displayed on a portion of the outwardly exposed area of the

display

210.

When the

apparatus

100c is asymmetrically folded, the window

division selection menu

6020 is disposed in a preset region physically separated from the

display

210 on which the screen of the first application is displayed, so that the user can intuitively access the window

division selection menu

6020. In addition, by providing the window

division selection menu

6020 with the area of the lower layer exposed due to the asymmetric folding, the window

division selection menu

6020 can be displayed without covering the screen of the first application and without complicating the screen setting.

Fig. 61 illustrates an example of a screen displaying a screen of a second application from a window division selection menu selection screen according to an exemplary embodiment.

Referring to fig. 61, in the folded state of the

device

100c, when the user selects an option of displaying the screen of the second application and changes the state of the

device

100c from the folded state to the open state, the

controller

230 simultaneously displays the screen of the first application and the screen of the second application.

Fig. 62 illustrates an example of a screen selecting a screen not displaying the second application from the window division selection menu according to an exemplary embodiment.

Referring to fig. 62, in the folded state of the

device

100c, when the user selects an option not to display the screen of the second application and changes the state of the

device

100c from the folded state to the open state, the

controller

230 displays only the screen of the first application. In the open state, the size or setting of the screen of the first application may be changed according to a preset criterion.

Fig. 63 illustrates an example of providing a window

setting selection menu

6320 according to an exemplary embodiment.

According to the present embodiment, when the

device

100c simultaneously displays the screen of the first application and the screen of the second application, the

device

100c may provide a window

setting selection menu

6320 for receiving an input of selecting the setting form of the screen of the first application and the setting form of the screen of the second application. The user may select settings of the screen of the first application and the screen of the second application from the window

setting selection menu

6320. The window

setting selection menu

6320 may be provided through the

display unit

210 or the housing of the

device

100 c.

Referring to fig. 63, the

device

100c is asymmetrically folded, and the

device

100c in a folded state displays a screen of the first application on the

display

210 and provides a window

setting selection menu

6320 to the

fifth region

6310.

The window

setting selection menu

6320 is a menu from which the user can select options related to the settings of the screens of the first and second applications. For example, the window

setting selection menu

6320 may include an option as to whether the first application is set in the left area or in the right area or in a full screen. The

fifth region

6310 may be divided into three regions, and window setting options may be matched with the three regions.

The

fifth region

6310 may be a region on the

display

210. The

fifth region

6310 may be defined as a portion of the region of the

display

210 exposed to the outside in the folded state. When the

display

210 is a touch screen, a window

setting selection menu

6320 may be displayed on the

fifth region

6310 and may receive a user selection in the form of a touch input to the

display

210.

In one or more exemplary embodiments, the

fifth region

6310 may be provided on the housing of the

apparatus

100c, and may be a region where a touch sensor, a proximity sensor, or the like is provided, for example. Information about the window

setting selection menu

6320 may be displayed in a

fifth region

6310 on the housing or may be displayed on a portion of the outwardly exposed region of the

display

210.

When the

apparatus

100c is asymmetrically folded, the window

setting selection menu

6320 may be disposed in a preset area physically separated from the

display

210 on which the screen of the first application is displayed, so that the user may intuitively access the window

setting selection menu

6320. In addition, the window

setting selection menu

6320 may be provided by using an area of a lower layer exposed due to asymmetric folding. The window

setting selection menu

6320 may be displayed without covering the screen of the first application and without complicating the screen setting.

Fig. 64 illustrates an example of a screen selecting from the window setting selection menu to display the screen of the first application in the right area of the screen according to an exemplary embodiment.

Referring to fig. 64, in the folded state of the

device

100c, when the user selects an option of displaying the screen of the first application in the right region and changes the state of the

device

100c from the folded state to the open state, the

controller

230 displays the screen of the first application in the right region and displays the screen of the second application in the left region of the screen.

Fig. 65 shows an example of a screen that selects from the window setting selection menu to display the first application in full screen according to an exemplary embodiment.

Referring to fig. 65, in the folded state of the

device

100c, when the user selects an option to display the first application in a full screen and changes the state of the

device

100c from the folded state to the open state, the

controller

230 displays the first application in the full screen without displaying the screen of the second application.

Fig. 66 illustrates an example of a screen selected from the window setting selection menu to display the screen of the first application in the left area of the screen according to an exemplary embodiment.

Referring to fig. 66, in the folded state of the

device

100c, when the user selects an option to display the screen of the first application in the left area and changes the state of the

device

100c from the folded state to the open state, the

controller

230 displays the screen of the first application in the left area and displays the screen of the second application in the right area of the screen.

Fig. 67 illustrates settings of a screen of a first application and a

screen

6710 of a second application according to an exemplary embodiment.

In one or more exemplary embodiments, when the first and

second application screens

6710 are simultaneously displayed, the first and second application screens 6710 may be set in various ways.

As shown in fig. 67, in the case where the folded shape of the

device

100a is changed and the first application screen and the

second application screen

6710 are thus displayed simultaneously, the

second application screen

6710 may be displayed with the first application screen partially covered. In this case, for example, the

second application screen

6710 may be displayed in the form of a pop-up window.

Further, in a case where the folded shape of the

device

100a is changed and the first application screen and the

second application screen

6710 are thus displayed simultaneously, the first application screen may be displayed in a full screen form and the

second application screen

6710 may be displayed to cover a portion of the screen of the first application. Alternatively, the

second application screen

6710 may be displayed in a full screen form, and the first application screen may be displayed while covering a portion of the

second application screen

6710.

In some embodiments, as described above, the first and second application screens 6710 may be displayed in equally divided regions of the screen of the

device

100 a.

Fig. 68 illustrates settings of a first application screen and a second application screen according to another exemplary embodiment.

Referring to fig. 68, when the

device

100a detects a folding motion and thus displays a first application screen and a second application screen simultaneously, a plurality of second application screens may be displayed simultaneously with the first application screen. The plurality of second application screens may be screens of the same application or screens of different applications.

In the case where the folded shape of the

device

100a is changed and thus the first application screen and the second application screen are simultaneously displayed, the screen of the

device

100a may be divided as shown in fig. 68, and the first application screen and the plurality of second application screens may be disposed in the divided screen of the

device

100 a. According to various exemplary embodiments, the size of the screen of the first application and the size of the screen of the second application may be different from each other or different from each other.

When the first application screen and the second application screen are simultaneously displayed, the size of the first application screen and/or the size of the second application screen may be changed, the position of the first application screen and/or the position of the second application screen may be changed, or one of the plurality of screens may be deleted, according to a user input.

Fig. 69 illustrates the operation of the

apparatus

100 according to an exemplary embodiment.

Referring to fig. 69, the

apparatus

100 detects the gaze of the user, and thus, the

apparatus

100 may turn on only one of the first and

second displays

310 and 320 to which the gaze of the user is directed, and may turn off the other one of the displays. For example, when the user views the surface on which the

first display

310 is disposed, the

second display

320 may be turned off. In order to detect the gaze of the user, the

device

100 may have a camera, gaze detection sensor, or the like.

Fig. 70 illustrates an example of providing information on a second application according to an exemplary embodiment.

Referring to fig. 70, when a screen of a first application is displayed, information on a second application may also be provided. For example, in the case where the second application is a contacts application (as shown in fig. 70), when the

device

100a is unfolded, a screen of the contacts application may be displayed on the

device

100 a.

Fig. 71 illustrates an example of providing information on a second application according to another exemplary embodiment.

Referring to fig. 71, in the case where a notification event occurs while a screen of a first application is displayed, the

device

100a may provide information indicating that a user may check the notification event by unfolding the

device

100 a. For example, in the case where a text message is received while the screen of the first application is displayed (as shown in fig. 71), the

device

100a may show that the text message can be checked by expanding the

device

100 a.

Fig. 72 shows a structure of an

apparatus

100d according to another exemplary embodiment.

Referring to fig. 72, the

display

210c may be disposed on a first surface of the

device

100d hidden when the

device

100d is folded, and the display may not be disposed on a second surface of the

device

100d disposed on an opposite side of the first surface. Further, when the

device

100d is folded asymmetrically, a portion of the first surface may be exposed outwardly when the

device

100d is folded.

A second

application selection menu

3520 may be disposed in an area of the first surface that is exposed outward when the

device

100d is asymmetrically folded.

A second

application selection menu

3520 may be displayed as a GUI on the

display

210 c. The

display

210c may include a

sixth region

7210 exposed to the outside when the

device

100d is asymmetrically folded. The

display

210c may correspond to a

sixth region

7210 and a

seventh region

7220 integrally formed or divided by a frame or the like, wherein a second

application selection menu

3520 is displayed in the

sixth region

7210 and an operation screen of the

apparatus

100c is displayed in the

seventh region

7220.

In one or more exemplary embodiments, a second

application selection menu

3520 may be provided so as to receive a user input by using a touch sensor or key buttons provided in a preset area of the housing of the

apparatus

100 d. Information regarding the second

application selection menu

3520 may be displayed on an area of the

display

210 c.

Fig. 73 illustrates an example of a second

application selection menu

3520 according to an exemplary embodiment.

The second

application selection menu

3520 may display user-selectable applications, and a user may select an application from the second

application selection menu

3520. As shown in fig. 73, the

sixth area

7210 may be divided into a plurality of areas, and the user-selectable application may be matched with the plurality of areas.

Fig. 74 illustrates an example of selecting a second application from the second

application selection menu

3520 according to an exemplary embodiment.

Referring to fig. 74, the

device

100d in a folded state provides a second

application selection menu

3520. When the user selects one of the plurality of second applications included in the second

application selection menu

3520 and expands the

device

100d (S7402), the

device

100d may display a screen of the user-selected second application on a predetermined region of the

display

210 c. For example, in the case where the

device

100d is a smartphone in a folded state and a waiting mode, when the user selects a call function from the second

application selection menu

3520 and changes the folded state of the

device

100d to an open state (S7402), the second application screen may be displayed in the

eighth area

7410 of the

display

210c, and the first application screen may be displayed in the ninth area 7420 (S7404), the first application screen may include a home screen, an application list screen, a screen of an application preset by the user, and the like, which are displayed when the screen is opened in the waiting mode.

Fig. 75 illustrates a process of providing a second application screen according to an exemplary embodiment.

When the user makes a call by using the

device

100d, the

device

100d can recognize the contents of the user's voice conversation. When the

device

100d recognizes a word related to a specific application from the content of the voice conversation, the

device

100d includes the specific application in the second

application selection menu

3520. Recognition of the content of a voice conversation can be performed by utilizing various speech recognition algorithms. Information regarding the particular application associated with the identified term may be stored in

device

100d or may be set by the user.

For example, as shown in fig. 75, when the user makes a call by using the

apparatus

100d in the folded state, the user mentions the word "schedule" (S7532). If the word "schedule" and an application related to the word "schedule" are allocated in the

device

100d, the

device

100d recognizes the word "schedule" (S7532). The

device

100d may set a

selection menu

7510 for selecting an application related to the word "schedule" on the second application selection menu 3520 (S7534). The

device

100d may set a

selection menu

7510 corresponding to an application related to the recognized word on the touch interface using the

display

210a, and may display a picture or text corresponding to the

selection menu

7510 on the touch interface.

The user can select a related application by using the

selection menu

7510 of the related application, and can expand the

device

100d (S7534) to display the

execution screen

7520 of the related application (S7536). The

device

100d may recognize the contents of the user's voice conversation and may allow the user to easily use the application, thereby increasing user convenience.

Fig. 76 shows the structure of an

apparatus

100e according to an exemplary embodiment.

The

device

100e is a rollable device. As shown in fig. 76, the

display

210d of the

device

100e may have two states, namely, a rolled state and an unrolled state. The user may change the state of the

device

100e from the rolled state to the unrolled state by grasping and pulling the

portion

7610 of the

device

100e apart or by pressing a preset button of the

device

100 e. The second

application selection menu

3520 may be provided in a preset area of the

device

100e that is exposed to the outside when the

device

100e is rolled. For example, as shown in fig. 76, the second

application selection menu

3520 may be provided in a preset area of a housing of the

apparatus

100e, wherein the housing maintains the

display

210d in a rolled state.

The second

application selection menu

3520 may be provided in the form of a touch screen, a touch sensor, a button, or the like. In one or more exemplary embodiments, the type of application selectable from the second

application selection menu

3520 may be preset in the

apparatus

100e, may be determined according to a user's selection, or may be changed according to an operation mode of the

apparatus

100 e.

Fig. 77 illustrates a process in which the

device

100e provides the second application menu according to an exemplary embodiment.

Referring to fig. 77, the

device

100e in the rolled state provides a second

application selection menu

3520, and the user selects an application from the second application selection menu 3520 (S7702). Thereafter, when the user changes the state of the

device

100e to the unfolded state (S7704), the

display

210d is exposed, and a screen of an application selected by the user is displayed on the

display

210d (S7706). The selection of the application may be performed before the user expands the

display

210d (S7702), may be performed when the user expands the

display

210d, or may be performed after the

display

210d expands.

When the user selects the second application from the second

application selection menu

3520, the

device

100e may simultaneously display a screen of the second application selected by the user and a screen of a preset first application or a screen of a currently executed first application.

Fig. 78 shows a block diagram of an

apparatus

100f according to another exemplary embodiment.

As shown in fig. 78, the structure of the

device

100f may be applied to various types of devices, including a mobile phone, a tablet PC, a Personal Digital Assistant (PDA), an MP3 player, an autonomous service terminal, an electronic photo frame, a navigation device, a digital TV, or a wearable device, such as a watch or a Head Mounted Display (HMD)).

Referring to fig. 78, the

apparatus

100f may include at least one selected from: a display 7810 (i.e., a display unit), a controller 7870 (i.e., a control unit), a

memory

7820, a Global Positioning System (GPS)

chip

7825, a communicator 7830 (i.e., a communication unit or transceiver), a

video processor

7835, an

audio processor

7840, a user interface 7845 (i.e., a user input or user input unit), a microphone 7850 (i.e., a microphone unit), a camera 7855 (i.e., an image capture unit), a speaker 7860 (i.e., a speaker unit), a motion detector 7865 (i.e., a motion detection unit), a sensor 7880 (i.e., a sensing unit).

The

display

7810 may include a

display panel

7811 and a controller for controlling the

display panel

7811. The

control panel

7811 may be a variety of displays (including Liquid Crystal Displays (LCDs), Organic Light Emitting Diode (OLED) displays, Active Matrix Organic Light Emitting Diode (AMOLED) displays, Plasma Display Panels (PDPs), etc.) the

display panel

7811 may be flexible, transparent, and/or wearable. The

display

7810 may be combined with the

touch panel

7847 of the

user interface

7845 and thus may be provided as a touch screen. The touch screen may include a module having a stacked structure including the

display panel

7811 and the

touch panel

7847.

The

memory

7820 may include at least one of an internal memory and an external memory.

The internal memory may include at least one of: volatile memory (e.g., Dynamic Random Access Memory (DRAM), static ram (sram), synchronous dynamic ram (sdram), etc.), non-volatile memory (e.g., one-time programmable read only memory (OTPROM), programmable ROM (prom), erasable programmable ROM (eprom), electrically erasable programmable ROM (eeprom), mask ROM, flash ROM, etc.), Hard Disk Drives (HDD), and Solid State Drives (SSD). The

controller

7870 may load instructions or data from at least one of the non-volatile memory and other elements to the volatile memory and may process the instructions or data. Further, the

controller

7870 may store data received from or generated by another element in the non-volatile memory.

The external memory may include at least one of: compact Flash (CF) memory, Secure Digital (SD) memory, Micro secure digital (Micro-SD) memory, Mini secure digital (Mini-SD) memory, ultra-small digital (xD) memory, and memory stick.

The

memory

7820 may store various programs and data used in the operation of the

device

100 f. For example,

memory

7820 may temporarily or semi-permanently store portions of content to be displayed on the lock screen.

The

controller

7870 may control the

display

7810 to display a portion of the content stored in the

memory

7820. In other words, the

controller

7870 may display a portion of the content stored in the

memory

7820 on the

display

7810. When a user gesture is performed in an area of the

display

7810, the

controller

7870 may perform a control operation corresponding to the user gesture.

The

controller

7870 may include at least one of:

RAM

7871,

ROM

7872, Central Processing Unit (CPU)7873, Graphics Processing Unit (GPU)7874, and

bus

7875. The

RAM

7871,

ROM

7872,

CPU

7873, and GPU7874 may be interconnected via

bus

7875.

CPU

7873 accesses

memory

7820 and performs boot operations by utilizing the operating system stored in

memory

7820. Then, the

CPU

7873 performs various operations by using various programs, contents, or data stored in the

memory

7820.

ROM

7872 may store a set of instructions for starting the system. For example, when a boot instruction is input to the

device

100f and power is supplied to the

device

100f, the

CPU

7873 may copy the OS stored in the

memory

7820 to the

RAM

7871 according to the instruction stored in the

ROM

7872, run the OS, and thereby boot the system. When the boot operation is completed, the

CPU

7873 can copy various programs stored in the

memory

7820 to the

RAM

7871, and can perform various operations by running the programs copied to the

RAM

7871. When

device

100f starts up, GPU7874 displays the UI screen in

display

7810. In more detail, GPU7874 may generate a screen that displays an electronic document that includes various objects (e.g., content, icons, menus, etc.). The GPU7874 calculates coordinate values of an object to be displayed according to the layout of the UI screen, and calculates attribute values of the shape, size, or color of the object. GPU7874 may then generate UI screens having various layouts including objects based on the calculated attribute values. The UI screen generated by GPU7874 may be provided to

display

7810 and may thus be displayed in an area of

display

7810.

The

GPS chip

7825 may receive GPS signals from GPS satellites and may calculate the current location of the

device

100 f. In the case of using a navigation program or requiring the current location of the user, the

controller

7870 may calculate the location of the user by using the

GPS chip

7825.

The

communicator

7830 may perform communication with various external devices according to various types of communication methods. The

communicator

7830 may include at least one selected from a wireless fidelity (Wi-Fi)

chip

7831, a

bluetooth chip

7832, a

wireless communication chip

7833, and a Near Field Communication (NFC)

chip

7834. The

controller

7870 may perform communication with various external devices by using the

communicator

7830.

The Wi-

Fi chip

7831 and the

Bluetooth chip

7832 can perform communication by utilizing Wi-Fi and Bluetooth, respectively. If the Wi-

Fi chip

7831 or the

bluetooth chip

7832 is used, the

WiFi chip

7831 or the

bluetooth chip

7832 may first transmit and receive various types of connection information including service set identification information (SSID), session keys, etc., a connection for communication may be established by using the connection information, and then various types of information may be transmitted and received. The

wireless communication chip

7833 may be a chip that performs communication according to various communication standards, such as Institute of Electrical and Electronics Engineers (IEEE), ZigBee, third generation mobile communication technology (3G), third generation partnership project (3GPP), Long Term Evolution (LTE), and the like. The

NFC chip

7834 refers to a chip that operates with NFC using a 13.56MHz band among various radio frequency identification (RF-ID) bands (e.g., 135kHz, 13.56MHz, 433MHz, 860-960 MHz, 2.45GGz, etc.).

The

video processor

7835 may process video data included in content received by using the

communicator

7830 or may process video data included in content stored in the

memory

7820.

Video processor

7835 may perform various image processing (e.g., decoding, scaling, noise filtering, frame rate conversion, resolution conversion, etc.) on the video data.

The

audio processor

7840 may process audio data included in content received by using the

communicator

7830 or may process audio data included in content stored in the

memory

7820.

Audio processor

7840 may perform various processing on the audio data (e.g., decoding, amplification, noise filtering, etc.).

When running the reproducing program for multimedia contents, the

controller

7870 may reproduce the multimedia contents by driving the

video processor

7835 and the

audio processor

7840. The

speaker

7860 may output audio data generated in the

audio processor

7840.

The

user interface

7845 may receive various instructions input from a user. The

user interface

7845 may include at least one selected from the group consisting of

keys

7846, a

touch panel

7847, and a

pen recognition panel

7848

The

keys

7846 may be of various types (e.g., mechanical buttons, rollers, etc. that may be formed on the front, sides, rear, etc. of the outer surface of the body of the

device

100 f).

The

touch panel

7847 may sense a touch input by a user and may output a value of a touch event corresponding to a signal generated by the sensed touch input. When the

touch panel

7847 is combined with the

display panel

7811 and thus formed as a touch screen, the touch screen may be configured as a capacitive touch screen, a resistive touch screen, or a piezoelectric touch screen by using various types of touch sensors. The capacitive touch screen may calculate touch coordinates by sensing a small amount of electricity generated when a body part of a user touches a surface of the capacitive touch screen coated with a dielectric material. The resistive touch screen may include two embedded electrode plates, and touch coordinates may be calculated by sensing a flow of current occurring when a user touches the resistive touch screen, which causes upper and lower plates of a touch point to contact each other. Touch events occurring on a touch screen may be generated by a user's finger, but may also be generated by an object formed of a conductive material capable of changing capacitance.

The

pen recognition panel

7848 may sense a proximity input or a touch input of a stylus (stylus or stylus) performed by a user and may output a sensed pen proximity event or a sensed pen touch event. The

pen recognition panel

7848 may be an electromagnetic resonance (EMR) type pen recognition panel, or may sense a touch input or a proximity input according to a change in electromagnetic field intensity occurring when a touch pen approaches or touches a touch screen. In more detail, the

pen recognition panel

7848 may include an electromagnetic induction coil sensor having a mesh structure, an electric signal processor for sequentially supplying an Alternating Current (AC) signal having a predetermined frequency to each loop coil of the electromagnetic induction coil sensor. When a pen having an internal resonant circuit is placed near the loop coil of the

pen recognition panel

7848, the magnetic field transmitted from the loop coil generates a current in the resonant circuit of the pen based on mutual electrostatic induction. Due to the current, an induced magnetic field is generated from a coil forming a resonance circuit in the pen, and the

pen recognition panel

7848 detects the induced magnetic field from the loop coil capable of receiving a signal, thereby sensing a touch input or a proximity input by the pen. The

pen recognition panel

7848 may be disposed to occupy a predetermined area under the

display panel

7811, and may have a size capable of covering a display area of the

display panel

7811, for example.

The

microphone

7850 may receive input user voice or other sounds and may convert the user voice or other sounds into audio data. The

controller

7870 may use the user's voice input via the

microphone

7850 in call-related operations, or may convert the user's voice into audio data and may store the audio data in the

memory

7820.

The

camera

7855 may capture still images or moving pictures according to the user's control. The

cameras

7855 may include front cameras, rear cameras, and the like.

If the

camera

7855 and the

microphone

7850 are formed, the

controller

7870 may perform a control operation according to a user's voice input via the

microphone

7850 or a user's motion recognized by the

camera

7855. For example, the

device

100f may operate in a motion control mode or a voice control mode. If the

device

100f operates in the motion control mode, the

controller

7870 may activate the

camera

7855 and may capture an image of the user, and track a change in the motion of the user, and may perform a control operation corresponding thereto. If the

device

100f operates in a voice control mode (i.e., a voice recognition mode), the

controller

7870 may analyze a user's voice input via the

microphone

7850 and may perform a control operation according to the analyzed user's voice.

The

motion detector

7865 may detect motion of the body of the

device

100 f. The

device

100f may be rotated or tilted in various directions. Here, the

motion detector

7865 may detect a motion characteristic (e.g., a rotation direction, a rotation angle, an inclination angle, etc.) by using at least one of a magnetic sensor, a gyro sensor, an acceleration sensor, and the like.

The

device

100f may further include a Universal Serial Bus (USB) port for connecting the

device

100f with a USB connector, various external input ports including an earphone, a mouse, a Local Area Network (LAN) for connecting with various external terminals, etc., a Digital Media Broadcasting (DMB) chip for receiving and processing DMB signals, various sensors, etc.

The names of the elements in the

device

100f may vary. Further, the

apparatus

100f may include at least one selected from the above elements, or may be implemented with more or less elements than the above elements.

Display

210 may correspond to display 7810 of fig. 78. The

controller

230 may correspond to the

controller

7870 of fig. 78. The

user interface

3310 may correspond to the

user interface

7845 of fig. 78. The

status detector

220 may correspond to the

sensor

7880 of fig. 78.

The various operations and methods described above in response to the unfolding motion of the device may be generally applied to the folding motion of the response device. Similarly, the various operations and methods described above in response to the folding motion of the device may be generally applied in response to the unfolding motion of the device.

As described above, according to one or more of the above exemplary embodiments, a user can conveniently view a screen for an application in a device having a foldable characteristic.

Further, according to one or more of the above exemplary embodiments, a user can easily use related functions in the foldable device.

Further, according to one or more of the above exemplary embodiments, a user can view an application screen for a current situation by simply folding or unfolding the flexible device.

One or more exemplary embodiments may also be computer readable code on a non-transitory computer readable recording medium. The non-transitory computer-readable recording medium may be any data storage device that can store data which can be thereafter read or executed by a computer or a processor.

According to one or more exemplary embodiments, the non-transitory computer readable code, when read or executed by a processor or computer, performs a method of controlling a device. The computer readable code can be constructed in a variety of programming languages. Also, functional programs, codes, and code segments for implementing one or more exemplary embodiments may be easily construed by programmers skilled in the art to which the inventive concept pertains.

Examples of the non-transitory computer readable recording code include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like. The non-transitory computer-readable recording medium may also be distributed over network-connected computer systems so that the computer-readable code is stored and executed in a decentralized manner.

It is to be understood that the exemplary embodiments described herein are to be considered in a descriptive sense and not for purposes of limitation. The description of features or aspects in each exemplary embodiment should be representatively considered as applicable to other similar features or aspects in other exemplary embodiments.

Although one or more exemplary embodiments have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope defined by the following claims.