CN114968454A - Process arrangement, display method, head-mounted display device and computer-readable medium - Google Patents
- ️Tue Aug 30 2022
Info
-
Publication number
- CN114968454A CN114968454A CN202210470109.6A CN202210470109A CN114968454A CN 114968454 A CN114968454 A CN 114968454A CN 202210470109 A CN202210470109 A CN 202210470109A CN 114968454 A CN114968454 A CN 114968454A Authority
- CN
- China Prior art keywords
- task
- user
- workflow
- process node
- head Prior art date
- 2022-04-28 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Embodiments of the present disclosure disclose a flow arrangement, a display method, a head-mounted display device, and a computer-readable medium. One embodiment of the method comprises: generating a workflow according to the process node information set and a process node connection information set corresponding to the process node information set, wherein the process node information set corresponds to a starting node, at least one process node and an ending node, and the workflow corresponds to each task interface; and in response to detecting the issuing operation aiming at the workflow, determining the issuing state of the workflow as an issued state, wherein each task interface corresponding to the workflow is used for being displayed in the head-mounted display equipment corresponding to the workflow, and the process node information of the corresponding process node comprises an instruction for starting an executable unit of the head-mounted display equipment. According to the embodiment, an operator can perform actual operation without interruption, data is not required to be filled in manually, and the data in the operation process can be stored through the head-mounted display device.
Description
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a flow arrangement method, a display method, a head-mounted display device, and a computer-readable medium.
Background
The workflow is used to guide the actual work of the operator. At present, when an operator actually works, the mode generally adopted is as follows: the operation is performed according to a workflow in the form of a paper or electronic document.
However, when the above-described manner is adopted, there are often technical problems as follows: an operator needs to interrupt the current operation process to check the workflow in the form of paper or electronic document, and the workflow in the form of paper needs to be manually filled in by the operator, so that the filled data is difficult to store.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose flow orchestration, display methods, head mounted display devices, and computer readable media to address one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a flow orchestration method, the method including: generating a workflow according to a process node information set and a process node connection information set corresponding to the process node information set, wherein the process node information set corresponds to a starting node, at least one process node and an ending node, the process node connection information in the process node connection information set corresponds to two pieces of process node information in the process node information set, and the workflow corresponds to each task interface; and in response to detecting the issuing operation aiming at the workflow, determining the issuing state of the workflow as an issued state, wherein each task interface corresponding to the workflow is used for being displayed in a head-mounted display device corresponding to the workflow, and the process node information corresponding to the process node comprises an instruction for starting an executable unit of the head-mounted display device.
In a second aspect, some embodiments of the present disclosure provide a flow display method applied to a head-mounted display device, the method including: in response to receiving a work task or recognizing a work task corresponding to a work task identification code, displaying a task interface corresponding to a workflow corresponding to the received or recognized work task in a display screen of the head-mounted display device, wherein the workflow is generated by adopting a method described in any one of the implementation manners of the first aspect; determining whether the operation of the user aiming at the task interface meets the jump condition corresponding to the task interface; and jumping to a next task interface corresponding to the task interface and the operation in response to detecting that the operation of the user on the task interface meets the jump condition.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; one or more display screens for imaging in front of the eyes of a user wearing the head-mounted display device; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the second aspect.
In a fourth aspect, some embodiments of the disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first or second aspects.
The above embodiments of the present disclosure have the following advantages: through the process arrangement method of some embodiments of the disclosure, an operator can perform actual operation without interruption, and does not need to fill data manually, and the data in the operation process can be stored through a head-mounted display device. Specifically, the reasons that the operator needs to interrupt the operation process, needs to manually fill in data, and the data is difficult to store are as follows: an operator needs to interrupt the current operation process to check the workflow in the form of paper or electronic document, and the workflow in the form of paper needs to be manually filled in by the operator, so that the filled data is difficult to store. Based on this, the process arrangement method according to some embodiments of the present disclosure first generates a workflow according to a process node information set and a process node connection information set corresponding to the process node information set. Wherein the process node information set corresponds to a start node, at least one process node and an end node, and the process node connection information in the process node connection information set corresponds to the process node information set
The workflow has each task interface correspondingly. Therefore, the generated workflow can represent each task item which can be directly executed by referring to the operator in actual operation, and each task item can be directly presented to the operator in the form of a task interface. Then, in response to detecting a publishing operation for the workflow, the publishing state of the workflow is determined to be a published state. And each task interface corresponding to the workflow is used for being displayed in the head-mounted display equipment corresponding to the workflow. The process node information corresponding to the process node includes an instruction to activate the executable unit of the head-mounted display device. Therefore, the workflow in the released state can be directly displayed in the corresponding head-mounted display equipment, so that an operator wearing the head-mounted display equipment can directly and sequentially watch the displayed task interfaces, and the actual operation corresponding to each task interface can be sequentially carried out according to the sequence of the workflow. And because the workflow can be displayed in the head-mounted display device in the form of a task interface, an operator can directly perform actual operation along with the guidance of the head-mounted display device without additionally checking the workflow in the form of a paper or electronic document. Data may also be entered and saved via the head mounted display device. Therefore, an operator can continuously perform actual operation without filling data manually, and the data in the operation process can be stored through the head-mounted display device.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
FIG. 1 is an architectural diagram of an exemplary system in which some embodiments of the present disclosure may be applied;
FIG. 2 is a flow diagram of some embodiments of a flow orchestration method according to the present disclosure;
FIG. 3 is a flow diagram of further embodiments of a flow orchestration method according to the present disclosure;
FIG. 4 is a flow chart of some embodiments of a flow display method according to the present disclosure;
FIG. 5 is a schematic structural diagram of a head mounted display device suitable for use to implement some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an
exemplary system architecture100 to which the flow orchestration method or the flow display method of some embodiments of the present disclosure may be applied.
As shown in fig. 1, the
system architecture100 may include
terminal devices101, 102, 103, a
network104, and a
server105. The
network104 serves as a medium for providing communication links between the
terminal devices101, 102, 103 and the
server105.
Network104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the
terminal devices101, 102, 103 to interact with the
server105 via the
network104 to receive or send messages or the like. The
terminal devices101, 102, 103 may have various communication client applications installed thereon, such as a web browser application, a search-type application, an instant messaging tool, a mailbox client, social platform software, and the like.
The
terminal apparatuses101, 102, and 103 may be hardware or software. When the
terminal apparatuses101, 102, 103 are hardware, they may be electronic apparatuses having a display screen and supporting information display. For example, the
terminal devices101 and 102 include, but are not limited to, a mobile phone and a computer. The
terminal device103 includes, but is not limited to, AR glasses, MR glasses, and the like. When the
terminal apparatuses101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented, for example, as multiple software or software modules for providing distributed services, or as a single software or software module. And is not particularly limited herein.
The
server105 may be a server providing various services, such as a background server providing support for information displayed on the
terminal devices101, 102, 103. The background server can analyze and process the received data such as the request and feed back the processing result to the terminal equipment.
It should be noted that the flow arrangement method provided by the embodiment of the present disclosure may be executed by the
server105 or the
terminal device101 or 102. The flow display method provided by the embodiment of the present disclosure may be executed by the
terminal device103.
The server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as multiple pieces of software or software modules, for example, to provide distributed services, or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2, a
flow200 of some embodiments of a flow orchestration method according to the present disclosure is shown. The process arrangement method comprises the following steps:
201, generating a workflow according to the process node information set and the process node connection information set corresponding to the process node information set.
In some embodiments, the executing agent of the process orchestration method (e.g., the server or
terminal device101/102 shown in fig. 1) may generate a workflow based on the set of process node information and the set of process node connection information corresponding to the set of process node information. The process node information set may be information of each process node determined by a process orchestration user. The process node connection information set may be connection information of each process node determined by the process scheduling user for the process node information set. The process node information sets and the process node information sets can be pushed by a process arrangement user through an API (application programming interface).
The process node information may be attribute-related information of a node corresponding to the workflow. The process node information set may correspond to a start node, at least one process node, and an end node. The process node information corresponding to the start node and the end node may include a node type and a node identification. The above-mentioned process nodes may be understood as intermediate nodes. The process node information corresponding to the process node may include a node type and process node attribute information. The process node attribute information may be information for configuring service data of the process node, and may include but is not limited to: the system comprises a node identifier, task interface information used for being displayed in a corresponding task interface, an API identifier used for receiving operation data of an operator, and an API identifier representing task execution logic of the node. The task interface information may include text, and may further include, but is not limited to, at least one of the following: title, button configuration information.
It should be noted that the process node attribute information includes, but is not limited to, the above-mentioned configuration related information, the process node attribute information of the process node corresponds to the task item during actual operation, and the process node attribute information of different process nodes may include different configuration related information. For example, the process node attribute information of the process node corresponding to the voice input task item may include: node identification, title, body text, voice input type (which may be text or numbers), button configuration information (which may include button display text, button color). As an example, the process node attribute information of the voice input task item may be "node identification: 02, title: speech input, text: please read the number displayed at a, the voice input type: number, button configuration information: confirm, cancel, next ". The process node attribute information of the photographing task item may be "node identification: 02, title: shooting, text: please take a complete image of B, button configuration information: photograph, next ". The process node attribute information of the process node corresponding to the scan task item may include: and checking the result. The verification result may be scan configuration information that verifies the scan result to determine whether the scan configuration information is the same as the verification result. For example, the check result may be "xxx 1 b". If the scan result is "xxx 2 b", the scan result is different from the verification result, the execution of the scan task item fails, and the operator needs to scan again.
The process node connection information may be related information for connecting two nodes. The flow node connection information may include: front item node identification and back item node identification. The corresponding relation between the previous node identifier and the next node identifier is: the connecting line is connected to the node corresponding to the next node identification through the node corresponding to the previous node identification. When the node corresponding to the previous node identifier is a flow node and the flow node attribute information of the node includes the button configuration information, the flow node connection information may further include: and (5) jumping button identification. The skip button identifier can represent that after an operator clicks a button corresponding to the skip button identifier in a task interface corresponding to the previous node identifier, the operator skips to a task interface corresponding to the next node identifier. The skip button identification may be an identification uniquely representing the button, and text or a button identification code may be displayed for the button.
In practice, the execution body may create a workflow represented by the flow node information set and the flow node connection information set. The workflow may be process related information for an operator to perform corresponding operations according to information of each process node having an execution sequence. The process node information of any two connections in the workflow may be connected according to the process node connection information corresponding to the process node information of the two connections. The first flow node information of the two connected flow node information corresponds to the front node identifier included in the flow node connection information, and the second flow node information corresponds to the back node identifier included in the flow node connection information.
By way of example, a workflow may be represented as:
{ workflowName, assembly two-line alternate-day polling process;
linksInputList:[workflowNodeIdSouce:001,workflowNodeIdTarget:002;
workflowNodeIdSouce:002,workflowNodeIdTarget:003];
nodeInputList: [ workflowNodeId:001, workflowNodeName: device photo, info: "text: please take device full body photo";
002, equipment scan, info: "text: please scan equipment";
003, info, text, please read out the instrument display value.
Wherein, the workflow name is the name of the workflow. linksInputList is a set of process node connection information. nodeInputList is the flow node information set. And workflowNodeIdSouce is the identifier of the previous item node. workflowNodeIdTarget is the consequent node identification. And workflowNodeId is the node identification. workflowNodeName is the node name. info is task interface information.
It is noted that in one or more embodiments of the present application, the workflow may be in a data exchange format, such as, but not limited to, one of the following: JSON, XML.
Optionally, before
step201, the execution subject may determine each flow node template selected by the user from the flow node template library as each target flow node template. The process node template library may be a set of process node templates corresponding to various task items. For example, the process node template library may include, but is not limited to: a starting node template, an ending node template, a condition judging node template, a shooting node template and a voice input node template. The process node template library may further include a custom node template. The user-defined node template can be a node template which is created by a user according to the requirement. For example, the process node template library may be displayed on a workflow editing page, and the process arrangement user may select the process node template by dragging the process node template to an editing area. The workflow editing page may be a page for editing a workflow of nodes and connecting lines between the nodes.
Then, the process node attribute information configured by the user for each target process node template in the target process node templates may be determined as the process node information, so as to obtain a process node information set. Then, the connecting lines connected by the user for the target process node templates may be determined as the target connecting lines. And connecting lines in the connecting lines are directed lines which are led out from one target process node template by a user and point to another target process node template.
Finally, the two node identifiers corresponding to each target connection line in the target connection lines and the configuration information of the jump button corresponding to the first node identifier may be combined into the process node connection information, so as to obtain a process node connection information set. And the target connecting line points from the target process node template corresponding to the first node identification to the target process node template corresponding to the second node identification. The jump button configuration information may be information for configuring a button for jumping to the next target process node template, and may include a jump button identifier. And when the flow node attribute information corresponding to the first node identifier does not comprise the jump button identifier, the jump button configuration information is null. Therefore, the process arrangement user can configure the process node information set and the process node connection information set for generating the workflow in a dragging mode.
Optionally, before the connecting lines connected by the user for the target process node templates are determined as the target connecting lines, the execution main body may determine a first target process node template selected by the user as a previous process node template. And then, in response to determining that the number of connected nodes corresponding to the previous process node template is smaller than the threshold of the number of connected nodes corresponding to the previous process node template, determining a second target process node template selected by the user as a next process node template. The number of connected links may be the number of connected links that have been extracted by the previous process node template. The connection number threshold may be the number of connection lines that can be extracted by the previous flow node template at most. Then, the former process node template and the latter process node template may be connected by a connecting line.
202, in response to detecting a publication operation for the workflow, determining a publication status of the workflow as a published status.
In some embodiments, the execution agent may determine the publication status of the workflow as published status in response to detecting a publication operation for the workflow. The publishing operation may determine, for the process orchestration user, an operation to publish the workflow. For example, the publishing operation may be an operation that a flow orchestration user confirms to publish through an API. The respective task interfaces of the published state workflows may be for display in corresponding head mounted display devices. The head-mounted display device corresponding to the workflow may be the head-mounted display device that receives the workflow, or may be the head-mounted display device that recognizes the workflow. One or more head-mounted display devices corresponding to the workflow may be provided. The display sequence of the task interfaces in the head-mounted display device is determined according to the execution logic of the workflow.
The head-mounted display device can be a device which is used for a wearing user to watch the virtual scene presented and can receive data input by the user. The data may include, but is not limited to: voice, image, video. For example, the head mounted display device may be AR glasses or MR glasses. The process node information corresponding to the process node may contain an instruction to start the executable unit of the head-mounted display device.
The above embodiments of the present disclosure have the following advantages: through the process arranging method of some embodiments of the disclosure, an operator can perform actual operation without interruption, and does not need to fill data manually, and the data in the operation process can be stored through a head-mounted display device. Specifically, the reasons that the operator needs to interrupt the operation process, needs to manually fill in data, and the data is difficult to store are as follows: an operator needs to interrupt the current operation process to check the workflow in the form of paper or electronic document, and the workflow in the form of paper needs to be manually filled in by the operator, so that the filled data is difficult to store. Based on this, the process arrangement method according to some embodiments of the present disclosure first generates a workflow according to a process node information set and a process node connection information set corresponding to the process node information set. The process node information set corresponds to a starting node, at least one process node and an ending node, the process node connection information in the process node connection information set corresponds to two pieces of process node information in the process node information set, and the workflow corresponds to each task interface. Therefore, the generated workflow can represent each task item which can be directly executed by referring to the operator in actual operation, and each task item can be directly presented to the operator in the form of a task interface. Then, in response to detecting a publishing operation for the workflow, the publishing state of the workflow is determined to be a published state. And each task interface corresponding to the workflow is used for being displayed in the head-mounted display equipment corresponding to the workflow. The process node information corresponding to the process node includes an instruction to activate the executable unit of the head-mounted display device. Therefore, the workflow in the released state can be directly displayed in the corresponding head-mounted display equipment, so that an operator wearing the head-mounted display equipment can directly and sequentially watch the displayed task interfaces, and the actual operation corresponding to each task interface can be sequentially carried out according to the sequence of the workflow. And because the workflow can be displayed in the head-mounted display device in the form of a task interface, an operator can directly perform actual operation along with the guidance of the head-mounted display device without additionally checking the workflow in the form of a paper or electronic document. Data may also be entered and saved via the head mounted display device. Therefore, an operator can continuously perform actual operation without filling data manually, and the data in the operation process can be stored through the head-mounted display device.
With further reference to FIG. 3, a
flow300 of further embodiments of a flow orchestration method is illustrated. The
process300 of the process arrangement method includes the following steps:
301, generating a workflow according to the workflow global configuration information, the flow node information set and the flow node connection information set.
In some embodiments, the execution subject of the flow orchestration method (e.g., the server or
terminal device101/102 shown in fig. 1) may generate a workflow based on the workflow global configuration information, the set of flow node information, and the set of flow node connection information. The workflow global configuration information may be configuration related information set by a process scheduling user for each node corresponding to the process node information set. For example, workflow global configuration information may include, but is not limited to: the identifier of whether remote cooperation is supported, the identifier of whether video recording is supported, the identifier of API used for single step operation result reporting, the identifier of API used for total task operation result reporting, the identifier of API used for static resource reporting, and an attachment file or website for operators to check. The workflow global configuration information may be pushed by a process scheduler through an API, or may be configured by the process scheduler through a workflow editing page. In practice, the execution body may create a workflow represented by the workflow global configuration information, the flow node information set, and the flow node connection information set. The aforementioned workflow may be uniquely represented by a workflow identification.
302, in response to detecting a publication operation for a workflow, determining a publication status of the workflow as a published status.
In some embodiments, the specific implementation of
step302 and the technical effect brought by the implementation may refer to step 202 in those embodiments corresponding to fig. 2, which are not described herein again.
303, determining the workflow selected by the user from the workflows whose release status is the released status as the target workflow.
In some embodiments, the execution agent may determine a workflow selected by a user from various workflows whose release status is a released status as a target workflow. Wherein, the user can be a process arrangement user. When the execution subject is a server, the user may select a workflow from the workflows whose release state is the released state through the API. When the execution main body is a terminal device, the user can also select the workflow from the workflows of which the release state is the released state displayed in the terminal device.
Optionally, before
step303, in response to detecting user binding information sent by any head-mounted display device, the executing main body may determine a bound user corresponding to the user binding information as an alternative task receiving user. The user binding information may represent that a user is bound to the any head-mounted display device, and may include a user identifier of the user and a device identifier of the any head-mounted display device. Then, the arbitrary head-mounted display device may be determined as the head-mounted display device corresponding to the candidate task receiving user. Alternative task receiving users may be available for the scheduling user to select to become a task receiving user. Thus, when a new head mounted display device access is detected, a corresponding user can be bound to the head mounted display device.
Optionally, the executing body may determine each determined candidate task receiving user as a set of candidate task receiving users. Then, in response to detecting a selection operation of the user for an alternative task receiving user in the alternative task receiving user set, the alternative task receiving user selected by the user may be determined as the task receiving user. The selection operation may be an operation for a candidate task receiving user group, or may be an operation for a single candidate task receiving user.
304, receiving the user according to the target workflow and at least one task selected by the user, and generating a work task.
In some embodiments, the execution agent may generate a job task according to the target workflow and at least one task receiving user selected by the user. The task receiving user may be an operator who performs a task. The at least one task receiving user can be selected by the user in a user group selection mode or can be selected one by one. The task receiving user may be represented in the form of a user identification. The work task may be a task that enables a designated operator to perform actual work according to a workflow. One job task corresponds to one workflow. One workflow may correspond to at least one work task. In practice, the execution agent may create a work task represented by the user with the identification of the target workflow and the at least one task. The work task may include a workflow identification of the target workflow and a user identification of the at least one task recipient user. It should be noted that in one or more embodiments of the present application, the work task may be in a data exchange format, for example, but not limited to, one of the following: JSON, XML.
In some optional implementation manners of some embodiments, the execution main body may generate the work task according to the target workflow, the at least one task receiving user, and the task execution manner and the task network type selected by the user. The task execution mode may be a mode of executing a work task. The task execution mode may be, but is not limited to, one of the following: disposable tasks, permanent tasks, periodic tasks. The task network type may be a network connection type supported when performing a work task. The task network type may be, but is not limited to, one of the following: online tasks, semi-offline tasks. Online tasks may characterize the need to maintain a networked state while performing work tasks. The semi-offline task can represent that the work task is executed in a disconnected state, and data is uploaded to the data storage end after networking. In practice, the execution agent may create a job task represented by the identification of the target workflow, the at least one task receiving user, the task execution manner, and the task network type. The work task may be uniquely represented by a work task identifier.
305, in response to detecting the issuing operation for the work task, issuing the work task.
In some embodiments, the execution subject may perform a publishing process on the work task in response to detecting a publishing operation for the work task. The issuing operation may be an operation for the process scheduling user to determine to issue the job task. For example, when the execution subject is a server, the issuing operation may be an operation of the flow layout user to confirm and issue through an API. When the execution main body is a terminal device, the issuing operation may be a selection operation of the process scheduling user on an issuing control corresponding to the work task. In practice, the execution main body may send the work task to each head-mounted display device corresponding to the at least one task receiving user, so that each head-mounted display device displays a task interface in each task interface corresponding to the target workflow. The release status of the work task may also be determined as a released status.
In some optional implementation manners of some embodiments, first, the execution main body may generate a work task identification code according to the work task. The task identification code may be used for scanning each head-mounted display device corresponding to the at least one task receiving user, so as to display a task interface in each task interface corresponding to the target workflow. The work task identifier code may be, but is not limited to: two-dimensional codes, bar codes. The work task identification code can also be any other identification code. The work task identifier code may then be transmitted to the associated terminal device. The associated terminal device may be any terminal device associated with the execution subject. For example, the associated terminal device may be a device of a site where actual work is performed, and the work task identification code may be directly displayed. The staff can also print the work task identification code stored in the associated terminal equipment into a paper plate to be posted on a work site for the head-mounted display equipment to scan. Therefore, the head-mounted display device can identify the work task in a mode of the work task identification code.
Alternatively, the execution main body may receive a task execution information record corresponding to the work task, which is sent by a head mounted display device of the head mounted display devices. The task execution information record may be operation data of a task receiving user (operator) wearing the head-mounted display device during execution of the work task. The task execution information records may include, but are not limited to: operation time, node identification and operation buttons. When the operated node corresponds to a task item requiring data entry, the task execution information record may further include entry data. The above-mentioned logging data may be in the form of, but not limited to: image, video, voice.
The execution main body may display a task record page of the work task in response to detecting a task record viewing operation of a user for the work task, where the task record viewing operation may be an operation of viewing a task execution information record of the work task. And displaying a task execution information record set corresponding to the work task in the task record page. The execution subject is a terminal device.
The execution main body can respond to the detection of the task report viewing operation of the user aiming at the work task and display the task report file of the work task. The task report viewing operation may be an operation of viewing a task report file of the work task. The task report file may be a report file showing the set of task execution information records. The execution subject is a terminal device.
As can be seen from fig. 3, compared with the description of some embodiments corresponding to fig. 2, the
flow300 of the flow orchestration method in some embodiments corresponding to fig. 3 embodies the steps of expanding the generation of the work task. Thus, the solutions described by the embodiments may generate work tasks through already published workflows. So that the task receiving user corresponding to the work task can execute the work task through the head-mounted display device.
With further reference to fig. 4, a
flow400 of some embodiments of a flow display method is illustrated. The
process400 of the process display method, applied to the head-mounted display device, includes the following steps:
401, in response to receiving the work task or identifying the work task corresponding to the work task identification code, displaying a task interface corresponding to the workflow corresponding to the received or identified work task in a display screen of the head-mounted display device.
In some embodiments, an execution subject of the flow display method (for example, the head-mounted display device shown in fig. 1) may display, in response to receiving a work task or identifying a work task corresponding to a work task identification code, a task interface corresponding to a workflow corresponding to the received or identified work task in a display screen of the head-mounted display device. The workflow may be generated by using the steps in the embodiments corresponding to fig. 2 or fig. 3. The work tasks described above may be generated using the steps in those embodiments corresponding to fig. 3. The execution main body can identify the work task corresponding to the work task identification code through the camera. In practice, the execution subject may display different task interfaces according to the execution logic of the workflow.
402, determining whether the operation of the user for the task interface meets the jump condition corresponding to the task interface.
In some embodiments, the execution subject may determine whether the operation of the user on the task interface satisfies a jump condition corresponding to the task interface. The jump condition may be a condition for determining whether to jump to a next task interface. In practice, the execution main body may determine that the operation of the user on the task interface satisfies a jump condition corresponding to the task interface in response to determining that the task interface is a scan task interface and detecting a scan result for a real scene. The scan task interface may be an interface for scanning a real scene. In practice, the execution main body may further determine that the operation of the user on the task interface satisfies the jump condition corresponding to the task interface in response to determining that the task interface is a scanning task interface and that the detected scanning result for the real scene is the same as the check result pre-configured for the process node corresponding to the task interface. The execution main body can also respond to the fact that the task interface is determined to be a voice input task interface and detects voice input by a user, and the fact that the operation of the user aiming at the task interface meets the corresponding skipping condition of the task interface is determined. The voice input task interface may be an interface for prompting a user to input voice. The execution main body can also respond to the fact that the task interface is determined to be a shooting task interface and the image or the video shot by the user is detected, and determine that the operation of the user aiming at the task interface meets the jump condition corresponding to the task interface. The shooting task interface may be an interface for prompting a user to shoot an image or a video.
In some optional implementation manners of some embodiments, the execution subject may control the associated camera to capture an image in response to determining that the task interface is an image capture task interface and detecting a selection operation of a user on a capture control displayed in the image capture task interface. Then, in response to detecting a confirmation operation of the user on the image, it is determined that the operation of the user on the task interface satisfies the jump condition. The confirmation operation may be a selection operation performed by a user on an image confirmation control corresponding to the image. Therefore, the user can automatically jump to the next task interface after confirming the shot image.
In some optional implementations of some embodiments, the execution subject may determine that the user's operation on the task interface satisfies the jump condition in response to detecting a selection operation of the user wearing the head-mounted display device on a jump control, or detecting that the voice password issued by the user is the same as the voice instruction corresponding to the jump control. And the jump control is displayed in the task interface. The jump control can be a control for receiving a selection operation of a user to jump to a next task interface. For example, the jump control described above may be displayed as "next". The user may select the jump control via a knob provided on the head-mounted display device. For example, the jump control can be selected by hovering over the jump control for a preset time period via a knob. As another example, the jump control can be hovered over through a knob, and the knob can be pressed to select the jump control. The voice instruction may be a voice instruction corresponding to the display text of the jump control. For example, the voice command may be "next". When the user speaks the "next" step, it may be determined that the user's operation for the task interface satisfies the jump condition.
And step 403, in response to detecting that the operation of the user for the task interface meets the jump condition, jumping to the next task interface corresponding to the task interface and the operation.
In some embodiments, the execution subject may jump to a next task interface corresponding to the task interface and the operation in response to detecting that the operation of the user on the task interface satisfies the jump condition. The next task interface may be a task interface arranged behind the task interface, to which the operation is directed, under the execution logic of the workflow.
Optionally, the executing agent may send a task execution information record generated by a user with respect to the operation of the task interface to the associated storage end and/or the cooperative end in response to determining that the current network state is the online state. The storage terminal may be a terminal for storing task execution information records. For example, the storage end may be an execution body of the flow arrangement method corresponding to fig. 2 or fig. 3. The cooperation terminal may be a terminal cooperating with the head-mounted display device. For example, an operator may call the coordination terminal through the head-mounted display device worn by the operator, so that a user of the coordination terminal may remotely guide the actual operation of the operator, and at this time, the operator directly communicates with the user of the coordination terminal through the head-mounted display device worn by the operator. Then, in response to determining that the current network state is the offline state, the task execution information record may be stored locally. For example, the task execution information record may be stored in a local memory. And then, in response to determining that the current network state is the online state, sending the locally stored task execution information record which is not sent to the storage end. Therefore, the task execution information record can be synchronized in real time in the online state. The task execution information record may be temporarily stored locally while in the offline state. When the offline state is changed into the online state, the task execution information records temporarily stored in the local can be synchronized to the storage end.
The above embodiments of the present disclosure have the following advantages: through the flow display method of some embodiments of the disclosure, an operator can perform actual work without interruption, and does not need to fill data manually, and the data in the work process can be stored through the head-mounted display device. Specifically, the reasons that the operator needs to interrupt the operation process, needs to manually fill in data, and the data is difficult to store are as follows: an operator needs to interrupt the current operation process to check the workflow in the form of paper or electronic document, and the workflow in the form of paper needs to be manually filled in by the operator, so that the filled data is difficult to store. Based on this, according to the flow display method of some embodiments of the present disclosure, first, in response to receiving a work task or identifying a work task corresponding to a work task identification code, a task interface corresponding to a workflow corresponding to the received or identified work task is displayed in the display screen of the head-mounted display device. Thus, the operator can directly refer to the information displayed on the task interface during actual work. And then, determining whether the operation of the user aiming at the task interface meets the jump condition corresponding to the task interface. And then, responding to the condition that the operation of the user for the task interface meets the jump condition, and jumping to the next task interface corresponding to the task interface and the operation. Therefore, after the task item of the current task interface is executed, the next task interface can be automatically skipped to for the operator to watch. And because the workflow can be displayed in the head-mounted display device in the form of a task interface, an operator can directly perform actual operation along with the guidance of the head-mounted display device without additionally checking the workflow in the form of a paper or electronic document. Data may also be entered and saved via the head mounted display device. Therefore, an operator can continuously perform actual operation without filling data manually, and the data in the operation process can be stored through the head-mounted display device.
Referring now to FIG. 5, a schematic diagram of a head mounted display device (e.g., the head mounted display device of FIG. 1) 500 suitable for use to implement some embodiments of the present disclosure is shown. Head mounted display devices in some embodiments of the present disclosure may include, but are not limited to, AR glasses, MR glasses. The head mounted display device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the head mounted
display device500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the
RAM503, various programs and data necessary for the operation of the head mounted
display device500 are also stored. The
processing device501, the
ROM502, and the
RAM503 are connected to each other through a
bus504. An input/output (I/O)
interface505 is also connected to
bus504.
Generally, the following devices may be connected to the I/O interface 505:
input devices506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an
output device507 including, for example, at least one display screen for imaging in front of the eyes of the user (e.g., the display screen may include a micro-display and optical elements), speakers, vibrators, and the like; and a
communication device509. The communication means 509 may allow the head mounted
display device500 to communicate with other devices wirelessly or by wire to exchange data. While FIG. 5 illustrates a head mounted
display device500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 5 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the
ROM502. The computer program, when executed by the
processing device501, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the head-mounted display apparatus; or may exist separately and not be incorporated into the head-mounted display device. The computer readable medium carries one or more programs which, when executed by the head mounted display device, cause the head mounted display device to: in response to receiving the work task or identifying the work task corresponding to the work task identification code, displaying a task interface corresponding to the workflow corresponding to the received or identified work task in a display screen of the head-mounted display device; determining whether the operation of the user aiming at the task interface meets the jump condition corresponding to the task interface; and jumping to a next task interface corresponding to the task interface and the operation in response to detecting that the operation of the user on the task interface meets the jump condition.
Computer program code for carrying out operations for some embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, Python, Ruby, nodjs, Javascript, and also including conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.
Claims (17)
1. A process orchestration method, comprising:
generating a workflow according to a process node information set and a process node connection information set corresponding to the process node information set, wherein the process node information set corresponds to a starting node, at least one process node and an ending node, the process node connection information in the process node connection information set corresponds to two pieces of process node information in the process node information set, and the workflow corresponds to each task interface;
and in response to detecting the issuing operation aiming at the workflow, determining the issuing state of the workflow as an issued state, wherein each task interface corresponding to the workflow is used for being displayed in a head-mounted display device corresponding to the workflow, and the process node information of the corresponding process node comprises an instruction for starting an executable unit of the head-mounted display device.
2. The method of claim 1, wherein the method further comprises:
determining the workflow selected by the user from the workflows of which the release state is the released state as a target workflow;
receiving a user according to the target workflow and at least one task selected by the user, and generating a work task, wherein the work task corresponds to the target workflow;
and responding to the detected issuing operation aiming at the work task, and issuing the work task.
3. The method of claim 2, wherein generating a workflow from a set of process node information and a set of process node connection information corresponding to the set of process node information comprises:
and generating the workflow according to the workflow global configuration information, the process node information set and the process node connection information set.
4. The method of claim 1, wherein prior to generating a workflow from a set of process node information and a set of process node connection information corresponding to the set of process node information, the method further comprises:
determining each process node template selected by a user from a process node template library as each target process node template;
determining process node attribute information configured by a user for each target process node template in each target process node template as process node information to obtain a process node information set;
determining each connecting line connected by the user aiming at each target process node template as each target connecting line;
and combining two node identifications corresponding to each target connecting line in each target connecting line and the configuration information of the jump button corresponding to the first node identification into flow node connection information to obtain a flow node connection information set, wherein the target connecting line is a connecting line pointing from a target flow node template corresponding to the first node identification to a target flow node template corresponding to the second node identification.
5. The method of claim 4, wherein prior to said determining respective connecting lines connected by users for said respective target process node templates as respective target connecting lines, the method further comprises:
determining a first target process node template selected by a user as a previous process node template;
in response to determining that the number of connections corresponding to the predecessor flow node template is less than the threshold of the number of connections corresponding to the predecessor flow node template, determining a second target flow node template selected by a user as a successor flow node template;
and connecting the former process node template with the latter process node template by a connecting line.
6. The method of claim 2, wherein the receiving a user from the target workflow and the at least one task selected by the user, generating a work task, comprises:
and generating a work task according to the target workflow, the at least one task receiving user, and the task execution mode and the task network type selected by the user.
7. The method of claim 2, wherein the publishing the work task comprises:
and sending the work task to each head-mounted display device corresponding to the at least one task receiving user, so that each head-mounted display device displays a task interface in each task interface corresponding to the target workflow.
8. The method of claim 2, wherein the publishing the work task comprises:
generating a work task identification code according to the work task, wherein the work task identification code is used for scanning each head-mounted display device corresponding to the at least one task receiving user so as to display a task interface in each task interface corresponding to the target workflow;
and sending the work task identification code to the associated terminal equipment.
9. The method of claim 7 or 8, wherein the method further comprises:
receiving a task execution information record corresponding to the work task and sent by a head-mounted display device in each head-mounted display device;
responding to the task record viewing operation of a user for the work task, and displaying a task record page of the work task, wherein a task execution information record set corresponding to the work task is displayed in the task record page;
and displaying the task report file of the work task in response to detecting the task report viewing operation of the user aiming at the work task.
10. The method of claim 2, wherein prior to said determining a workflow selected by a user from the workflows whose publication state is the published state as the target workflow, the method further comprises:
in response to the detection of user binding information sent by any head-mounted display equipment, determining a binding user corresponding to the user binding information as an alternative task receiving user;
and determining the arbitrary head-mounted display equipment as the head-mounted display equipment corresponding to the alternative receiving user.
11. The method of claim 10, wherein prior to the receiving user generating a work task in accordance with the target workflow and the at least one task selected by the user, the method further comprises:
determining each determined alternative task receiving user as an alternative task receiving user set;
and in response to detecting the selection operation of the user for the alternative task receiving users in the alternative task receiving user set, determining the alternative task receiving user selected by the user as the task receiving user.
12. A flow display method is applied to a head-mounted display device and comprises the following steps:
in response to receiving a work task or identifying a work task corresponding to a work task identification code, displaying a task interface corresponding to a workflow corresponding to the received or identified work task in a display screen of the head-mounted display device, wherein the workflow is generated by adopting the method of any one of claims 1-11;
determining whether the operation of the user for the task interface meets the jump condition corresponding to the task interface;
and jumping to a next task interface corresponding to the task interface and the operation in response to detecting that the operation of the user for the task interface meets the jump condition.
13. The method of claim 12, wherein the determining whether the operation of the user on the task interface satisfies a jump condition corresponding to the task interface comprises:
in response to the fact that the task interface is determined to be an image shooting task interface and the user is detected to act on the selection operation of a shooting control displayed in the image shooting task interface, controlling the associated camera to shoot the image;
in response to detecting a confirmation operation of the user on the image, determining that the operation of the user on the task interface meets the jump condition.
14. The method of claim 12, wherein the determining whether the operation of the user on the task interface satisfies the jump condition corresponding to the task interface comprises:
and determining that the operation of the user on the task interface meets the jump condition in response to detecting that the user wearing the head-mounted display equipment acts on the selection operation of the jump control or detecting that the voice password sent by the user is the same as the voice instruction corresponding to the jump control, wherein the jump control is displayed in the task interface.
15. The method according to one of claims 12-14, wherein the method further comprises:
in response to the fact that the current network state is the online state, sending task execution information records generated by the user aiming at the operation of the task interface to the associated storage end and/or the associated coordination end;
responding to the fact that the current network state is determined to be an offline state, and storing the task execution information record to the local;
and responding to the fact that the current network state is the online state, and sending the locally stored task execution information record which is not sent to the storage end.
16. A head-mounted display device, comprising:
one or more processors;
one or more display screens for imaging in front of the eyes of a user wearing the head-mounted display device;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 12-15.
17. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any of claims 1-11 or 12-15.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210470109.6A CN114968454B (en) | 2022-04-28 | 2022-04-28 | Flow arrangement, display method, head-mounted display device, and computer-readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210470109.6A CN114968454B (en) | 2022-04-28 | 2022-04-28 | Flow arrangement, display method, head-mounted display device, and computer-readable medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114968454A true CN114968454A (en) | 2022-08-30 |
CN114968454B CN114968454B (en) | 2024-04-12 |
Family
ID=82978618
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210470109.6A Active CN114968454B (en) | 2022-04-28 | 2022-04-28 | Flow arrangement, display method, head-mounted display device, and computer-readable medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114968454B (en) |
Cited By (3)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115145560A (en) * | 2022-09-06 | 2022-10-04 | 北京国电通网络技术有限公司 | Business orchestration method, device, equipment, computer readable medium and program product |
CN116701181A (en) * | 2023-05-10 | 2023-09-05 | 海南泽山软件科技有限责任公司 | Information verification flow display method, device, equipment and computer readable medium |
CN117132245A (en) * | 2023-10-27 | 2023-11-28 | 北京国电通网络技术有限公司 | Online item acquisition business process reorganization method, device, equipment and readable medium |
Citations (12)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030004770A1 (en) * | 2001-06-28 | 2003-01-02 | International Business Machines Corporation | Method, system, and program for generating a workflow |
US20140310595A1 (en) * | 2012-12-20 | 2014-10-16 | Sri International | Augmented reality virtual personal assistant for external representation |
CN106406907A (en) * | 2016-10-11 | 2017-02-15 | 传线网络科技(上海)有限公司 | Application program flow execution control method and device |
WO2018000606A1 (en) * | 2016-06-30 | 2018-01-04 | 乐视控股(北京)有限公司 | Virtual-reality interaction interface switching method and electronic device |
CN108089696A (en) * | 2016-11-08 | 2018-05-29 | 罗克韦尔自动化技术公司 | For the virtual reality and augmented reality of industrial automation |
CN110853115A (en) * | 2019-10-14 | 2020-02-28 | 平安国际智慧城市科技股份有限公司 | Method and equipment for creating development process page |
CN110910081A (en) * | 2018-09-17 | 2020-03-24 | 上海宝信软件股份有限公司 | Workflow configuration implementation method and system based on laboratory information management system |
CN111930372A (en) * | 2020-08-06 | 2020-11-13 | 科大国创云网科技有限公司 | Service arrangement solution method and system realized through draggable flow chart |
CN112685036A (en) * | 2021-01-13 | 2021-04-20 | 北京三快在线科技有限公司 | Front-end code generation method and device, computer equipment and storage medium |
CN113220118A (en) * | 2021-04-20 | 2021-08-06 | 杭州灵伴科技有限公司 | Virtual interface display method, head-mounted display device and computer readable medium |
US11151898B1 (en) * | 2020-04-15 | 2021-10-19 | Klatt Works, Inc. | Techniques for enhancing workflows relating to equipment maintenance |
CN114035884A (en) * | 2021-12-07 | 2022-02-11 | 深圳市锐思华创技术有限公司 | UI interaction design method of AR HUD train control system |
-
2022
- 2022-04-28 CN CN202210470109.6A patent/CN114968454B/en active Active
Patent Citations (12)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030004770A1 (en) * | 2001-06-28 | 2003-01-02 | International Business Machines Corporation | Method, system, and program for generating a workflow |
US20140310595A1 (en) * | 2012-12-20 | 2014-10-16 | Sri International | Augmented reality virtual personal assistant for external representation |
WO2018000606A1 (en) * | 2016-06-30 | 2018-01-04 | 乐视控股(北京)有限公司 | Virtual-reality interaction interface switching method and electronic device |
CN106406907A (en) * | 2016-10-11 | 2017-02-15 | 传线网络科技(上海)有限公司 | Application program flow execution control method and device |
CN108089696A (en) * | 2016-11-08 | 2018-05-29 | 罗克韦尔自动化技术公司 | For the virtual reality and augmented reality of industrial automation |
CN110910081A (en) * | 2018-09-17 | 2020-03-24 | 上海宝信软件股份有限公司 | Workflow configuration implementation method and system based on laboratory information management system |
CN110853115A (en) * | 2019-10-14 | 2020-02-28 | 平安国际智慧城市科技股份有限公司 | Method and equipment for creating development process page |
US11151898B1 (en) * | 2020-04-15 | 2021-10-19 | Klatt Works, Inc. | Techniques for enhancing workflows relating to equipment maintenance |
CN111930372A (en) * | 2020-08-06 | 2020-11-13 | 科大国创云网科技有限公司 | Service arrangement solution method and system realized through draggable flow chart |
CN112685036A (en) * | 2021-01-13 | 2021-04-20 | 北京三快在线科技有限公司 | Front-end code generation method and device, computer equipment and storage medium |
CN113220118A (en) * | 2021-04-20 | 2021-08-06 | 杭州灵伴科技有限公司 | Virtual interface display method, head-mounted display device and computer readable medium |
CN114035884A (en) * | 2021-12-07 | 2022-02-11 | 深圳市锐思华创技术有限公司 | UI interaction design method of AR HUD train control system |
Cited By (5)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115145560A (en) * | 2022-09-06 | 2022-10-04 | 北京国电通网络技术有限公司 | Business orchestration method, device, equipment, computer readable medium and program product |
CN116701181A (en) * | 2023-05-10 | 2023-09-05 | 海南泽山软件科技有限责任公司 | Information verification flow display method, device, equipment and computer readable medium |
CN116701181B (en) * | 2023-05-10 | 2024-02-02 | 海南泽山软件科技有限责任公司 | Information verification flow display method, device, equipment and computer readable medium |
CN117132245A (en) * | 2023-10-27 | 2023-11-28 | 北京国电通网络技术有限公司 | Online item acquisition business process reorganization method, device, equipment and readable medium |
CN117132245B (en) * | 2023-10-27 | 2024-02-06 | 北京国电通网络技术有限公司 | Online item acquisition business process reorganization method, device, equipment and readable medium |
Also Published As
Publication number | Publication date |
---|---|
CN114968454B (en) | 2024-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114968454B (en) | 2024-04-12 | Flow arrangement, display method, head-mounted display device, and computer-readable medium |
JP6916167B2 (en) | 2021-08-11 | Interactive control methods and devices for voice and video calls |
CN109816447B (en) | 2023-12-01 | Intelligent monitoring method, device and storage medium for cabinet advertisement |
EP3622458B1 (en) | 2024-03-27 | Self-learning adaptive routing system |
JP7230465B2 (en) | 2023-03-01 | ERROR DISPLAY SYSTEM, ERROR DISPLAY METHOD, INFORMATION PROCESSING DEVICE |
CN104065699A (en) | 2014-09-24 | Vehicle-mounted device system, portable terminal, and vehicle-mounted device |
CN110634220B (en) | 2021-11-23 | Information processing method and device |
CN112015654B (en) | 2024-11-22 | Method and apparatus for testing |
KR20160128145A (en) | 2016-11-07 | Message processing method and electronic device supporting the same |
US20200272514A1 (en) | 2020-08-27 | Information processing terminal, non-transitory recording medium, and control method |
EP3296935A1 (en) | 2018-03-21 | Device check support system, work progress confirmation method, and work progress confirmation program |
CN110618768B (en) | 2021-05-04 | Information presentation method and device |
US20200160243A1 (en) | 2020-05-21 | Resource reservation system, information display method, server system, and information processing terminal |
CN111147872A (en) | 2020-05-12 | Information display method and device and electronic equipment |
US20210144697A1 (en) | 2021-05-13 | Resource reservation system and resource usage method |
JP2020095675A (en) | 2020-06-18 | Resource reservation system, terminal setting method, program, utilization system, and information processing apparatus |
CN113326013A (en) | 2021-08-31 | Information interaction method and device and electronic equipment |
CN112668283A (en) | 2021-04-16 | Document editing method and device and electronic equipment |
CN112346947A (en) | 2021-02-09 | Performance detection method and device, electronic equipment and computer readable medium |
US11018987B2 (en) | 2021-05-25 | Resource reservation system, setting method, and non-transitory computer readable storage medium |
JP2018185560A (en) | 2018-11-22 | PC support system, information display program and information reading transmission program |
JP2021081865A (en) | 2021-05-27 | Resource reservation system and presentation method and information processor |
CN116264603B (en) | 2025-02-14 | Live broadcast information processing method, device, equipment and storage medium |
CN113420133B (en) | 2023-07-25 | Session processing method, device, equipment and storage medium |
CN114253520B (en) | 2024-03-12 | Interface code generation method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
2022-08-30 | PB01 | Publication | |
2022-08-30 | PB01 | Publication | |
2022-09-16 | SE01 | Entry into force of request for substantive examination | |
2022-09-16 | SE01 | Entry into force of request for substantive examination | |
2024-04-12 | GR01 | Patent grant | |
2024-04-12 | GR01 | Patent grant |