patents.google.com

CN112558554A - Task tracking method and system - Google Patents

  • ️Fri Mar 26 2021

CN112558554A - Task tracking method and system - Google Patents

Task tracking method and system Download PDF

Info

Publication number
CN112558554A
CN112558554A CN201910855076.5A CN201910855076A CN112558554A CN 112558554 A CN112558554 A CN 112558554A CN 201910855076 A CN201910855076 A CN 201910855076A CN 112558554 A CN112558554 A CN 112558554A Authority
CN
China
Prior art keywords
task
image
production
target image
current
Prior art date
2019-09-10
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910855076.5A
Other languages
Chinese (zh)
Other versions
CN112558554B (en
Inventor
赵永飞
龙一民
徐博文
吴剑
陈新
陈荥
马文波
胡露露
神克乐
王逸成
张天霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2019-09-10
Filing date
2019-09-10
Publication date
2021-03-26
2019-09-10 Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
2019-09-10 Priority to CN201910855076.5A priority Critical patent/CN112558554B/en
2021-03-26 Publication of CN112558554A publication Critical patent/CN112558554A/en
2024-10-18 Application granted granted Critical
2024-10-18 Publication of CN112558554B publication Critical patent/CN112558554B/en
Status Active legal-status Critical Current
2039-09-10 Anticipated expiration legal-status Critical

Links

  • 238000000034 method Methods 0.000 title claims abstract description 90
  • 238000004519 manufacturing process Methods 0.000 claims abstract description 213
  • 238000007639 printing Methods 0.000 claims abstract description 69
  • 238000000605 extraction Methods 0.000 claims description 34
  • 239000004744 fabric Substances 0.000 claims description 29
  • 230000033001 locomotion Effects 0.000 claims description 19
  • 238000003860 storage Methods 0.000 claims description 19
  • 230000008569 process Effects 0.000 claims description 17
  • 238000013135 deep learning Methods 0.000 claims description 7
  • 238000013528 artificial neural network Methods 0.000 claims description 5
  • 238000005094 computer simulation Methods 0.000 claims 1
  • 230000000875 corresponding effect Effects 0.000 description 22
  • 238000012545 processing Methods 0.000 description 22
  • 238000010586 diagram Methods 0.000 description 12
  • 238000004422 calculation algorithm Methods 0.000 description 10
  • 238000004458 analytical method Methods 0.000 description 9
  • 238000013527 convolutional neural network Methods 0.000 description 8
  • 238000001514 detection method Methods 0.000 description 7
  • 230000006870 function Effects 0.000 description 7
  • 238000013479 data entry Methods 0.000 description 5
  • 238000005516 engineering process Methods 0.000 description 5
  • 238000007726 management method Methods 0.000 description 5
  • 239000013598 vector Substances 0.000 description 5
  • 238000004590 computer program Methods 0.000 description 4
  • 230000003287 optical effect Effects 0.000 description 4
  • 230000002085 persistent effect Effects 0.000 description 4
  • 239000000284 extract Substances 0.000 description 3
  • 101150041570 TOP1 gene Proteins 0.000 description 2
  • 230000003044 adaptive effect Effects 0.000 description 2
  • 238000004364 calculation method Methods 0.000 description 2
  • 230000008859 change Effects 0.000 description 2
  • 238000012790 confirmation Methods 0.000 description 2
  • 238000013461 design Methods 0.000 description 2
  • 230000000694 effects Effects 0.000 description 2
  • 238000009776 industrial production Methods 0.000 description 2
  • 230000004048 modification Effects 0.000 description 2
  • 238000012986 modification Methods 0.000 description 2
  • 238000012544 monitoring process Methods 0.000 description 2
  • 238000013439 planning Methods 0.000 description 2
  • 238000012546 transfer Methods 0.000 description 2
  • 230000009466 transformation Effects 0.000 description 2
  • 238000012384 transportation and delivery Methods 0.000 description 2
  • 102100024607 DNA topoisomerase 1 Human genes 0.000 description 1
  • 101000830681 Homo sapiens DNA topoisomerase 1 Proteins 0.000 description 1
  • 230000002378 acidificating effect Effects 0.000 description 1
  • 230000002776 aggregation Effects 0.000 description 1
  • 238000004220 aggregation Methods 0.000 description 1
  • 238000013473 artificial intelligence Methods 0.000 description 1
  • 230000005540 biological transmission Effects 0.000 description 1
  • 238000010411 cooking Methods 0.000 description 1
  • 230000002596 correlated effect Effects 0.000 description 1
  • 238000007405 data analysis Methods 0.000 description 1
  • 230000007812 deficiency Effects 0.000 description 1
  • 238000001035 drying Methods 0.000 description 1
  • 239000002355 dual-layer Substances 0.000 description 1
  • 239000000975 dye Substances 0.000 description 1
  • 238000001914 filtration Methods 0.000 description 1
  • 238000005286 illumination Methods 0.000 description 1
  • 238000010191 image analysis Methods 0.000 description 1
  • 238000007689 inspection Methods 0.000 description 1
  • 230000010354 integration Effects 0.000 description 1
  • 239000000203 mixture Substances 0.000 description 1
  • 230000006855 networking Effects 0.000 description 1
  • 239000003973 paint Substances 0.000 description 1
  • 239000004065 semiconductor Substances 0.000 description 1
  • 238000000926 separation method Methods 0.000 description 1
  • 238000012163 sequencing technique Methods 0.000 description 1
  • 238000007493 shaping process Methods 0.000 description 1
  • 238000004513 sizing Methods 0.000 description 1
  • 238000005507 spraying Methods 0.000 description 1
  • 230000003068 static effect Effects 0.000 description 1
  • 238000010025 steaming Methods 0.000 description 1
  • 239000004753 textile Substances 0.000 description 1
  • 238000005406 washing Methods 0.000 description 1

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32368Quality control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Factory Administration (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Disclosed are a task tracking method, system and platform, the method comprising: acquiring a target image of a task; judging the execution state of the task according to the target image; acquiring the progress status of the task; and determining a current status of the task based on the execution status and the progress status. Therefore, the quality of the task is tracked through image comparison, and the current state of the task can be accurately grasped by combining the acquisition of the progress state. The solution is particularly suitable for implementation as a documentary system for digital printing production tasks.

Description

Task tracking method and system

Technical Field

The invention relates to the field of automation, in particular to a task tracking method and a task tracking system.

Background

The order tracking means tracking the operation flow of the production task based on the customer order in the enterprise operation process. The receipt is also called as a processing route receipt, and refers to a production certificate which follows the movement of a workpiece and records the receiving and sending movement conditions among all working procedures in the processing process. The production planning and scheduling personnel give workers in the first procedure, and then follow each procedure to sequentially transfer and record one by one until the last procedure is finished. The existing order following mode mainly comprises manual data entry, order production association and data entry are required to be carried out by specific workers, and the production process on a production line is separated from the order progress, so that the order production progress cannot be fed back in real time, and the production cost of a factory is greatly improved. In other areas such as logistics, the lack of timely tracking and feedback of task status greatly affects the system's delivery and monitoring of tasks, and thus overall efficiency.

Accordingly, there is a need for an improved task tracking scheme.

Disclosure of Invention

In order to solve at least one problem, the invention provides a task tracking scheme, which tracks tasks through computer vision comparison and combines the acquisition of production progress conditions, so that the current conditions of the production tasks can be accurately grasped. This solution is particularly suitable for implementing a tracking system for production tasks, for example for digital printing tasks.

According to an aspect of the present invention, a task tracking method is provided, including: acquiring a target image of a task; judging the execution state of the task according to the target image; acquiring the progress status of the task; and determining a current status of the task based on the execution status and the progress status. Therefore, the execution state of the production task is acquired based on the image through task image acquisition, and the current production state can be accurately grasped in combination with the acquisition of the production progress state.

Preferably, the task is a production task, and the target image of the production task on the production line may be acquired by an image acquisition device installed on the production line. Preferably, a production progress sensor arranged on or near the production line, for example, a reading of a length or number counter, may be acquired as the production progress status of the production task.

Preferably, the determination of the execution status may include determining whether the target image conforms to a prescribed image of the production task. For example, feature extraction may be performed on the target image; comparing features extracted from the target image with features extracted from a prescribed image of the production task; and judging whether the target image meets the specified image of the task or not according to the comparison result. The image feature extraction may be, for example, feature point extraction based on a conventional CV (computer vision); or neural network feature extraction based on deep learning.

Specifically, a target area image for the determination may be extracted from the acquired target image. For example, continuous target images of a current production task on a production line can be acquired, and motion area detection is performed according to the established background model; so as to extract the target area image based on the motion area detected from the continuous target image.

The comparison may be a direct comparison against a determined image, or a search comparison against an existing database. To this end, determining whether the target image conforms to a prescribed image of the production task may include: and searching a specified image which is consistent with the target image from a task specified image database.

Further, the tracking method of the present invention may be used to implement automatic order following, and then the method may further include: based on the conformed specification images, order information of the task is searched, and the current completion degree of the task can be determined based on the order information and the progress condition.

Further, the tracking method of the present invention may be used for task status aggregation and reporting, and then the method may further include: reporting the current status of the task. Reporting the current status of the task may include at least one of: actively pushing the current state of the task to a client; providing the current state of the task when the client side inquires; and displaying the current state of the task in real time. In the invention, the method can be simultaneously executed for a plurality of tasks so as to simultaneously track the current conditions of the plurality of tasks.

Further, the plurality of tasks may be tasks belonging to different task executors, and the method further includes: calculating executive task condition information of the different task executors based on the current conditions of the plurality of tasks; and the task issuing party selects a task executing party to execute the new task based on the task condition information of the executing parties of the different task executing parties.

The plurality of tasks may also be tasks belonging to different task performers belonging to the same product supply chain, and the method further comprises: calculating executive task condition information of the different task executors based on the current conditions of the plurality of tasks; determining current supply chain condition information for the supply chain based on the executive task condition information of the different task executors; and determining a new task to perform based on the current supply chain condition information.

Further, the tracking method of the present invention may be used for automated production of combined documentaries, and then the method may further comprise: and starting the next production task under the condition that the current condition is at least that the production task is completed.

In the invention, the production task can be especially a cloth printing task, and a target image after the printing process on a production line is finished can be obtained, so that the execution condition of the production task is judged according to the printing image on the cloth.

In the present invention, the task may also be a logistics task, and determining the execution status of the task according to the target image includes: judging the task object and/or the task object condition of the logistics task according to the target image, wherein the step of acquiring the progress condition of the task comprises the following steps: and acquiring the progress status of the task according to the position of the logistics task.

According to yet another aspect of the present invention, there is provided a task tracking system, comprising: the image acquisition equipment is used for acquiring a target image of the task; the computing equipment is used for judging the execution condition of the task according to the target image; a sensor, such as a length or number counter, to collect a production progress status of the task, wherein the computing device determines a current status of the task based on the execution status and the progress status.

Preferably, the computing device may be configured to: and judging whether the target image conforms to the specified image of the task.

Preferably, the computing device may be further configured to: and extracting a target area image for the judgment from the acquired target image.

Preferably, the image acquisition device may be configured to: acquiring successive target images of the task (e.g., a current production task on a production line), and the computing device may be further operable to: detecting a motion area according to the established background model; extracting the target area image from the motion areas detected from the continuous target images.

Preferably, the computing device may be further configured to: extracting the features of the target image; comparing features extracted from the target image with features extracted from a prescribed image of the task; and judging whether the target image meets the specified image of the task or not according to the comparison result.

Preferably, the system may further comprise: a database to store production task specification images, wherein the computing device is further to: and searching a specified image which is consistent with the target image from the database. Preferably, the computing device is further configured to: searching order information of the task based on the matched specified image; and determining the current completion degree of the task based on the order information and the progress condition.

Preferably, the system may further comprise: and the client is used for acquiring and reporting the current state of the production task. The client may receive the current status of the actively pushed task, or obtain the current status of the task when queried by the user.

Preferably, the system can track the current status of multiple tasks simultaneously. The computing device may be further to: and calculating system task condition information based on the current conditions of the plurality of tasks, and reporting the system task condition information. The system may also be one that belongs to a particular supply chain, the computing device receiving current supply chain condition information and determining a new task to perform based on the current supply chain condition information.

Preferably, the computing device is further configured to: determining a next task to be started if the current condition is at least the task completed.

The tracking system of the present invention may be a production task tracking system, said system further comprising: a production line for performing production tasks thereon. The production task tracking system is particularly suitable for being implemented as a system for cloth printing production tracking system, whereupon the image acquisition device may be mounted for: and acquiring a target image after the printing process on the production line is finished. The computing device can judge the execution status of the production task according to the printing image on the cloth.

The tracking system of the present invention may also be a logistics task tracking system, and the computing device is configured to: and judging the task object and/or the task object condition of the logistics task according to the target image acquired by the image acquisition equipment, and determining the progress condition of the logistics task according to the position of the logistics task acquired by the sensor.

According to one aspect of the invention, a task tracking platform is provided, comprising a server and a plurality of task tracking systems as described above connected to the server via the internet.

In the case where edge calculations are performed by the local computing device, the server may only be used to issue image feature extraction models to the computing devices of the task tracking system. In the case of cloud computing, the server needs to: acquiring a target image uploaded by the task tracking system; judging the execution state of the task according to the target image; and issuing the execution status to the computing equipment of the task tracking system.

Preferably, the platform may further comprise: a client for: receiving the current state of the task actively pushed by the server; and/or obtaining the current state of the task from the server when queried by a user.

Preferably, the plurality of task tracking systems connected to the server may include a plurality of task performers performing tasks of the same kind, and the server is configured to: executing side task condition information of the plurality of task executing sides calculated based on the current conditions of the plurality of tasks; and providing the executive task condition information to a task assigning party so that the task assigning party selects a task executing party to execute a new task based on executive task condition information of the plurality of task executing parties.

Preferably, the plurality of task tracking systems connected to the server may include a plurality of task performers performing the same product supply chain, and the server is configured to: acquiring executive task condition information of the plurality of task executors calculated based on the current conditions of the plurality of tasks; determining current supply chain condition information for the supply chain based on executive task condition information for the plurality of task executors; determining a new task to perform based on the current supply chain condition information; and issuing the determined new task to a corresponding task executive party.

According to yet another aspect of the invention, there is provided a computing device comprising: a processor; and a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the production task tracking method as described above.

According to yet another aspect of the present invention, a non-transitory machine-readable storage medium is presented having executable code stored thereon, which when executed by a processor of an electronic device, causes the processor to perform the production task tracking method as described above.

According to the invention, a camera is installed to collect image information in a task execution process for algorithm intelligent analysis, adaptive area detection is carried out on an effective image area, image characteristics of an effective target area are extracted, the effective target area is compared and sorted in a characteristic library according to the characteristic vector, a corresponding image is searched in a base library and is further associated with order information, production of a current task is carried out by combining a sensing counter, and various functions based on task tracking, such as an automatic order following function, are finally realized.

Drawings

The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.

FIG. 1 shows a schematic flow diagram of a production task tracking method according to one embodiment of the invention.

Fig. 2 shows an embodiment of the flow of the digital printing automatic order following according to the invention.

FIG. 3 illustrates a schematic diagram of the components of a production task tracking system according to one embodiment of the present invention.

FIG. 4 is a schematic diagram of a computing device that can be used to implement the production task tracking method according to an embodiment of the present invention.

Fig. 5 shows an example of a technical implementation flow of the digital printing automatic order following according to the invention.

Fig. 6 shows a diagram of the presentation effect of the single example shown in fig. 5.

Detailed Description

Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.

The order tracking means tracking the operation flow of the production task based on the customer order in the enterprise operation process. The following list is also called as a processing route list. A production certificate for recording the receiving and transmitting movement condition of each working procedure in the processing process along with the movement of a workpiece. The production planning and scheduling personnel give workers in the first procedure, and then follow each procedure to sequentially transfer and record one by one until the last procedure is finished. The existing order following mode mainly comprises manual data entry, order production association and data entry are required to be carried out by specific workers, and the production process on a production line is separated from the order progress, so that the order production progress cannot be fed back in real time, and the production cost of a factory is greatly improved.

Taking digital printing as an example, the management efficiency of current digital printing processing factory is low, the production and marketing are disjointed, most in the industry are to the order following of digital printing, what adopt is that the production is that the order following personnel carries out special manual data entry operation on the day, production and order progress control are disjointed, consequently need to have one set of complete digital solution help mill to carry out production management, help the user to obtain printing process data in real time simultaneously, know order production progress, finally realize efficient production and marketing in coordination, promote the purpose that mill's management efficiency and reduce cost improve the productivity.

In other areas such as logistics, the lack of timely tracking and feedback of task status greatly affects the system's delivery and monitoring of tasks, and thus overall efficiency.

Therefore, the invention provides a task tracking scheme, which has little modification to the existing facilities of factories or logistics warehouses and the like, does not need to change the working mode of workers, only needs to install a camera in the area near a machine which is the main link of the task, and can complete the automatic tracking of the task in a non-invasive mode by carrying out algorithm analysis on image/video data under the condition that the workers do not feel, such as an associated order following function.

Fig. 1 shows a schematic flow diagram of a task tracking method according to an embodiment of the invention. The method is suitable for a control center of a factory or a logistics service provider and the like, and can acquire cloud service of the current task state or be realized by combining local and cloud.

In step S110, a target image of the task is acquired. Herein, a "task" refers to a job, such as being dispatched via a customer. For example, a production line of a factory may perform production tasks according to customer orders, a logistics office may perform logistics transportation tasks according to customer orders, and the like. The target image of the acquisition task may be a target image of an object of the acquisition task, for example, an image of an object being produced on a production line or an image of an object being shipped by logistics.

Here, a "production line" may be referred to as a production line or an assembly line, and is an industrial production method, which means that each production unit is dedicated to only processing a certain piece of work. The production line can be used for producing or assembling parts, or aiming at different steps of a processing object, or assembling the parts or various combinations of processing steps. "production task" refers to an object processed on a production line. In some embodiments, a "production job" is the object itself, and in other embodiments, a "production job" is the task of processing a plurality of objects in a batch, or the plurality of objects themselves. The target image of the production task is the image of the object being processed on the production line.

For example, in digital printing plants, digital printing of the raw piece of cloth produced is required. Digital printing is printing by digital technology, which inputs various digital patterns into a computer through various digital means such as scanning, digital photos, images or computer manufacturing processing, then the digital patterns are processed by a computer color separation printing system, various special dyes (active, dispersed and acidic main paints) are directly sprayed and printed on various fabrics or other media through a special RIP (Raster Image Processor) software through a spraying and printing system, and various high-precision printing products required are obtained on various textile fabrics after processing. The digital printing process is mainly divided into five links: printing, steaming, washing, shaping, weft straightening, sizing and drying.

For this reason, in an application scenario of digital printing, a production line may refer to an industrial production line for performing the above five links of digital printing to complete accurate printing on an original cloth. The production task may then refer to a certain task in the order, for example, 1000 meters of the piece of cloth producing the star pattern, or the piece of cloth producing the star pattern itself. The target image of the production job may then be a printed pattern of the piece of cloth being processed on the production line.

In the following description of the present invention, it will be mainly described in conjunction with digital printing as an example. It will be appreciated by those skilled in the art that the present invention is also applicable to other product lines where the determination of production conditions is based on profile comparison. In other embodiments, the invention is also applicable to tasks other than production tasks, for example, logistics transportation tasks, as long as the performance of these tasks can be analyzed using the captured images.

In step S110, an image capturing device installed on a production line may be used to capture a target image of a production task on the production line. Here, the "on-line" may refer to an arrangement that is installed in any direction of up, down, left, and right of the production line, and around, near, etc. the production line, depending on actual needs. For example, in a digital printing scenario, a target image after completion of a printing process on a production line can be acquired by an image capture device installed between the printing and cooking processes, for example, a camera installed directly above the production line and vertically downward. In other embodiments, the image capturing device may be installed after all or part of other production processes to capture and analyze images, and the invention is not limited thereto.

In a logistics task, the acquisition of the target image may be performed using an image acquisition device installed at a logistics warehouse fixed location (e.g., an entrance or exit location) or a transportation fixed location (e.g., the cabin of a cargo plane).

In various embodiments, the "image capture device" may be a camera or a video camera to meet the requirements for image or video capture in subsequent steps. In some scenarios, the identities of the camera and camcorder may be interchanged, e.g., pictures taken at sufficiently short intervals for use as a video stream may be taken quickly enough, or videos may be captured at sufficiently large intervals for processing as images. In one embodiment, the target image is preferably acquired in real time or near real time for subsequent timely processing, thereby facilitating immediate control of the production process.

Subsequently, in step S120, the execution status of the task is determined according to the target image. Here, the determining of the execution status of the task may be determining a specific task based on the image, for example, determining the order to which the current cloth production task belongs from the design and color of the printing, or determining whether the task is being executed normally based on the graph, for example, determining whether the shape of the parcel box in the logistics task is complete, determining whether the design and color of the printing are wrong, and the like.

Specifically, a comparison of the target image with the existing image may be performed, and thus it may be determined whether the task is normally executed. Specifically, it may be determined whether the target image conforms to a prescribed image of the job (e.g., an image in a print library). For example, the target image may be determined based on comparison of image features by using an image feature extraction technique. Here, the "image feature extraction technology" may be part of "computer vision", which refers to machine vision, such as recognizing, tracking, and measuring a target by using an image acquisition device (e.g., a camera) and a computer instead of human eyes, and further performing image processing, and processing the image into an image more suitable for human eyes or for transmission to an instrument for detection by using the computer. Computer vision attempts to create artificial intelligence systems that can extract "information" from images or multidimensional data. The information referred to herein is that defined by shannon and can be used to help make a "decision". Thus, in the present invention, the "image feature extraction technique" may refer to an operation of extracting a meaningful feature (a feature representing the above-described "information") from an acquired image using a computer. In the present invention, the image feature extraction technique utilized may be feature point extraction based on conventional CV (computer vision); deep learning based neural network (e.g., CNN, convolutional neural network) feature extraction is also possible.

Although it is possible to acquire an image of an object closer to a prescribed image acquisition angle by the precision and mounting of the camera, various differences are introduced in actual operation. For example, the actual pattern image acquired by the camera in the digital printing production process still has a certain difference with the image corresponding to the stencil pattern issued by the order base, such as size, rotation, light and shade of illumination, color depth, pattern repetition degree, image noise of other contents, angle change, etc., which is a technical difficulty that needs to be overcome in the aspect of related matching. Therefore, further processing of the acquired images is often required for subsequent alignment.

Accordingly, in one embodiment, the tracking method of the present invention further includes extracting a target area image for the determination from the acquired target image. Specifically, a continuous target image, such as a video, of a current production task on a production line may be acquired, and then motion region detection may be performed according to the established background model; and extracts the target area image based on the motion area detected from the continuous target image.

In one embodiment, the comparison of the target image to the existing image may be a comparison to the extracted features. To this end, step S120 may include: extracting the features of the target image; comparing features extracted from the target image with features extracted from a prescribed image of the production task; and judging whether the target image meets the specified image of the task or not according to the comparison result.

The comparison may be a direct comparison against a determined image, or a search comparison against an existing database. To this end, step S120 may include: and searching a specified image which is consistent with the target image from a task specified image database. For example, the extracted target image features are compared with existing features in a prescribed image database to find a prescribed image in which the features match.

Subsequently, based on the conformed specification images, the order information of the task may be searched, and thus the current completion degree of the task may be determined based on the order information and the progress status acquired in the subsequent step S130.

In step S130, a progress status of the task is acquired, e.g. readings of a production progress sensor arranged on or near the production line are acquired. The production progress sensor may be a length or number counter. For example, the finished length of the printed piece of cloth can be determined by a meter counter mounted on the axis of rotation of the production line. For example, the current position of the transportation object in the logistics task may be acquired, thereby estimating the transportation completion.

Subsequently, in step S140, the current status of the task is determined based on the execution status and the progress status.

Thus, the present invention is able to determine the current completion of a task through the acquisition of performance and progress conditions (e.g., real-time acquisition and analysis of images and sensor data). The current status of the task can be fed into the control center of the local or cloud end, so as to perform the subsequent operation.

In one embodiment, the method may further comprise: and reporting the current status of the production task. For example, the determined current condition may be displayed in real time on a display of the plant control center, even on the customer's client. For example, the current status of the task may be actively pushed to the client; the current status of the task may also be provided when the client makes a query.

In the invention, the method can be simultaneously executed for a plurality of tasks so as to simultaneously track the current conditions of the plurality of tasks. For example, in the case of multiple production lines, the method of the present invention can track the production status of the current production task of each production line, and summarize and report the status.

Under the condition of cloud implementation, tracking of the current production condition of each production line of each factory can be achieved in the server, and the current production condition is fed back to the corresponding client.

In one embodiment, the plurality of tasks tracked are tasks belonging to different task executors, and the method further comprises: calculating executive task condition information of the different task executors based on the current conditions of the plurality of tasks; and the task issuing party selects a task executing party to execute the new task based on the task condition information of the executing parties of the different task executing parties.

Here, the different task performers may be performers of the same type of task, such as cloth suppliers A, B and C. Each cloth supplier A, B and C has multiple production lines for printed cloth production. Thus, the print time, efficiency, busyness (e.g., inventory count) of each piece goods provider may be aggregated and provided to a job-issuing party (e.g., an orderer), for example, who may view the status of the producer on a client, either in real time or updated at a fixed time (e.g., updated daily), to select one of the piece goods providers A, B and C to sign in an order, e.g., an electronic order, based on the status.

In one embodiment, the plurality of tasks tracked are tasks belonging to different task performers belonging to the same product supply chain, and the method further comprises: calculating executive task condition information of the different task executors based on the current conditions of the plurality of tasks; determining current supply chain condition information for the supply chain based on the executive task condition information of the different task executors; and determining a new task to perform based on the current supply chain condition information. For example, the supply of tires a and engines B throughout the supply chain restricts the production of the entire vehicle. For this purpose, the current production situation of the individual task performers on the entire supply chain can be summarized and the subsequent production tasks of the individual production lines of the individual task performers can be determined therefrom.

In one embodiment, the tracking scheme of the present invention may also be incorporated into a factory automation system, whereby the method may further comprise: in the case of the current situation at least the task having been completed, the next task to be started is confirmed, whereby this next task can be started automatically or with manual confirmation. For example, when order scheduling is performed, tasks to be executed can be scheduled reasonably according to the current overall order and execution condition.

The production task tracking method according to the present invention and its preferred embodiment are described above in connection with fig. 1. In the case of digital printing production, the digital printing cloth printing area can be detected in a self-adaptive manner by performing algorithm intelligent analysis on video image information acquired by mounting cameras in each link in the printing process, an effective pattern printing area is extracted, then, the pattern printing area image is subjected to feature extraction, and comparison and sequencing are performed according to the extracted feature vector and the feature vector in the pattern base, so that base samples corresponding to the currently produced pattern are determined to be further related to order information of a database, the number of production meters of the current pattern is obtained according to the data of a current machine meter counter, and finally, the function of automatic order following is realized.

Fig. 2 shows an embodiment of the flow of the digital printing automatic order following according to the invention. As shown in the figure, a video image of a main link, for example, a video image after a printing link, is first acquired in step S210. Subsequently, in step S220, the input real-time video image is subjected to a filtering denoising process, for example, a gaussian denoising process may be adopted.

Then, in step S230, background modeling is performed on the working environment of the worker in the camera, for example, gaussian mixture background modeling may be adopted, and since the printing cloth is tiled and moved in 5 main links of printing to perform the processing operation, the motion region detection may be performed according to any established background model, and then the printing ROI (region of interest) region is segmented according to the motion region detected in the video. Herein, the print ROI area may refer to an image content corresponding to a cloth area on which a print is effectively printed in a video picture acquired by a camera in a print production process.

In step S240, an image feature extraction algorithm is used to perform image retrieval and comparison. Image Retrieval mainly refers to Content-based Image Retrieval (CBIR) technology, that is, Image Retrieval technology for analyzing and retrieving the Content semantics of an Image, such as color, texture, layout, and the like of the Image. In a different implementation, the printing ROI region may be subjected to feature extraction by using a computer vision algorithm, such as feature point extraction of a traditional CV, convolutional neural network feature extraction of deep learning, and the like, for example, the feature extraction may be performed by using resnet50 in the deep learning algorithm in this example, and the feature extraction is compared with the feature vector of the image data set of the order pattern base. Here, the order pattern base may refer to a pattern template image set issued to a printing side by the order management system. The above comparison can obtain a similar characteristic pattern such as Top10, and if the similarity of the characteristic of the bottom-library pattern corresponding to Top1 is greater than a certain threshold (for example, 0.8 is assumed in this example), it is determined that the pattern is correctly corresponding.

Then, in step S250, the length of the processed meter number of the current pattern is obtained by reading the data of the current meter counter and subtracting the value of the meter counter after the previous pattern is finished, and the result is reported by combining the order information of the pattern of the corresponding bottom library in the database. Finally, in step S260, the automatic order following system receives the reported information, and performs result summarizing analysis and display.

In one embodiment, the tracking scheme of the present invention may also track logistics tasks. To this end, step S120 may include determining a task object and/or a task object status of the logistics task according to the target image. Step S130 may include acquiring a progress status of the logistics task according to the location of the task. For example, the current shipping status of a previously produced printed cloth order may be further determined from the printed image of the shipped goods. For example, a cloth with a color of a is now in order a that was shipped in warehouse a by inspection. Meanwhile, the transportation progress of the logistics task can be obtained according to the position of the logistics warehouse A or real-time GPS data in a subsequent transportation tool. Thereby determining the current state of the logistics task at step S140.

The tracking scheme of the present invention may also be implemented as a task tracking system. A task tracking system according to the present invention may include: the image acquisition equipment is used for acquiring a target image of the task; the computing equipment is used for judging the execution condition of the task according to the target image; a sensor to collect a progress status of the task, wherein the computing device determines a current status of the task based on the execution status and the progress status. The system of the present invention will be described below in connection with an example of a production task tracking system that further includes a production line.

FIG. 3 illustrates a schematic diagram of the components of a production task tracking system according to one embodiment of the present invention. As shown,

system

300 includes a

production line

310, an

image acquisition device

320, a

computing device

330, and a

sensor

340.

The

production line

310 is used to perform production tasks thereon. For example, the

production line

310 may be a production line for printing cloth, on which a production task is performed according to an order, for example, producing a 300 meter long piece of printed cloth with a pattern a.

The

image capturing device

320 is used for capturing a target image of a production task on the production line. The

image capturing device

320 may be a camera or a video camera, and may be, for example, vertically disposed on the production line and photograph the corresponding processing object of the production task for capturing a target image after completion of the printing process on the production line, thereby producing a target image of the production task, for example, a cloth image including a printed pattern.

The

computing device

330 can determine the execution status of the production task according to the target image.

The

sensor

340 may be used to collect the production progress status of the production task. The production progress sensor is a length or number counter. For example, the

sensor

340 may be a meter counter arranged on the web roll and counting the length of the web based on the speed of rotation, or any other sensor of the type commonly used in the art or developed in the future suitable for characterizing the progress of a task.

Subsequently, the

computing device

330 may determine a current status of the production task based on the execution status and the production progress status.

Specifically, the

computing device

330 may make the determination of the execution status. For example, it may be determined whether the target image conforms to a prescribed image of the production task, e.g., based on an existing determination model (e.g., a CNN feature extraction model described below), a probability that the target image conforms to the prescribed image of the production task.

Because the captured image has parallax with the prescribed image, the

computing device

330 may be further configured to: and extracting a target area image for the judgment from the acquired target image. For example, the

image capturing device

320 may acquire successive target images of a current production task on the production line, and the

computing device

330 may perform motion region detection according to the established background model and extract the target region images according to the motion regions detected from the successive target images.

In one embodiment,

computing device

330 may be used to: extracting the features of the target image; comparing features extracted from the target image with features extracted from a prescribed image of the production task; and judging whether the target image meets the specified image of the production task or not according to the comparison result. Here, the image feature extraction technique utilized may be feature point extraction based on a conventional CV (computer vision); deep learning based neural network (e.g., CNN, convolutional neural network) feature extraction is also possible.

The comparison may be a direct comparison against a determined image, or a search comparison against an existing database. To this end, the

system

300 may also include a database for storing production task specification images. The

computing device

330 may then look up the prescribed image from the database that corresponds to the target image.

Thus, the present invention is able to determine the current completion of a production task through the acquisition of performance status and production progress status (e.g., real-time acquisition and analysis of images and sensor data). The current status of the production task can be fed into the local or cloud control center to perform the subsequent operations,

further, the

computing device

330 may look up order information for the production task based on the conformed prescription image; and determining the current completion of the production task based on the order information and the production progress status.

The

computing device

330 may collect the current status of the production tasks and notify the user via a display screen or the like of the

computing device

330. In the case of multiple production lines, the method of the present invention can track the production status of the current production task of each production line, and summarize and report the production status. In one embodiment,

system

300 may include multiple production lines, each equipped with image capture devices and sensors, and connected to a computing device. In other embodiments, different tasks may share one image capturing device and/or sensor, for example, an image capturing device installed at a passageway of a logistics warehouse.

In some embodiments, the

system

300 may further include a client (e.g., a smartphone with a corresponding App installed) for obtaining and reporting the current status of the production task to a user. The client may receive the current status of the actively pushed task, or obtain the current status of the task when queried by the user.

Under the condition of cloud implementation, tracking of the current production condition of each production line of each factory can be achieved in the server, and the current production condition is fed back to the corresponding client. At this point, the

computing device

330 may be connected to the cloud platform and further configured to: and calculating system task condition information based on the current conditions of the plurality of tasks, and reporting the system task condition information. The cloud platform can collect the system task status information from each networking system, provide ordering guidance for an ordering party, and transmit a new order to a corresponding system.

In one embodiment, the system is one that belongs to a particular supply chain. The cloud platform may aggregate information from the entire supply chain to calculate current supply chain condition information and determine the system to perform subsequent tasks. The computing devices of the respective systems may receive current supply chain condition information and determine a new task to perform based on the current supply chain condition information.

In other embodiments, the task tracking system of the present invention may also be a logistics task tracking system. The logistics task tracking system can be a system independent from the production task tracking system, and can also be a subsequent system related to the logistics task tracking system, so that comprehensive tracking from production to transportation is realized. The computing device of the logistics task tracking system can be configured to: and judging the task object and/or the task object condition of the logistics task according to the target image acquired by the image acquisition equipment, and determining the progress condition of the logistics task according to the position of the logistics task acquired by the sensor.

As described above, in the case of the cloud, the present invention can be implemented as a production task tracking platform including a server and a plurality of

task tracking systems

300 as described above connected to the server via the internet.

In cases involving a cloud, there are ways for each

computing device

330 to perform edge computing and ways for the cloud to perform centralized computing. For example, in the case where a factory is equipped with a more computationally intensive computing device, or is less computationally intensive, the server is configured to issue a model for image feature extraction, such as a trained CNN model, to the computing devices of the production task tracking system. The

computing device

330 may load the model for performing image feature extraction, and perform real-time image analysis and determination of the execution status of the production task with respect to the captured video of each production line in the plant. When a task starts, it is often necessary to compare an image acquired and subjected to region extraction with a specified image in a database to find out the image so as to determine an order number. Subsequently, during the duration of this task, only subsequently extracted images may be compared to the order corresponding image to determine that production for the order is proceeding normally. When an erroneous image occurs, for example, the current image fails to match any of the prescribed images, or other images occur during the course of a task, the user may be notified directly by the

computing device

330, or via a bound client. The tracking scheme of the present invention may also be incorporated into a factory automation system. Thus, the

computing device

330 may also be used to: determining to start the production line to perform a next production task if the current condition is at least that the production task is completed. For example, the

computing device

330 may directly initiate the subsequent production task, or notify the user that the current task is finished and initiate the subsequent production task upon user confirmation.

In the case of local computing power deficiency or network bandwidth operation, the cloud can be used for computing. At this point, the server may be configured to: acquiring a target image uploaded by the production task tracking system; judging the execution state of the production task according to the target image; and issuing the execution status to the computing equipment of the production task tracking system. Specifically, the server may perform feature extraction of the target image based on the trained CNN model at the cloud, and compare the feature extraction with the features of the database to find out a corresponding prescribed image, and send the result of the search and comparison to the local.

To facilitate reporting of information, the platform may further comprise: a client for: receiving the current state of the task actively pushed by the server; and/or obtaining the current state of the task from the server when queried by a user. For example, the tracking platform may publish a tracking App for downloading by the client, and then the user may be adapted to view the target task, the provider, and the current operating conditions of the supply chain at any time (e.g., different viewing permissions are required) by the client on which the corresponding App is installed, and to control the progress of the task, etc., as required.

In one embodiment, the plurality of task tracking systems connected to the server includes a plurality of task performers performing homogeneous tasks, and the server is configured to: executing side task condition information of the plurality of task executing sides calculated based on the current conditions of the plurality of tasks; and providing the executive task condition information to a task assigning party so that the task assigning party selects a task executing party to execute a new task based on executive task condition information of the plurality of task executing parties.

In one embodiment, the plurality of task tracking systems connected to the server includes a plurality of task performers performing a same product supply chain, and the server is configured to: acquiring executive task condition information of the plurality of task executors calculated based on the current conditions of the plurality of tasks; determining current supply chain condition information for the supply chain based on executive task condition information for the plurality of task executors; determining a new task to perform based on the current supply chain condition information; and issuing the determined new task to a corresponding task executive party.

In addition, tracking systems with different attributes, such as a production task tracking system and a logistics task tracking system, can be further included in the platform, so that the tracking of a specific object (such as printed cloth of the printed order A) from production to transportation is completed through linkage of the tracking systems with different attributes.

FIG. 4 is a schematic diagram of a computing device that can be used to implement the production task tracking method according to an embodiment of the present invention.

Referring to fig. 4, computing device 400 includes memory 410 and processor 420.

The processor 420 may be a multi-core processor or may include a plurality of processors. In some embodiments, processor 420 may include a general-purpose host processor and one or more special coprocessors such as a Graphics Processor (GPU), a Digital Signal Processor (DSP), or the like. In some embodiments, processor 420 may be implemented using custom circuits, such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).

The memory 410 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. Wherein the ROM may store static data or instructions that are required by the processor 420 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. Further, the memory 410 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 410 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a read-only digital versatile disc (e.g., DVD-ROM, dual layer DVD-ROM), a read-only Blu-ray disc, an ultra-density optical disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disc, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.

The memory 410 has stored thereon executable code that, when processed by the processor 420, causes the processor 420 to perform the production task tracking method described above.

[ application example ]

The task tracking scheme of the present invention is particularly useful for production tasks that can determine that production is proceeding normally based on image comparison, such as digital printing tasks. Fig. 5 shows an example of a technical implementation flow of the digital printing automatic order following according to the invention.

In step S510, a camera captures a printing machine live video. For example, live image frames acquired at 20XX year, 7 month, 4 days 15:35:22 are shown. Subsequently, in step S520, the adaptive ROI obtains the actual flower type region, as shown by the corresponding red box region in the figure. Subsequently, in step S530, feature extraction of the stamp ROI is performed.

In the lower branch, however, a library pattern features database is built from the printing library pattern dataset at step S540. This step S540 may be performed well before the upper part branches to steps S510-530, or may be continued, for example, whenever a new pattern is input, it is subjected to pattern feature analysis and the base library pattern feature database is updated.

Thus, in step S550, the features extracted from the ROI for printing can be compared with the pattern feature database of the base, and the similarity calculation ranking of the feature comparison can be performed.

Then, in step S560, Top10 sorting results are output, Top1 order information is correlated, and the current number of pattern processing meters is calculated through meter counter reading.

And the automatic order following system receives the reported information, and summarizes, analyzes and displays the results. Fig. 6 shows a diagram of the presentation effect of the single example shown in fig. 5. As shown, the left side of the figure finds the corresponding printed pattern from the feature database based on image feature comparison. And then, combining the comparison result with the current degree of the meter counter, and summarizing and reporting the comparison result to a display interface of the automatic order following system, such as a display of local computing equipment or a display screen of a client.

The production task tracking method, system and platform according to the present invention have been described in detail above with reference to the accompanying drawings. The scheme is particularly suitable for being implemented as a digital printing automatic order following scheme based on computer vision, an effective ROI (region of interest) of printing is detected by changing a motion region in a camera picture by utilizing video intelligent algorithm analysis, then the image features of the ROI of the printing are extracted by an image feature extraction algorithm and compared with the feature vectors of a pattern data set of a printing base to be sorted, the pattern of TOP1 is confirmed to be a correct corresponding pattern according to an identification threshold, the number of production processing meters corresponding to the pattern after corresponding order information and meter counter data processing is obtained, and finally the obtained number is output to an automatic order following system for data analysis integration and result display.

The technology of the invention effectively reduces the cost and the transformation difficulty of digital transformation of a digital printing factory, has the characteristics of light weight deployment and strong reproducibility, obtains the real-time progress of printing processing of the factory through algorithm analysis in real time under the condition of not changing the original working mode of workers, synchronizes the real-time progress to a producer, a platform side and a consumer, achieves efficient production and marketing cooperation, accurately controls the production condition of printing orders, improves the management efficiency of the factory, reduces the cost and improves the productivity. The invention provides a digital printing automatic order following technical framework based on image feature extraction, and provides self-adaptive printing area ROI detection, thereby reducing the influence of other non-printing area contents in a video picture on the printing area ROI feature extraction and improving the matching accuracy of a printing ROI area and a pattern base.

Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out the above-mentioned steps defined in the above-mentioned method of the invention.

Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.

Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.

The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (47)

1. A task tracking method, comprising:

acquiring a target image of a task;

judging the execution state of the task according to the target image;

acquiring the progress status of the task; and

determining a current status of the task based on the execution status and the progress status.

2. The method of claim 1, wherein the task is a production task,

acquiring a target image of a task includes:

and acquiring a target image of a production task on a production line by using image acquisition equipment installed on the production line.

3. The method of claim 1, wherein determining the performance status of the task from the image comprises:

and judging whether the target image conforms to the specified image of the task.

4. The method of claim 3, further comprising:

and extracting a target area image for the judgment from the acquired target image.

5. The method of claim 4, wherein acquiring a target image of a task comprises:

acquiring successive target images of the task, and

extracting a target area image for the determination from the acquired target image includes:

detecting a motion area according to the established background model;

extracting the target area image from the motion areas detected from the continuous target images.

6. The method of claim 3, wherein determining whether the target image conforms to a prescribed image of the task comprises:

extracting the features of the target image;

comparing features extracted from the target image with features extracted from a prescribed image of the task; and

and judging whether the target image meets the specified image of the task or not according to the comparison result.

7. The method of claim 6, wherein feature extracting the target image comprises at least one of:

extracting image characteristic points based on the traditional CV (computer vision);

and carrying out neural network feature extraction based on deep learning.

8. The method of claim 3, wherein determining whether the target image conforms to a prescribed image of the task comprises:

and searching a specified image which is consistent with the target image from a task specified image database.

9. The method of claim 8, further comprising:

based on said conformed specification image, searching order information of said task, an

Based on the execution status and the progress status, determining a current status of the task comprises:

and determining the current completion degree of the task based on the order information and the progress condition.

10. The method of claim 1, wherein obtaining the progress status of the task comprises:

taking a reading of a production progress sensor disposed on or near the production line for characterizing the progress condition.

11. The method of claim 10, wherein the production progress sensor is a length or number counter.

12. The method of claim 1, further comprising:

reporting a current status of the task.

13. The method of claim 12, wherein reporting the current status of the task comprises at least one of:

actively pushing the current state of the task to a client;

providing the current state of the task when the client side inquires;

and displaying the current state of the task in real time.

14. The method of claim 1, wherein the method is performed simultaneously for multiple tasks to track current conditions of the multiple tasks simultaneously.

15. The method of claim 14, wherein the plurality of tasks are tasks belonging to different task performers, and further comprising:

calculating executive task condition information of the different task executors based on the current conditions of the plurality of tasks;

and the task issuing party selects a task executing party to execute the new task based on the task condition information of the executing parties of the different task executing parties.

16. The method of claim 14, wherein the plurality of tasks are tasks belonging to different task performers belonging to a same product supply chain, and further comprising:

calculating executive task condition information of the different task executors based on the current conditions of the plurality of tasks;

determining current supply chain condition information for the supply chain based on the executive task condition information of the different task executors; and

determining a new task to perform based on the current supply chain condition information.

17. The method of claim 1, further comprising:

determining a next task to be started if the current status indicates that the task is completed.

18. The method of claim 1, wherein the job is a cloth printing job, and determining the status of execution of the job from the target image comprises:

and judging the execution condition of the production task according to the printing image on the cloth.

19. The method of claim 18, wherein acquiring a target image of a task comprises:

and acquiring a target image after the printing process on the production line is finished.

20. The method of claim 1, wherein the task is a logistics task, and

judging the execution state of the task according to the target image comprises the following steps:

judging the task object and/or the task object condition of the logistics task according to the target image,

the step of acquiring the progress status of the task comprises the following steps:

and acquiring the progress status of the task according to the position of the logistics task.

21. A task tracking system, comprising:

the image acquisition equipment is used for acquiring a target image of the task;

the computing equipment is used for judging the execution condition of the task according to the target image;

a sensor for collecting a progress status of the task, wherein,

the computing device determines a current status of the task based on the execution status and the progress status.

22. The system of claim 21, wherein the computing device is further to:

and judging whether the target image conforms to the specified image of the task.

23. The system of claim 22, wherein the computing device is further to:

and extracting a target area image for the judgment from the acquired target image.

24. The system of claim 23, wherein the image acquisition device is to:

acquiring successive target images of the task, and

the computing device is further to:

detecting a motion area according to the established background model;

extracting the target area image from the motion areas detected from the continuous target images.

25. The system of claim 22, wherein the computing device is further to:

extracting the features of the target image;

comparing features extracted from the target image with features extracted from a prescribed image of the task; and

and judging whether the target image meets the specified image of the task or not according to the comparison result.

26. The system of claim 25, wherein feature extracting the target image comprises at least one of:

extracting image characteristic points based on the traditional CV (computer vision);

and carrying out neural network feature extraction based on deep learning.

27. The system of claim 22, further comprising:

a database for storing production task specification images,

wherein the computing device is further to:

and searching a specified image which is consistent with the target image from the database.

28. The system of claim 27, wherein the computing device is further to:

searching order information of the task based on the matched specified image;

and determining the current completion degree of the task based on the order information and the progress condition.

29. The system of claim 21, wherein the sensor is a length or number counter.

30. The system of claim 21, further comprising:

and the client is used for acquiring and reporting the current state of the production task.

31. The system of claim 30, wherein the client receives the current status of the task being actively pushed or retrieves the current status of the task upon user query.

32. The system of claim 21, wherein the system simultaneously tracks current conditions of multiple tasks.

33. The system of claim 32, wherein the computing device is further to: and calculating system task condition information based on the current conditions of the plurality of tasks, and reporting the system task condition information.

34. A system as recited in claim 21, wherein the system is one that belongs to a particular supply chain, the computing device receiving current supply chain condition information and determining a new task to perform based on the current supply chain condition information.

35. The system of claim 21, wherein the computing device is further to:

in a case where the current condition indicates that the production task is completed, determining a next task to be initiated.

36. The system of claim 21, wherein the system is a production task tracking system, the system further comprising:

a production line for performing production tasks thereon.

37. The system of claim 36, wherein the system is a cloth print production tracking system and the computing system determines the performance of the production job based on the print image on the cloth.

38. The system of claim 37, wherein the image capture device is mounted for:

and acquiring a target image after the printing process on the production line is finished.

39. The system of claim 21, wherein the system is a logistics task tracking system, and,

the computing device is to:

judging the task object and/or the task object condition of the logistics task according to the target image acquired by the image acquisition equipment,

and determining the progress condition of the logistics task according to the position of the logistics task acquired by the sensor.

40. A production task tracking platform comprising a server and a plurality of task tracking systems according to any one of claims 21 to 39 connected to the server via the Internet.

41. The platform of claim 40, wherein the server is to issue a computational model for image feature extraction to a computing device of the task tracking system.

42. The platform of claim 40, wherein the server is to:

acquiring a target image uploaded by the production task tracking system;

judging the execution state of the production task according to the target image; and

and issuing the execution status to the computing equipment of the task tracking system.

43. The platform of claim 40, further comprising:

a client for:

receiving the current state of the task actively pushed by the server; and/or

And acquiring the current state of the task from the server when the user inquires.

44. The platform of claim 40, wherein the plurality of task tracking systems connected to the server include a plurality of task performers performing homogeneous tasks, and

the server is configured to:

executing side task condition information of the plurality of task executing sides calculated based on the current conditions of the plurality of tasks; and

and providing the executive task condition information to a task issuing party so that the task issuing party selects a task executing party to execute a new task based on executive task condition information of the plurality of task executing parties.

45. The platform of claim 40, wherein the plurality of task tracking systems connected to the server include a plurality of task performers performing a same product-supply chain, and

the server is configured to:

acquiring executive task condition information of the plurality of task executors calculated based on the current conditions of the plurality of tasks;

determining current supply chain condition information for the supply chain based on executive task condition information for the plurality of task executors;

determining a new task to perform based on the current supply chain condition information; and

and issuing the determined new task to a corresponding task executive party.

46. A computing device, comprising:

a processor; and

a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any one of claims 1-20.

47. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method of any one of claims 1-20.

CN201910855076.5A 2019-09-10 2019-09-10 Task tracking method and system Active CN112558554B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910855076.5A CN112558554B (en) 2019-09-10 2019-09-10 Task tracking method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910855076.5A CN112558554B (en) 2019-09-10 2019-09-10 Task tracking method and system

Publications (2)

Publication Number Publication Date
CN112558554A true CN112558554A (en) 2021-03-26
CN112558554B CN112558554B (en) 2024-10-18

Family

ID=75028971

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910855076.5A Active CN112558554B (en) 2019-09-10 2019-09-10 Task tracking method and system

Country Status (1)

Country Link
CN (1) CN112558554B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116027743A (en) * 2022-12-13 2023-04-28 三一汽车起重机械有限公司 Intelligent monitoring method, device and system for production line
CN117893178A (en) * 2024-03-15 2024-04-16 北京谷器数据科技有限公司 Real-time work progress display method, system, equipment and medium

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08123946A (en) * 1994-10-27 1996-05-17 Toshiba Corp Plant monitoring device
JPH1034499A (en) * 1996-07-23 1998-02-10 Hitachi Ltd Acquisition method of equipment capacity information and production management system
JPH1094947A (en) * 1996-09-25 1998-04-14 Calsonic Corp Physical distribution system in factory
JP2002373012A (en) * 2001-06-14 2002-12-26 Ntn Corp Method for designing/working plant facility and supporting system
US20040073360A1 (en) * 2002-08-09 2004-04-15 Eric Foxlin Tracking, auto-calibration, and map-building system
JP2004206570A (en) * 2002-12-26 2004-07-22 Toppan Forms Co Ltd Production control system
WO2005050194A1 (en) * 2003-11-21 2005-06-02 Ralph Gregory Burke A device for inspecting and controlling the density of a moving web of cloth in a production line
JP2006031135A (en) * 2004-07-13 2006-02-02 Shirai Group Kk Waste processing tracking verification system
JP2007188373A (en) * 2006-01-16 2007-07-26 Hitachi Ltd Process progress management system
JP2007226466A (en) * 2006-02-22 2007-09-06 Open I Systems Kk Distribution system for cloth for kimono having pattern
JP2010055334A (en) * 2008-08-28 2010-03-11 Olympus Corp Method of controlling production system, and production system
JP2011003000A (en) * 2009-06-18 2011-01-06 Mitsubishi Electric Corp Line monitoring device
JP2012023414A (en) * 2010-07-12 2012-02-02 Kozo Keikaku Engineering Inc Simulation apparatus, simulation method, and program
JP2012221460A (en) * 2011-04-14 2012-11-12 Nippon Steel Corp Monitoring operation system, tracking support device and tracking support method
CN104707868A (en) * 2013-12-11 2015-06-17 东芝三菱电机产业系统株式会社 Data analysis device
US20150228078A1 (en) * 2014-02-11 2015-08-13 Microsoft Corporation Manufacturing line monitoring
JP2016018242A (en) * 2014-07-04 2016-02-01 オムロン株式会社 Production process analysis system
CN205620794U (en) * 2016-03-25 2016-10-05 福州职业技术学院 Tricot machine production data acquisition and management system
JP2018195083A (en) * 2017-05-17 2018-12-06 三菱電機株式会社 Mobile tracking system
DE102017128331A1 (en) * 2017-11-29 2019-05-29 Ioss Intelligente Optische Sensoren & Systeme Gmbh Bildaufnehmersystem

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08123946A (en) * 1994-10-27 1996-05-17 Toshiba Corp Plant monitoring device
JPH1034499A (en) * 1996-07-23 1998-02-10 Hitachi Ltd Acquisition method of equipment capacity information and production management system
JPH1094947A (en) * 1996-09-25 1998-04-14 Calsonic Corp Physical distribution system in factory
JP2002373012A (en) * 2001-06-14 2002-12-26 Ntn Corp Method for designing/working plant facility and supporting system
US20040073360A1 (en) * 2002-08-09 2004-04-15 Eric Foxlin Tracking, auto-calibration, and map-building system
JP2004206570A (en) * 2002-12-26 2004-07-22 Toppan Forms Co Ltd Production control system
WO2005050194A1 (en) * 2003-11-21 2005-06-02 Ralph Gregory Burke A device for inspecting and controlling the density of a moving web of cloth in a production line
JP2006031135A (en) * 2004-07-13 2006-02-02 Shirai Group Kk Waste processing tracking verification system
JP2007188373A (en) * 2006-01-16 2007-07-26 Hitachi Ltd Process progress management system
JP2007226466A (en) * 2006-02-22 2007-09-06 Open I Systems Kk Distribution system for cloth for kimono having pattern
JP2010055334A (en) * 2008-08-28 2010-03-11 Olympus Corp Method of controlling production system, and production system
JP2011003000A (en) * 2009-06-18 2011-01-06 Mitsubishi Electric Corp Line monitoring device
JP2012023414A (en) * 2010-07-12 2012-02-02 Kozo Keikaku Engineering Inc Simulation apparatus, simulation method, and program
JP2012221460A (en) * 2011-04-14 2012-11-12 Nippon Steel Corp Monitoring operation system, tracking support device and tracking support method
CN104707868A (en) * 2013-12-11 2015-06-17 东芝三菱电机产业系统株式会社 Data analysis device
US20150228078A1 (en) * 2014-02-11 2015-08-13 Microsoft Corporation Manufacturing line monitoring
JP2016018242A (en) * 2014-07-04 2016-02-01 オムロン株式会社 Production process analysis system
CN205620794U (en) * 2016-03-25 2016-10-05 福州职业技术学院 Tricot machine production data acquisition and management system
JP2018195083A (en) * 2017-05-17 2018-12-06 三菱電機株式会社 Mobile tracking system
DE102017128331A1 (en) * 2017-11-29 2019-05-29 Ioss Intelligente Optische Sensoren & Systeme Gmbh Bildaufnehmersystem
WO2019105818A1 (en) * 2017-11-29 2019-06-06 Ioss Intelligente Optische Sensoren & Systeme Gmbh Image sensor system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
沈豪;庄建军;郑茜颖;吴建耀;: "一种基于MeanShift算法的目标跟踪系统的设计与实现", 电子测量技术, no. 14, 23 July 2018 (2018-07-23) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116027743A (en) * 2022-12-13 2023-04-28 三一汽车起重机械有限公司 Intelligent monitoring method, device and system for production line
CN117893178A (en) * 2024-03-15 2024-04-16 北京谷器数据科技有限公司 Real-time work progress display method, system, equipment and medium

Also Published As

Publication number Publication date
CN112558554B (en) 2024-10-18

Similar Documents

Publication Publication Date Title
KR102579783B1 (en) 2023-09-18 Vision inspection system by using remote learning of product defects image
CN113962274B (en) 2022-03-08 Abnormity identification method and device, electronic equipment and storage medium
EP4137901A1 (en) 2023-02-22 Deep-learning-based real-time process monitoring system, and method therefor
KR102386718B1 (en) 2022-04-14 Counting apparatus and method of distribution management thereof
CN113989944B (en) 2022-04-08 Operation action recognition method, device and storage medium
CN115578780B (en) 2023-04-07 Agricultural product and food cold chain traceability management system based on big data
CN118333501B (en) 2024-10-22 Logistics and supply chain management system based on RFID
CN109743497B (en) 2020-06-30 Data set acquisition method and system and electronic device
CN116777375B (en) 2024-02-23 Industrial Internet system based on machine vision
CN112558554B (en) 2024-10-18 Task tracking method and system
Chen 2022 [Retracted] E‐Commerce Logistics Inspection System Based on Artificial Intelligence Technology in the Context of Big Data
Panahi et al. 2023 Automated progress monitoring in modular construction factories using computer vision and building information modeling
CN118133444A (en) 2024-06-04 Visual identification-based intelligent storage modeling system
CN117853928A (en) 2024-04-09 Method and device for monitoring factory and mine states, storage medium and electronic equipment
CN116308038A (en) 2023-06-23 Warehouse goods purchase management method and system based on Internet of things
CN114022070A (en) 2022-02-08 Disc library method, device, equipment and storage medium
CN113781432A (en) 2021-12-10 Laser scanning automatic laying on-line detection method and device based on deep learning
CN118397492B (en) 2024-09-10 Monitoring data processing method and device, storage medium and terminal
Abreo et al. 2024 AI Based Seamless Vehicle License Plate Recognition Using Raspberry Pi Technology
CN117522966B (en) 2024-11-19 Physical quantity position information management method and system
KR102755874B1 (en) 2025-01-21 Method and device for analyzing cargo damage level using image segmentation
CN112730427B (en) 2024-02-09 Product surface defect detection method and system based on machine vision
CN114882597B (en) 2022-10-28 Target behavior identification method and device and electronic equipment
KR102011775B1 (en) 2019-08-19 Method for evaluating production stability of factory, server and system using the same
CN112861823B (en) 2025-02-28 A method and device for visual inspection and positioning of key workpiece installation processes

Legal Events

Date Code Title Description
2021-03-26 PB01 Publication
2021-03-26 PB01 Publication
2021-04-13 SE01 Entry into force of request for substantive examination
2021-04-13 SE01 Entry into force of request for substantive examination
2021-11-05 REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40046337

Country of ref document: HK

2024-10-18 GR01 Patent grant
2024-10-18 GR01 Patent grant