patents.google.com

CN112070087B - Train number identification method and device with end bit and readable storage medium - Google Patents

  • ️Fri Jun 02 2023
Train number identification method and device with end bit and readable storage medium Download PDF

Info

Publication number
CN112070087B
CN112070087B CN202010961523.8A CN202010961523A CN112070087B CN 112070087 B CN112070087 B CN 112070087B CN 202010961523 A CN202010961523 A CN 202010961523A CN 112070087 B CN112070087 B CN 112070087B Authority
CN
China
Prior art keywords
end position
position information
image
information
train
Prior art date
2020-09-14
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010961523.8A
Other languages
Chinese (zh)
Other versions
CN112070087A (en
Inventor
张渝
彭建平
赵波
王祯
黄炜
章祥
马莉
王小伟
胡继东
史亚利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Leading Software Technology Co ltd
Original Assignee
Chengdu Leading Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2020-09-14
Filing date
2020-09-14
Publication date
2023-06-02
2020-09-14 Application filed by Chengdu Leading Software Technology Co ltd filed Critical Chengdu Leading Software Technology Co ltd
2020-09-14 Priority to CN202010961523.8A priority Critical patent/CN112070087B/en
2020-12-11 Publication of CN112070087A publication Critical patent/CN112070087A/en
2023-06-02 Application granted granted Critical
2023-06-02 Publication of CN112070087B publication Critical patent/CN112070087B/en
Status Active legal-status Critical Current
2040-09-14 Anticipated expiration legal-status Critical

Links

  • 238000000034 method Methods 0.000 title claims abstract description 68
  • 238000013135 deep learning Methods 0.000 claims abstract description 34
  • 238000012545 processing Methods 0.000 claims description 26
  • 230000004044 response Effects 0.000 claims description 16
  • 238000000605 extraction Methods 0.000 claims description 12
  • 238000001914 filtration Methods 0.000 claims description 12
  • 238000004590 computer program Methods 0.000 claims description 10
  • 238000003706 image smoothing Methods 0.000 claims description 10
  • 238000013528 artificial neural network Methods 0.000 claims description 9
  • 239000013598 vector Substances 0.000 claims description 8
  • 230000003137 locomotive effect Effects 0.000 claims description 5
  • 238000009499 grossing Methods 0.000 claims description 4
  • 210000002569 neuron Anatomy 0.000 claims description 4
  • 238000007781 pre-processing Methods 0.000 description 6
  • 238000005516 engineering process Methods 0.000 description 4
  • 238000003708 edge detection Methods 0.000 description 3
  • 230000006978 adaptation Effects 0.000 description 2
  • 238000012423 maintenance Methods 0.000 description 2
  • 238000012986 modification Methods 0.000 description 2
  • 230000004048 modification Effects 0.000 description 2
  • 235000002566 Capsicum Nutrition 0.000 description 1
  • 239000006002 Pepper Substances 0.000 description 1
  • 235000016761 Piper aduncum Nutrition 0.000 description 1
  • 235000017804 Piper guineense Nutrition 0.000 description 1
  • 244000203593 Piper nigrum Species 0.000 description 1
  • 235000008184 Piper nigrum Nutrition 0.000 description 1
  • 230000002159 abnormal effect Effects 0.000 description 1
  • 230000009286 beneficial effect Effects 0.000 description 1
  • 238000013461 design Methods 0.000 description 1
  • 238000011161 development Methods 0.000 description 1
  • 238000010586 diagram Methods 0.000 description 1
  • 230000000694 effects Effects 0.000 description 1
  • 239000000284 extract Substances 0.000 description 1
  • 230000035772 mutation Effects 0.000 description 1
  • 238000003909 pattern recognition Methods 0.000 description 1
  • 230000000750 progressive effect Effects 0.000 description 1
  • 150000003839 salts Chemical class 0.000 description 1
  • 230000035945 sensitivity Effects 0.000 description 1

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/63Scene text, e.g. street names
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Multimedia (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image recognition, and particularly discloses a train number recognition method and device with end bits and a readable storage medium. The identification method comprises the steps of acquiring a complete train number image of a train, identifying train number information and end position information by adopting a deep learning algorithm, identifying the end position information by adopting an end position identification algorithm, judging whether the end position information identified by the end position identification algorithm is correct, outputting the train number information and the end position information identified by the end position identification algorithm if the end position information is correct, acquiring final end position information according to the end position information identified by the deep learning algorithm and the end position identification algorithm if the end position information is incorrect, and outputting the final end position information and the train number information together.

Description

Train number identification method and device with end bit and readable storage medium

Technical Field

The invention relates to the technical field of image recognition, in particular to a train number recognition method and device with end bits and a readable storage medium.

Background

The train's train number includes end position information consisting of roman numerals i, ii or letters a and B, in addition to a string of letters and numerals. The end position information is positioned at the tail end, above or below the car number consisting of letters and numbers. In the development process of the train, the train number is identified by vehicle-mounted, on-line, ground reading-out type and other equipment, the maintenance of the train is guided according to the identification result, and the position information of the corresponding actual parts is searched, and besides the character string number consisting of letters and numbers, the daily maintenance of the train is ensured by means of end position information.

In the current railway system, the image recognition technology is adopted to automatically recognize the train number of the locomotive, so that the condition that the train number cannot be recognized due to the damage and loss of the electronic tag is avoided. However, in the image recognition of the train number, whether the conventional pattern recognition or the recognition based on the deep learning method is adopted, the photographed images are required to be clear and have consistent photographing angles, and due to the influence of objective factors, in the recognition of the end position information of the train number, the accuracy of the end position information recognition is required to be improved due to the problems of large difference of the size, the position and the printing body of the end position due to more train types.

Disclosure of Invention

In view of the foregoing, the present application provides a train number identification method and apparatus with end bits, and a readable storage medium, which can solve or at least partially solve the above-mentioned problems.

In order to solve the technical problems, the technical scheme provided by the invention is a train number identification method with end positions, comprising the following steps:

acquiring a complete train number image of a train, wherein the complete train number image comprises train number information and end position information of the train;

identifying the car number information and the end position information in the complete car number image by adopting a deep learning algorithm;

identifying end position information in the complete vehicle number image by adopting an end position identification algorithm;

judging whether the end bit information identified by the end bit identification algorithm is correct, taking the end bit information as final end bit information in response to the end bit information being correct, and generating final end bit information according to the end bit information identified by the deep learning algorithm and the end bit identification algorithm in response to the end bit information being incorrect;

and outputting train number information and final end position information of the train.

Preferably, the method for identifying the end position information in the complete vehicle number image by adopting an end position identification algorithm comprises the following steps:

performing image graying treatment and image smoothing treatment on the complete vehicle number image to obtain a preprocessed image;

positioning and cutting an end position information area in the preprocessed image to obtain an end position area image;

performing binarization processing on the end position region image, and then performing feature extraction to obtain an end position feature image;

and identifying the terminal characters in the terminal feature image to obtain terminal information.

Preferably, the method for obtaining the preprocessed image by performing image graying processing and image smoothing processing on the complete car number image comprises the following steps:

assuming that f (i, j) is the gray value of the point with the coordinates (i, j) in the complete car number image, and R (i, j), G (i, j) and B (i, j) are the values on the red, green and blue components of the point with the coordinates (i, j), respectively, different weights are given to R, G, B, so as to obtain the gray value of each point in the complete car number image, wherein the formula is as follows:

f(i,j)=0.30×R(i,j)+0.59×G(i,j)+0.11×B(i,j);

smoothing the complete vehicle number image by adopting median filtering, and carrying out smoothing on the two-dimensional sequence { x } i,j Median filtering, the filtering window is two-dimensional and can be expressed as:

Figure BDA0002680724460000021

a is a filter window, and a is 3*3.

Preferably, the method for locating and clipping the end position information area in the preprocessed image to obtain the end position area image includes:

performing edge recognition by adopting a Sobel operator, wherein the Sobel operator comprises two groups of 3*3 matrixes which are respectively transverse and longitudinal, and performing plane convolution on the matrixes and the preprocessed image to respectively obtain transverse and longitudinal brightness difference approximate values;

if f is used to represent the preprocessed image, gx and Gy represent the transverse and longitudinal edge detected images, respectively, the formula is as follows:

Figure BDA0002680724460000031

Figure BDA0002680724460000032

the lateral and longitudinal gradient approximations for each pixel of the image that is detected by the lateral and longitudinal edges are combined using the following formula to calculate the magnitude of the gradient:

Figure BDA0002680724460000033

the gradient direction was calculated with the following formula:

Figure BDA0002680724460000034

preferably, the method for obtaining the terminal feature image includes:

the optimal threshold value is obtained by adopting an iteration method, and binarization processing is carried out on the end position area image to obtain a binarized image, wherein the method for obtaining the optimal threshold value by adopting the iteration method comprises the following steps: initializing a threshold value which is half of the sum of a maximum gray level and a minimum gray level, dividing an end region image into a foreground and a background by the threshold value calculated each time, solving the average gray level of the whole foreground pixel and the average gray level of the whole background pixel, wherein the threshold value is half of the sum of the foreground and the background average gray level, and converging at the moment if the threshold value is equal to the threshold value calculated last time, so as to obtain an optimal threshold value;

character feature extraction is carried out by calculating the proportion value of the white pixels of the binarized image, the character is divided into 16 equal parts, the proportion of the white pixels of the binarized image in the 16 parts is counted and used as 16 feature vectors, and the proportion of the white pixels of the four areas in the vertical direction is counted and used as the last 4 feature vectors.

Preferably, the method for identifying the terminal characters in the terminal feature image and obtaining terminal information includes:

identifying the cut end position characteristic images with the same size by adopting a neural network; the neural network is a network composed of basic neurons, the network comprises an input layer, an hidden layer and an output layer, the input layer comprises 20 nodes, the output layer comprises 4 nodes, the 1 hidden layer is trained by the end position information to obtain an end position information classifier, and the end position information is identified through the classifier.

Preferably, the method for determining whether the end bit information identified by the end bit identification algorithm is correct, taking the end bit information as final end bit information in response to the end bit information being correct, and generating the final end bit information according to the end bit information identified by the deep learning algorithm and the end bit identification algorithm in response to the end bit information being incorrect includes:

judging whether the identified head end position information and the identified tail end position information of the same train are consistent,

responding to the coincidence of the locomotive end position information and the train tail end position information, and taking the first end position information obtained when the train passes through as final end position information;

responding to that the head end position information does not accord with the tail end position information, comparing the head end position information with the tail end position information identified by a deep learning algorithm, if the two pieces of the head end position information are identical, indicating that the tail end position information is wrong, assigning the tail end position information to a corresponding result of the head end position information, if the two pieces of the tail end position information are identical, indicating that the head end position information is wrong, assigning the head end position information to a corresponding result of the tail end position information, and finally, taking the first end position information obtained when a train passes as final end position information.

The invention also provides a train number identification device with end positions, which comprises:

the complete train number image acquisition module is used for acquiring a complete train number image of the train, wherein the complete train number image comprises train number information and end position information of the train;

the car number end position information identification module is used for identifying car number information and end position information in the complete car number image by adopting a deep learning algorithm;

the terminal position information identification module is used for identifying terminal position information in the complete car number image by adopting a terminal position identification algorithm;

the terminal information judging module is used for judging whether the terminal information identified by the terminal identification algorithm is correct, taking the terminal information as final terminal information in response to the terminal information being correct, and generating final terminal information according to the terminal information identified by the deep learning algorithm and the terminal identification algorithm in response to the terminal information being wrong;

and the train number end position information output module is used for outputting train number information and final end position information of the train.

Preferably, the terminal information identification module includes:

the image preprocessing unit is used for carrying out image graying processing and image smoothing processing on the complete car number image to obtain a preprocessed image;

the terminal positioning unit is used for positioning and cutting a terminal information area in the preprocessed image to obtain a terminal area image;

the feature extraction unit is used for carrying out binarization processing on the end position region image and then carrying out feature extraction to obtain an end position feature image;

and the character recognition unit is used for recognizing the terminal characters in the terminal characteristic image to obtain terminal information.

Preferably, the terminal information judging module includes:

an end position judging unit for judging whether the identified head end position information and the identified tail end position information of the same train are consistent,

the matching processing unit is used for responding to the matching of the locomotive end position information and the train tail end position information and taking the first end position information obtained when the train passes through as final end position information;

and the noncompliance processing unit is used for responding to noncompliance of the head end position information and the tail end position information, comparing the head end position information and the tail end position information identified by the deep learning algorithm, if the two pieces of the head end position information are identical, indicating that the tail end position information is wrong, assigning the tail end position information to the corresponding result of the head end position information, if the two pieces of the tail end position information are identical, indicating that the head end position information is wrong, assigning the head end position information to the corresponding result of the tail end position information, and finally, taking the first end position information obtained when a train passes as final end position information.

The invention also provides a train number identification device with end positions, which comprises:

a memory for storing a computer program;

and the processor is used for executing the computer program to realize the steps of the train number identification method with the end bit.

The invention also provides a readable storage medium storing a computer program which when executed by a processor implements the steps of the train number identification method with end bits.

Compared with the prior art, the application has the following beneficial effects: according to the train number identification method with the end position, the complete train number image of the train is obtained, the deep learning algorithm is adopted to identify the train number information and the end position information, the end position identification algorithm is adopted to identify the end position information, whether the end position information identified by the end position identification algorithm is correct or not is judged, if the end position information identified by the end position identification algorithm is correct, the end position information identified by the train number information and the end position identification algorithm is output, if the end position information identified by the end position identification algorithm is incorrect, the final end position information is obtained according to the end position information identified by the deep learning algorithm and the end position identification algorithm, and then the end position information is output together with the train number information, so that the problem that the end number information identification accuracy of the existing deep learning algorithm is low is solved, and the identification accuracy of the end position information is improved by adopting the method of combining and judging the end position identification algorithm by the end position identification algorithm.

Drawings

For a clearer description of embodiments of the present invention, the drawings that are required to be used in the embodiments will be briefly described, it being apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the drawings without inventive effort for those skilled in the art.

Fig. 1 is a schematic flow chart of a train number identification method with end position according to an embodiment of the present invention;

fig. 2 is a flowchart of a method for identifying end position information in a complete vehicle number image by using an end position identification algorithm according to an embodiment of the present invention;

FIG. 3 is a flowchart of a method for generating final end position information according to end position information identified by a deep learning algorithm and an end position identification algorithm according to an embodiment of the present invention;

fig. 4 is a schematic structural diagram of a train number identification device with end positions according to an embodiment of the present invention.

Detailed Description

The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by a person of ordinary skill in the art without making any inventive effort are within the scope of the present invention.

In order to make the technical solution of the present invention better understood by those skilled in the art, the present invention will be further described in detail with reference to the accompanying drawings and specific embodiments.

Because of the wide variety of trains, there are large differences in end position information height (position), character size, and style. In practical application, the end position photo is uneven in light filling, so that the photographed car numbers are different in size, angle, color and definition. In the car number image processing, no matter in the traditional mode recognition mode or the car number recognition mode based on deep learning, certain requirements are required on the form and structure of the image. Therefore, the algorithm adapted in one scene is not applicable to other scenes, so that the correct recognition rate for the terminal information is not high at present.

As shown in fig. 1, an embodiment of the present invention provides a train number identification method with end positions, including:

s11: acquiring a complete train number image of a train, wherein the complete train number image comprises train number information and end position information of the train;

s12: identifying the car number information and the terminal position information in the complete car number image by adopting a deep learning algorithm;

s13: identifying end position information in the complete vehicle number image by adopting an end position identification algorithm;

s14: judging whether the end bit information identified by the end bit identification algorithm is correct, taking the end bit information as final end bit information in response to the end bit information being correct, and generating final end bit information according to the end bit information identified by the deep learning algorithm and the end bit identification algorithm in response to the end bit information being incorrect;

s15: and outputting train number information and final end position information of the train.

Specifically, the invention aims at the problems, based on the current mature deep learning recognition technology, develops a set of algorithm for improving the recognition accuracy of the end position information of the car number, and the algorithm is used together with the deep learning car number recognition algorithm, so that the recognition accuracy of the end position information of the car number can be improved.

It should be noted that, in S12, the deep learning algorithm is adopted to identify the car number information and the end position information in the complete car number image, and the deep learning algorithm which is mature at present is adopted and mainly comprises the steps of image preprocessing, car number region positioning and character recognition, and the deep learning algorithm has high accuracy in identifying the car number without the end position information at present.

As shown in fig. 2, it should be noted that, the method for identifying the end position information in the complete vehicle number image by using the end position identification algorithm in S13 includes:

s131: performing image graying treatment and image smoothing treatment on the complete vehicle number image to obtain a preprocessed image;

s132: positioning and cutting an end position information area in the preprocessed image to obtain an end position area image;

s133: performing binarization processing on the end position region image, and then performing feature extraction to obtain an end position feature image;

s134: and identifying the terminal characters in the terminal feature image to obtain terminal information.

Specifically, the method for identifying the end position information in the complete vehicle number image by adopting the end position identification algorithm comprises the steps of image preprocessing, vehicle number end position positioning, end position feature extraction and character identification, wherein the image preprocessing is realized based on the image graying processing and the image smoothing technology, the influence of noise, poor shooting effect and the like can be reduced, and the vehicle number end position feature is highlighted; the vehicle number end position region positioning adopts an edge detection technology, binarizes the cut end position image, extracts features, trains a classifier based on feature information to perform end position recognition, and performs character recognition by adopting a neural network.

In S131, the method for performing image graying processing and image smoothing processing on the complete car number image to obtain the preprocessed image includes:

assuming that f (i, j) is the gray value of the point with the coordinates (i, j) in the complete car number image, and R (i, j), G (i, j) and B (i, j) are the values on the red, green and blue components of the point with the coordinates (i, j), respectively, different weights are given to R, G, B, so as to obtain the gray value of each point in the complete car number image, wherein the formula is as follows:

f(i,j)=0.30×R(i,j)+0.59×G(i,j)+0.11×B(i,j);

complete vehicle number image is carried out by adopting median filteringSmoothing the two-dimensional sequence { x } i,j Median filtering, the filtering window is two-dimensional and can be expressed as:

Figure BDA0002680724460000081

a is a filter window, and a is 3*3.

Specifically, the color information can be filtered out by adopting image graying preprocessing, so that the calculated amount is reduced. The specific treatment method comprises the following steps: let f (i, j) be the gray value of (i, j) in the two-dimensional image, R (i, j), G (i, j), B (i, j) are the values on the red, green and blue components of the point of (i, j) coordinates, respectively. According to the sensitivity of human eyes to R, G and B, R, G, B are given different weights, and gray values of each position of an image are obtained, wherein the formula is as follows:

f (i, j) =0.30×r (i, j) +0.59×g (i, j) +0.11×b (i, j); in order to accurately identify the edge information of the end position of the vehicle number, median filtering is adopted to enhance the image smoothing process, and abnormal points and salt and pepper noise are removed.

For two-dimensional sequence { x i,j Median filtering, the filtering window is two-dimensional and can be expressed as:

Figure BDA0002680724460000091

a is a filtering window; the invention uses a filter window size of 3 x 3. />

It should be noted that, in S132, the method for locating and clipping the terminal information area in the preprocessed image to obtain the terminal area image includes:

performing edge recognition by adopting a Sobel operator, wherein the Sobel operator comprises two groups of 3*3 matrixes which are respectively transverse and longitudinal, and performing plane convolution on the matrixes and the preprocessed image to respectively obtain transverse and longitudinal brightness difference approximate values;

if f is used to represent the preprocessed image, gx and Gy represent the transverse and longitudinal edge detected images, respectively, the formula is as follows:

Figure BDA0002680724460000092

Figure BDA0002680724460000093

the lateral and longitudinal gradient approximations for each pixel of the image that is detected by the lateral and longitudinal edges are combined using the following formula to calculate the magnitude of the gradient:

Figure BDA0002680724460000094

the gradient direction was calculated with the following formula:

Figure BDA0002680724460000095

specifically, the end position of the train number is positioned by edge detection: in order to cut out an accurate car number area, an end position area needs to be identified, namely, the edge of an end position character is identified, the edge detection depends on the mutation of the character and the background in the gradient direction, and the Sobel operator is adopted for edge identification. The Sobel operator comprises two groups of 3×3 matrixes, namely transverse and longitudinal matrixes, and the transverse and longitudinal brightness difference approximate values can be obtained by carrying out plane convolution on the matrixes and the images. If f represents the original image, G x G (G) y Representing the detected image of the lateral and longitudinal edges, respectively, with the following formula:

Figure BDA0002680724460000096

Figure BDA0002680724460000097

the lateral and longitudinal gradient approximations for each pixel of the image can be combined using the following equations to calculate the magnitude of the gradient:

Figure BDA0002680724460000101

the gradient direction can then be calculated with the following formula:

Figure BDA0002680724460000102

in S133, binarizing the end region image, and extracting features to obtain an end feature image, where the method includes:

the optimal threshold value is obtained by adopting an iteration method, and binarization processing is carried out on the end position area image to obtain a binarized image, wherein the method for obtaining the optimal threshold value by adopting the iteration method comprises the following steps: initializing a threshold value which is half of the sum of a maximum gray level and a minimum gray level, dividing an end region image into a foreground and a background by the threshold value calculated each time, solving the average gray level of the whole foreground pixel and the average gray level of the whole background pixel, wherein the threshold value is half of the sum of the foreground and the background average gray level, and converging at the moment if the threshold value is equal to the threshold value calculated last time, so as to obtain an optimal threshold value;

character feature extraction is carried out by calculating the proportion value of the white pixels of the binarized image, the character is divided into 16 equal parts, the proportion of the white pixels of the binarized image in the 16 parts is counted and used as 16 feature vectors, and the proportion of the white pixels of the four areas in the vertical direction is counted and used as the last 4 feature vectors.

Specifically, the invention adopts an iteration method to obtain the optimal threshold value, so as to carry out binarization processing, wherein the threshold value is set to be white when being larger than the threshold value, and is set to be black otherwise. The method for solving the optimal threshold by the iteration method comprises the following steps: the threshold value is initialized to be half of the sum of the maximum gray level and the minimum gray level, the image is divided into a foreground and a background by the threshold value calculated each time, the average gray level of the whole foreground pixel and the average gray level of the whole background pixel are obtained, the threshold value is half of the sum of the foreground and the background average gray level, and if the threshold value is equal to the threshold value calculated last time, the threshold value is converged at the moment, and the optimal threshold value is obtained.

The invention adopts the proportion value of the white pixel point of the calculated binarized image to extract character characteristics. The specific implementation mode is as follows: the character is divided into 16 equal parts, and the proportion of white pixels of the binarized image in the 16 parts is counted to be used as 16 feature vectors. And counting white pixel points of the four areas in the vertical direction to serve as the last four feature vectors.

It should be noted that, in S134, the method for identifying the terminal character in the terminal feature image and obtaining the terminal information includes:

identifying the cut end position characteristic images with the same size by adopting a neural network; the neural network is a network composed of basic neurons, the network comprises an input layer, an hidden layer and an output layer, the input layer comprises 20 nodes, the output layer comprises 4 nodes, the 1 hidden layer is trained by the end position information to obtain an end position information classifier, and the end position information is identified through the classifier.

Specifically, the terminal character recognition method adopts the neural network to recognize the terminal image slices with the same size. A neural network is here a network of elementary neurons, which network comprises an input layer, an hidden layer and an output layer. The input layer comprises 20 nodes, the output layer comprises 4 nodes, and the classifier of the car number end position is obtained after the car number end position is trained by the 1 hidden layer.

As shown in fig. 3, it should be noted that, in S14, it is determined whether the end bit information identified by the end bit identification algorithm is correct, and in response to the end bit information being correct, the end bit information is used as final end bit information, and in response to the end bit information being incorrect, the method for generating final end bit information according to the end bit information identified by the deep learning algorithm and the end bit identification algorithm includes:

s141: judging whether the identified head end position information and the tail end position information of the same train are consistent, responding to the fact that the head end position information and the tail end position information are consistent, and entering S145;

s142: responding to the discrepancy between the head end position information and the tail end position information, comparing the head end position information and the tail end position information identified by the deep learning algorithm,

s143: if the two pieces of vehicle head end position information are the same, indicating that the vehicle tail end position information is wrong, assigning the vehicle tail end position information to the corresponding result of the vehicle head end position information, and entering S145;

s144: if the two pieces of vehicle tail end position information are the same, indicating that the vehicle head end position information is wrong, assigning the vehicle head end position information to a corresponding result of the vehicle tail end position information, and entering S145;

s145: and taking the first end position information obtained when the train passes by as final end position information.

Specifically, when the train end position is determined, the whole process of the train passing through is collected. The end positions of the train number generally exist at the head and the tail, the head character is I or A, the tail character is II or B, i.e. I corresponds to II, and A corresponds to B. Based on the above situation, a special logic judgment is added in the algorithm to further confirm the end bit. The specific method comprises the following steps: after the front end position and the rear end position of the train are identified through the trained classifier, the identified front end position and the rear end position are compared, if the identified end positions are inconsistent, if the identified end positions I and II are identified, the end position identification is accurate, the first identified end position when the train passes is output, if the identified end positions are consistent, the condition that the train head or the train tail has identification errors is indicated, at the moment, the end position result of deep learning identification is combined to be compared with the end position identified by the classifier, if the train head position comparison result is consistent, the train tail position identification errors are caused, the train tail result is given to be a train head reverse result, and if the train tail result comparison is consistent, the train head end position identification errors are given to be a train tail reverse result. And finally, outputting the first end position information identified by the passing of the train.

As shown in fig. 4, the embodiment of the invention further provides a train number identification device with end positions, which comprises:

the complete train number image acquisition module 21 is used for acquiring a complete train number image of the train, wherein the complete train number image comprises train number information and end position information of the train;

the car number end position

information identification module

22 is used for identifying car number information and end position information in the complete car number image by adopting a deep learning algorithm;

the end position

information identification module

23 is used for identifying the end position information in the complete vehicle number image by adopting an end position identification algorithm;

the end bit

information judging module

24 is configured to judge whether the end bit information identified by the end bit identification algorithm is correct, take the end bit information as final end bit information in response to the end bit information being correct, and generate final end bit information according to the end bit information identified by the deep learning algorithm and the end bit identification algorithm in response to the end bit information being incorrect;

the train number end position

information output module

25 is used for outputting train number information and final end position information of the train.

The terminal

information identifying module

23 includes:

the image preprocessing unit is used for carrying out image graying processing and image smoothing processing on the complete vehicle number image to obtain a preprocessed image;

the terminal positioning unit is used for positioning and cutting a terminal information area in the preprocessed image to obtain a terminal area image;

the feature extraction unit is used for carrying out binarization processing on the end position region image and then carrying out feature extraction to obtain an end position feature image;

and the character recognition unit is used for recognizing the terminal characters in the terminal characteristic image to obtain terminal information.

It should be noted that the terminal

information determining module

24 includes:

an end position judging unit for judging whether the identified head end position information and the identified tail end position information of the same train are consistent,

the matching processing unit is used for responding to the matching of the locomotive end position information and the train tail end position information and taking the first end position information obtained when the train passes through as final end position information;

and the noncompliance processing unit is used for responding to noncompliance of the head end position information and the tail end position information, comparing the head end position information and the tail end position information identified by the deep learning algorithm, if the two pieces of the head end position information are identical, indicating that the tail end position information is wrong, assigning the tail end position information to the corresponding result of the head end position information, if the two pieces of the tail end position information are identical, indicating that the head end position information is wrong, assigning the head end position information to the corresponding result of the tail end position information, and finally, taking the first end position information obtained when a train passes as final end position information.

The embodiment of the invention also provides a train number identification device with end positions, which comprises: a memory for storing a computer program; and the processor is used for executing the computer program to realize the steps of the train number identification method with the end bit.

The embodiment of the invention also provides a readable storage medium, wherein the readable storage medium stores a computer program, and the computer program realizes the steps of the train number identification method with the end bit when being executed by a processor.

The description of the features of the embodiment corresponding to fig. 4 may be referred to the related description of the embodiment corresponding to fig. 1-3, and will not be repeated here.

The train number identification method and device with the end bit and the readable storage medium provided by the embodiment of the invention are described in detail. In the description, each embodiment is described in a progressive manner, and each embodiment is mainly described by the differences from other embodiments, so that the same similar parts among the embodiments are mutually referred. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section. It should be noted that it will be apparent to those skilled in the art that various modifications and adaptations of the invention can be made without departing from the principles of the invention and these modifications and adaptations are intended to be within the scope of the invention as defined in the following claims.

Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.

The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.

Claims (10)

1. The train number identification method with the end position is characterized by comprising the following steps of:

acquiring a complete train number image of a train, wherein the complete train number image comprises train number information and end position information of the train;

identifying the car number information and the end position information in the complete car number image by adopting a deep learning algorithm;

identifying end position information in the complete vehicle number image by adopting an end position identification algorithm;

judging whether the end bit information identified by the end bit identification algorithm is correct, taking the end bit information as final end bit information in response to the end bit information being correct, and generating final end bit information according to the end bit information identified by the deep learning algorithm and the end bit identification algorithm in response to the end bit information being incorrect;

and outputting train number information and final end position information of the train.

2. The method for identifying a train number with an end position according to claim 1, wherein the method for identifying the end position information in the complete train number image by using an end position identification algorithm comprises the following steps:

performing image graying treatment and image smoothing treatment on the complete vehicle number image to obtain a preprocessed image;

positioning and cutting an end position information area in the preprocessed image to obtain an end position area image;

performing binarization processing on the end position region image, and then performing feature extraction to obtain an end position feature image;

and identifying the terminal characters in the terminal feature image to obtain terminal information.

3. The method for recognizing train numbers with end positions according to claim 2, wherein the method for performing image graying processing and image smoothing processing on the complete train number image to obtain the preprocessed image comprises:

assuming that f (i, j) is the gray value of the point with the coordinates (i, j) in the complete car number image, and R (i, j), G (i, j) and B (i, j) are the values on the red, green and blue components of the point with the coordinates (i, j), respectively, different weights are given to R, G, B, so as to obtain the gray value of each point in the complete car number image, wherein the formula is as follows:

f(i,j)=0.30×R(i,j)+0.59×G(i,j)+0.11×B(i,j);

smoothing the complete vehicle number image by adopting median filtering, and carrying out smoothing on the two-dimensional sequence { x } i,j Median filtering, the filtering window is two-dimensional and can be expressed as:

Figure FDA0002680724450000011

a is a filter window, and a is 3*3.

4. The method for identifying a train number with an end position according to claim 2, wherein the method for locating and clipping the end position information area in the preprocessed image to obtain the end position area image comprises the following steps:

performing edge recognition by adopting a Sobel operator, wherein the Sobel operator comprises two groups of 3*3 matrixes which are respectively transverse and longitudinal, and performing plane convolution on the matrixes and the preprocessed image to respectively obtain transverse and longitudinal brightness difference approximate values;

if f is used to represent the preprocessed image, gx and Gy represent the transverse and longitudinal edge detected images, respectively, the formula is as follows:

Figure FDA0002680724450000021

the lateral and longitudinal gradient approximations for each pixel of the image that is detected by the lateral and longitudinal edges are combined using the following formula to calculate the magnitude of the gradient:

Figure FDA0002680724450000022

the gradient direction was calculated with the following formula:

Figure FDA0002680724450000023

5. the method for identifying a train number with an end position according to claim 2, wherein the method for obtaining an end position feature image by performing binarization processing and feature extraction on the end position region image comprises the steps of:

the optimal threshold value is obtained by adopting an iteration method, and binarization processing is carried out on the end position area image to obtain a binarized image, wherein the method for obtaining the optimal threshold value by adopting the iteration method comprises the following steps: initializing a threshold value which is half of the sum of a maximum gray level and a minimum gray level, dividing an end region image into a foreground and a background by the threshold value calculated each time, solving the average gray level of the whole foreground pixel and the average gray level of the whole background pixel, wherein the threshold value is half of the sum of the foreground and the background average gray level, and converging at the moment if the threshold value is equal to the threshold value calculated last time, so as to obtain an optimal threshold value;

character feature extraction is carried out by calculating the proportion value of the white pixels of the binarized image, the character is divided into 16 equal parts, the proportion of the white pixels of the binarized image in the 16 parts is counted and used as 16 feature vectors, and the proportion of the white pixels of the four areas in the vertical direction is counted and used as the last 4 feature vectors.

6. The method for identifying a train number with an end position according to claim 2, wherein the method for identifying an end position character in the end position feature image to obtain end position information comprises:

identifying the cut end position characteristic images with the same size by adopting a neural network; the neural network is a network composed of basic neurons, the network comprises an input layer, an hidden layer and an output layer, the input layer comprises 20 nodes, the output layer comprises 4 nodes, the 1 hidden layer is trained by the end position information to obtain an end position information classifier, and the end position information is identified through the classifier.

7. The method for identifying a train number with an end bit according to claim 1, wherein the method for determining whether the end bit information identified by the end bit identification algorithm is correct, responding to the end bit information as final end bit information, responding to the end bit information error, and generating final end bit information according to the end bit information identified by the deep learning algorithm and the end bit identification algorithm comprises:

judging whether the identified head end position information and the identified tail end position information of the same train are consistent,

responding to the coincidence of the locomotive end position information and the train tail end position information, and taking the first end position information obtained when the train passes through as final end position information;

responding to that the head end position information does not accord with the tail end position information, comparing the head end position information with the tail end position information identified by a deep learning algorithm, if the two pieces of the head end position information are identical, indicating that the tail end position information is wrong, assigning the tail end position information to a corresponding result of the head end position information, if the two pieces of the tail end position information are identical, indicating that the head end position information is wrong, assigning the head end position information to a corresponding result of the tail end position information, and finally, taking the first end position information obtained when a train passes as final end position information.

8. A train number identification device with end position, the device comprising:

the complete train number image acquisition module is used for acquiring a complete train number image of the train, wherein the complete train number image comprises train number information and end position information of the train;

the car number end position information identification module is used for identifying car number information and end position information in the complete car number image by adopting a deep learning algorithm;

the terminal position information identification module is used for identifying terminal position information in the complete car number image by adopting a terminal position identification algorithm;

the terminal information judging module is used for judging whether the terminal information identified by the terminal identification algorithm is correct, taking the terminal information as final terminal information in response to the terminal information being correct, and generating final terminal information according to the terminal information identified by the deep learning algorithm and the terminal identification algorithm in response to the terminal information being wrong;

and the train number end position information output module is used for outputting train number information and final end position information of the train.

9. Train number recognition device of area end position, characterized in that includes:

a memory for storing a computer program;

a processor for executing the computer program to implement the steps of the end-capped train number identification method as claimed in any one of claims 1 to 7.

10. A readable storage medium, characterized in that the readable storage medium stores a computer program which, when executed by a processor, implements the steps of the train number identification method with end bits as claimed in any one of claims 1 to 7.

CN202010961523.8A 2020-09-14 2020-09-14 Train number identification method and device with end bit and readable storage medium Active CN112070087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010961523.8A CN112070087B (en) 2020-09-14 2020-09-14 Train number identification method and device with end bit and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010961523.8A CN112070087B (en) 2020-09-14 2020-09-14 Train number identification method and device with end bit and readable storage medium

Publications (2)

Publication Number Publication Date
CN112070087A CN112070087A (en) 2020-12-11
CN112070087B true CN112070087B (en) 2023-06-02

Family

ID=73695887

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010961523.8A Active CN112070087B (en) 2020-09-14 2020-09-14 Train number identification method and device with end bit and readable storage medium

Country Status (1)

Country Link
CN (1) CN112070087B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113609983B (en) * 2021-08-05 2024-11-22 武汉黎赛科技有限责任公司 Locomotive tracking and positioning method, device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0475187A (en) * 1990-07-17 1992-03-10 Matsushita Electric Ind Co Ltd In-list character recognizing device
JP2002344952A (en) * 2001-05-22 2002-11-29 Fujitsu Ltd Vehicle number recognition device and method
CN105354574A (en) * 2015-12-04 2016-02-24 山东博昂信息科技有限公司 Vehicle number recognition method and device
CN106940884A (en) * 2015-12-15 2017-07-11 北京康拓红外技术股份有限公司 A kind of EMUs operation troubles image detecting system and method comprising depth information
WO2018233038A1 (en) * 2017-06-23 2018-12-27 平安科技(深圳)有限公司 Deep learning-based method, apparatus and device for recognizing license plate, and storage medium
CN109840523A (en) * 2018-12-29 2019-06-04 南京睿速轨道交通科技有限公司 A kind of municipal rail train Train number recognition algorithm based on image procossing
CN110378332A (en) * 2019-06-14 2019-10-25 上海咪啰信息科技有限公司 A kind of container terminal case number (CN) and Train number recognition method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107066933B (en) * 2017-01-25 2020-06-05 武汉极目智能技术有限公司 Road sign identification method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0475187A (en) * 1990-07-17 1992-03-10 Matsushita Electric Ind Co Ltd In-list character recognizing device
JP2002344952A (en) * 2001-05-22 2002-11-29 Fujitsu Ltd Vehicle number recognition device and method
CN105354574A (en) * 2015-12-04 2016-02-24 山东博昂信息科技有限公司 Vehicle number recognition method and device
CN106940884A (en) * 2015-12-15 2017-07-11 北京康拓红外技术股份有限公司 A kind of EMUs operation troubles image detecting system and method comprising depth information
WO2018233038A1 (en) * 2017-06-23 2018-12-27 平安科技(深圳)有限公司 Deep learning-based method, apparatus and device for recognizing license plate, and storage medium
CN109840523A (en) * 2018-12-29 2019-06-04 南京睿速轨道交通科技有限公司 A kind of municipal rail train Train number recognition algorithm based on image procossing
CN110378332A (en) * 2019-06-14 2019-10-25 上海咪啰信息科技有限公司 A kind of container terminal case number (CN) and Train number recognition method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于图像处理的列车车号识别系统研究;张晓利;曹宁;任杰;;电子世界(第11期);全文 *

Also Published As

Publication number Publication date
CN112070087A (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CN109657632B (en) 2022-05-06 A method of lane line detection and recognition
CN104778721B (en) 2017-08-11 The distance measurement method of conspicuousness target in a kind of binocular image
CN112686812B (en) 2023-08-29 Bank card inclination correction detection method and device, readable storage medium and terminal
CN109740595B (en) 2022-12-30 Oblique vehicle detection and tracking system and method based on machine vision
CN108596166A (en) 2018-09-28 A kind of container number identification method based on convolutional neural networks classification
CN108615034A (en) 2018-10-02 A kind of licence plate recognition method that template matches are combined with neural network algorithm
CN108694393A (en) 2018-10-23 A kind of certificate image text area extraction method based on depth convolution
US20130294653A1 (en) 2013-11-07 Methods and systems for optimized parameter selection in automated license plate recognition
CN107103317A (en) 2017-08-29 Fuzzy license plate image recognition algorithm based on image co-registration and blind deconvolution
CN107066933A (en) 2017-08-18 A kind of road sign recognition methods and system
CN107122777A (en) 2017-09-01 A kind of vehicle analysis system and analysis method based on video file
CN110414506B (en) 2022-09-06 Bank card number automatic identification method based on data augmentation and convolution neural network
CN105205480A (en) 2015-12-30 Complex scene human eye locating method and system
CN111753749A (en) 2020-10-09 A Lane Line Detection Method Based on Feature Matching
CN109460722B (en) 2021-11-23 Intelligent license plate recognition method
CN110569801B (en) 2023-06-30 Identification method for key content of driving license
CN112528917A (en) 2021-03-19 Zebra crossing region identification method and device, electronic equipment and storage medium
CN110060221B (en) 2023-01-17 A Bridge Vehicle Detection Method Based on UAV Aerial Images
CN112101058A (en) 2020-12-18 Method and device for automatically identifying test paper bar code
EP4287137A1 (en) 2023-12-06 Method, device, equipment, storage media and system for detecting drivable space of road
CN111046866B (en) 2023-04-18 Method for detecting RMB crown word number region by combining CTPN and SVM
CN112070087B (en) 2023-06-02 Train number identification method and device with end bit and readable storage medium
CN115841633A (en) 2023-03-24 Power tower and power line associated correction power tower and power line detection method
CN108520252B (en) 2022-03-01 Road sign recognition method based on generalized Hough transform and wavelet transform
CN112364844B (en) 2021-05-18 Data acquisition method and system based on computer vision technology

Legal Events

Date Code Title Description
2020-12-11 PB01 Publication
2020-12-11 PB01 Publication
2020-12-29 SE01 Entry into force of request for substantive examination
2020-12-29 SE01 Entry into force of request for substantive examination
2023-06-02 GR01 Patent grant
2023-06-02 GR01 Patent grant