CN109460046B - A method for unmanned aerial vehicle natural landmark recognition and autonomous landing - Google Patents
- ️Fri Aug 06 2021
Info
-
Publication number
- CN109460046B CN109460046B CN201811213147.3A CN201811213147A CN109460046B CN 109460046 B CN109460046 B CN 109460046B CN 201811213147 A CN201811213147 A CN 201811213147A CN 109460046 B CN109460046 B CN 109460046B Authority
- CN
- China Prior art keywords
- image
- landing
- coordinates
- area
- point Prior art date
- 2018-10-17 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 238000000605 extraction Methods 0.000 claims abstract description 7
- 238000012545 processing Methods 0.000 claims description 14
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 3
- 230000000717 retained effect Effects 0.000 claims 1
- 238000005259 measurement Methods 0.000 description 5
- 238000001914 filtration Methods 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000007547 defect Effects 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
- G05D1/0653—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
- G05D1/0676—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/48—Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Remote Sensing (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Astronomy & Astrophysics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
Abstract
一种无人机自然地标识别与自主着陆方法属机器视觉导航技术领域,本发明根据给定的预着陆坐标在卫星数字地图上确定着陆区域,利用无人机在预着陆坐标拍摄航拍图像,将卫星数字地图与航拍图像进行滤波、灰度化、二值化处理、边缘特征提取和Hough变换,提取连续几何曲线,并采用加权Hausdorff距离匹配算法将二者进行匹配,根据格林定理计算无人机航拍图像中区域的形心相对于无人机的坐标,根据投影关系计算区域形心的空间坐标,并引导无人机在区域形心的空间坐标自主着陆。本发明能保证无人机在指定范围内自主识别最佳着陆点,并精准着陆,可弥补GPS导航下自主着陆误差大的不足,且自主着陆的安全性和可靠性得到提高。
A method for recognizing natural landmarks and autonomous landing of an unmanned aerial vehicle belongs to the technical field of machine vision navigation. The present invention determines a landing area on a satellite digital map according to given pre-landing coordinates, uses the unmanned aerial vehicle to take aerial images at the pre-landing coordinates, The satellite digital map and aerial image are filtered, grayed, binarized, edge feature extraction and Hough transform, extracted continuous geometric curve, and the weighted Hausdorff distance matching algorithm is used to match the two, and the UAV is calculated according to Green's theorem The centroid of the area in the aerial image is relative to the coordinates of the UAV, and the spatial coordinates of the area centroid are calculated according to the projection relationship, and the UAV is guided to land autonomously at the spatial coordinates of the area centroid. The invention can ensure that the unmanned aerial vehicle can autonomously identify the best landing point within a specified range and land accurately, can make up for the large error of autonomous landing under GPS navigation, and improve the safety and reliability of autonomous landing.
Description
Technical Field
The invention belongs to the technical field of machine vision navigation, and particularly relates to a natural landmark identification and autonomous landing method for an unmanned aerial vehicle.
Background
In recent years, with the development of micro inertial navigation systems, flight control systems, micro electromechanical systems and novel materials, the research on micro unmanned aerial vehicles has greatly progressed. The rotary wing type micro unmanned aerial vehicle has the advantages of being good in flexibility, compact in structure, low in cost, fast in data acquisition and the like, and the application range also covers various fields including but not limited to pesticide spraying, geological surveying, searching and rescuing, cargo transportation, mapping and the like. Due to the limitation of the response speed and the working efficiency of the people for acquiring the information, the tasks are automatically completed by the unmanned aerial vehicle as much as possible, the actions of automatic take-off and landing, path planning, obstacle avoidance, ground imitation flying and the like are realized through a set program or the automatic planning of the unmanned aerial vehicle, and the accuracy and the reliability of the operation process are ensured.
In the aspect of unmanned aerial vehicle autonomous landing, an autonomous landing mode based on GPS navigation is mostly used at present, namely, a GPS sensor carried by the unmanned aerial vehicle records the geographic coordinate of a take-off time body, or a certain geographic coordinate is specified artificially, and when the unmanned aerial vehicle lands, a GPS positioning system guides the unmanned aerial vehicle to hover over the recorded geographic coordinate and descend for landing. The GPS navigation has the defects of large interference by non-air media, low positioning accuracy and the like, so that the unmanned aerial vehicle has large autonomous landing error in remote areas or areas with a large number of shelters and cannot accurately complete a landing task.
An unmanned aerial vehicle autonomous landing method based on machine vision is one of approaches for solving inaccurate positioning of a GPS (global positioning system), and currently, an autonomous landing method based on artificial landmarks is more applied to a rotor unmanned aerial vehicle. And along with the application of unmanned aerial vehicles in various fields is more and more extensive, also higher and more high to unmanned aerial vehicle's environmental suitability requirement. Some specific tasks require the drone to land in places where artificial landmarks are not suitable, and even require the drone to autonomously find a suitable landing place in a specific area, which requires the drone to have the ability to recognize natural landmarks. Therefore, in order to provide accurate navigation information to the drone and complete a specific autonomous landing task, a natural landmark identification and autonomous landing method for the drone is urgently needed.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle natural landmark identification and autonomous landing method based on machine vision and a satellite digital map, which aims to solve the problems in the prior art.
The natural landmark identification and autonomous landing method for the unmanned aerial vehicle comprises the following steps:
1.1 according to the given Pre-landing coordinates (X)0,Y0,Z0) Determining a landing area P with a convex polygon outline on a satellite digital map, firstly carrying out filtering, graying and binarization processing on an image of the area P, then further carrying out edge feature extraction, removing part of miscellaneous points, reserving main edge features based on the area, and finally extracting a continuous geometric curve through Hough transformation to obtain an outline curve I and a reference image A of the area P on the satellite digital map, wherein the binarization processing adopts a maximum inter-class variance method;
1.2, flying an unmanned aerial vehicle to the air above a given pre-landing coordinate, carrying out filtering, graying and binarization processing on an aerial image, further carrying out edge feature extraction, removing part of miscellaneous points, reserving main edge features based on a region, and finally extracting a continuous geometric curve through Hough transformation to obtain a profile curve II and a measured image B of a region P on a satellite digital map, wherein the binarization processing also adopts a maximum inter-class variance method;
1.3, matching the reference image A obtained in the step 1.1 with the actual measurement image B obtained in the step 1.2, and confirming a landing area P in the aerial image of the unmanned aerial vehicle; the image matching adopts a weighted Hausdorff distance matching algorithm, and comprises the following steps:
1.3.1 in the reference image A and the actual measurement image B, the 3-4DT algorithm is adopted to carry out the distance conversion of the characteristic point set in the two-dimensional space, and an image distance conversion matrix J is obtainedAAnd JB;
1.3.2 extracting branch points in a reference image A and a measured image B, and respectively storing the branch points in matrixes A and B;
1.3.3 according to JA、JBAnd A and B calculate the weighted Hausdorff distance:
H(A,B)=max(hWHD(A,B),hWHD(B,A))
wherein: A. b is two point sets; n is a radical ofaIs the total number of feature points in point set a; a is a feature point belonging to A; d (a, B) is the distance from the characteristic point a on the point set A to the point set B; h isWHD(A, B) represents the directed distance from point set A to point set B; h isWHD(B, A) represents the directed distance from point set B to point set A;
the point with the minimum Hausdorff distance is the final matching point, so that the preliminary positioning information is obtained;
1.3.4, utilizing a least square algorithm to carry out one-to-one correspondence on all matching point pairs to obtain more accurate position information;
1.4 establishing a two-dimensional plane rectangular coordinate system by taking the unmanned aerial vehicle camera as the origin of coordinates, and calculating the coordinate (x) of the centroid of the area P in the aerial image of the unmanned aerial vehicle relative to the unmanned aerial vehicle according to the Green's theoremc,yc);
1.5 calculating the coordinates (X) of the P centroid of the region from the projection relationshipc,Yc,Zc) The method specifically comprises the following steps:
1.5.1 calculating ground resolution GSD:
wherein: GSD represents ground resolution (m); f is the focal length (mm) of the lens; p is the pixel size (mm) of the imaging sensor; h is the corresponding flight height (m) of the unmanned aerial vehicle;
1.5.2 calculating the actual ground distance of the image diagonal, and obtaining the ground distance L between the image diagonal according to the width w and the height h of the image:
wherein: GSD represents ground resolution (m); w is the image width; h is the image height;
1.5.3 according to the longitude and latitude of the central point of the image, the distance and the direction angle of the area P centroid relative to the central point, the geographical coordinate of the area P centroid is obtained:
wherein: theta0∈(0,2π);LonaLongitude of the image center point; lataThe latitude of the central point of the image; ri6378137m is taken as the equatorial radius; rjTaking 6356725m as the extreme radius;
1.5.4 converting geographic coordinates into spatial coordinates to obtain spatial coordinates (X) of P centroid of regionc,Yc,Zc):
Wherein: n is the curvature radius; lon is longitude; lat is latitude; h is elevation;
1.6 unmanned aerial vehicle flies to space coordinate (X)c,Yc,Zc) And (5) landing in the vertical direction in the air.
The method can ensure that the unmanned aerial vehicle autonomously identifies the optimal landing point within the designated range, accurately lands, can make up for the defect of large autonomous landing error under GPS navigation, and improves the safety and reliability of autonomous landing.
Drawings
FIG. 1 is a flowchart of a method for identifying natural landmarks and autonomously landing for unmanned aerial vehicles
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below.
Step one, according to the given pre-landing coordinate (X)0,Y0,Z0) Determining a proper landing area P (requiring the outline of the area to be a convex polygon) on a satellite digital map, firstly carrying out filtering processing, graying processing and binarization processing on the image of the area P, further carrying out edge feature extraction, removing partial miscellaneous points, reserving main edge features based on the area, finally extracting a continuous geometric curve through Hough transformation to obtain an outline curve of the area P on the satellite digital map, and obtaining a reference image A, wherein the binarization processing selects a maximum inter-class variance method, supposing that T is a selected global threshold, pixels of all pixel points of the image are divided into a foreground and a background according to T as a boundary, and omega is a global threshold1And ω2Respectively representing the proportion of pixels belonging to the background and the foreground in the whole image, then:
wherein: p (i) represents the probability of the pixel with the pixel value i appearing in the image.
μ0And mu1Respectively representing the average value of the pixels of the background pixel and the foreground pixel, and if mu is the average pixel value of all the pixels, then:
the variance σ between classes corresponding to the threshold2(T) is defined as:
σ2(T)=ω0(T)[μ0(T)-μ(T)]2+ω1(T)[μ1(T)-μ(T)]2=ω0(T)ω1(T)[μ0(T)-μ1(T)]2
and traversing each gray value, and finding out the threshold T corresponding to the maximum inter-class variance, namely the threshold.
And secondly, flying the unmanned aerial vehicle to the vicinity of the upper space of the given pre-landing coordinate, carrying out filtering processing, graying processing and binarization processing on the aerial image, further carrying out edge feature extraction, removing part of miscellaneous points, reserving main edge features based on the area, and finally extracting a continuous geometric curve through Hough transformation to obtain a contour curve of the area P on the satellite digital map so as to obtain a real-measurement image B. The binarization processing also selects a maximum inter-class variance method.
And step three, matching the reference image A with the actual measurement image B, and confirming the landing area P in the aerial image of the unmanned aerial vehicle. The image matching adopts a weighted Hausdorff distance matching algorithm, and comprises the following specific steps:
(1) in the reference image A and the actual measurement image B, the distance conversion of the feature point set in the two-dimensional space is carried out by adopting a 3-4DT algorithm to obtain an image distance conversion matrix JAAnd JB;
(2) Extracting branch points in the reference image A and the measured image B, and respectively storing the branch points in the matrixes A and the matrix B;
(3) according to JA、JBAnd A and B calculate the weighted Hausdorff distance:
H(A,B)=max(hWHD(A,B),hWHD(B,A))
wherein: A. b is two sets of points, NaIs the total number of feature points in the point set A, a is a feature point belonging to A, d (a, B) is the distance from the feature point a to the point set B on the point set A, hWHD(A,B)、hWHD(B, A) represent the directional distances from point set A to point set B and from point set B to point set A, respectively.
The point with the minimum Hausdorff distance is the final matching point to obtain preliminary positioning information.
(4) And performing one-to-one correspondence on all the matching point pairs by using a least square algorithm to acquire more accurate position information.
Establishing a two-dimensional plane rectangular coordinate system by taking the unmanned aerial vehicle camera as a coordinate origin, and calculating the coordinate (x) of the centroid of the region P in the aerial image of the unmanned aerial vehicle relative to the unmanned aerial vehiclec,yc)。
According to green's theorem, the closed contour along region P integrates:
after discretization, the above formula translates to:
step five, calculating the coordinates (X) of the P centroid of the region according to the projection relationc,Yc,Zc):
(1) Calculating the ground resolution:
wherein: GSD represents ground resolution (m), f is lens focal length (mm), P is imaging sensor's pixel size (mm), and H is the corresponding flight height (m) of unmanned aerial vehicle.
(2) Calculating the actual ground distance of the image diagonal, and obtaining the ground distance between the image diagonal according to the width w and the height h of the image:
(3) according to the longitude and latitude of the central point of the image, the distance and the direction angle of the area P centroid relative to the central point, the geographic coordinate of the area P centroid is obtained:
wherein: theta0∈(0,2π),Lona、LataIs the longitude and latitude of the center point of the image, RiTaking 6378137m, R as the equatorial radiusj6356725m was taken for the polar radius.
(4) Conversion between a geographical coordinate to an inter-spatial coordinate system
Wherein: n is curvature radius, Lon, Lat and H are longitude, latitude and elevation respectively, and space coordinates (X) of the centroid of the area P is obtainedc,Yc,Zc)。
Step six, the unmanned plane flies to a space coordinate (X)c,Yc,Zc) Go to hang in the upper airLanding in a straight direction.
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811213147.3A CN109460046B (en) | 2018-10-17 | 2018-10-17 | A method for unmanned aerial vehicle natural landmark recognition and autonomous landing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811213147.3A CN109460046B (en) | 2018-10-17 | 2018-10-17 | A method for unmanned aerial vehicle natural landmark recognition and autonomous landing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109460046A CN109460046A (en) | 2019-03-12 |
CN109460046B true CN109460046B (en) | 2021-08-06 |
Family
ID=65607782
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811213147.3A Active CN109460046B (en) | 2018-10-17 | 2018-10-17 | A method for unmanned aerial vehicle natural landmark recognition and autonomous landing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109460046B (en) |
Families Citing this family (8)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110968112B (en) * | 2019-12-12 | 2023-08-01 | 哈尔滨工程大学 | Unmanned aerial vehicle autonomous landing method based on monocular vision |
CN111324145B (en) * | 2020-02-28 | 2022-08-16 | 厦门理工学院 | Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium |
CN111626260A (en) * | 2020-06-05 | 2020-09-04 | 贵州省草业研究所 | Aerial photo ground object feature point extraction method based on unmanned aerial vehicle remote sensing technology |
CN113920439B (en) * | 2020-07-10 | 2024-09-06 | 千寻位置网络有限公司 | Extraction method and device for arrow point |
CN112419374B (en) * | 2020-11-11 | 2022-12-27 | 北京航空航天大学 | Unmanned aerial vehicle positioning method based on image registration |
CN115526896A (en) * | 2021-07-19 | 2022-12-27 | 中核利华消防工程有限公司 | Fire prevention and control method and device, electronic equipment and readable storage medium |
CN114998773B (en) * | 2022-08-08 | 2023-02-17 | 四川腾盾科技有限公司 | Characteristic mismatching elimination method and system suitable for aerial image of unmanned aerial vehicle system |
CN115237158B (en) * | 2022-08-17 | 2024-11-19 | 吉林大学 | Multi-rotor unmanned aerial vehicle autonomous tracking and landing control system and control method |
Citations (5)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100095665A (en) * | 2009-02-12 | 2010-09-01 | 한양대학교 산학협력단 | Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same |
CN103424126A (en) * | 2013-08-12 | 2013-12-04 | 西安电子科技大学 | System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle |
CN105000194A (en) * | 2015-08-13 | 2015-10-28 | 史彩成 | UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark |
CN105550994A (en) * | 2016-01-26 | 2016-05-04 | 河海大学 | Satellite image based unmanned aerial vehicle image rapid and approximate splicing method |
CN107063261A (en) * | 2017-03-29 | 2017-08-18 | 东北大学 | The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane |
-
2018
- 2018-10-17 CN CN201811213147.3A patent/CN109460046B/en active Active
Patent Citations (5)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100095665A (en) * | 2009-02-12 | 2010-09-01 | 한양대학교 산학협력단 | Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same |
CN103424126A (en) * | 2013-08-12 | 2013-12-04 | 西安电子科技大学 | System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle |
CN105000194A (en) * | 2015-08-13 | 2015-10-28 | 史彩成 | UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark |
CN105550994A (en) * | 2016-01-26 | 2016-05-04 | 河海大学 | Satellite image based unmanned aerial vehicle image rapid and approximate splicing method |
CN107063261A (en) * | 2017-03-29 | 2017-08-18 | 东北大学 | The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane |
Non-Patent Citations (2)
* Cited by examiner, † Cited by third partyTitle |
---|
基于视觉的无人机自主着陆地标识别方法;李宇 等;《计算机应用研究》;20120731;第29卷(第7期);第2780-2783页 * |
新型的无人机自主着陆地标设计与研究;陈勇 等;《电子科技大学学报》;20161130;第45卷(第6期);第934-938页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109460046A (en) | 2019-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109460046B (en) | 2021-08-06 | A method for unmanned aerial vehicle natural landmark recognition and autonomous landing |
CN110569838B (en) | 2022-05-24 | An autonomous landing method of quadrotor UAV based on visual positioning |
CN106054929B (en) | 2018-10-16 | A kind of unmanned plane based on light stream lands bootstrap technique automatically |
US20200344464A1 (en) | 2020-10-29 | Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Defects |
US20190068829A1 (en) | 2019-02-28 | Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Obstructions |
CN112740225B (en) | 2022-05-13 | A kind of pavement element determination method and device |
CN103714541B (en) | 2015-07-08 | A Method for Identifying and Locating Buildings Using Mountain Contour Area Constraints |
Huang et al. | 2017 | Structure from motion technique for scene detection using autonomous drone navigation |
Hebel et al. | 2011 | Simultaneous calibration of ALS systems and alignment of multiview LiDAR scans of urban areas |
CN109949361A (en) | 2019-06-28 | An Attitude Estimation Method for Rotor UAV Based on Monocular Vision Positioning |
CN109885086B (en) | 2022-09-23 | A UAV vertical landing method based on compound polygonal sign guidance |
CN111666855B (en) | 2023-06-30 | Method, system and electronic equipment for extracting three-dimensional parameters of animals based on drone |
CN103822635A (en) | 2014-05-28 | Visual information based real-time calculation method of spatial position of flying unmanned aircraft |
Yoneda et al. | 2015 | Urban road localization by using multiple layer map matching and line segment matching |
CN108305264A (en) | 2018-07-20 | A kind of unmanned plane precision landing method based on image procossing |
CN101109640A (en) | 2008-01-23 | Vision-based autonomous landing navigation system for unmanned aircraft |
CN105405126B (en) | 2017-11-07 | A kind of multiple dimensioned vacant lot parameter automatic calibration method based on single camera vision system |
CN109341686B (en) | 2023-10-27 | Aircraft landing pose estimation method based on visual-inertial tight coupling |
KR102289752B1 (en) | 2021-08-13 | A drone for performring route flight in gps blocked area and methed therefor |
CN113377118A (en) | 2021-09-10 | Multi-stage accurate landing method for unmanned aerial vehicle hangar based on vision |
CN107424156B (en) | 2019-12-06 | Accurate measurement method for autonomous formation of unmanned aerial vehicles based on visual attention imitating barn owl eyes |
CN110058604A (en) | 2019-07-26 | A kind of accurate landing system of unmanned plane based on computer vision |
CN114815871A (en) | 2022-07-29 | A vision-based autonomous landing method for vertical take-off and landing UAV mobile platforms |
CN117636284A (en) | 2024-03-01 | Unmanned aerial vehicle autonomous landing method and device based on visual image guidance |
CN113378701A (en) | 2021-09-10 | Ground multi-AGV state monitoring method based on unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
2019-03-12 | PB01 | Publication | |
2019-03-12 | PB01 | Publication | |
2019-04-05 | SE01 | Entry into force of request for substantive examination | |
2019-04-05 | SE01 | Entry into force of request for substantive examination | |
2021-08-06 | GR01 | Patent grant | |
2021-08-06 | GR01 | Patent grant |