CN109460046A - A kind of unmanned plane identify naturally not with independent landing method - Google Patents
- ️Tue Mar 12 2019
CN109460046A - A kind of unmanned plane identify naturally not with independent landing method - Google Patents
A kind of unmanned plane identify naturally not with independent landing method Download PDFInfo
-
Publication number
- CN109460046A CN109460046A CN201811213147.3A CN201811213147A CN109460046A CN 109460046 A CN109460046 A CN 109460046A CN 201811213147 A CN201811213147 A CN 201811213147A CN 109460046 A CN109460046 A CN 109460046A Authority
- CN
- China Prior art keywords
- image
- landing
- coordinates
- point
- area Prior art date
- 2018-10-17 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 16
- 238000000605 extraction Methods 0.000 claims abstract description 4
- 238000006243 chemical reaction Methods 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 10
- 239000000284 extract Substances 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 3
- 230000000717 retained effect Effects 0.000 claims 1
- 230000009187 flying Effects 0.000 description 6
- 230000007704 transition Effects 0.000 description 2
- 238000013316 zoning Methods 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 108091092878 Microsatellite Proteins 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 230000036632 reaction speed Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
- G05D1/0653—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
- G05D1/0676—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/48—Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Remote Sensing (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Astronomy & Astrophysics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
一种无人机自然地标识别与自主着陆方法属机器视觉导航技术领域,本发明根据给定的预着陆坐标在卫星数字地图上确定着陆区域,利用无人机在预着陆坐标拍摄航拍图像,将卫星数字地图与航拍图像进行滤波、灰度化、二值化处理、边缘特征提取和Hough变换,提取连续几何曲线,并采用加权Hausdorff距离匹配算法将二者进行匹配,根据格林定理计算无人机航拍图像中区域的形心相对于无人机的坐标,根据投影关系计算区域形心的空间坐标,并引导无人机在区域形心的空间坐标自主着陆。本发明能保证无人机在指定范围内自主识别最佳着陆点,并精准着陆,可弥补GPS导航下自主着陆误差大的不足,且自主着陆的安全性和可靠性得到提高。
A method for recognizing natural landmarks and autonomous landing of an unmanned aerial vehicle belongs to the technical field of machine vision navigation. The present invention determines a landing area on a satellite digital map according to given pre-landing coordinates, uses the unmanned aerial vehicle to take aerial images at the pre-landing coordinates, The satellite digital map and aerial image are filtered, grayed, binarized, edge feature extraction and Hough transform, extracted continuous geometric curve, and the weighted Hausdorff distance matching algorithm is used to match the two, and the UAV is calculated according to Green's theorem The centroid of the area in the aerial image is relative to the coordinates of the UAV, and the spatial coordinates of the area centroid are calculated according to the projection relationship, and the UAV is guided to land autonomously at the spatial coordinates of the area centroid. The invention can ensure that the unmanned aerial vehicle can autonomously identify the best landing point within a specified range and land accurately, can make up for the large error of autonomous landing under GPS navigation, and improve the safety and reliability of autonomous landing.
Description
Technical field
The invention belongs to machine vision navigation technical fields, and in particular to a kind of unmanned plane does not identify not and independent landing naturally Method.
Background technique
In recent years, with the development of micro- inertial navigation system, flight control system, MEMS and new material, miniature drone Research achieve great progress.Wherein rotary wind type Small and micro-satellite is good, compact-sized, at low cost with flexibility, obtains Access according to it is quick the features such as, application range also covers various fields, including but not limited to pesticide spraying, geological exploration, search With rescue, cargo transport and mapping etc..The reaction speed of information and the limitation of working efficiency are obtained due to people, these Business is independently completed by unmanned plane as far as possible, realizes autonomous landing, path by the contexture by self of blas or unmanned plane Planning, avoidance and imitative fly etc. act, and guarantee the accuracy and reliability of operation process.
In terms of unmanned plane independent landing, using more at present is the independent landing mode based on GPS navigation, i.e., by nothing Geographical coordinate locating for man-machine included GPS sensor record takeoff opportunity body, or by artificially specifying some geographical coordinate, in nothing When man-machine landing, is hovered by GPS positioning system guidance unmanned plane in the geographical coordinate overhead recorded and decline landing.And GPS Navigation causes unmanned plane to be located in remote or shelter more there is big, the disadvantages such as positioning accuracy is low are interfered by non-air medium Regional independent landing error it is big, can not accurately complete landing task.
Unmanned plane independent landing method based on machine vision is to solve one of the approach of GPS system position inaccurate, mesh Preceding apply more on rotor wing unmanned aerial vehicle is the independent landing method based on artificial landmark.And as unmanned plane is in every field Using more and more extensive, the adaptive capacity to environment of unmanned plane is required also higher and higher.Some specific mission requirements nobody Machine lands in the place that should not place artificial landmark, or even unmanned plane is required independently to find suitable land in specific region Place, this requires unmanned planes to possess identification target ability naturally.Therefore, believe to provide accurately navigation to unmanned plane Breath, completes specific independent landing task, need the mark naturally of unmanned plane a kind of not with independent landing method.
Summary of the invention
It is an object of the invention to above-mentioned prior art there are aiming at the problem that, propose a kind of based on machine vision and satellite The unmanned plane of numerical map identifies naturally not to be sat using satellite digital map in given pre- landing with independent landing method Mark nearby finds suitable touchdown area, and the Aerial Images of unmanned plane single camera vision system and satellite digital map is made to carry out image Matching, by the processing to unmanned plane image, is calculated best with eliminating the error of GPS navigation in touchdown area Land point coordinate, is precisely landed.
Unmanned plane of the invention identifies naturally not to be included the following steps: with independent landing method
1.1 according to given pre- landing coordinate (X0,Y0,Z0), determine that one piece of profile is convex polygon on satellite digital map The touchdown area P of shape is first filtered the image of region P, gray processing and binary conversion treatment, further implements edge feature It extracts, rejects the miscellaneous point in part, retain the main edge feature based on region, extract continuous geometry curve finally by Hough transform, Obtain the contour curve I and reference picture A of region P on satellite digital map, wherein binary conversion treatment uses maximum between-cluster variance Method;
Aerial Images are filtered, gray processing and binaryzation to given pre- landing coordinate overhead by 1.2 unmanned plane during flyings Edge Gradient Feature is further implemented in processing, rejects the miscellaneous point in part, retains the main edge feature based on region, finally by Hough transform extracts continuous geometry curve, obtains the contour curve II and measuring image B of region P on satellite digital map, In, binary conversion treatment equally uses maximum variance between clusters;
1.3 match the reference picture A that step 1.1 obtains with the measuring image B that step 1.2 obtains, in unmanned plane Touchdown area P is confirmed in Aerial Images;Images match uses Weighted Hausdorff distance matching algorithm, including the following steps:
1.3.1 in reference picture A and measuring image B, feature point set is carried out in two-dimensional space using 3-4DT algorithm Distance conversion, obtains image distance transition matrix JAAnd JB;
1.3.2 the branch point in reference picture A and measuring image B is extracted, and is respectively stored into matrix A and B;
1.3.3 according to JA、JB, A and B calculate Weighted Hausdorff distance:
H (A, B)=max (hWHD(A,B),hWHD(B,A))
Wherein: A, B is two point sets;NaIt is the sum of characteristic point in point set A;A is a characteristic point for belonging to A;d(a, B) be on point set A characteristic point a to the distance of point set B;hWHD(A, B) is represented from point set A to the directed distance of point set B;hWHD(B,A) It represents from point set B to the directed distance of point set A;
Point with minimum Hausdorff distance is exactly final match point, thus obtains preliminary location information;
1.3.4 all matching double points are corresponded using least-squares algorithm, to obtain more accurate position Confidence breath;
1.4 establish two-dimensional surface rectangular coordinate system by coordinate origin of unmanned plane camera, calculate nothing according to Green's theorem Coordinate (x of the centroid of region P relative to unmanned plane in man-machine Aerial Imagesc,yc);
1.5 according to the coordinate (X of the projection relation zoning p-shaped heartc,Yc,Zc), it specifically includes:
1.5.1 ground resolution GSD is calculated:
Wherein: GSD indicates ground resolution (m);F is lens focus (mm);P is the pixel dimension of imaging sensor (mm);H is the corresponding flying height (m) of unmanned plane;
1.5.2 it calculates image diagonal line True Ground Range and image diagonal is obtained according to the width w and height h of image Ground distance L between line:
Wherein: GSD indicates ground resolution (m);W is image width;H is image height;
1.5.3 region is acquired with respect to the distance and deflection of central point according to image center point longitude and latitude and the region p-shaped heart The geographical coordinate of the p-shaped heart:
Wherein: θ0∈(0,2π);LonaFor the longitude of image center point;LataFor the latitude of image center point;RiFor equator Radius takes 6378137m;RjFor polar radius, 6356725m is taken;
1.5.4 geographical coordinate is carried out to the conversion of space coordinate, obtains the space coordinate (X of the region p-shaped heartc,Yc,Zc):
Wherein: N is radius of curvature;Lon is longitude;Lat is latitude;H is elevation;
1.6 unmanned plane during flyings are to space coordinate (Xc,Yc,Zc) overhead, carry out vertical direction landing.
The present invention can guarantee the unmanned plane best landing point of autonomous classification within the specified range, and precisely land, and can make up GPS The big deficiency of lower independent landing error of navigating, and the safety and reliability of independent landing is improved.
Detailed description of the invention
Fig. 1 is that unmanned plane identifies the flow chart not with independent landing method naturally
Specific embodiment
For the purposes, technical schemes and advantages in the present invention are more clearly understood, following present invention is further specifically It is bright.
Step 1, according to given pre- landing coordinate (X0,Y0,Z0), determined on satellite digital map one piece it is suitable Land region P (it is required that the region contour is convex polygon), is first filtered the image of region P, gray processing processing, two-value Change processing, further implements Edge Gradient Feature, rejects the miscellaneous point in part, retains the main edge feature based on region, finally leads to It crosses Hough transform extraction continuous geometry curve and obtains the contour curve of region P on satellite digital map, obtain reference picture A, In, binary conversion treatment selects maximum variance between clusters, it is assumed that T is the global threshold chosen, by the pixel of image all pixels point It is that line of demarcation is divided into foreground and background, ω according to T1And ω2It respectively indicates to belong to background and belong to foreground pixel and accounts for entire image Ratio, then:
Wherein: p (i) indicates that pixel value is the probability that the pixel of i occurs in the picture.
μ0And μ1The average value of background and foreground pixel point pixel is respectively indicated, μ is the average pixel value of all pixels point, Then:
The corresponding inter-class variance σ of the threshold value2(T) is defined as:
σ2(T)=ω0(T)[μ0(T)-μ(T)]2+ω1(T)[μ1(T)-μ(T)]2=ω0(T)ω1(T)[μ0(T)-μ1(T)]2
Each gray value is traversed, the maximum corresponding threshold value T of inter-class variance, as required threshold value are found.
Near unmanned plane during flying to given pre- landing coordinate overhead, Aerial Images are filtered for step 2, ash Degreeization processing, binary conversion treatment further implement Edge Gradient Feature, reject the miscellaneous point in part, retain the main side based on region Edge feature extracts continuous geometry curve finally by Hough transform and obtains the contour curve of region P on satellite digital map, obtains To measuring image B.Wherein, binary conversion treatment equally selects maximum variance between clusters.
Reference picture A is matched with measuring image B, confirms touchdown area in unmanned plane image by step 3 P.Images match uses Weighted Hausdorff distance matching algorithm, the specific steps are as follows:
(1) in reference picture A and measuring image B, using 3-4DT algorithm carry out feature point set in two-dimensional space away from From conversion, image distance transition matrix J is obtainedAAnd JB;
(2) branch point in reference picture A and measuring image B is extracted, and is respectively stored into matrix A and B;
(3) according to JA、JB, A and B calculate Weighted Hausdorff distance:
H (A, B)=max (hWHD(A,B),hWHD(B,A))
Wherein: A, B is two point sets, NaIt is the sum of characteristic point in point set A, a is a characteristic point for belonging to A, d (a, It B) is distance of the characteristic point a to point set B, h on point set AWHD(A,B)、hWHD(B, A) respectively represented from point set A to point set B and from Directed distance of the point set B to point set A.
Point with minimum Hausdorff distance is exactly thus final match point obtains preliminary location information.
(4) all matching double points are corresponded using least-squares algorithm, to obtain more accurate position Information.
Step 4 establishes two-dimensional surface rectangular coordinate system by coordinate origin of unmanned plane camera, calculates unmanned plane Coordinate (x of the centroid of region P relative to unmanned plane in imagec,yc)。
According to Green's theorem, the closed contour along region P is integrated:
After discretization, above formula conversion are as follows:
Step 5, according to the coordinate (X of the projection relation zoning p-shaped heartc,Yc,Zc):
(1) ground resolution is calculated:
Wherein: GSD indicates ground resolution (m), and f is lens focus (mm), and P is the pixel dimension of imaging sensor (mm), H is the corresponding flying height (m) of unmanned plane.
(2) image diagonal line True Ground Range is calculated, image diagonal line is obtained according to the width w of image and height h Between ground distance:
(3) region P is acquired with respect to the distance and deflection of central point according to image center point longitude and latitude and the region p-shaped heart The geographical coordinate of centroid:
Wherein: θ0∈(0,2π),Lona、LataFor the longitude and latitude of image center point, Ri6378137m, R are taken for equatorial radiusj 6356725m is taken for polar radius.
(4) it carries out geographical coordinate and is transformed between space the conversion between coordinate system
Wherein: N is radius of curvature, and Lon, Lat, H are respectively longitude, latitude and elevation, and the space for obtaining the region p-shaped heart is sat Mark (Xc,Yc,Zc)。
Step 6, unmanned plane during flying to space coordinate (Xc,Yc,Zc) overhead, carry out vertical direction landing.
Claims (1)
1.一种无人机自然地标识别与自主着陆方法,其特征在于包括下列步骤:1. a kind of unmanned aerial vehicle natural landmark identification and autonomous landing method, it is characterized in that comprising the following steps: 1.1根据给定的预着陆X0,Y0,Z0坐标,在卫星数字地图上确定一块轮廓是凸多边形的着陆区域P,先对区域P的图像进行滤波、灰度化和二值化处理,再进一步实施边缘特征提取,剔除部分杂点,保留基于区域的主边缘特征,最后通过Hough变换提取连续几何曲线,得到卫星数字地图上区域P的轮廓曲线Ⅰ和参考图像A,其中,二值化处理采用最大类间方差法;1.1 According to the given pre-landing X 0 , Y 0 , Z 0 coordinates, determine a landing area P whose outline is a convex polygon on the satellite digital map, and first filter, grayscale and binarize the image of the area P , and then further implement edge feature extraction, remove some noise points, retain the main edge features based on the region, and finally extract the continuous geometric curve through Hough transform to obtain the contour curve I of the region P on the satellite digital map and the reference image A. Among them, the binary value The maximum inter-class variance method is used for the processing; 1.2无人机飞行至给定的预着陆坐标上空,将航拍图像进行滤波、灰度化和二值化处理,再进一步实施边缘特征提取,剔除部分杂点,保留基于区域的主边缘特征,最后通过Hough变换提取连续几何曲线,得到卫星数字地图上区域P的轮廓曲线Ⅱ和实测图像B,其中,二值化处理同样采用最大类间方差法;1.2 The drone flies to the given pre-landing coordinates, and the aerial image is filtered, grayed and binarized, and then further edge feature extraction is performed, some noise points are removed, and the main edge features based on the region are retained. Finally, The continuous geometric curve is extracted by Hough transform, and the contour curve II of the area P on the satellite digital map and the measured image B are obtained, and the maximum inter-class variance method is also used for the binarization processing; 1.3将步骤1.1得到的参考图像A与步骤1.2得到的实测图像B进行匹配,在无人机航拍图像中确认着陆区域P;图像匹配采用加权Hausdorff距离匹配算法,包括下列步骤:1.3 Match the reference image A obtained in step 1.1 with the measured image B obtained in step 1.2, and confirm the landing area P in the UAV aerial image; the image matching adopts the weighted Hausdorff distance matching algorithm, including the following steps: 1.3.1在参考图像A与实测图像B中,采用3-4DT算法进行特征点集在二维空间中的距离转换,得到图像距离转换矩阵JA和JB;1.3.1 in the reference image A and the measured image B, adopt the 3-4DT algorithm to carry out the distance conversion of the feature point set in the two-dimensional space, and obtain the image distance conversion matrix J A and J B ; 1.3.2提取参考图像A和实测图像B中的分支点,并分别存储到矩阵A和B中;1.3.2 Extract the branch points in the reference image A and the measured image B, and store them in the matrix A and B respectively; 1.3.3根据JA、JB、A和B计算加权Hausdorff距离:1.3.3 Calculate the weighted Hausdorff distance from J A , J B , A and B: H(A,B)=max(hWHD(A,B),hWHD(B,A))H(A,B)=max( hWHD (A,B), hWHD (B,A)) 其中:A、B为两个点集;Na为点集A中特征点的总数;a是属于A的一个特征点;d(a,B)是点集A上特征点a到点集B的距离;hWHD(A,B)代表从点集A到点集B的有向距离;hWHD(B,A)代表从点集B到点集A的有向距离;Among them: A and B are two point sets; Na is the total number of feature points in point set A; a is a feature point belonging to A; d(a, B) is the feature point a to point set B on point set A ; h WHD (A, B) represents the directional distance from point set A to point set B; h WHD (B, A) represents the directional distance from point set B to point set A; 具有最小Hausdorff距离的点就是最终匹配点,由此获取初步的定位信息;The point with the smallest Hausdorff distance is the final matching point, from which preliminary positioning information is obtained; 1.3.4利用最小二乘算法对所有的匹配点对进行一一对应,来获取更为精确的位置信息;1.3.4 Use the least squares algorithm to perform one-to-one correspondence of all matching point pairs to obtain more accurate position information; 1.4以无人机摄像头为坐标原点建立二维平面直角坐标系,根据格林定理计算无人机航拍图像中区域P的形心相对于无人机的xc,yc坐标;1.4 Establish a two-dimensional plane rectangular coordinate system with the UAV camera as the coordinate origin, and calculate the x c , y c coordinates of the centroid of the area P in the UAV aerial image relative to the UAV according to Green's theorem; 1.5根据投影关系计算区域P形心的Xc,Yc,Zc坐标,具体包括:1.5 Calculate the X c , Y c , Z c coordinates of the centroid of the region P according to the projection relationship, including: 1.5.1计算地面分辨率GSD:1.5.1 Calculate the ground resolution GSD: 其中:GSD表示地面分辨率(m);f为镜头焦距(mm);P为成像传感器的像元尺寸(mm);H为无人机对应的航高(m);Among them: GSD represents the ground resolution (m); f is the focal length of the lens (mm); P is the pixel size of the imaging sensor (mm); H is the flight height corresponding to the drone (m); 1.5.2计算影像对角线实际地面距离,根据影像的宽度w和高度h,得到图像对角线线之间的地面距离L:1.5.2 Calculate the actual ground distance of the diagonal line of the image, and obtain the ground distance L between the diagonal lines of the image according to the width w and height h of the image: 其中:GSD表示地面分辨率(m);w为影像宽度;h为影像高度;Among them: GSD represents the ground resolution (m); w is the width of the image; h is the height of the image; 1.5.3根据影像中心点经纬度及区域P形心相对中心点的距离及方向角,求得区域P形心的地理坐标:1.5.3 According to the latitude and longitude of the center point of the image and the distance and direction angle of the centroid of the area relative to the center point, the geographic coordinates of the centroid of the area P are obtained: 其中:θ0∈(0,2π);Lona为影像中心点的经度;Lata为影像中心点的纬度;Ri为赤道半径,取6378137m;Rj为极半径,取6356725m;Where: θ 0 ∈(0,2π); Lon a is the longitude of the center of the image; Lat a is the latitude of the center of the image; R i is the equatorial radius, taking 6378137m; Rj is the polar radius, taking 6356725m; 1.5.4进行地理坐标到空间坐标的转换,得到区域P形心的空间坐标Xc,Yc,Zc:1.5.4 Convert geographic coordinates to spatial coordinates to obtain the spatial coordinates X c , Y c , Z c of the centroid of the region P: 其中:N为曲率半径;Lon为经度;Lat为纬度;H为高程;Where: N is the radius of curvature; Lon is the longitude; Lat is the latitude; H is the elevation; 1.6无人机飞行至空间坐标Xc,Yc,Zc上空,进行垂直方向着陆。1.6 The drone flies to the space above the space coordinates X c , Y c , Z c for vertical landing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811213147.3A CN109460046B (en) | 2018-10-17 | 2018-10-17 | A method for unmanned aerial vehicle natural landmark recognition and autonomous landing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811213147.3A CN109460046B (en) | 2018-10-17 | 2018-10-17 | A method for unmanned aerial vehicle natural landmark recognition and autonomous landing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109460046A true CN109460046A (en) | 2019-03-12 |
CN109460046B CN109460046B (en) | 2021-08-06 |
Family
ID=65607782
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811213147.3A Active CN109460046B (en) | 2018-10-17 | 2018-10-17 | A method for unmanned aerial vehicle natural landmark recognition and autonomous landing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109460046B (en) |
Cited By (8)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110968112A (en) * | 2019-12-12 | 2020-04-07 | 哈尔滨工程大学 | Unmanned aerial vehicle autonomous landing system and method based on monocular vision |
CN111324145A (en) * | 2020-02-28 | 2020-06-23 | 厦门理工学院 | Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium |
CN111626260A (en) * | 2020-06-05 | 2020-09-04 | 贵州省草业研究所 | Aerial photo ground object feature point extraction method based on unmanned aerial vehicle remote sensing technology |
CN112419374A (en) * | 2020-11-11 | 2021-02-26 | 北京航空航天大学 | A UAV Localization Method Based on Image Registration |
CN113920439A (en) * | 2020-07-10 | 2022-01-11 | 千寻位置网络有限公司 | Extraction of arrow point and device thereof |
CN114998773A (en) * | 2022-08-08 | 2022-09-02 | 四川腾盾科技有限公司 | Characteristic mismatching elimination method and system suitable for aerial image of unmanned aerial vehicle system |
CN115237158A (en) * | 2022-08-17 | 2022-10-25 | 吉林大学 | Multi-rotor unmanned aerial vehicle autonomous tracking and landing control system and control method |
CN115526896A (en) * | 2021-07-19 | 2022-12-27 | 中核利华消防工程有限公司 | Fire prevention and control method and device, electronic equipment and readable storage medium |
Citations (5)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100095665A (en) * | 2009-02-12 | 2010-09-01 | 한양대학교 산학협력단 | Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same |
CN103424126A (en) * | 2013-08-12 | 2013-12-04 | 西安电子科技大学 | System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle |
CN105000194A (en) * | 2015-08-13 | 2015-10-28 | 史彩成 | UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark |
CN105550994A (en) * | 2016-01-26 | 2016-05-04 | 河海大学 | Satellite image based unmanned aerial vehicle image rapid and approximate splicing method |
CN107063261A (en) * | 2017-03-29 | 2017-08-18 | 东北大学 | The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane |
-
2018
- 2018-10-17 CN CN201811213147.3A patent/CN109460046B/en active Active
Patent Citations (5)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100095665A (en) * | 2009-02-12 | 2010-09-01 | 한양대학교 산학협력단 | Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same |
CN103424126A (en) * | 2013-08-12 | 2013-12-04 | 西安电子科技大学 | System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle |
CN105000194A (en) * | 2015-08-13 | 2015-10-28 | 史彩成 | UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark |
CN105550994A (en) * | 2016-01-26 | 2016-05-04 | 河海大学 | Satellite image based unmanned aerial vehicle image rapid and approximate splicing method |
CN107063261A (en) * | 2017-03-29 | 2017-08-18 | 东北大学 | The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane |
Non-Patent Citations (2)
* Cited by examiner, † Cited by third partyTitle |
---|
李宇 等: "基于视觉的无人机自主着陆地标识别方法", 《计算机应用研究》 * |
陈勇 等: "新型的无人机自主着陆地标设计与研究", 《电子科技大学学报》 * |
Cited By (11)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110968112A (en) * | 2019-12-12 | 2020-04-07 | 哈尔滨工程大学 | Unmanned aerial vehicle autonomous landing system and method based on monocular vision |
CN111324145A (en) * | 2020-02-28 | 2020-06-23 | 厦门理工学院 | Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium |
CN111324145B (en) * | 2020-02-28 | 2022-08-16 | 厦门理工学院 | Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium |
CN111626260A (en) * | 2020-06-05 | 2020-09-04 | 贵州省草业研究所 | Aerial photo ground object feature point extraction method based on unmanned aerial vehicle remote sensing technology |
CN113920439A (en) * | 2020-07-10 | 2022-01-11 | 千寻位置网络有限公司 | Extraction of arrow point and device thereof |
CN113920439B (en) * | 2020-07-10 | 2024-09-06 | 千寻位置网络有限公司 | Extraction method and device for arrow point |
CN112419374A (en) * | 2020-11-11 | 2021-02-26 | 北京航空航天大学 | A UAV Localization Method Based on Image Registration |
CN112419374B (en) * | 2020-11-11 | 2022-12-27 | 北京航空航天大学 | Unmanned aerial vehicle positioning method based on image registration |
CN115526896A (en) * | 2021-07-19 | 2022-12-27 | 中核利华消防工程有限公司 | Fire prevention and control method and device, electronic equipment and readable storage medium |
CN114998773A (en) * | 2022-08-08 | 2022-09-02 | 四川腾盾科技有限公司 | Characteristic mismatching elimination method and system suitable for aerial image of unmanned aerial vehicle system |
CN115237158A (en) * | 2022-08-17 | 2022-10-25 | 吉林大学 | Multi-rotor unmanned aerial vehicle autonomous tracking and landing control system and control method |
Also Published As
Publication number | Publication date |
---|---|
CN109460046B (en) | 2021-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109460046A (en) | 2019-03-12 | A kind of unmanned plane identify naturally not with independent landing method |
CN110569838B (en) | 2022-05-24 | An autonomous landing method of quadrotor UAV based on visual positioning |
CN106054929B (en) | 2018-10-16 | A kind of unmanned plane based on light stream lands bootstrap technique automatically |
Gui et al. | 2013 | Airborne vision-based navigation method for UAV accuracy landing using infrared lamps |
CN107544550B (en) | 2021-01-15 | Unmanned aerial vehicle automatic landing method based on visual guidance |
CN105021184B (en) | 2017-10-24 | It is a kind of to be used for pose estimating system and method that vision under mobile platform warship navigation |
US20200344464A1 (en) | 2020-10-29 | Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Defects |
CN108305264A (en) | 2018-07-20 | A kind of unmanned plane precision landing method based on image procossing |
US20190068829A1 (en) | 2019-02-28 | Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Obstructions |
CN103822635A (en) | 2014-05-28 | Visual information based real-time calculation method of spatial position of flying unmanned aircraft |
Martínez et al. | 2011 | On-board and ground visual pose estimation techniques for UAV control |
CN109949361A (en) | 2019-06-28 | An Attitude Estimation Method for Rotor UAV Based on Monocular Vision Positioning |
CN101109640A (en) | 2008-01-23 | Vision-based autonomous landing navigation system for unmanned aircraft |
CN105197252A (en) | 2015-12-30 | Small-size unmanned aerial vehicle landing method and system |
Choi et al. | 2020 | BRM localization: UAV localization in GNSS-denied environments based on matching of numerical map and UAV images |
KR102289752B1 (en) | 2021-08-13 | A drone for performring route flight in gps blocked area and methed therefor |
CN111024072B (en) | 2021-06-11 | Satellite map aided navigation positioning method based on deep learning |
CN109341686B (en) | 2023-10-27 | Aircraft landing pose estimation method based on visual-inertial tight coupling |
CN113406975B (en) | 2021-11-30 | Bionic intelligent multi-unmanned aerial vehicle cluster autonomous formation navigation control method and device |
CN110081875B (en) | 2020-10-09 | A pigeon-like intelligent unmanned aerial vehicle autonomous navigation system and method |
CN110058604A (en) | 2019-07-26 | A kind of accurate landing system of unmanned plane based on computer vision |
CN105405126A (en) | 2016-03-16 | Multi-scale air-ground parameter automatic calibration method based on monocular vision system |
CN114815871A (en) | 2022-07-29 | A vision-based autonomous landing method for vertical take-off and landing UAV mobile platforms |
CN107424156B (en) | 2019-12-06 | Accurate measurement method for autonomous formation of unmanned aerial vehicles based on visual attention imitating barn owl eyes |
Dumble et al. | 2015 | Airborne vision-aided navigation using road intersection features |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
2019-03-12 | PB01 | Publication | |
2019-03-12 | PB01 | Publication | |
2019-04-05 | SE01 | Entry into force of request for substantive examination | |
2019-04-05 | SE01 | Entry into force of request for substantive examination | |
2021-08-06 | GR01 | Patent grant | |
2021-08-06 | GR01 | Patent grant |