patents.google.com

CN115655157A - Leaf Area Index Calculation Method Based on Fisheye Image - Google Patents

  • ️Tue Jan 31 2023

CN115655157A - Leaf Area Index Calculation Method Based on Fisheye Image - Google Patents

Leaf Area Index Calculation Method Based on Fisheye Image Download PDF

Info

Publication number
CN115655157A
CN115655157A CN202211290462.2A CN202211290462A CN115655157A CN 115655157 A CN115655157 A CN 115655157A CN 202211290462 A CN202211290462 A CN 202211290462A CN 115655157 A CN115655157 A CN 115655157A Authority
CN
China
Prior art keywords
image
leaf area
canopy
fisheye
lai
Prior art date
2022-10-21
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211290462.2A
Other languages
Chinese (zh)
Inventor
查元源
张宇凡
汪家伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2022-10-21
Filing date
2022-10-21
Publication date
2023-01-31
2022-10-21 Application filed by Wuhan University WHU filed Critical Wuhan University WHU
2022-10-21 Priority to CN202211290462.2A priority Critical patent/CN115655157A/en
2023-01-31 Publication of CN115655157A publication Critical patent/CN115655157A/en
Status Pending legal-status Critical Current

Links

  • 238000004364 calculation method Methods 0.000 title claims abstract description 26
  • 238000000034 method Methods 0.000 claims abstract description 50
  • 238000005259 measurement Methods 0.000 claims abstract description 27
  • 238000004422 calculation algorithm Methods 0.000 claims abstract description 17
  • 238000005516 engineering process Methods 0.000 claims abstract description 14
  • 230000011218 segmentation Effects 0.000 claims description 17
  • 238000012545 processing Methods 0.000 claims description 14
  • 238000005070 sampling Methods 0.000 claims description 10
  • 238000003384 imaging method Methods 0.000 claims description 6
  • 230000000007 visual effect Effects 0.000 claims description 5
  • 238000011161 development Methods 0.000 claims description 4
  • 230000018109 developmental process Effects 0.000 claims description 4
  • 239000000284 extract Substances 0.000 claims description 4
  • NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 3
  • 238000004458 analytical method Methods 0.000 claims description 3
  • 238000010191 image analysis Methods 0.000 claims description 3
  • 239000011159 matrix material Substances 0.000 claims description 3
  • 238000010606 normalization Methods 0.000 claims 1
  • 238000012795 verification Methods 0.000 abstract description 14
  • 238000012544 monitoring process Methods 0.000 abstract description 7
  • 238000013461 design Methods 0.000 abstract description 3
  • 238000012360 testing method Methods 0.000 description 13
  • 241000196324 Embryophyta Species 0.000 description 12
  • 230000012010 growth Effects 0.000 description 10
  • 241000209094 Oryza Species 0.000 description 9
  • 230000006870 function Effects 0.000 description 9
  • 230000008569 process Effects 0.000 description 9
  • 235000007164 Oryza sativa Nutrition 0.000 description 8
  • 241000209140 Triticum Species 0.000 description 8
  • 235000021307 Triticum Nutrition 0.000 description 8
  • 235000009566 rice Nutrition 0.000 description 8
  • 238000002474 experimental method Methods 0.000 description 7
  • 230000003287 optical effect Effects 0.000 description 5
  • 238000007781 pre-processing Methods 0.000 description 3
  • 240000008042 Zea mays Species 0.000 description 2
  • 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 2
  • 235000002017 Zea mays subsp mays Nutrition 0.000 description 2
  • 230000005540 biological transmission Effects 0.000 description 2
  • 235000005822 corn Nutrition 0.000 description 2
  • 230000001066 destructive effect Effects 0.000 description 2
  • 238000010586 diagram Methods 0.000 description 2
  • 230000000694 effects Effects 0.000 description 2
  • 238000003709 image segmentation Methods 0.000 description 2
  • 230000008635 plant growth Effects 0.000 description 2
  • SGTNSNPWRIOYBX-UHFFFAOYSA-N 2-(3,4-dimethoxyphenyl)-5-{[2-(3,4-dimethoxyphenyl)ethyl](methyl)amino}-2-(propan-2-yl)pentanenitrile Chemical compound C1=C(OC)C(OC)=CC=C1CCN(C)CCCC(C#N)(C(C)C)C1=CC=C(OC)C(OC)=C1 SGTNSNPWRIOYBX-UHFFFAOYSA-N 0.000 description 1
  • 241000723369 Cocculus trilobus Species 0.000 description 1
  • 230000000052 comparative effect Effects 0.000 description 1
  • 238000013500 data storage Methods 0.000 description 1
  • 238000009795 derivation Methods 0.000 description 1
  • 238000012938 design process Methods 0.000 description 1
  • 230000007613 environmental effect Effects 0.000 description 1
  • 230000006353 environmental stress Effects 0.000 description 1
  • 238000011156 evaluation Methods 0.000 description 1
  • 238000000605 extraction Methods 0.000 description 1
  • 230000008676 import Effects 0.000 description 1
  • 238000009434 installation Methods 0.000 description 1
  • 238000007726 management method Methods 0.000 description 1
  • 238000000691 measurement method Methods 0.000 description 1
  • 230000000877 morphologic effect Effects 0.000 description 1
  • 230000008447 perception Effects 0.000 description 1
  • 239000002574 poison Substances 0.000 description 1
  • 231100000614 poison Toxicity 0.000 description 1
  • 238000012805 post-processing Methods 0.000 description 1
  • 238000003672 processing method Methods 0.000 description 1
  • 238000011160 research Methods 0.000 description 1
  • 230000004044 response Effects 0.000 description 1
  • 230000009466 transformation Effects 0.000 description 1
  • XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a fisheye image-based leaf area index measuring and calculating method, which is a farmland leaf area index monitoring method based on a machine vision technology in a mode of carrying an ultra-wide-angle lens to shoot a plant canopy fisheye image through a smart phone. The design key points of the invention comprise that the shooting mode of the mobile phone and the fisheye camera is applied to the crop canopy, the fisheye image is efficiently processed, and a simple and rapid LAI difference calculation method is designed by utilizing the definition of LAI and an integral formula. Meanwhile, the invention can be tested and applied in a plurality of fields and experimental scenes, and carries out precision verification with the measurement results of traditional instruments and professional software. The result shows that the equipment and the algorithm can effectively adapt to LAI measurement tasks in different scenes such as sunny days, rainy days and the like, have good accuracy and stability for paddy fields and dry lands, and can be well popularized to different agricultural monitoring tasks.

Description

基于鱼眼图像的叶面积指数测算方法Leaf Area Index Calculation Method Based on Fisheye Image

技术领域technical field

本发明属于农作物表型监测技术领域,具体是指通过智能手机搭载超广角摄像头拍摄植物冠层的鱼眼图像,并基于机器视觉技术进行图像处理,测算作物叶面积指数的方法。The invention belongs to the technical field of crop phenotype monitoring, and specifically refers to a method for taking a fisheye image of a plant canopy with a super wide-angle camera mounted on a smart phone, and performing image processing based on machine vision technology to measure and calculate the crop leaf area index.

背景技术Background technique

植被冠层的叶面积指数(Leaf Area Index,LAI)是研究植物生长状态和生理特征的重要参考指标,其不仅体现了植被群体性形态结构,也反映了冠层对外界环境变化的响应。因此,LAI的精准测量对评价农作物长势、识别农田环境胁迫以及探究植物生长规律具有重要意义。Leaf Area Index (LAI) of vegetation canopy is an important reference index for studying plant growth state and physiological characteristics. It not only reflects the morphological structure of vegetation groups, but also reflects the response of canopy to external environmental changes. Therefore, accurate measurement of LAI is of great significance for evaluating crop growth, identifying environmental stress in farmland, and exploring the law of plant growth.

LAI在数值上被定义为在冠层深度上对叶面积密度的积分(式1),即不同高度水平面上叶片分布情况,概念上相当于单位土地面积上所有叶片的上表面积和(受光面)。LAI is numerically defined as the integral of leaf area density at canopy depth (Equation 1), that is, the distribution of leaves at different heights, which is conceptually equivalent to the sum of the upper surface areas of all leaves per unit land area (light-receiving surface) .

Figure BDA0003901138720000011

Figure BDA0003901138720000011

式中,H为指定的冠层高度,l(h)为h高度处的叶面积密度函数。where H is the specified canopy height, and l(h) is the leaf area density function at h height.

传统的LAI测量方法有破坏性采样法、凋落物法和点斜样方法等,这些方法需要人工采集作物叶片,工作量较大且对植物的破坏导致不能获得同一植被覆盖区域的连续观测数据。在间接测量技术方面,光学仪器的应用十分广泛,常用的仪器包括Li-cor冠层分析仪、TRAC仪和AccuPAR设备等,它们具有无损检测、便捷高效的特点,被广泛运用于LAI的地面测量实践中。但是目前成熟的测量仪器仍然存在一些应用上的问题:(1)受仪器镜头低分辨率和光圈不可调节的影响,仪器的测量只能在太阳高度角低于15-20°时进行,每天可进行测量作业的时间有限;(2)基于光学原理设计的仪器需要伸入作物冠层下方水平朝向天空拍摄测量,这导致仪器无法检测株型低矮的植被或苗期的作物(冠层深度不够仪器进入);(3)主流的LAI测量仪器价格高昂,依赖进口并且需要专业人士操作使用,难以向田间推广应用。Traditional LAI measurement methods include destructive sampling method, litter fall method, and point oblique sample method, etc. These methods require manual collection of crop leaves, a large workload and damage to plants, resulting in the inability to obtain continuous observation data of the same vegetation coverage area. In terms of indirect measurement technology, optical instruments are widely used. Commonly used instruments include Li-cor canopy analyzers, TRAC instruments, and AccuPAR equipment. They are nondestructive, convenient and efficient, and are widely used in ground measurements of LAI. Practice. However, there are still some application problems in the current mature measuring instruments: (1) Due to the low resolution of the instrument lens and the non-adjustable aperture, the measurement of the instrument can only be carried out when the solar altitude angle is lower than 15-20°. The time for measurement operations is limited; (2) The instrument designed based on the optical principle needs to extend below the crop canopy and take measurements horizontally towards the sky, which makes the instrument unable to detect low vegetation or crops at the seedling stage (the depth of the canopy is not enough (3) The mainstream LAI measuring instruments are expensive, rely on imports and require professionals to operate and use, and are difficult to promote and apply in the field.

发明内容Contents of the invention

针对现有技术存在的问题,本发明提供了一种基于鱼眼图像的叶面积指数测算方法,通过智能手机搭载超广角镜头拍摄植物冠层鱼眼图像的方式,基于机器视觉技术进行农田叶面积指数的测算。Aiming at the problems existing in the prior art, the present invention provides a method for calculating the leaf area index based on fisheye images, which uses a smartphone equipped with a super wide-angle lens to capture fisheye images of plant canopies, and performs farmland leaf area index calculations based on machine vision technology. calculation.

本发明的设计要点包括将手机+鱼眼相机的拍摄模式应用于作物冠层,对鱼眼图像进行高效图像处理,利用LAI的定义与积分公式设计简洁快速的LAI差分计算方法。基于LAI的定义和光学测量原理,本发明设计了一种智能手机搭载鱼眼镜头的延伸拍摄方法,利用智能手机的后置摄像头和轻量级外置镜头构建一套便捷鱼眼图像采集装置,并通过机器视觉算法对图像的优化处理,实现手机实时监测农田作物叶面积指数的功能。该方法适用于多种场景,精度较高、稳定性强且成本低廉,对于实现大规模、高通量农作物表型监测和长势评价有重要意义。The design points of the present invention include applying the shooting mode of mobile phone + fisheye camera to the crop canopy, performing efficient image processing on fisheye images, and designing a simple and fast LAI difference calculation method by using the definition and integral formula of LAI. Based on the definition of LAI and the principle of optical measurement, the present invention designs an extended shooting method with a fisheye lens mounted on a smart phone, and uses the rear camera of the smart phone and a lightweight external lens to construct a convenient fisheye image acquisition device. And through the optimized processing of images by machine vision algorithms, the function of real-time monitoring of the leaf area index of farmland crops by mobile phones is realized. This method is applicable to a variety of scenarios, has high precision, strong stability and low cost, and is of great significance for realizing large-scale, high-throughput crop phenotype monitoring and growth evaluation.

本发明采用的技术方案为:基于鱼眼图像的叶面积指数测算方法,包括如下步骤:The technical scheme adopted in the present invention is: the method for measuring and calculating the leaf area index based on the fisheye image, comprising the following steps:

步骤1,采集不同类型冠层的鱼眼图像;Step 1, collecting fisheye images of different types of canopies;

步骤2,基于图像处理技术对采集的鱼眼图像进行处理,降低背景噪声,得到无畸变、可计算的像素矩阵;Step 2, process the collected fisheye image based on image processing technology, reduce background noise, and obtain a distortion-free and computable pixel matrix;

步骤3,将处理后的鱼眼图像切分成无限个圆环,每个圆环上的像素近似地处于同一高度的平面上,计算圆环中的植被接触频率并积分,即得到叶面积指数值。Step 3: Divide the processed fisheye image into infinite rings, the pixels on each ring are approximately on the same height plane, calculate and integrate the vegetation contact frequency in the ring, and obtain the leaf area index value .

进一步的,步骤1中通过智能手机与鱼眼镜头采集不同类型冠层的鱼眼图像,首先将智能手机与拍摄杆进行蓝牙连接,将鱼眼镜头对准智能手机后摄主镜头并固定,即组成简易的测量杆,完成镜头搭载后,根据拍摄的植物类型和发育程度确定相机角度,拍摄时远离边缘长势不均匀的冠层,并保持相机水平。Further, in step 1, the fisheye images of different types of canopies are collected through the smartphone and the fisheye lens. First, the smartphone is connected to the shooting pole via Bluetooth, and the fisheye lens is aligned with the main lens of the smartphone and fixed, that is Make up a simple measuring rod. After the lens is installed, determine the camera angle according to the type and growth of the plant to be photographed. When shooting, stay away from the canopy with uneven growth at the edge and keep the camera level.

进一步的,鱼眼镜头的成像感光平面为一个与相机靶面内切的圆,以圆心为坐标原点建立坐标系,即可以通过式2计算图像中每点处的视角θ:Further, the imaging photosensitive plane of the fisheye lens is a circle inscribed with the camera target surface, and the coordinate system is established with the center of the circle as the coordinate origin, that is, the angle of view θ at each point in the image can be calculated by formula 2:

Figure BDA0003901138720000021

Figure BDA0003901138720000021

式中,x,y为像元坐标,R为图像的半径。In the formula, x and y are the pixel coordinates, and R is the radius of the image.

进一步的,步骤2的具体实现方式如下;Further, the specific implementation of step 2 is as follows;

(21)采用降采样方法等比例缩小图像分辨率以加快运行速度,借助图像裁剪程辑包切割图像四周无效区域;(21) The down-sampling method is used to proportionally reduce the image resolution to speed up the operation, and the invalid area around the image is cut with the help of the image cropping program package;

(22)基于HSV颜色空间阈值进行植被像元分类,提取每个圆环中的植被占比和间隙分数,根据拍摄视角不同将采用不同的分割方式:通过算法将RGB图像转化为HSV图像,设定阈值上下限,超出界限的像素被分类为非植被像元不计入接触分数计算,当图像为冠层内部向上拍摄时,则以天空像元为分割主体,去除天空像元后得到植被像元;当拍摄视角为俯拍时,则以植被为分割主体,直接得到植被覆盖度。(22) Classify the vegetation pixels based on the HSV color space threshold, extract the proportion of vegetation and the gap score in each ring, and adopt different segmentation methods according to different shooting angles: convert the RGB image into an HSV image through an algorithm, set The upper and lower limits of the threshold are set, and the pixels beyond the limit are classified as non-vegetation pixels and are not included in the calculation of the contact score. When the image is taken upwards from the inside of the canopy, the sky pixels are used as the main body of the segmentation, and the vegetation image is obtained after removing the sky pixels. When the shooting angle is overhead, the vegetation is used as the segmentation subject to directly obtain the vegetation coverage.

进一步的,步骤3中基于PIL程辑包进行图像圆环区域切分,得到0-15°观测天顶角范围内的中心圆,以及15-30°、30-45°、45-60°和60-75°范围内的同心圆环,考虑到75°以外的区域接近采样边缘,畸变严重且噪声较多,因此仅计算75°内的部分。Further, in step 3, the image circle area is segmented based on the PIL program package to obtain the central circle within the range of 0-15° observation zenith angle, and 15-30°, 30-45°, 45-60° and For concentric rings within the range of 60-75°, considering that the area beyond 75° is close to the sampling edge, the distortion is serious and the noise is more, so only the part within 75° is calculated.

进一步的,步骤3的具体实现方式如下;Further, the specific implementation of step 3 is as follows;

根据LAI的概念以及式(1)的定义,提出了基于接触频率和间隙分数的泊松计算模型,接触频率由Warren Wilson提出,是指太阳光入射冠层时与冠层内植被元素想接触的几率,而间隙分数是指自然光束直接入射至参考平面的概率,在叶片不透明的假设下,图像分析中测得的叶片覆盖度就是单向接触分数;According to the concept of LAI and the definition of formula (1), a Poisson calculation model based on contact frequency and gap fraction is proposed. The contact frequency is proposed by Warren Wilson, which refers to the contact between the vegetation elements in the canopy when sunlight enters the canopy Probability, while the gap score refers to the probability that the natural light beam is directly incident on the reference plane. Under the assumption that the leaves are opaque, the leaf coverage measured in the image analysis is the one-way contact score;

Figure BDA0003901138720000031

Figure BDA0003901138720000031

式中,H为冠层高度,l(h)为h高度处的叶面积密度函数;In the formula, H is the height of the canopy, l(h) is the leaf area density function at the height of h;

在h的冠层高度下,平均接触分数作为每一叶层单向接触分数的株高积分值,其计算公式为:Under the canopy height of h, the average contact fraction is used as the plant height integral value of the one-way contact fraction of each leaf layer, and its calculation formula is:

Figure BDA0003901138720000032

Figure BDA0003901138720000032

式中H为冠层高度,L(h)表示植物的冠层每一个高度h都对应相应层的叶面积密度,即冠层单位体积上的叶面积,

Figure BDA0003901138720000033

指的是观测位置的方向向量,θv为观测方向的天顶角,

Figure BDA0003901138720000034

为观测方向的方位角,G为高h处的叶面积投影函数,将式(1)带入可得:In the formula, H is the height of the canopy, and L(h) indicates that each height h of the canopy of the plant corresponds to the leaf area density of the corresponding layer, that is, the leaf area per unit volume of the canopy,

Figure BDA0003901138720000033

refers to the direction vector of the observation position, θ v is the zenith angle of the observation direction,

Figure BDA0003901138720000034

is the azimuth of the observation direction, and G is the projection function of the leaf area at the height h, which can be obtained by substituting formula (1):

Figure BDA0003901138720000035

Figure BDA0003901138720000035

(4)表示了LAI与接触频率的相关性,其中

Figure BDA0003901138720000036

由式(5)、(6)联立可得:(4) shows the correlation between LAI and contact frequency, where

Figure BDA0003901138720000036

From equations (5) and (6), we can get:

Figure BDA0003901138720000037

Figure BDA0003901138720000037

Figure BDA0003901138720000038

Figure BDA0003901138720000038

引入

Figure BDA0003901138720000039

为叶片倾斜角度分布模型的概率密度函数,其中,θl为叶片倾斜方向的天顶角,

Figure BDA00039011387200000310

为叶片倾斜的方位角,并通过式(7)、(8)进行归一化条件约束:introduce

Figure BDA0003901138720000039

is the probability density function of the blade tilt angle distribution model, where θ l is the zenith angle of the blade tilt direction,

Figure BDA00039011387200000310

is the azimuth angle of blade inclination, and is normalized by formula (7) and (8):

Figure BDA00039011387200000311

Figure BDA00039011387200000311

Figure BDA0003901138720000041

Figure BDA0003901138720000041

上述公式联合,可以得到冠层间隙分数

Figure BDA0003901138720000042

平均接触分数

Figure BDA0003901138720000043

以及LAI之间的关系,由Nilson优化为式(9)中的指数关系:Combining the above formulas, the canopy gap fraction can be obtained

Figure BDA0003901138720000042

average contact score

Figure BDA0003901138720000043

And the relationship between LAI, which is optimized by Nilson as the exponential relationship in formula (9):

Figure BDA0003901138720000044

Figure BDA0003901138720000044

其中,

Figure BDA0003901138720000045

Figure BDA0003901138720000046

相同,基于鱼眼图像的圆形视场,不考虑入射光线的方位,假设间隙分数测量仅取决于观测天顶角θv,即光线入射方向相对于冠层底部光传感器的法向量夹角,则计算叶面积指数LAIcal的计算公式可以整理为:in,

Figure BDA0003901138720000045

and

Figure BDA0003901138720000046

Similarly, based on the circular field of view of the fisheye image, regardless of the orientation of the incident light, it is assumed that the gap fraction measurement only depends on the observation zenith angle θ v , that is, the angle between the incident direction of the light and the normal vector of the light sensor at the bottom of the canopy, Then the formula for calculating the leaf area index LAI cal can be organized as:

Figure BDA0003901138720000047

Figure BDA0003901138720000047

Welles提出了基于多视角观测的针对积分式(10)的离散数值解析法,采用多个天顶观测角划分圆环,分割每个圆环中的平均植被间隙分数

Figure BDA0003901138720000048

将积分式(10)做差分处理:Welles proposed a discrete numerical analysis method for integral equation (10) based on multi-view observations, using multiple zenith observation angles to divide the rings, and dividing the average vegetation gap fraction in each ring

Figure BDA0003901138720000048

The integral formula (10) is differentially processed:

Figure BDA0003901138720000049

Figure BDA0003901138720000049

式中Siv)为cosiθv -1,而Wi为sinθvdθ,i表示划分的角度,根据所取分环角度的不同,系数有所差异。In the formula, S iv ) is cos i θ v -1 , and W i is sinθ v dθ, i represents the division angle, and the coefficients are different according to the different division ring angles.

进一步的,采用5个天顶观测角划分圆环,分别为0-15°,15-30°,30-45°,45-60°,60-75°,分别对应i=1至i=5。Further, five zenith observation angles are used to divide the circle, which are 0-15°, 15-30°, 30-45°, 45-60°, 60-75°, respectively corresponding to i=1 to i=5 .

本发明是在植物表型和光学遥感领域对叶面积指数无损监测的多年研究成果的基础上进行了应用性开发,相较于其他同类别仪器和软件,本发明取得了一些新的进展和优势:The present invention is an applied development on the basis of years of research results on the non-destructive monitoring of leaf area index in the field of plant phenotype and optical remote sensing. Compared with other similar instruments and software, the present invention has achieved some new progress and advantages :

(1)采用智能手机搭载镜头,配合延伸杆的拍摄方式,简化了测量操作,通过智能手机的操作优化了数据存储和传输过程,并显著降低了成本,适合推广到田间管理层面。(1) Using a smartphone equipped with a lens, combined with the shooting method of the extension rod, simplifies the measurement operation, optimizes the data storage and transmission process through the operation of the smartphone, and significantly reduces the cost, which is suitable for promotion to the field management level.

(2)基于Python进行图像处理和积分计算,后端代码简单易写,方便修改,并且适合单张图像调试和大量图像批量处理,通过预处理降低了天气、光照对间隙分数测量的影响,为运行调试提供了全程可视化条件。(2) Based on Python for image processing and integral calculation, the back-end code is simple and easy to write, easy to modify, and suitable for single image debugging and batch processing of a large number of images. The influence of weather and light on gap score measurement is reduced through preprocessing. The operation and debugging provide the whole visual conditions.

(3)根据不同作物和不同田间场景设计了俯拍和仰拍两种视角,弥补了LAI仪器无法监测低矮冠层的缺点,消除了光线较强时无法使用仪器测量的限制。(3) According to different crops and different field scenes, two viewing angles are designed: overhead shooting and upward shooting, which make up for the shortcomings of LAI instruments that cannot monitor low canopies, and eliminate the limitation that the instrument cannot be used for measurement when the light is strong.

(4)算法中通过读取手机图像的exif信息(位置、时间),加入了实时太阳天顶角计算函数,可以对LAI计算结果进行修正。(4) In the algorithm, by reading the exif information (position, time) of the mobile phone image and adding the real-time solar zenith angle calculation function, the LAI calculation result can be corrected.

(5)完成本发明的过程中,进行了大规模的采样实验,在水稻、玉米、小麦等常见农作物田间与传统方法和常用仪器进行了对比实验,验证了本发明的实用性和精确度。(5) In the process of completing the present invention, large-scale sampling experiments have been carried out, and comparative experiments have been carried out with traditional methods and commonly used instruments in the fields of common crops such as rice, corn, wheat, etc., and the practicability and accuracy of the present invention have been verified.

附图说明Description of drawings

图1:鱼眼镜头成像原理示意图;Figure 1: Schematic diagram of the imaging principle of the fisheye lens;

图2:手机+鱼眼镜头+延伸拍摄杆组合样式(示例);Figure 2: Combination style of mobile phone + fisheye lens + extended shooting pole (example);

图3:冠层鱼眼图像采集方法与流程;Figure 3: Canopy fisheye image acquisition method and process;

图4:观测天顶角定义与鱼眼图像分环计算示意;Figure 4: Schematic illustration of the definition of the observed zenith angle and the ring division calculation of the fisheye image;

图5:鱼眼图像处理与计算流程;Figure 5: Fisheye image processing and calculation flow;

图6:水稻仰拍测试结果精度验证(右图中横坐标为鱼眼相机算法计算得到的LAI,纵坐标为仪器测量的LAI值);Figure 6: Verification of the accuracy of the rice paddy test results (the abscissa in the right figure is the LAI calculated by the fisheye camera algorithm, and the ordinate is the LAI value measured by the instrument);

图7:不同场景(天气)下水稻仰拍测试结果验证(右图中横坐标为鱼眼相机算法计算得到的LAI,纵坐标为仪器测量的LAI值);Figure 7: Verification of the test results of rice paddy shooting in different scenarios (weather) (the abscissa in the right figure is the LAI calculated by the fisheye camera algorithm, and the ordinate is the LAI value measured by the instrument);

图8:水稻俯拍测试结果验证(右图中横坐标为鱼眼相机算法计算得到的LAI,纵坐标为专业软件测量的LAI值);Figure 8: Verification of rice overhead test results (the abscissa in the right figure is the LAI calculated by the fisheye camera algorithm, and the ordinate is the LAI value measured by professional software);

图9:不同生育阶段小麦俯拍测试结果验证(右图中横坐标为鱼眼相机算法计算得到的LAI,纵坐标为倾斜摄影方法测量的LAI值);Figure 9: Verification of the test results of overhead shooting of wheat at different growth stages (the abscissa in the right figure is the LAI calculated by the fisheye camera algorithm, and the ordinate is the LAI value measured by the oblique photography method);

图10:小麦俯拍测试结果验证(右图中横坐标为鱼眼相机算法计算得到的LAI,纵坐标为倾斜摄影方法测量的LAI值)。Figure 10: Verification of wheat overhead test results (the abscissa in the right figure is the LAI calculated by the fisheye camera algorithm, and the ordinate is the LAI value measured by the oblique photography method).

具体实施方式Detailed ways

下面结合图和实施例对本发明的技术方案作进一步说明。本发明提供的一种基于鱼眼图像的叶面积指数(LAI)测算方法,包括如下步骤:The technical solution of the present invention will be further described below in conjunction with the figures and embodiments. A kind of Leaf Area Index (LAI) calculation method based on fisheye image provided by the invention comprises the following steps:

步骤1,基于鱼眼成像技术采集不同类型冠层的鱼眼图像;Step 1, collect fisheye images of different types of canopies based on fisheye imaging technology;

目前主流的光学仪器均是通过测量冠层内部的光线透射程度来采集测量LAI,因此,视场角度越广,对应能够获取的冠层深度范围越大,测算的LAI越准确,摄影学中的鱼眼镜头(fish-eye lens)可以被应用于这一场景,对应发展出了叶面积指数的半球图像摄影测量法(Digital Hemispherical Photography,DHP)。与一般相机镜头不同的是,鱼眼镜头具有更小的焦距和更广的视场,外界光线在凸起的透镜处发生大幅折射,使得与相机所处平面夹角很小的入射光线都可以落在成像靶面上,达到获取180°半球图像的效果,其成像感光平面为一个与相机靶面内切的圆,以圆心为坐标原点建立坐标系即可以通过式2计算图中每点(像素)处的视角θ(图1):At present, mainstream optical instruments collect and measure LAI by measuring the light transmission degree inside the canopy. Therefore, the wider the field of view, the larger the range of canopy depth that can be obtained, and the more accurate the measured LAI. A fish-eye lens (fish-eye lens) can be applied to this scene, which corresponds to the development of the hemispherical image photogrammetry (Digital Hemispherical Photography, DHP) of the leaf area index. Different from the general camera lens, the fisheye lens has a smaller focal length and a wider field of view. The external light is greatly refracted at the convex lens, so that the incident light with a small angle with the plane where the camera is located can be Fall on the imaging target surface to achieve the effect of obtaining a 180° hemispherical image. The imaging photosensitive plane is a circle inscribed with the camera target surface. The coordinate system can be established with the center of the circle as the coordinate origin, and each point in the figure can be calculated by formula 2 ( pixel) at the viewing angle θ (Fig. 1):

Figure BDA0003901138720000061

Figure BDA0003901138720000061

式中,x,y为像元坐标,R为图像的半径(相机靶面高度的1/2,如分辨率为1920×1080的图像高度为1080,半径为540)。In the formula, x and y are the coordinates of the pixel, and R is the radius of the image (1/2 of the height of the camera target surface, for example, the height of the image with a resolution of 1920×1080 is 1080, and the radius is 540).

本发明实施例通过智能手机与鱼眼镜头采集不同类型冠层的鱼眼图像。智能手机的拍摄功能和分辨率可以满足冠层图像的采集要求,根据图2的方式完成鱼眼镜头的搭载和延伸杆的组装,首先手机与拍摄杆进行蓝牙连接,将鱼眼镜头对准手机后摄主镜头并固定,即组成简易的测量杆,这种组装模式可以有效避免广角视场下路面、人像等噪声的入镜,减少边缘效应对计算结果的影响。完成镜头搭载后,根据图3示意的流程采集不同类型冠层的鱼眼图像。首先根据拍摄的植物类型和发育程度确定相机角度,图3中以抽雄期玉米(冠层内部仰拍)和苗期水稻(冠层上方俯拍)为例,拍摄时远离边缘长势不均匀的冠层,并保持相机水平。In the embodiment of the present invention, fisheye images of different types of canopies are collected through a smart phone and a fisheye lens. The shooting function and resolution of the smartphone can meet the collection requirements of canopy images. According to the method in Figure 2, the installation of the fisheye lens and the assembly of the extension pole are completed. First, the mobile phone is connected to the shooting pole with Bluetooth, and the fisheye lens is aligned with the mobile phone. The main lens of the rear camera is fixed to form a simple measuring rod. This assembly mode can effectively avoid the noise of roads and portraits in the wide-angle field of view, and reduce the impact of edge effects on the calculation results. After the lens is installed, fisheye images of different types of canopies are collected according to the process shown in Figure 3. Firstly, the camera angle is determined according to the type and degree of growth of the plant to be photographed. In Figure 3, corn at the tasseling stage (upward shot inside the canopy) and rice seedling stage (overhead shot above the canopy) are taken as examples. layer, and keep the camera level.

步骤2,基于图像处理技术对采集的鱼眼图像进行处理,降低背景噪声,得到无畸变、可计算的像素矩阵;Step 2, process the collected fisheye image based on image processing technology, reduce background noise, and obtain a distortion-free and computable pixel matrix;

完成鱼眼图像的采集后,需要将其处理成可计算的图像形式,基于机器视觉的图像处理技术将实现图像到LAI数值的一系列操作与计算。图像预处理技术是利用视觉算法对原始图像的色彩空间、亮度感知以及拓扑结构等属性进行调整和均衡的过程,目的是突出图中的参考主体(植被),降低噪声,消除因外界光线变化、相机光圈调整带来的图像亮度不一致等问题,将图像色彩空间标准化、亮度色域参数化,为后续视觉特征的提取和计算提高精度和速度。主体分割技术是图像处理中重要一环,冠层鱼眼图像分为前景和背景,分割技术将植物像元从背景中提取出来。基于阈值的图像分割方法是一种区域并行技术,用一个或几个设定阈值将图像的灰度直方图分为几类,认为图像中灰度处于同一区间范围内的属于一类物体。直接利用图像的灰度特性使得这类方法运算效率高,适用性强。图像后处理技术是对剔除背景的植被部分进行像素统计和几何变换,以提取冠层间隙分数,最终计算得到LAI。After the fisheye image is collected, it needs to be processed into a computable image form. The image processing technology based on machine vision will realize a series of operations and calculations from the image to the LAI value. Image preprocessing technology is the process of using visual algorithms to adjust and balance the color space, brightness perception, and topological structure of the original image. The problem of image brightness inconsistency caused by camera aperture adjustment, standardizes the image color space, parameterizes the brightness and color gamut, and improves the accuracy and speed for the extraction and calculation of subsequent visual features. Subject segmentation technology is an important part of image processing. Canopy fisheye images are divided into foreground and background. Segmentation technology extracts plant pixels from the background. The threshold-based image segmentation method is a region-parallel technology, using one or several set thresholds to divide the gray histogram of the image into several categories, and it is considered that the grayscale in the image is in the same range as a class of objects. The direct use of the grayscale characteristics of the image makes this kind of method highly efficient and highly applicable. The image post-processing technology is to perform pixel statistics and geometric transformation on the vegetation part of which the background is removed, so as to extract the canopy gap score, and finally calculate the LAI.

(21)完成图像采集后,进行图像的预处理,采用降采样方法等比例缩小图像分辨率以加快运行速度,借助图像裁剪程辑包切割图像四周无效区域,鱼眼图像是一幅与相机矩形采集面内切的部分,因此四角存在被遮挡的无效部分需要裁除。(21) After the image acquisition is completed, the image preprocessing is carried out, and the image resolution is reduced proportionally by the down-sampling method to speed up the operation speed. The invalid area around the image is cut with the help of the image cropping program package. The fisheye image is a rectangle with the camera The inscribed part of the surface is collected, so the occluded invalid parts at the four corners need to be trimmed.

(22)基于HSV颜色空间阈值进行植被像元分类,提取图像中中的植被占比和间隙率,根据拍摄视角不同将采用不同的分割方式。HSV色彩空间(Hue-色调、Saturation-饱和度、Value-明暗)将亮度从色彩中分解出来,以便更好地感知图像中不同对象的颜色、亮度差异。通过算法将RGB图像转化为HSV图像后,设定阈值上下限,超出界限的像素被分类为非植被像元不计入接触分数计算。当图像为冠层内部向上拍摄时(仰拍),则以天空像元为分割主体,去除天空像元后得到植被像元;当拍摄视角为俯拍时,则以植被为分割主体,直接得到植被覆盖度。(22) Classify vegetation pixels based on HSV color space threshold, extract vegetation proportion and gap ratio in the image, and adopt different segmentation methods according to different shooting angles. The HSV color space (Hue-hue, Saturation-saturation, Value-shade) decomposes brightness from color in order to better perceive the color and brightness differences of different objects in the image. After the RGB image is converted into an HSV image by an algorithm, the upper and lower limits of the threshold are set, and the pixels beyond the limit are classified as non-vegetation pixels and are not included in the calculation of the contact score. When the image is shot upwards inside the canopy (upward shot), the sky pixel is used as the segmentation subject, and the vegetation pixel is obtained after removing the sky pixel; when the shooting angle is overhead, the vegetation is used as the segmentation subject to directly obtain Vegetation coverage.

(23)基于PIL程辑包进行图像圆环区域切分,得到0-15°观测天顶角范围内的中心圆,以及15-30°、30-45°、45-60°和60-75°范围内的同心圆环,考虑到75°以外的区域接近采样边缘,畸变严重且噪声较多,因此仅计算75°内的部分。(23) Based on the PIL program package, the image circle area is segmented to obtain the central circle within the range of 0-15° observation zenith angle, and 15-30°, 30-45°, 45-60° and 60-75° For the concentric rings within the range of 75°, considering that the area beyond 75° is close to the sampling edge, the distortion is serious and the noise is more, so only the part within 75° is calculated.

对(23)中切分的每个圆环进行步骤(3)的分割操作,得到每个圆环上植被和非植被像元分割结果,最后根据每环的分割计算结果,包括植被像元占比、单向间隙分数等,联立式(11)计算得到LAI的值。Carry out the segmentation operation of step (3) for each ring segmented in (23), and obtain the segmentation results of vegetation and non-vegetation pixels on each ring, and finally calculate the results according to the segmentation of each ring, including the proportion of vegetation pixels Ratio, one-way gap fraction, etc., the value of LAI can be obtained through simultaneous formula (11).

步骤3,将HSV分割处理后的鱼眼图像切分成无限个圆环,每个圆环上的像素对应的植被单元近似地处于同一高度的平面上,根据每个植被和非植被像元分割结果计算圆环中的植被接触频率(覆盖度)并积分,即得到LAI的值。Step 3: Divide the fisheye image after HSV segmentation into infinite rings, and the vegetation units corresponding to the pixels on each ring are approximately on the same height plane, according to the segmentation results of each vegetation and non-vegetation pixel The vegetation contact frequency (coverage) in the circle is calculated and integrated to obtain the value of LAI.

LAI是描述某一高度处冠层叶面积密度情况的变量,根据LAI的概念以及式(1)的定义,研究者提出了基于接触频率和间隙分数的泊松计算模型(Poison Model),接触频率由Warren Wilson提出,是指太阳光入射冠层时与冠层内植被元素想接触的几率,而间隙分数是指自然光束直接入射至参考平面的概率,在叶片不透明的假设下,图像分析中测得的叶片覆盖度(比值)就是单向接触分数。在h的冠层高度下,平均接触分数作为每一叶层单向接触分数的株高积分值,其计算公式为:LAI is a variable that describes the leaf area density of the canopy at a certain height. According to the concept of LAI and the definition of formula (1), the researchers proposed a Poisson calculation model (Poison Model) based on contact frequency and gap fraction. Contact frequency Proposed by Warren Wilson, it refers to the probability of contact with vegetation elements in the canopy when sunlight enters the canopy, and the gap fraction refers to the probability of natural light beams directly incident on the reference plane. Under the assumption that the leaves are opaque, the image analysis measures The resulting leaf coverage (ratio) is the one-way contact score. Under the canopy height of h, the average contact fraction is used as the plant height integral value of the one-way contact fraction of each leaf layer, and its calculation formula is:

Figure BDA0003901138720000071

Figure BDA0003901138720000071

式中H为冠层高度,L(h)表示植物的冠层每一个高度h都对应相应层的叶面积密度,即冠层单位体积上的叶面积,

Figure BDA0003901138720000072

指的是观测位置的方向向量(v——view),θv为观测方向的天顶角,

Figure BDA0003901138720000073

为观测方向的方位角,G为高h处的叶面积投影函数,将式(1)带入可得:In the formula, H is the height of the canopy, and L(h) indicates that each height h of the canopy of the plant corresponds to the leaf area density of the corresponding layer, that is, the leaf area per unit volume of the canopy,

Figure BDA0003901138720000072

Refers to the direction vector (v——view) of the observation position, θ v is the zenith angle of the observation direction,

Figure BDA0003901138720000073

is the azimuth of the observation direction, and G is the projection function of the leaf area at the height h, which can be obtained by substituting formula (1):

Figure BDA0003901138720000081

Figure BDA0003901138720000081

(4)表示了LAI与接触频率的相关性,其中

Figure BDA0003901138720000082

由式(5)、(6)联立可得:(4) shows the correlation between LAI and contact frequency, where

Figure BDA0003901138720000082

From equations (5) and (6), we can get:

Figure BDA0003901138720000083

Figure BDA0003901138720000083

Figure BDA0003901138720000084

Figure BDA0003901138720000084

引入

Figure BDA0003901138720000085

为叶片倾斜角度(高度角、方位角)分布模型的概率密度函数(l——leaf),其中,θl为叶片倾斜方向的天顶角,

Figure BDA0003901138720000086

为叶片倾斜的方位角,并通过式(7)、(8)进行归一化条件约束:introduce

Figure BDA0003901138720000085

is the probability density function (l—leaf) of the blade tilt angle (elevation angle, azimuth angle) distribution model, where θ l is the zenith angle of the blade tilt direction,

Figure BDA0003901138720000086

is the azimuth angle of blade inclination, and is normalized by formula (7) and (8):

Figure BDA0003901138720000087

Figure BDA0003901138720000087

Figure BDA0003901138720000088

Figure BDA0003901138720000088

上述公式联合,可以得到冠层间隙分数

Figure BDA0003901138720000089

平均接触分数

Figure BDA00039011387200000810

以及LAI之间的关系,由Nilson优化为式(9)中的指数关系:Combining the above formulas, the canopy gap fraction can be obtained

Figure BDA0003901138720000089

average contact score

Figure BDA00039011387200000810

And the relationship between LAI, which is optimized by Nilson as the exponential relationship in formula (9):

Figure BDA00039011387200000811

Figure BDA00039011387200000811

其中,

Figure BDA00039011387200000812

Figure BDA00039011387200000813

相同,基于鱼眼图像的圆形视场,可以不考虑入射光线的方位,假设间隙分数测量仅取决于观测天顶角θv,即光线入射方向相对于冠层底部光传感器的法向量夹角,则本例中的计算叶面积指数LAIcal的计算公式可以整理为:in,

Figure BDA00039011387200000812

and

Figure BDA00039011387200000813

Similarly, based on the circular field of view of the fisheye image, the azimuth of the incident light can be ignored, assuming that the gap fraction measurement only depends on the observed zenith angle θ v , that is, the angle between the incident direction of the light and the normal vector of the light sensor at the bottom of the canopy , then the formula for calculating the leaf area index LAI cal in this example can be organized as:

Figure BDA00039011387200000814

Figure BDA00039011387200000814

P0v)与

Figure BDA00039011387200000815

相同,Welles提出了基于多视角观测的针对积分式(10)的离散数值解析法,采用5个天顶观测角划分圆环,分割每个圆环中的平均植被间隙分数

Figure BDA00039011387200000816

将积分式(10)做差分处理:P 0v ) and

Figure BDA00039011387200000815

Similarly, Welles proposed a discrete numerical analysis method for the integral (10) based on multi-view observations, using five zenith observation angles to divide the rings, and dividing the average vegetation gap fraction in each ring

Figure BDA00039011387200000816

The integral formula (10) is differentially processed:

Figure BDA00039011387200000817

Figure BDA00039011387200000817

式中Siv)为cosiθv -1,而Wi为sinθvdθ,根据所取分环角度的不同,系数有所差异。本发明将根据上述推导过程和差分式(11)计算LAIcal值,图4示意了观测天顶角的定义以及分环计算的原理。以冠层高h处一点q为例,q点在鱼眼图像中所处位置距离圆心(天顶)距离为rq,整圆半径为R,则q点的观测天顶角sinθv=rq/R,从而可以计算出每一点处Wi和Siv)的值。本发明划分圆环采用的天顶观测角度分别为0-15°,15-30°,30-45°,45-60°,60-75°,对应i=1-5的情况。In the formula, S iv ) is cos i θ v -1 , and W i is sinθ v dθ, and the coefficients are different according to the different angles of the split rings. The present invention calculates the LAI cal value according to the above derivation process and differential formula (11). Fig. 4 illustrates the definition of the observed zenith angle and the principle of ring division calculation. Taking a point q at the canopy height h as an example, the distance between the point q in the fisheye image and the center of the circle (zenith) is r q , and the radius of the entire circle is R, then the observed zenith angle of point q sinθ v = r q /R, so that the values of W i and S iv ) at each point can be calculated. The zenith observation angles used in the present invention to divide the rings are 0-15°, 15-30°, 30-45°, 45-60°, 60-75° respectively, corresponding to the case of i=1-5.

在本发明的设计过程中,不断进行田间作物冠层鱼眼图像的采集和LAI仪器测量,以对照验证图像处理方法的结果准确度,并测试该方法应用在不同场景下的适用性。本发明的测试作物包括水稻和小麦,测试场景包括试验田、实际农田,应用天气包括日出、日落、晴天以及阴天,而在作物生长的不同时期也进行了图像采集测试。During the design process of the present invention, the collection of field crop canopy fisheye images and LAI instrument measurement are continuously carried out to verify the accuracy of the results of the image processing method and test the applicability of the method in different scenarios. The test crops of the present invention include rice and wheat, the test scenes include experimental field and actual farmland, and the applied weather includes sunrise, sunset, sunny day and cloudy day, and image acquisition tests are also carried out in different periods of crop growth.

下面通过设计田间实验对本发明方法的结果进行测试。表1显示了验证实验的详细信息,在不同场景中应用本发明的方式以及验证方法。在相关作物的试验田或野外实际农田完成选点后,携带鱼眼相机和LAI专业测量仪器进行测量,在仰拍的情况下,相机与仪器的角度一致,选择冠层内部同一位置进行测量,仪器由左至右采集四个点取平均LAI,相机则拍摄6-10张鱼眼图像进行后续处理。在俯拍的情况下,相机的取景范围较大,因此辅以其他专业软件的测量结果进行验证。在验证实验中,仪器选择为LI-cor-LAI-2200冠层分析仪,专业软件采用法国农科院基于MCR开发的Can-eye。The results of the method of the present invention are tested below by designing field experiments. Table 1 shows the detailed information of the verification experiment, the way of applying the present invention in different scenarios and the verification method. After completing the point selection in the experimental field of related crops or the actual farmland in the field, carry a fisheye camera and LAI professional measuring instrument for measurement. In the case of upward shooting, the angle of the camera and the instrument is consistent, and the same position inside the canopy is selected for measurement. The instrument Collect four points from left to right to take the average LAI, and the camera takes 6-10 fisheye images for subsequent processing. In the case of overhead shooting, the camera has a larger viewing range, so it is supplemented by the measurement results of other professional software for verification. In the verification experiment, the instrument was selected as the LI-cor-LAI-2200 canopy analyzer, and the professional software was Can-eye developed by the French Academy of Agricultural Sciences based on MCR.

表1本发明验证实验的应用场景Table 1 The application scenarios of the verification experiment of the present invention

Figure BDA0003901138720000091

Figure BDA0003901138720000091

本发明在多个田间和实验场景中得以应用,与传统仪器和专业软件的测量结果进行了精度验证。图6显示的结果为在拔节孕穗时期的水稻田间仰拍得到的图像示意以及经本发明算法处理计算得到的LAIcal值与仪器测量结果LAImea的精度比较,相关系数R为0.6332,均方根误差RMSE为0.1216,效果良好。在进行验证实验的135个采样点中,我们选取了一天中的不同时间段以及不同天气场景进行拍摄和测量(表1),图7显示了各种场景下的鱼眼图像示例以及与仪器测量结果的验证精度,其中阴天(overcast)场景是仪器测量常用的时段,仪器在这一时段的测量结果较稳定,而本发明的算法计算结果相关系数R为0.6283,均方根误差RMSE为0.1785,误差较小,而在晴天和日落时分,算法的误差较高,达到0.3081和0.3419,雨天时,水滴在镜头上对图像分割造成了影响,因此RMSE也有0.2378。在水稻田间仰拍的测试中,算法能够适应不同的天气和环境,阴天的误差较稳定,而晴天和雨天测量值与实际值也具有良好的相关性。The present invention is applied in multiple fields and experimental scenarios, and the accuracy verification is carried out with the measurement results of traditional instruments and professional software. The result shown in Fig. 6 is the image schematic diagram obtained by the upward shot of the paddy field at the jointing and booting stage and the LAI cal value calculated by the algorithm of the present invention is compared with the accuracy of the instrument measurement result LAI mea , the correlation coefficient R is 0.6332, and the root mean square The error RMSE is 0.1216, which works well. Among the 135 sampling points in the verification experiment, we selected different time periods of the day and different weather scenes for shooting and measurement (Table 1). The verification accuracy of the result, wherein the cloudy day (overcast) scene is the period commonly used for instrument measurement, the measurement result of the instrument in this period is relatively stable, and the algorithm calculation result correlation coefficient R of the present invention is 0.6283, and the root mean square error RMSE is 0.1785 , the error is small, but in sunny days and sunsets, the error of the algorithm is higher, reaching 0.3081 and 0.3419. In rainy days, water droplets on the lens have an impact on image segmentation, so the RMSE is also 0.2378. In the test of upward shooting in rice fields, the algorithm can adapt to different weather and environments, and the error in cloudy days is relatively stable, and the measured values and actual values in sunny and rainy days also have a good correlation.

在水稻株高低矮的生育前期(主要为苗期和分蘖期),仪器无法进入冠层进行测量,一般采用倾斜摄影或采样扫描的方法,本发明的测试过程中,应用在水稻幼苗期时采用俯拍的视角,如图8,计算值与其他方法的测量值具有R=0.9049的高相关性,而均方根误差RMSE仅有0.0223,结果表明,俯拍条件下本发明既可以避免仪器无法进场的限制,也具有良好的精度和稳定性。In the early growth stage (mainly seedling stage and tillering stage) of rice plant height low and low, the instrument can't enter the canopy to measure, generally adopt the method of oblique photography or sampling scanning, in the testing process of the present invention, when applying in rice seedling stage Adopt the angle of view of overhead shooting, as shown in Figure 8, the calculated value and the measured value of other methods have a high correlation of R=0.9049, and the root mean square error RMSE is only 0.0223, the result shows that the present invention can avoid the instrument under the overhead shooting condition Can not enter the limit, but also has good accuracy and stability.

在旱地作物小麦的应用中,我们主要验证了俯拍视角与仪器和其他方法测量的相关性和精度,并且在小麦各个生育期进行了拍照和取样。根据图9,在小麦比较低矮的苗期和分蘖期,测试结果的误差为0.0333和0.0316,精度较高,而小麦逐渐封垄的拔节孕穗期,相关系数R达到0.9909,RMSE为0.1840,与传统方法实测值具有良好的相关性。In the application of dryland crop wheat, we mainly verified the correlation and accuracy of overhead viewing angles with instruments and other methods of measurement, and took pictures and samples at various growth stages of wheat. According to Figure 9, at the seedling stage and tillering stage when the wheat is relatively short, the errors of the test results are 0.0333 and 0.0316, and the accuracy is relatively high. However, at the jointing and booting stage when the wheat gradually closes the ridges, the correlation coefficient R reaches 0.9909, and the RMSE is 0.1840. The measured values by traditional methods have a good correlation.

根据不同场景和不同作物的测试结果,本发明的具体实施过程可以得出以下几点结论:According to the test results of different scenarios and different crops, the specific implementation process of the present invention can draw the following conclusions:

(1)本发明的设备和算法可以有效地适应不同场景的LAI测量任务,针对水田和旱地具有良好的精确度和稳定性。(1) The device and algorithm of the present invention can effectively adapt to LAI measurement tasks in different scenarios, and have good accuracy and stability for paddy fields and dry land.

(2)本发明有效避免了传统方法费时费力的问题,并且在各类作物株高较矮的生育前期,为LAI测量提供了精确的解决方案,填补了作物发育初期LAI监测的空白。(2) The present invention effectively avoids the time-consuming and labor-intensive problems of traditional methods, and provides an accurate solution for LAI measurement in the early growth stage of various crops with relatively short plant heights, filling the blank of LAI monitoring in the initial stage of crop development.

Claims (7)

1. The method for measuring and calculating the leaf area index based on the fisheye image is characterized by comprising the following steps of:

step 1, collecting fish-eye images of different types of canopies;

step 2, processing the collected fisheye image based on an image processing technology, reducing background noise and obtaining a pixel matrix which is free of distortion and can be calculated;

and 3, cutting the processed fisheye image into an infinite number of rings, wherein pixels on each ring are approximately positioned on a plane with the same height, calculating vegetation contact frequency in the rings and integrating to obtain the leaf area index value.

2. The fisheye image-based leaf area index estimation method according to claim 1, characterized in that: in the step 1, fisheye images of different types of canopies are collected through a smart phone and a fisheye lens, the smart phone is connected with a shooting rod in a Bluetooth mode, the fisheye lens is aligned to a main shooting lens of the smart phone and fixed, a simple measuring rod is formed, after the lens is carried, the angle of a camera is determined according to the type and the development degree of a shot plant, the canopies with uneven growing edges are far away during shooting, and the camera is kept horizontal.

3. The fisheye image-based leaf area index estimation method according to claim 2, characterized in that: the imaging photosensitive plane of the fisheye lens is a circle inscribed in the target surface of the camera, a coordinate system is established by taking the circle center as the origin of coordinates, and the visual angle theta of each point in the image can be calculated by the following formula 2:

Figure FDA0003901138710000011

in the formula, x and y are pixel coordinates, and R is the radius of the image.

4. The fisheye image-based leaf area index estimation method according to claim 1, characterized in that: the specific implementation manner of the step 2 is as follows;

(21) Reducing the resolution of the image in an equal proportion by adopting a down-sampling method to accelerate the running speed, and cutting invalid areas around the image by virtue of an image cutting Cheng Ji packet;

(22) Based on HSV color space threshold value carry out vegetation pixel classification, extract vegetation proportion and clearance fraction in every ring, will adopt different segmentation methods according to shooting visual angle difference: converting the RGB image into an HSV image through an algorithm, setting upper and lower threshold limits, classifying pixels exceeding the limits into non-vegetation pixels and not counting in contact fraction calculation, and when the image is shot upwards in a canopy, taking a sky pixel as a segmentation main body, and removing the sky pixel to obtain the vegetation pixel; when the shooting angle is a downward shooting, the vegetation is taken as a segmentation main body, and the vegetation coverage is directly obtained.

5. The fisheye image-based leaf area index estimation method according to claim 1, characterized in that: in the step 3, image ring region segmentation is carried out based on PIL Cheng Jibao to obtain a central circle in the range of 0-15-degree observation zenith angle and concentric rings in the ranges of 15-30 degrees, 30-45 degrees, 45-60 degrees and 60-75 degrees, and the fact that the region beyond 75 degrees is close to the sampling edge, the distortion is serious and the noise is more is considered, so that only the part within 75 degrees is calculated.

6. The fisheye image based leaf area index estimation method of claim 1, wherein: the specific implementation manner of the step 3 is as follows;

according to the LAI concept and the definition of the formula (1), a Poisson calculation model based on contact frequency and gap fraction is provided, the contact frequency is provided by Warren Wilson and refers to the probability that sunlight is in contact with implanted elements in a canopy when the sunlight is incident on the canopy, the gap fraction refers to the probability that natural light beams are directly incident on a reference plane, and under the assumption that the leaves are opaque, the leaf coverage measured in image analysis is a unidirectional contact fraction;

Figure FDA0003901138710000021

wherein H is the canopy height, l (H) is the leaf area density function at H height;

under the height of the canopy h, the average contact fraction is used as the integral value of the plant height of the one-way contact fraction of each leaf layer, and the calculation formula is as follows:

Figure FDA0003901138710000022

wherein H is the height of the canopy, L (H) represents the leaf area density of each height H of the canopy of the plant corresponding to the corresponding layer, namely the leaf area of the canopy in unit volume,

Figure FDA0003901138710000023

a direction vector, theta, referring to the observed position v In order to observe the zenith angle of the direction,

Figure FDA0003901138710000024

for the azimuth of the observation direction, G is the projection function of the leaf area at the height h, bringing equation (1) into the available:

Figure FDA0003901138710000025

(4) Shows the dependence of the LAI on the frequency of the contact, where

Figure FDA0003901138710000026

The following formulas (5) and (6) are combined to obtain:

Figure FDA0003901138710000027

Figure FDA0003901138710000028

introduction of

Figure FDA0003901138710000029

Probability density function of blade inclination angle distribution model, where l The zenith angle in the direction of inclination of the blade,

Figure FDA00039011387100000210

the azimuth angle of the blade inclination is adopted, and normalization condition constraint is carried out through equations (7) and (8):

Figure FDA00039011387100000211

Figure FDA00039011387100000212

the above formulas are combined to obtain the canopy gap fraction

Figure FDA0003901138710000031

Average contact fraction

Figure FDA0003901138710000032

And LAI, optimized by Nilson as an exponential relationship in equation (9):

Figure FDA0003901138710000033

wherein,

Figure FDA0003901138710000034

and

Figure FDA0003901138710000035

similarly, based on the circular field of view of the fisheye image, regardless of the orientation of the incident ray, it is assumed that the gap fraction measurement depends only on the observation zenith angle θ v I.e. the angle between the incident direction of the light and the normal vector of the bottom photosensor of the canopy, the leaf area index LAI is calculated cal The calculation formula of (2) can be organized as:

Figure FDA0003901138710000036

welles proposes a discrete numerical analysis method aiming at integral expression (10) based on multi-view observation, which adopts a plurality of zenith observation angles to divide rings and divides average vegetation gap fraction in each ring

Figure FDA0003901138710000037

The integral formula (10) is subjected to difference processing:

Figure FDA0003901138710000038

in the formula S iv ) Is cos i θ v -1 And W is i Is sin theta v d θ, i represents the angle of division, and the coefficients differ according to the angle of the taken ring.

7. The fisheye image-based leaf area index estimation method of claim 6, wherein: the ring is divided by 5 zenith observation angles, which are respectively 0-15 degrees, 15-30 degrees, 30-45 degrees, 45-60 degrees, 60-75 degrees, and respectively correspond to i =1 to i =5.

CN202211290462.2A 2022-10-21 2022-10-21 Leaf Area Index Calculation Method Based on Fisheye Image Pending CN115655157A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211290462.2A CN115655157A (en) 2022-10-21 2022-10-21 Leaf Area Index Calculation Method Based on Fisheye Image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211290462.2A CN115655157A (en) 2022-10-21 2022-10-21 Leaf Area Index Calculation Method Based on Fisheye Image

Publications (1)

Publication Number Publication Date
CN115655157A true CN115655157A (en) 2023-01-31

Family

ID=84989744

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211290462.2A Pending CN115655157A (en) 2022-10-21 2022-10-21 Leaf Area Index Calculation Method Based on Fisheye Image

Country Status (1)

Country Link
CN (1) CN115655157A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118397075A (en) * 2024-06-24 2024-07-26 合肥工业大学 Calculation method of mountain forest effective leaf area index based on fisheye camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118397075A (en) * 2024-06-24 2024-07-26 合肥工业大学 Calculation method of mountain forest effective leaf area index based on fisheye camera

Similar Documents

Publication Publication Date Title
CN109816680B (en) 2020-10-27 A high-throughput calculation method for crop plant height
CN111340826A (en) 2020-06-26 Single tree crown segmentation algorithm for aerial image based on superpixels and topological features
CN110988909B (en) 2023-06-27 Vegetation Coverage Measurement Method of Sandy Land Vegetation in Alpine Vulnerable Area Based on TLS
CN108918820B (en) 2020-08-14 Method and device for acquiring graded distribution of salinization degree of farmland soil
CN112907520B (en) 2024-09-17 Single tree crown detection method based on end-to-end deep learning method
CN111289441A (en) 2020-06-16 Multispectral field crop water content determination method, system and equipment
CN116883480A (en) 2023-10-13 Corn plant height detection method based on binocular image and ground-based radar fusion point cloud
US20240290089A1 (en) 2024-08-29 Method for extracting forest parameters of wetland with high canopy density based on consumer-grade uav image
CN112906719A (en) 2021-06-04 Standing tree factor measuring method based on consumption-level depth camera
CN116645321B (en) 2024-03-08 Statistical method, device, electronic equipment and storage medium for calculating vegetation leaf inclination angle
CN111598955A (en) 2020-08-28 A mobile terminal intelligent foundation pit monitoring system and method based on photogrammetry
CN115290054A (en) 2022-11-04 A method for estimating and forecasting stand volume based on image point cloud
CN115655157A (en) 2023-01-31 Leaf Area Index Calculation Method Based on Fisheye Image
CN113655003B (en) 2024-01-12 Method for estimating soil moisture content of winter wheat in green-turning period by using unmanned aerial vehicle photo
CN111598874B (en) 2023-05-16 An investigation method of mangrove canopy density based on intelligent mobile terminal
Yang et al. 2021 Feature extraction of cotton plant height based on DSM difference method
CN117451012A (en) 2024-01-26 Unmanned aerial vehicle aerial photography measurement method and system
CN115598071A (en) 2023-01-13 Plant growth distribution state detection method and device
CN115841615A (en) 2023-03-24 Tobacco yield prediction method and device based on multispectral data of unmanned aerial vehicle
CN115205366A (en) 2022-10-18 Leaf area index measuring method and measuring device thereof
Chen et al. 2022 Improving fractional vegetation cover estimation with shadow effects using high dynamic range images
CN111680659A (en) 2020-09-18 Relative radiance normalization method for RGB nighttime light images of the International Space Station
Wang 2019 Estimating forest attributes from spherical images
CN117292154B (en) 2024-02-06 Automatic production method of long-time-sequence ground object samples based on dense time-sequence remote sensing images
CN118155101B (en) 2024-11-26 Automatic biomass sampling method, medium and equipment based on UAV remote sensing calculation

Legal Events

Date Code Title Description
2023-01-31 PB01 Publication
2023-01-31 PB01 Publication
2023-02-17 SE01 Entry into force of request for substantive examination
2023-02-17 SE01 Entry into force of request for substantive examination