Welcome to Smart Agriculture 中文
Topic--Frontier Technology and Application of Agricultural Phenotype

Research Status and Prospect on Height Estimation of Field Crop Using Near-Field Remote Sensing Technology

  • ZHANG Jian , 1 ,
  • XIE Tianjin 1 ,
  • YANG Wanneng 2 ,
  • ZHOU Guangsheng 3
Expand
  • 1. College of Resources and Environmental Sciences/Macro Agriculture Research Institute, Huazhong Agricultural University, Wuhan 430070, China
  • 2. National Key Laboratory of Crop Genetic Improvement, Huazhong Agricultural University, Wuhan 430070, China
  • 3. College of Plant Science and Technology, Huazhong Agricultural University, Wuhan 430070, China

Received date: 2021-02-11

  Revised date: 2021-03-10

  Online published: 2021-06-01

Highlights

Plant height is a key indicator to dynamically measure crop health and overall growth status, which is widely used to estimate the biological yield and final grain yield of crops. The traditional manual measurement method is subjective, inefficient, and time-consuming. And the plant height obtained by sampling cannot evaluate the height of the whole field. In the last decade, remote sensing technology has developed rapidly in agriculture, which makes it possible to collect crop height information with high accuracy, high frequency, and high efficiency. This paper firstly reviewed the literature on obtaining plant height by using remote sensing technology for understanding the research progress of height estimation in the field. Unmanned aerial vehicle (UAV) platform with visible-light camera and light detection and ranging (LiDAR) were the most frequently used methods. And main research crops included wheat, corn, rice, and other staple food crops. Moreover, crop height measurement was mainly based on near-field remote sensing platforms such as ground, UAV, and airborne. Secondly, the basic principles, advantages, and limitations of different platforms and sensors for obtaining plant height were analyzed. The altimetry process and the key techniques of LiDAR and visible-light camera were discussed emphatically, which included extraction of crop canopy and soil elevation information, and feature matching of the imaging method. Then, the applications using plant height data, including the inversion of biomass, lodging identification, yield prediction, and breeding of crops were summarized. However, the commonly used empirical model has some problems such large measured data, unclear physical significance, and poor universality. Finally, the problems and challenges of near-field remote sensing technology in plant height acquisition were proposed. Selecting appropriate data to meet the needs of cost and accuracy, improving the measurement accuracy, and matching the plant height estimation of remote sensing with the agricultural application need to be considered. In addition, we prospected the future development was prospected from four aspects of 1) platform and sensor, 2) bare soil detection and interpolation algorithm, 3) plant height application research, and 4) the measurement difference of plant height between agronomy and remote sensing, which can provide references for future research and method application of near-field remote sensing height measurement.

Cite this article

ZHANG Jian , XIE Tianjin , YANG Wanneng , ZHOU Guangsheng . Research Status and Prospect on Height Estimation of Field Crop Using Near-Field Remote Sensing Technology[J]. Smart Agriculture, 2021 , 3(1) : 1 -15 . DOI: 10.12133/j.smartag.2021.3.1.202102-SA033

1 引 言

株高是作物的重要生长指标,合理株高是实现作物稳产、高产的基础。在第一次绿色革命中,通过培育具有半矮秆基因的小麦和水稻品种,使作物株高降低的同时,抗倒伏能力和生产潜力得到提高1,2。但过低的作物株高也会存在减产风险3,因此通过研究株高遗传机制,有针对性地制定育种方案,对作物增产具有重要现实意义。此外,株高与作物生物量、产量密切相关4,5,通过对作物高度变化的监测,确定作物的健康程度和生长状况,可为加强施肥、除草和收获等农作物生产管理与调控提供重要的参考依据。
农学领域一般采用标尺人工测量植株高度(简称“测高”),其测量方式包括自然株高法、生理株高法和叶枕株高法3种6。最常见的是通过测量地面到自然状态下主茎顶部的垂直距离作为作物自然株高。由于环境、基因型或者管理等因素影响,作物之间生长形态会有较大差异。例如小麦等直立株型作物,上部叶片会出现下披现象7,为此会将作物叶片展开或者挑选直立生长的植株测量其基部至顶部的垂直距离作为生理株高。表示地面至植株最上部叶枕距离的叶枕株高也是使用较多的高度测量方式。然而,人工测高方式需要大量实地调查,效率低并且数据精度易受主观因素影响,通过抽样方式获取的株高数据无法代表大田的全部状况。遥感技术的发展为作物测高研究和应用提供了一种新的解决思路,本文旨在对基于遥感手段的作物株高提取研究进行全面综述,从不同传感器类型和平台出发,总结作物株高获取方法及其不足,归纳株高在作物表型特征提取、倒伏监测、产量估测和育种等方面的研究应用现状,并对近地遥感技术在作物株高获取方面的发展趋势和未来挑战进行展望。

2 全球遥感技术测高研究态势

为了解遥感技术测高领域的研究进展,总结现有研究所使用的主流传感器、平台和重点观测对象,在“Web of Science”和“ScienceDirect”平台搜索引擎检索近10年(2010—2019年)全球发表的有关遥感技术测高的学术论文,检索标题关键词规则为:"canopy or crop or plant or vegetation or wheat or maize or corn or rice or barley or soybean or sorghum or rapeseed" and "height or lodging or biomass or yield or lai",结果如图1所示。图1(a)统计了遥感测高中使用频率最高的无人机(Unmanned Arial Vehicle,UAV)平台搭载可见光相机、激光雷达(Light Detection and Ranging,LiDAR)、合成孔径雷达和超声波传感器4种主要方法及其他方法的文章数量,其整体上呈现增加的趋势。其中,由于低空遥感技术及计算机视觉方法的高速发展,无人机平台搭载可见光相机成为最常用的作物株高获取手段,其次为激光雷达方法。图1(b)显示,作物测高论文多数以小麦、玉米、水稻等大宗粮食作物为研究对象。并且,作物测高以地面及无人机等近地遥感平台为主(图1(c)),近地观测方式更适用于具有低矮、种植密集结构特征的作物。
图1 2010—2019年遥感技术测高的研究态势

Fig. 1 Research status of height measurement using remote sensing technology from 2010 to 2019

3 近地遥感技术测高的研究进展

按传感器工作方式的不同可将近地遥感测高技术分为主动式遥感和被动式遥感两种。主动式遥感的传感器带有能发射电磁波、声波等讯号的辐射源,同时能够接收并记录目标物反射回来的讯号,因此受光照条件影响小,可以昼夜工作;被动式遥感则是利用传感器直接接收并记录目标物反射自然辐射源的电磁波或自身发出的电磁波而进行探测。该类传感器通常成本较低,但更易受到光照条件影响,且不具有穿透性。

3.1 主动式遥感测高及特点

激光雷达通过记录激光飞行时间(Time of Flight,ToF)能够准确地定位激光束接触到物体的光斑。由于其较强的穿透能力,通过脉冲的多次回波反射通常可以同时记录被测物体冠层和土壤的点云信息,通过分类和滤波处理后可以得到物体高度信息。在20世纪90年代,搭载于直升机或固定翼飞机的激光雷达系统发展迅速,广泛应用于精准林业的测量,如林冠高度、林分平均树高、地上生物量等8,9。然而,航空激光雷达的飞行高度通常在百米至千米级,其较低的测距精度难以实现土壤与低矮作物的区分10,11。此外,数据采集成本及处理复杂性增加了基于激光雷达的多时相作物株高监测的难度12。地基激光雷达系统能够获取毫米级精度的三维点云数据,更适合于作物的株高提取。其地基平台主要包括地面固定平台13-15、机器人平台16和车载平台17,18等。在玉米19、小麦20,21、水稻22,23以及高粱24等多种作物中均取得了高精度的作物株高估算结果。但是,地面固定平台存在数据采集效率低、遮挡严重等问题,而地面移动平台易受不同大田作物行距、株距以及作物高度等因素的限制,同时直接田间行驶也容易造成土壤压实25。随着激光雷达的轻小型化发展,其搭载于无人机上成为可能,在提高作物观测频次和效率的同时,可以减少数据获取时对作物造成的干扰,填补了数据精度高但效率低的地基激光雷达系统与能够进行大面积数据采集但细节信息较少的有人机激光雷达系统之间的空白26
超声波(Ultrasonic)传感器能够发射频率高于20 kHz的超声波脉冲,通过记录声音发射与返回的时间差计算传感器到物体的距离,具有部署容易、数据处理简单、成本较低等优势,并可以通过微调实现厘米级的株高测量精度24,27,28。但超声波传感器属于近端遥感,信号衰减较快,测高精度会随着传感器与目标作物距离增加而下降。因此,超声波传感器测量距离大多选择在10 m以内29,且多搭载于地面移动平台30,31

3.2 被动式遥感测高及特点

微软公司发布的Kinect v2能够以30 f/s的速度获取分辨率为512 × 424 px的深度图像,深度检测范围为0.5~4.5 m32,33。系统成像速度较快,基本能够满足实时测量的需求。然而深度相机测量尺度有限,获取的图像分辨率较低,因此通常搭载于地面平台34,35,且多用于温室盆栽作物株高监测36
与深度相机相比,可见光相机能够获取更高分辨率的图像37。有研究在被测作物周边放置已知高度的参考物,通过采集同时包含参考物和作物的单幅可见光图像,计算作物高度38,39。随着计算机视觉技术的发展,出现了通过在地面平台布设多个可见光相机或者移动单相机来产生多视角图像,并基于双目立体匹配40、多视图立体视觉41,42等方法的作物测高方式,但其效率较低、测量范围较小。近十年来,无人机搭载高分辨率可见光相机因其成本低、分辨率高且易于部署等优点已成为近地遥感技术中最广泛使用的作物株高获取方法43-46。通过可见光相机采集重叠的图像,基于运动恢复结构(Structure from Motion,SfM)对重叠图像进行特征检测与匹配47。特征匹配构建稀疏点云后基于多视图立体视觉法进行密集点云重构,最后通过点云插值生成表示作物或土壤高程的影像。目前,该技术已应用于玉米48-51、水稻52、高粱53,54、小麦5,46,55、大豆56、棉花57,58等作物,其精度与地面实测数据相比,多数作物的决定系数R 2可控制在0.80以上,均方根误差(Root Mean Square Error,RMSE)在10 cm以下。
总之,激光雷达、超声波、深度相机和可见光相机均能够在地面平台上获取较好的大田作物高度估算精度。但是,地面平台的行进效率和灵活性在一定程度上限制了其应用范围。无人机平台能够较好弥补上述不足。目前,无人机结合可见光相机在作物测高研究中已得到广泛应用,而随着激光雷达的不断轻小型化,让无人机搭载模式成为可能,该模式也逐步在作物测高领域得到应用。

3.3 近地遥感测高流程与关键技术

本节以地基激光雷达和无人机搭载可见光相机两种作物测高主流方法为例,介绍株高获取的主要流程(图2)和涉及的关键技术。
图2 地基激光雷达和无人机搭载可见光相机两种遥感系统的株高获取流程图

注:虚线矩形框表示可省略步骤

Fig. 2 Plant height acquisition flow charts of ground-based LiDAR and UAV equipped with visible light camera remote sensing systems

3.3.1 地基激光雷达测高方法

为避免地基测高平台受到遮挡影响,一般会从不同高度和角度设置多个数据采集站点。因此,在地基激光雷达株高提取前需要利用最邻近点迭代算法(Iterative Closest Point,ICP)59,60或者加入外置特征地物(如标靶球)13,14等方式实现多站点数据的配准。对点云数据进行配准和去噪等预处理步骤后,准确地提取作物顶部及土壤区域是实现精确测高关键步骤之一61。程曼等62利用多项式曲线针对花生冠层轮廓特征使用激光点云进行拟合,结果显示5阶曲线拟合效果最佳,通过计算拟合曲线极大值点和极小值点以判断作物冠层轮廓的边界,得到花生株高,如图3所示。多项式曲线拟合方法适合冠层均匀的圆形叶片,对于不均匀的尖顶作物,容易造成低估且产生过高的拟合阶数。苏伟等19将单株玉米点云数据从群体点云中分离后,遍历单株玉米点云获得其空间坐标与点距,株高即为点云高度坐标的最大值和最小值两点间的欧式距离。此外,在播种后即进行土壤基面的点云数据获取也能够有效地减轻植株遮挡对土壤点提取产生的影响。通过在激光雷达测高系统上搭载可见光相机,获取每个测量点空间坐标的同时,能够记录被测物体的颜色和纹理信息,所产生的着色点云数据有助于作物与土壤的分类,也能够实现土壤和作物冠层点的精确提取。
图3 地基激光雷达测高方法[62]

Fig. 3 Ground-based LiDAR height estimation method

3.3.2 无人机搭载可见光相机测高方法

被动式遥感传感器无法穿透植被冠层获取土壤基底,在作物还未封行的生长初期或作物种植间隔本身较大的情况下,可以提取土壤区域,并通过克里金插值63,64、反距离权重插值53,65、自然邻域插值45等方法获得完整、准确的数字地形模型(Digital Terrain Model,DTM)(图4)。该方式可以同步获取DTM和作物表面高程(Digital Surface Model,DSM),减少成本的同时避免人为因素或恶劣天气造成的土壤高程变化。然而,农业生产中往往以较高的播种密度种植经济作物,使作物迅速封闭冠层,抑制杂草生长66,67,增加了从DSM中推导DTM的难度。因此对于种植密集、冠层均匀或者基底起伏较大的垄作型作物,如高粱、油菜、马铃薯、甜菜等,多数实验选择在作物出苗前或收获后进行额外一次的飞行任务获取DTM5,18,68,69
图4 无人机搭载可见光相机测高方法[49]

注: DTM表示数字地形模型(Digital Terrain Model)

Fig. 4 Height estimation method of UAV equipped with visible light camera

基于SfM算法的作物株高提取方法中需要精确地检测及匹配图像中大量特征信息,以获取高质量的冠层重建70。该过程通常采用尺度不变特征变换(Scale-Invariant Feature Transform,SIFT)算法寻找同名像点。然而与建筑物、树林等地物相比,农田作物存在大量的自遮挡现象且纹理信息单一,在多个视图中难以实现叶片目标的精准匹配,从而增大特征匹配误差,造成细节形态和纹理信息的缺失。对于郁闭度较高的农田目标,通常可以设置较高的无人机图像采集重叠度,以及更高的飞行高度来应对上述问题。Hasheminasab等71使用高精度全球定位系统/惯性传感单元(GPS/IMU)以减少特征匹配的搜索空间,代替传统的穷举搜索,能够减轻由重复纹理引起的匹配模糊问题。此外,无人机飞行高度较低时而产生的气流或者多风的环境条件均会使冠层发生移动,导致不同图像中叶片、穗部等结构的位置改变,也会对特征匹配产生一定的不利影响72

4 近地遥感技术测高在农业中的应用

作物株高由于能够通过遥感方式无损、高精度地直接测量,因此常被作为模型变量应用于作物生理生化指标反演、倒伏识别、产量预测和育种等方面(表1)。
表1 基于近场遥感方式获取大田作物株高的应用研究

Table 1 Application of near-field remote sensing method to obtain plant height of field crops

应用 传感器 平台/测量高度 作物 模型 RMSE R 2
生物量估算 激光雷达[73] 地面固定平台 小麦 幂函数回归模型 1.76 t/ha 0.82
超声波[74] 地面固定平台 生菜 指数回归模型 —— 0.80
可见光相机[75] 无人机/50 m 小麦 偏最小二乘回归模型 0.96 t/ha 0.74
可见光相机[76] 无人机/25 m 水稻 随机森林 2.10 t/ha 0.90
可见光相机[77] 无人机/44 m 洋葱 作物体积模型 1.53 t/ha 0.95
倒伏监测 激光雷达[78] 无人机/15 m 玉米 通过株高变化定量测定倒伏程度,株高测量精度R 2=0.964,RMSE=0.127 m
可见光相机[79] 无人机/20~50 m 玉米 通过设定阈值量化作物倒伏率,与地面实测值相比R 2=0.50,RMSE=0.09
可见光相机[80] 无人机/35 m 大麦 通过设定阈值量化作物倒伏率,与地面实测值相比,其最佳精度R 2=0.96,RMSE=0.08
产量预测 可见光相机[81] 无人机/50 m 玉米 多元回归模型 0.13 t/ha 0.74
可见光相机[82] 无人机/50 m 甘蔗 作物模型 1.09 t/ha 0.44
可见光相机[83] 无人机/50 m 棉花 多元回归模型 0.16 t/ha 0.94
可见光相机[84] 无人机/30 m 大豆 偏最小二乘回归模型 0.42 t/ha 0.81
高光谱相机[63] 无人机/50 m 小麦 偏最小二乘回归模型 0.65 t/ha 0.77
辅助育种 可见光相机[85] 无人机/30 m 小麦 对株高性状进行全基因组和QTL标记,其预测的基因组值与实际值相关性在0.47~0.53之间
可见光相机[86] 无人机/40~60 m 玉米 通过对7个与株高相关的性状进行全基因组关联研究,共鉴定出68个QTL,其中35%的QTL与已被报道的控制株高性状的QTL重合
(1)生物量估算。当前研究主要通过株高或联合光谱指数、冠层覆盖度等作物表型参数并采用线性回归模型87,88、指数回归模型69,73,89、偏最小二乘回归模型90、随机森林76,91、支持向量机92等建模方法预测地上部生物量。此外,通过株高在指定面积上进行累加构建作物体积模型也能够获取准确的生物量预测结果93,94。与光谱指标相比,形态指标受光照条件影响较小,并且光谱指数在作物生长后期会出现饱和95,因此由株高计算得到的生物量结果更加精确、稳定。
(2)倒伏监测。倒伏是指作物直立部位发生永久位移96,作物的抗倒伏能力是重要的遗传特性以及育种重要的选择标准97,通常可以通过提取光谱特征、纹理信息或倒伏前后株高变化测定倒伏面积和倒伏程度98,99。Singh等96将小麦倒伏前的一期DSM减去倒伏后的DSM得到差分DSM,并提取差分DSM中各个小区高程的均值,与人工打分得到的倒伏发生率、倒伏严重程度与倒伏指数相比,相关性在0.77~0.93之间。Su等97使用灰度共生矩阵提取玉米倒伏前后可见光图像纹理特征,同时也通过倒伏前后DSM数据相减方式,分别获得倒伏面积,其估计误差分别为10.00%和0.85%。说明相对纹理指标,株高更能准确地测定倒伏程度和面积。
(3)产量预测。株高也是预测产量重要的指标之一。Li等100通过UAV搭载可见光和多光谱传感器提取株高和多种植被指数预测小麦产量,发现在套索算法和随机森林两种模型中,灌浆期株高产量估测的重要性得分均排在首位。并且,通过株高建立模型进行估产,一般认为越接近收获期模型估测精度越高63,101
(4)辅助育种。株高是由多基因控制的数量性状102,易受环境、基因型及其互作的影响,通过高度的变化特征可以更好地研究作物生长的遗传机制85,103,104。Hassan等85对株高性状进行全基因组和数量性状基因座(Quantitative Trait Locus,QTL)标记,结果发现无人机估测小麦株高所预测的基因组值与实际值相关性在0.47~0.53之间,与地面实测值比较,二者呈现相似的基因组预测能力。目前,越来越多研究通过遥感方式获取作物株高用于育种105,106],遥感手段已被大量试验证明可以获取高频次、高精度、重复性好的连续冠层高度分布数据,对作物育种具有重要现实意义97
综上,在近地遥感测高的农业应用研究中,由于经验统计回归方法具有技术门槛低、反演参数少、方法简单有效等优势107,作物长势参数大多是通过该方法进行估测(表1)。但是该模型需要大量实测数据进行反演,缺乏明确的物理意义。部分应用通过株高等性状建立机器学习模型优化作物长势反演结果,但在实际应用中,一般需要单独建模以适应作物品种、生育期等因素的变化。Yu等82尝试将甘蔗的株高性状与农田水文模型耦合,构建了新的数据同化系统,有助于提高禾本科作物的产量估计精度。株高数据与作物模型的同化,可提高作物性状的反演精度,并实现模型的时空扩展。因此,未来研究可以通过将株高融入作物模型等方式改进作物长势参数反演精度,解决经验模型和传统机器学习模型普适性弱、稳定性差的问题。

5 存在的问题与展望

5.1 测高精度与成本的平衡问题

无人机平台搭载可见光相机可以通过多种空间辅助数据来优化测高精度,如DTM、DSM、地面控制点(Ground Control Point,GCP)、地面实测株高数据等。科学研究或农业生产等不同领域为满足测高精度与成本需求会对空间辅助数据有所取舍。文献[108]系统地评估了多种空间辅助数据组合下无人机搭载可见光相机的测高精度与成本。本研究结合文献[108]讨论结果,将决定系数R 2和均方根误差RMSE作为精度评价指标,人工、时间和操作成本作为成本评价指标,共同评估DTM、GCP、株高实测数据和冠层密度4种数据不同组合下的测高结果,如表2所示。
表2 基于精度和成本角度总结不同数据完备性下的作物株高估算

Table 2 Evaluation summary of accuracy and cost for crop height estimation

类别 条件 精度 成本
DTM GCP 地面数据 冠层密度 R² RMSE 人工成本 时间成本 操作成本
1 稀疏/密集 ★★★★★ ★★★★ ★★☆ ★☆ ★★
稀疏/密集 ★★★★★ ★★★★★
2 稀疏 ★★★★★ ★★★★ ★★★☆ ★★★★ ★★★
密集 ★☆ / ★★★☆ ★★★★ ★★★
稀疏 ★★★★★ ★★★★★ ★☆ ★★★ ★☆
密集 ★☆ ★★ ★☆ ★★★ ★☆
3 稀疏 ★★ / ★★★★ ★★☆ ★★★★
密集 ★★ / ★★ ★★
稀疏 ★★ ★★★ ★★★★ ★★☆ ★★★★
密集 ★★ ★★★ ★★ ★★
4 稀疏 / ★★★★★ ★★★★★ ★★★★★
密集 / ★★★★★ ★★★★★ ★★★★★
稀疏 ★☆ ★★★ ★★★★ ★★★☆
密集 ★☆ ★★★ ★★★★ ★★★☆

注:评估以星级为代表的10分制。其中“☆”为1分,“★”为2分。数量越多代表精度越高,成本越低

类别1同时采集了DTM和GCP数据,结合DSM能够实现高精度的株高测量。此外,通过加入地面实测数据建立线性回归模型能够有效降低株高的绝对误差。使用完备的空间辅助数据会提升数据采集成本,但对于精细的株高提取是必要的。例如,株高性状的基因定位对于作物育种具有重要的理论和应用价值109,育种学家会关注作物全生育期的株高变化,上述情况需要收集完备的空间辅助数据以实现高精度的估测结果110
DTM数据一般在作物出苗前或收获后采集,以避免作物对土壤造成的遮挡。然而,在作物冠层稀疏的情况下,可以从DSM中提取裸土高程作为基底,该方式能降低数据采集成本,并且裸露土壤较多的情况下能够获取与完备条件下近似的精度(具体见3.3.2)。然而,当作物冠层已经密闭且地形起伏大时,难以通过对裸地进行插值的方式实现准确地DTM构建,该种情况下,采集单独一期的DTM数据是有必要的。
在无人机图像拼接时,通过导入同一套GCP能够实现多幅图像间的配准并提高图像质量,因此可以观测特定作物植株或群体株高的动态变化。然而,在地势崎岖、区域分散或有灌溉系统的大田中,GCP的布设难以实现75。同时需要实时动态(Real-Time Kinematic,RTK)等仪器测量GCP的空间位置,其信号强度易受周围环境影响,如高压电线、变压器或地形等。因此,从农业生产的角度而言,进行GCP布设和位置信息收集的难度较大,成本较高。在难以实现GCP的田间布设时,从图像中直接提取GCP可以有助于多期数据间的配准,但通常测高精度会低于完备数据的情况。
表2通过从测高精度要求与数据获取成本的双重角度进行评估,能够为科学研究和实际农业生产中的作物测高方案制定提供参考,从而在保证测高精度需求的前提下合理地选择空间辅助数据。

5.2 无人机遥感平台的精细测高问题

低空无人机被动遥感以成像方式构建三维模型获取大田作物株高,对于玉米、水稻、小麦等叶片直立而狭长的作物,尤其是穗部或叶尖处的高度信息提取难度大,易造成作物株高的低估。Liu等111使用Mavic Pro2采集图像后构建冠层三维点云直接进行株高量测,飞行高度为5 m,但仍然不能恢复其完整的穗部结构。
由可见光图像导出作物高程是一种间接测高方法,而激光雷达是通过点云数据直接测量,其测高精度通常优于可见光方式20,21。目前,已有一些轻小型化的激光雷达搭载于无人机上对作物测高研究进行了初步探索(表3),其重量均在4 kg以下,测量精度为0.5~5 cm。然而,由表3可见,只有通过降低测量高度情况下(测量高度低于20 m)作物的株高测量精度较好,表3其余作物株高测量结果的R 2均在0.8以下。而且,这些激光雷达仍未解决成本高问题。以上是限制无人机搭载激光雷达模式在农业领域发展的重要原因。大疆创新科技有限公司于2020年10月发布了一款“禅思L1”,该系统集成了低成本、轻小型的Livox系列激光雷达AVIA。Hu等112人评估了同品牌的MID 40在森林资源清查中的应用,在100 m飞行高度下可以获得密度大于464 pts/m2的点云数据,能够精确地计算树高、林冠覆盖率、林隙分数等森林表型信息。AVIA相较于MID 40具有更大的FOV(视场角)和点云数据率,从而可以提高数据采集效率和点云密度,然而,还未有研究将其应用于大田作物的表型提取。目前,激光雷达系统的高成本以及点云密度、测距精度等性能无法满足农田作物表型精确测量仍是遥感测高中急需解决的问题。
表3 无人机搭载激光雷达系统的测高研究

Table 3 Studies on the height measurement of UAVs equipped with LiDAR system

传感器

观测

对象

FOV

/(°)

测距精度/cm 重量/kg 飞行速度/(m·s-1 测量高度/m

点云

密度/(pts·m-2

精度
Livox MID 40 林木[112] 38.40 2.00 0.76 4.00 100.00 464.5 R 2=0.96,RMSE=0.59 m

RIEGL

VUX-1UAV

玉米[78] 330 0.50 3.50 3.00 15.00 112.0~570.0 R 2=0.96,RMSE=0.13 m
小麦[26] 5.85 41.84 997.0 R 2=0.78,RMSE=0.03 m
马铃薯[26] 5.85 41.84 833.0 R 2=0.50,RMSE=0.12 m
甜菜[26] 5.85 41.84 933.0 R 2=0.70,RMSE=0.07 m
玉米[113] 150.00 420.0 R 2=0.65,RMSE=0.24 m
大豆[113] 150.00 420.0 R 2=0.40,RMSE=0.09 m

Velodyne

VLP-16

棉花[4] 360 3.00 0.83 2.00 9.00 1682.0 RE=12.73%,RMSE=0.03 m
大豆[114] 0.50 9.00 1600.0 RE=5.14%

注: RE表示相对误差(Relative Error)

5.3 遥感测高与农学测高的差异问题

作物的形态结构会因栽培措施、环境、品种等因素产生变化。并且,农学上的作物株高测量通常不包含禾本科作物的芒及豆科作物的卷须等部分115。然而,遥感测高一般获取田间作物在自然状态下植株全部结构的顶端到地面的垂直距离,因此会与农学测高结果不同。为方便表述,下文使用自然株高和植株长度分别表示遥感和农学测取的作物高度(图5)。例如,受栽培措施影响,小麦株型会产生变化,根据旗叶形态结构可以划分为“直立型”和“下披型”116,通过遥感手段容易对“下披型”作物的真实植株长度产生低估;遥感方式获取的自然株高能够辅助倒伏区域识别、倒伏程度测定,但无法提供倒伏后的真实植株长度;同样,遥感测高方式会将作物的芒计算在内,以上情况皆不能获取作物真实的植株长度,进而影响估产等应用。遥感测高需要根据农业应用需求有针对性地制定株高测量方案。当作物的自然株高与真实植株长度有差异时,可以尝试通过多种传感器协同进行株高提取。例如通过可见光相机获取纹理图像识别植株主体部分,再使用激光雷达进行植株的骨架提取,针对弯折部分采取分段式株高量测。多相机倾斜摄影测量技术能够获得地物丰富的纹理信息117,118,通过建立作物三维模型也能够为倾斜状态下的作物植株长度量测提供一种解决思路。
图5 农学测高与遥感测高的差异

(a)倒伏作物 (b)带芒作物

Fig. 5 Difference of height estimation between agronomic and remote sensing

5.4 未来研究方向展望

近十年来,近地遥感技术在大田作物测高研究中得到广泛应用,能够实现大面积作物的同步监测,获得高精度、重复性好的作物株高数据,考虑到近地遥感技术仍面临多种问题,未来该领域主要可以从以下4个方面展开科学研究。
(1)无人机作为作物株高获取的主要平台,需要提高有效载荷和续航能力,而测高传感器需要向轻小型、低成本方向发展,实现作物株高高效、大面积的观测。
(2)被动式传感器无法穿透作物冠层,需要单独执行一次飞行任务采集裸地高程或者通过从DSM中提取土壤部分进行插值获得DTM,前者会增加数据采集成本,而后者在裸露土壤较少时DTM提取精度较差。因此需要改进裸地探测算法与插值算法,以实现小样本量的裸地插值以及在复杂大田环境下的精准裸地探测,从而提高数据采集效率、改善测高精度。
(3)作物株高在农业中应用广泛,一方面株高可用于作物多种长势参数估测,但其反演方式以经验统计、传统机器学习方法为主,需要探索针对不同作物、不同生育期、不同环境中的长势反演通用模型。另一方面,加强遥感与遗传育种研究的结合,可为株高遗传机制研究提供高通量的株高数据,突破现有作物形态指标数据获取效率瓶颈,能够促进大田作物遗传育种研究,实现粮食产量与质量的提高。
(4)遥感和农学测高方式存在一定差异,需要结合作物株型结构特点和科学问题,有针对性地进行作物株高提取方法研究以满足科学研究和实际应用的需求。
1
PENG J, RICHARDS D E, HARTLEY N M, et al. 'Green revolution' genes encode mutant gibberellin response modulators[J]. Nature, 1999, 400: 256-261.

2
SASAKI A, ASHIKARI M, UEGUCHI-TANAKA M, et al. Green revolution: A mutant gibberellin-synthesis gene in rice[J]. Nature, 2002, 416: 701-702.

3
KUMAR K, NEELAM K, BHATIA D, et al. High resolution genetic mapping and identification of a candidate gene(s) for the purple sheath color and plant height in an interspecific F-2 population derived from Oryza nivara Sharma & Shastry × Oryza sativa L. cross[J]. Genetic Resources and Crop Evolution, 2020, 67(1): 97-105.

4
LIU K, DONG X, QIU B, et al. Analysis of cotton height spatial variability based on UAV-LiDAR[J]. International Journal of Precision Agricultural Aviation, 2020, 3(3): 72-76.

5
刘治开, 牛亚晓, 王毅, 等. 基于无人机可见光遥感的冬小麦株高估算[J]. 麦类作物学报, 2019, 39(7): 859-866.

LIU Z, NIU Y, WANG Y, et al. Estimation of plant height of winter wheat based on UAV visible image[J]. Journal of Triticeae Crops, 2019, 39(7): 859-866.

6
黄瑞冬, 李广权. 玉米株高整齐度及其测定方法的比较[J]. 玉米科学, 1995, 3(2): 61-63.

HUANG R, LI G. Plant height consistencies in maize population and a comparison of their measuring techniques[J]. Maize Science, 1995, 3(2): 61-63.

7
赵广才. 关于调查小麦株高标准的讨论[J]. 北京农业科学, 1996, 14(1): 18.

ZHAO G. Discussion on investigating high standard of wheat plant[J]. Beijing Agricultural Sciences, 1996, 14(1): 18.

8
刘建刚, 赵春江, 杨贵军, 等. 无人机遥感解析田间作物表型信息研究进展[J]. 农业工程学报, 2016, 32(24): 98-106.

LIU J, ZHAO C, YANG G, et al. Review of field-based phenotyping by unmanned aerial vehicle remote sensing platform[J]. Transactions of the CSAE, 2016, 32(24): 98-106.

9
HMIDA S BEN, KALLEL A, PGASTELLU-ETCHEGORRY J, et al. Crop biophysical properties estimation based on LiDAR full-waveform inversion using the DART RTM[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2017, 10(11): 4853-4868.

10
陈松尧, 程新文. 机载LiDAR系统原理及应用综述[J]. 测绘工程, 2007, 16(1): 27-31.

CHEN S, CHENG X. The principle and application of airborne LiDAR[J]. Engineering of Surveying and Mapping, 2007, 16(1): 27-31.

11
LI W, NIU Z, WANG C, et al. Combined use of airborne LiDAR and satellite GF-1 data to estimate leaf area index, height, and aboveground biomass of maize during peak growing season[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2015, 8(9): 4489-4501.

12
EITEL J U, HÖFLE B, VIERLING L A, et al. Beyond 3-D: The new spectrum of LiDAR applications for earth and ecological sciences[J]. Remote Sensing of Environment, 2016, 186: 372-392.

13
FRIEDLI M, KIRCHGESSNER N, GRIEDER C, et al. Terrestrial 3D laser scanning to track the increase in canopy height of both monocot and dicot crop species under field conditions[J]. Plant Methods, 2016, 12: ID 9.

14
CROMMELINCK S, HOEFLE B. Simulating an autonomously operating low-cost static terrestrial LiDAR for multitemporal maize crop height measurements[J]. Remote Sensing, 2016, 8(3): ID 205.

15
EITEL J U H, MAGNEY T S, VIERLING L A, et al. An automated method to quantify crop height and calibrate satellite-derived biomass using hypertemporal LiDAR[J]. Remote Sensing of Environment, 2016, 187: 414-422.

16
QIU Q, SUN N, BAI H, et al. Field-based high-throughput phenotyping for maize plant using 3D LiDAR point cloud generated with a "Phenomobile"[J]. Frontiers in Plant Science, 2019, 10: ID 554.

17
ANDUJAR D, ESCOLA A, ROSELL-POLO J R, et al. Potential of a terrestrial LiDAR-based system to characterise weed vegetation in maize crops[J]. Computers and Electronics in Agriculture, 2013, 92: 11-15.

18
MALAMBO L, POPESCU S C, MURRAY S C, et al. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery[J]. International Journal of Applied Earth Observation and Geoinformation, 2018, 64: 31-42.

19
苏伟, 蒋坤萍, 郭浩, 等. 地基激光雷达提取大田玉米植株表型信息[J]. 农业工程学报, 2019, 35(10): 125-130.

SU W, JIANG K, GUO H, et al. Extraction of phenotypic information of maize plants in field by terrestrial laser scanning[J]. Transactions of the CSAE, 2019, 35(10): 125-130.

20
YUAN W, LI J, BHATTA M, et al. Wheat height estimation using LiDAR in comparison to ultrasonic sensor and UAS[J]. Sensors, 2018, 18(11): ID 3731.

21
MADEC S, BARET F, DE SOLAN B, et al. High-throughput phenotyping of plant height: Comparing unmanned aerial vehicles and ground LiDAR Estimates[J]. Frontiers in Plant Science, 2017, 8: ID 2002.

22
PHAN A T T, TAKAHASHI K, RIKIMARU A, et al. Method for estimating rice plant height without ground surface detection using laser scanner measurement[J]. Journal of Applied Remote Sensing, 2016, 10(4): ID 046018.

23
JIMENEZ-BERNI J A, DEERY D M, PABLO R L, et al. Throughput determination of plant height, ground cover, and above-ground biomass in wheat with LiDAR[J]. Frontiers in Plant Science, 2018, 9: ID 237.

24
WANG X, SINGH D, MARLA S, et al. Field-based high-throughput phenotyping of plant height in sorghum using different sensing technologies[J]. Plant Methods, 2018, 14: ID 53.

25
BARKER J, ZHANG N, SHARON J, et al. Development of a field-based high-throughput mobile phenotyping platform[J]. Computers and Electronics in Agriculture, 2016, 122: 74-85.

26
HARKEL T J, BARTHOLOMEUS H, KOOISTRA L, et al. Biomass and crop height estimation of different crops using UAV-based LiDAR[J]. Remote Sensing, 2020, 12(1): ID 17.

27
THOMPSON A L, THORP K R, CONLEY M M, et al. Comparing nadir and multi-angle view sensor technologies for measuring in-field plant height of upland cotton[J]. Remote Sensing, 2019, 11: ID 700.

28
SCHIRRMANN M, HAMDORF A, GIEBEL A, et al. Regression kriging for improving crop height models fusing ultra-sonic sensing with UAV imagery[J]. Remote Sensing, 2017, 9(7): ID 665.

29
YUAN H, BENNETT R S, WANG N, et al. Development of a peanut canopy measurement system using a ground-based LiDAR sensor[J]. Frontiers in Plant Science, 2019, 10: ID 203.

30
BARMEIER G, MISTELE B, SCHMIDHALTER U, et al. Referencing laser and ultrasonic height measurements of barley cultivars by using a herbometre as standard[J]. Crop & Pasture Science, 2016, 67(12): 1215-1222.

31
PITTMAN J J, ARNALL D B, INTERRANTE S M, et al. Estimation of biomass and canopy height in bermudagrass, alfalfa, and wheat using ultrasonic, laser, and spectral sensors[J]. Sensors, 2015, 15(2): 2920-2943.

32
冯佳睿, 马晓丹, 关海鸥, 等. 基于深度信息的大豆株高计算方法[J]. 光学学报, 2019, 39(5): 258-268.

FENG J, MA X, GUAN H, et al. Calculation method of soybean plant height based on depth information[J]. Acta Optica Sinica, 2019, 39(5): 258-268.

33
MARTINEZ-GUANTER J, RIBEIRO A, PETEINATOS G G, et al. Low-cost three-dimensional modeling of crop plants[J]. Sensors, 2019, 19: ID 2883.

34
VAZQUEZ-ARELLANO M, PARAFOROS D S, REISER D, et al. Determination of stem position and height of reconstructed maize plants using a time-of-flight camera[J]. Computers and Electronics in Agriculture, 2018, 154: 276-288.

35
HAEMMERLE M, HOEFLE B. Mobile low-cost 3D camera maize crop height measurements under field conditions[J]. Precision Agriculture, 2018, 19(4): 630-647.

36
MA X, ZHU K, GUAN H, et al. High-throughput phenotyping analysis of potted soybean plants using colorized depth images based on a proximal platform[J]. Remote Sensing, 2019, 11(9): ID 1085.

37
XIONG X, YU L, YANG W, et al. A high-throughput stereo-imaging system for quantifying rape leaf traits during the seedling stage[J]. Plant Methods, 2017, 13: ID 7.

38
MANO M. Precise and continuous measurement of plant heights in an agricultural field using a time-lapse camera[J]. Journal of Agricultural Meteorology, 2017, 73(3): 100-108.

39
SRITARAPIPAT T, RAKWATIN P, KASETKASEM T, et al. Automatic rice crop height measurement using a field server and digital image processing[J]. Sensors, 2014, 14(1): 900-926.

40
CAI J, KUMAR P, CHOPIN J, et al. Land-based crop phenotyping by image analysis: Accurate estimation of canopy height distributions using stereo images[J]. PloS One, 2018, 13(5): ID e0196671.

41
BROCKS S, BARETH G. Evaluating dense 3D reconstruction software packages for oblique monitoring of crop canopy surface[C]// The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Prague, Czech Republic: XXIII ISPRS Congress, 2016: 785-789.

42
ZHANG Y, TENG P, AONO M, et al. 3D monitoring for plant growth parameters in field with a single camera by multi-view approach[J]. Journal of Agricultural Meteorology, 2018, 74(4): 129-139.

43
BENDIG J, BOLTEN A, BARETH G, et al. UAV-based imaging for multi-temporal, very high resolution crop surface models to monitor crop growth variability[J]. Photogramm Fernerkund Geoinf, 2013, 47(6): 551-562.

44
HOLMAN F H, RICHE A B, MICHALSKI A, et al. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing[J]. Remote Sensing, 2016, 8(12): ID 1031.

45
CHANG A, JUNG J, MAEDA M M, et al. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS)[J]. Computers and Electronics in Agriculture, 2017, 141: 232-237.

46
陶惠林, 徐良骥, 冯海宽, 等. 基于无人机数码影像的冬小麦株高和生物量估算[J]. 农业工程学报, 2019, 35(19): 107-116.

TAO H, XU L, FENG H, et al. Estimation of plant height and biomass of winter wheat based on UAV digital image[J]. Transactions of the CSAE, 2019, 35(19): 107-116.

47
LOWE D G. Distinctive image features from scale-invariant keypoints[J]. International Journal of Computer Vision, 2004, 60(2): 91-110.

48
BELTON D, HELMHOLZ P, LONG J, et al. Crop height monitoring using a consumer-grade camera and UAV technology[J]. Journal of Photogrammetry Remote Sensing and Geoinformation Science, 2019, 87: 249-262.

49
HAN L, YANG G, DAI H, et al. Fuzzy clustering of maize plant-height patterns using time series of UAV remote-sensing images and variety traits[J]. Frontiers in Plant Science, 2019, 10: ID 926.

50
TIRADO S B, HIRSCH C N, SPRINGER N M, et al. UAV-based imaging platform for monitoring maize growth throughout development[J]. Plant Direct, 2020, 4(6): ID e00230.

51
牛庆林, 冯海宽, 杨贵军, 等. 基于无人机数码影像的玉米育种材料株高和LAI监测[J]. 农业工程学报, 2018, 34(5): 73-82.

NIU Q, FENG H, YANG G, et al. Monitoring plant height and leaf area index of maize breeding material based on UAV digital images[J]. Transactions of the CSAE, 2018, 34(5): 73-82.

52
LIU H, ZHANG J, PAN Y, et al. An efficient approach based on UAV orthographic imagery to map paddy with support of field-level canopy height from point cloud data[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2018, 11(6): 2034-2046.

53
HU P, CHAPMAN S C, WANG X, et al. Estimation of plant height using a high throughput phenotyping platform based on unmanned aerial vehicle and self-calibration: Example for sorghum breeding[J]. European Journal of Agronomy, 2018, 95: 24-32.

54
HAN X, THOMASSON J A, BAGNALL G C, et al. Measurement and calibration of plant-height from fixed-wing UAV images[J]. Sensors, 2018, 18(12): ID 4092.

55
SONG Y, WANG J. Winter wheat canopy height extraction from UAV-based point cloud data with a moving cuboid filter[J]. Remote Sensing, 2019, 11(5): ID 1239.

56
BORRA-SERRANO I, DE SWAEF T, QUATAERT P, et al. Closing the phenotyping gap: High resolution UAV time series for soybean growth analysis provides objective data from field trials[J]. Remote Sensing, 2020, 12(10): ID 1644.

57
XU R, LI C, PATERSON A H, et al. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping[J]. PloS One, 2019, 14(2): ID e0205083.

58
WU M, YANG C, SONG X, et al. Evaluation of orthomosics and digital surface models derived from aerial imagery for crop type mapping[J]. Remote Sensing, 2017, 9(3): ID 239.

59
GUO T, FANG Y, CHENG T, et al. Detection of wheat height using optimized multi-scan mode of LiDAR during the entire growth stages[J]. Computers and Electronics in Agriculture, 2019, 165: ID 104959.

60
HOFFMEISTER D, WALDHOFF G, KORRES W, et al. Crop height variability detection in a single field by multi-temporal terrestrial laser scanning[J]. Precision Agriculture, 2016, 17(3): 296-312.

61
郭新年, 周恒瑞, 张国良, 等. 基于激光视觉的农作物株高测量系统[J]. 农业机械学报, 2018, 49(2): 22-27.

GUO X, ZHOU H, ZHANG G, et al. Crop height measurement system based on laser vision[J]. Transactions of the CSAM, 2018, 49(2): 22-27.

62
程曼, 蔡振江, 袁洪波, 等. 基于地面激光雷达的田间花生冠层高度测量系统研制[J]. 农业工程学报, 2019, 35(1): 180-187.

CHENG M, CAI Z, YUAN H, et al. System design for peanut canopy height information acquisition based on LiDAR[J]. Transactions of the CSAE, 2019, 35(1): 180-187.

63
TAO H, FENG H, XU L, et al. Estimation of the yield and plant height of winter wheat using UAV-based hyperspectral images[J]. Sensors, 2020, 20(4): ID 1231.

64
HAN L, YANG G, YANG H, et al. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach[J]. Frontiers in Plant Science, 2018, 9: ID 1638.

65
VARELA S, ASSEFA Y, PRASAD P V V, et al. Spatio-temporal evaluation of plant height in corn via unmanned aerial systems[J]. Journal of Applied Remote Sensing, 2017, 11(3): ID 036013.

66
MHLANGA B, CHAUHAN B S, THIERFELDER C, et al. Weed management in maize using crop competition: A review[J]. Crop Protection, 2016, 88: 28-36.

67
YOUNGERMAN C Z, DITOMMASO A, CURRAN W S, et al. Corn density effect on interseeded cover crops, weeds, and grain yield[J]. Agronomy Journal, 2018, 110(6): 2478-2487.

68
ENCISO J, AVILA C A, JUNG J, et al. Validation of agronomic UAV and field measurements for tomato varieties[J]. Computers and Electronics in Agriculture, 2019, 158: 278-283.

69
BENDIG J, BOLTEN A, BENNERTZ S, et al. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging[J]. Remote Sensing, 2014, 6(11): 10395-10412.

70
POUND M P, FRENCH A P, MURCHIE E H, et al. Automated recovery of three-dimensional models of plant shoots from multiple color images[J]. Plant Physiology, 2014, 166(4): 1688-1698.

71
HASHEMINASAB S M, ZHOU T, HABIB A, et al. GNSS/INS-Assisted structure from motion strategies for UAV-Based imagery over mechanized agricultural fields[J]. Remote Sensing, 2020, 12(3): ID 351.

72
DANDRIFOSSE S, BOUVRY A, LEEMANS V, et al. Imaging wheat canopy through stereo vision: Overcoming the challenges of the laboratory to field transition for morphological features extraction[J]. Frontiers in Plant Science, 2020, 11: ID 96.

73
邱小雷, 方圆, 郭泰, 等. 基于地基LiDAR高度指标的小麦生物量监测研究[J]. 农业机械学报, 2019, 50(10): 159-166.

QIU X, FANG Y, GUO T, et al. Monitoring of wheat biomass based on terrestrial-LiDAR height metric[J]. Transactions of the CSAM, 2019, 50(10): 159-166.

74
BUELVAS R M, ADAMCHUK V I, LEKSONO E, et al. Biomass estimation from canopy measurements for leafy vegetables based on ultrasonic and laser sensors[J]. Computers and Electronics in Agriculture, 2019, 164: ID 104896.

75
YUE J, YANG G, LI C, et al. Estimation of winter wheat above-ground biomass using Unmanned Aerial Vehicle-based snapshot hyperspectral sensor and crop height improved models[J]. Remote Sensing, 2017, 9(7): ID 708.

76
CEN H, WAN L, ZHU J, et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras[J]. Plant Methods, 2019, 15: ID 32.

77
BALLESTEROS R, FERNANDO ORTEGA J, HERNANDEZ D, et al. Onion biomass monitoring using UAV-based RGB imaging[J]. Precision Agriculture, 2018, 19(5): 840-857.

78
ZHOU L, GU X, CHENG S, et al. Analysis of plant height changes of lodged maize using UAV-LiDAR data[J]. Agriculture, 2020, 10(5): ID 146.

79
CHU T, STAREK M J, BREWER M J, et al. Assessing lodging severity over an experimental maize (Zea mays L.) field using UAS images[J]. Remote Sensing, 2017, 9(9): ID 923.

80
WILKE N, SIEGMANN B, KLINGBEIL L, et al. Quantifying lodging percentage and lodging severity using a UAV-based canopy height model combined with an objective threshold approach[J]. Remote Sensing, 2019, 11(5): ID 515.

81
GEIPEL J, LINK J, CLAUPEIN W, et al. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system[J]. Remote Sensing, 2014, 6(11): 10335-10355.

82
YU D, ZHA Y, SHI L, et al. Improvement of sugarcane yield estimation by assimilating UAV-derived plant height observations[J]. European Journal of Agronomy, 2020, 121: ID 126159.

83
FENG A, ZHOU J, VORIES E D, et al. Yield estimation in cotton using UAV-based multi-sensor imagery[J]. Biosystems Engineering, 2020, 193: 101-114.

84
LI B, XU X, ZHANG L, et al. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging[J]. Isprs Journal of Photogrammetry and Remote Sensing, 2020, 162: 161-172.

85
HASSAN M A, YANG M, FU L, et al. Accuracy assessment of plant height using an unmanned aerial vehicle for quantitative genomic analysis in bread wheat[J]. Plant Methods, 2019, 15: ID 37.

86
WANG X, ZHANG R, SONG W, et al. Dynamic plant height QTL revealed in maize through remote sensing phenotyping using a high-throughput unmanned aerial vehicle (UAV)[J]. Scientific Reports, 2019, 9: ID 3458.

87
ROTH L, STREIT B. Predicting cover crop biomass by lightweight UAS-based RGB and NIR photography: An applied photogrammetric approach[J]. Precision Agriculture, 2018, 19(1): 93-114.

88
BENDIG J, YU K, AASEN H, et al. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley[J]. International Journal of Applied Earth Observation and Geoinformation, 2015, 39: 79-87.

89
LI J, SHI Y, VEERANAMPALAYAM-SIVAKUMAR AN, et al. Elucidating sorghum biomass, nitrogen and chlorophyll contents with spectral and morphological traits derived from unmanned aircraft system[J]. Frontiers in Plant Science, 2018, 9: ID 1406.

90
MICHEZ A, BAUWENS S, BROSTAUX Y, et al. How far can consumer-grade uav rgb imagery describe crop production? A 3D and multitemporal modeling approach applied to Zea mays[J]. Remote Sensing, 2018, 10(11): ID 1798.

91
HAN L, YANG G, DAI H, et al. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data[J]. Plant Methods, 2019, 15: ID 10.

92
ZHU W, SUN Z, PENG J, et al. Estimating maize above-ground biomass using 3D point clouds of multi-source unmanned aerial vehicle data at multi-spatial scales[J]. Remote Sensing, 2019, 11(22): ID 2678.

93
GREAVES H E, VIERLING L A, EITEL J U H, et al. Estimating aboveground biomass and leaf area of low-stature Arctic shrubs with terrestrial LiDAR[J]. Remote Sensing of Environment, 2015, 164: 26-35.

94
GIL-DOCAMPO M L, ARZA-GARCIA M, ORTIZ-SANZ J, et al. Above-ground biomass estimation of arable crops using UAV-based SfM photogrammetry[J]. Geocarto International, 2020, 35(7): 687-699.

95
杨琦, 叶豪, 黄凯, 等. 利用无人机影像构建作物表面模型估测甘蔗LAI[J]. 农业工程学报, 2017, 33(8): 104-111.

YANG Q, YE H, HUANG K, et al. Estimation of leaf area index of sugarcane using crop surface model based on UAV image[J]. Transactions of the CSAE, 2017, 33(8): 104-111.

96
SINGH D, WANG X, KUMAR U, et al. High-throughput phenotyping enabled genetic dissection of crop lodging in wheat[J]. Frontiers in Plant Science, 2019, 10: ID 394.

97
SU W, ZHANG M, BIAN D, et al. Phenotyping of corn plants using Unmanned Aerial Vehicle (UAV) images[J]. Remote Sensing, 2019, 11(17): ID 2021.

98
HAN L, YANG G, FENG H, et al. Quantitative identification of maize lodging-causing feature factors using unmanned aerial vehicle images and a nomogram computation[J]. Remote Sensing, 2018, 10(10): ID 1528.

99
ACORSI M G, MARTELLO M, ANGNES G, et al. Identification of maize lodging: A case study using a remotely piloted aircraft system[J]. Engenharia Agricola, 2019, 39: 66-73.

100
LI J, VEERANAMPALAYAM-SIVAKUMAR AN, BHATTA M, et al. Principal variable selection to explain grain yield variation in winter wheat from features extracted from UAV imagery[J]. Plant Methods, 2019, 15: ID 123.

101
ZHOU G, YIN X. Relationship of cotton nitrogen and yield with normalized difference vegetation index and plant height[J]. Nutrient Cycling in Agroecosystems, 2014, 100(2): 147-160.

102
XUE H, TIAN X, ZHANG K, et al. Mapping developmental QTL for plant height in soybean [Glycine max (L.) Merr.] using a four-way recombinant inbred line population[J]. PloS One, 2019, 14(11): ID e0224897.

103
HERTER C P, EBMEYER E, KOLLERS S, et al. Rht24 reduces height in the winter wheat population 'Solitar x Bussard' without adverse effects on Fusarium head blight infection[J]. Theoretical and Applied Genetics, 2018, 131(6): 1263-1272.

104
MA X, FENG F, WEI H, et al. Genome-wide association study for plant height and grain yield in rice under contrasting moisture regimes[J]. Frontiers in Plant Science, 2016, 7: ID 1801.

105
WATANABE K, GUO W, ARAI K, et al. High-throughput phenotyping of sorghum plant height using an Unmanned Aerial Vehicle and its application to genomic prediction modeling[J]. Frontiers in Plant Science, 2017, 8: ID 421.

106
KAKERU W, WEI G, KEIGO A, et al. High-throughput phenotyping of sorghum plant height using an unmanned aerial vehicle and its application to genomic prediction modeling[J]. Frontiers in Plant Science, 2017, 8: ID 421.

107
刘忠, 万炜, 黄晋宇, 等. 基于无人机遥感的农作物长势关键参数反演研究进展[J]. 农业工程学报, 2018, 34(24): 60-71.

LIU Z, WAN W, HUANG J, et al. Progress on key parameters inversion of crop growth based on unmanned aerial vehicle remote sensing[J]. Transactions of the CSAE, 2018, 34(24): 60-71.

108
XIE T, LI J, YANG C, et al. Crop height estimation based on UAV images: methods, errors, and strategies[J]. Computers and Electronics in Agriculture, 2021, 185: ID 106155.

109
WALTER J D C, EDWARDS J, MCDONALD G, et al. Estimating biomass and canopy height with LiDAR for field crop breeding[J]. Frontiers in Plant Science, 2019, 10: ID 1145.

110
WANG H, WANG R, LIU B, et al. QTL analysis of salt tolerance in Sorghum bicolor during whole—plant growth stages[J]. Plant Breeding, 2020, 139(3): 455-465.

111
LIU F, HU P, ZHENG B, et al. A field-based high-throughput method for acquiring canopy architecture using unmanned aerial vehicle images[J]. Agricultural and Forest Meteorology, 2021, 296: ID 108231.

112
HU T, SUN X, SU Y, et al. Development and performance evaluation of a very low-cost UAV-LiDAR system for forestry applications[J]. Remote Sensing, 2020, 13(1): ID 77.

113
LUO S, LIU W, ZHANG Y, et al. Maize and soybean heights estimation from unmanned aerial vehicle (UAV) LiDAR data[J]. Computers and Electronics in Agriculture, 2021, 182: ID 106005.

114
管贤平, 刘宽, 邱白晶, 等. 基于机载三维激光扫描的大豆冠层几何参数提取[J]. 农业工程学报, 2019, 35(23): 96-103.

GUAN X, LIU K, QIU B, et al. Extraction of geometric parameters of soybean canopy by airborne 3D laser scanning[J]. Transactions of the CSAE, 2019, 35(23): 96-103.

115
VIKHE P, VENKATESAN S, CHAVAN A, et al. Mapping of dwarfing gene Rht14 in durum wheat and its effect on seedling vigor, internode length and plant height[J]. The Crop Journal, 2019, 7(2): 187-197.

116
刘永康, 李明军, 李景原, 等. 小麦旗叶直立转披动态过程对其高光效的影响[J]. 科学通报, 2009(15): 2205-2211.

LIU Y, LI M, LI J, et al. Dynamic changes in flag leaf angle contribute to high photosynthetic capacity[J]. Chinese Science Bulletin, 2009, 54(15): 2205-2211.

117
CHENG T, LU N, WANG W, et al. Estimation of nitrogen nutrition status in winter wheat from unmanned aerial vehicle based multi-angular multispectral imagery[J]. Frontiers in Plant Science, 2019, 10: ID 1601.

118
CHE Y, WANG Q, XIE Z, et al. Estimation of maize plant height and leaf area index dynamic using unmanned aerial vehicle with oblique and nadir photography[J]. Annals of Botany, 2020, 126(4): 765-773.

Outlines

/