Welcome to Smart Agriculture 中文

Smart Agriculture ›› 2025, Vol. 7 ›› Issue (6): 124-135.doi: 10.12133/j.smartag.SA202509029

• Special Issue--Remote Sensing + AI Empowering the Modernization of Agriculture and Rural Areas • Previous Articles    

Robust UAV-Based Method for Peanut Plant Height Estimation Using Bare-Soil Invariant Constraints

SONG Mingxuan1, BAI Bo2, YANG Juntao1(), ZHANG Yutao1, LI Sa1, LI Zhenhai1, WAN Shubo2, LI Guowei2()   

  1. 1. College of Geodesy and Geomatics, Shandong University of Science and Technology, Qingdao 266590, China
    2. Shandong Provincial Key Laboratory of Physiology, Ecology and Efficient Production of Field Crops, Institute of Crop Germplasm Resources, Shandong Academy of Agricultural Sciences (in preparation), Jinan 250100, China
  • Received:2025-09-25 Online:2025-11-30
  • Foundation items:Shandong Provincial Key Research and Development Program(2022LZGC021); National Natural Science Foundation of China(32472116); Shandong Provincial Higher Education Young Scholars Science and Technology Support Program(2024KJH062)
  • About author:

    SONG Mingxuan, E-mail: ; 2

    BAI Bo, E-mail:

  • corresponding author:
    YANG Juntao, E-mail: ;
    LI Guowei, E-mail:

Abstract:

[Objective] Peanuts plant height is a key structural trait for assessing crop growth and nitrogen response. Accurate and efficient height acquisition is essential for monitoring canopy vigor, supporting genotype selection, and enabling precision management. However, conventional ground control point (GCP)-based methods require substantial field deployment and are highly sensitive to local misregistration between multi-temporal digital surface models (DSMs) and digital elevation models (DEMs). In low-stature, prostrate peanut canopies on uneven terrain, such residual elevation errors propagate directly into the canopy height model, severely reducing estimation accuracy. To overcome these limitations, a robust unmanned aerial vehicle (UAV)-based method is developed for peanut plant height estimation using bare-soil invariant constraints. The workflow incorporates crop-mask-assisted fine registration to optimize DSM-DEM alignment and eliminates the need for dense GCP distribution. [Methods] Field experiments were conducted at a peanut experimental station in Wangbian community, Ningyang county, Tai'an city, Shandong Province, China, using multiple UAV platforms (DJI Mavic 3 Multispectral and DJI MATRICE 350 RTK equipped with a DJI Zenmuse P1 camera), two growth stages (42 d and 49 d after sowing, DAS), and two nitrogen fertilization levels (high nitrogen and low nitrogen). To validate the peanut plant height estimates, representative plants in each plot were selected before and after each UAV image acquisition, and manual measurements from the ground surface to the canopy apex were recorded as the reference plant height. High-resolution digital orthomosaic (DOM) images were then generated from the UAV data, and peanut canopy regions were extracted using the excess green (ExG) index. To mitigate threshold instability caused by variable illumination and soil background conditions, a fixed empirical threshold was combined with an adaptive strategy that integrated Otsu's between-class variance method and the median absolute deviation (MAD), thereby ensuring robust canopy segmentation across growth stages and nitrogen treatments. After canopy extraction, the peanut canopy mask derived from the DOM was used to remove corresponding pixels from the DSM on a per-pixel basis. The DSMs with and without canopy points were then separately used for 3D reconstruction, yielding a canopy point cloud and a bare-soil point cloud. This bare-soil point cloud and a bare-soil DSM acquired before crop emergence (used as the DEM reference) were jointly input into the iterative closest point (ICP) algorithm to solve for a three-dimensional rigid transformation matrix. The resulting matrix was used to jointly optimize translations and rotations along the X, Y, and Z directions. It was applied uniformly to the DSM containing peanut canopy points, thereby achieving fine-scale alignment between the DSM and DEM at the block level. Following registration, the DEM was used as the ground reference, and the canopy height model was constructed by differencing the DSM and DEM pixel by pixel. The 95th percentile (P95) of canopy height within each plot, derived from the canopy height histogram, was used as the representative plant height to reduce the influence of local noise on the statistics. [Results and Discussions] The results showed that varying the ExG threshold among 0.05, 0.10 and 0.15 had only a limited effect on overall plant height estimation accuracy, with the best performance observed at 0.10. At this threshold, the Mavic 3 platform achieved an R2 of 0.864 7 and a root-mean-square error (RMSE) of 2.57 cm. In contrast, the P1 platform achieved an R2 of 0.918 6 and an RMSE of 2.05 cm, indicating that the proposed threshold selection strategy provided a good balance between accuracy and robustness. Error analysis across different canopy-height percentiles showed that, as the percentile increased from P90 to P99, R2 and RMSE exhibited a typical concave pattern, first improving and then degrading. Among these percentiles, P95 yielded the highest R2 and the lowest RMSE, representing the best trade-off between noise suppression and canopy-top information retention; therefore, P95 was adopted as the representative plant height for this method. Under the P95-based definition of plant height, the traditional GCP method produced R2 values of only 0.592 3-0.669 9 and RMSE values of 4.60~4.94 cm, and the "GCP+ICP" workflow, in which canopy points were not removed prior to ICP registration, was most strongly affected by noise in the point clouds, with R2 dropping below 0.3 in some cases. In contrast, the proposed method maintained R2 values of 0.864 7~0.918 6 and RMSE values of 2.05~2.57 cm across both platforms, markedly improving the agreement between estimated and measured plant height relative to the traditional GCP-based approach. Further platform-specific analysis showed that, owing to its higher spatial resolution, the P1 platform reconstructed a more complete canopy-top structure and yielded better plant height estimates than the Mavic 3 platform at each growth stage. Nevertheless, when combined with the proposed plant height extraction workflow, the Mavic 3 platform still achieved reliable performance (R2 > 0.817 7) in regions with different nitrogen contents, confirming the method's multi-platform applicability. From the perspective of canopy cover and nitrogen level, as the crop progressed from 42 to 49 DAS, the peanut canopy gradually approached full closure, the proportion of high-value pixels in the canopy height model increased, the canopy-top point cloud in the DSM became more continuous, and plant height estimation accuracy improved accordingly. Under high nitrogen treatment, the canopy was denser and structurally more complete than under low nitrogen treatment, resulting in slightly higher R2 and slightly lower RMSE on both platforms; however, these differences remained within a controllable range, demonstrating that the bare-soil-based registration workflow was robust to fertility differences and that the proposed method was stable and transferable across growth stages and fertility conditions. [Conclusions] Overall, the proposed method for estimating peanut plant height substantially alleviates the constraints imposed by the misregistration of residual DSM and DEM on plant height inversion for low-stature, prostrate crops. It achieves centimetre-level accuracy for plant height retrieval across platforms and nitrogen treatments. By significantly reducing the dependence on densely distributed GCPs and offering a simple, reproducible, and low-cost processing pipeline, the method provides a scalable technical route for monitoring peanut nitrogen responses, deriving high-throughput agronomic structural traits, and measuring plant height in other low-stature, prostrate crops.

Key words: UAV remote sensing, peanut plant height, canopy height model, point cloud registration, DSM, DEM

CLC Number: