Welcome to Smart Agriculture 中文

Smart Agriculture ›› 2021, Vol. 3 ›› Issue (2): 23-34.doi: 10.12133/j.smartag.2021.3.2.202104-SA003

• Topic--Application of Spatial Information Technology in Agriculture • Previous Articles     Next Articles

Wheat Lodging Ratio Detection Based on UAS Imagery Coupled with Different Machine Learning and Deep Learning Algorithms

FLORES Paulo(), ZHANG Zhao()   

  1. Department of Agricultural and Biosystems Engineering, North Dakota State University, Fargo, ND 58102, USA
  • Received:2021-04-26 Revised:2021-06-28 Online:2021-06-30 Published:2021-08-25
  • corresponding author: Zhao ZHANG E-mail:paulo.flores@ndsu.edu;zhao.zhang.1@ndsu.edu
  • About author:Paulo FLORES (1979-), male, assistant professor, research interests is remote sensing technologies in agriculture. E-mail: paulo.flores@ndsu.edu.
  • Supported by:
    North Dakota Agricultural Experiment Station Precision Agriculture Graduate Research Assistantship(6064-21660-001-32S);USDA Agricultural Research Service Project(435589)


Wheat lodging is a negative factor affecting yield production. Obtaining timely and accurate wheat lodging information is critical. Using unmanned aerial systems (UASs) images for wheat lodging detection is a relatively new approach, in which researchers usually apply a manual method for dataset generation consisting of plot images. Considering the manual method being inefficient, inaccurate, and subjective, this study developed a new image processing-based approach for automatically generating individual field plot datasets. Images from wheat field trials at three flight heights (15, 46, and 91 m) were collected and analyzed using machine learning (support vector machine, random forest, and K nearest neighbors) and deep learning (ResNet101, GoogLeNet, and VGG16) algorithms to test their performances on detecting levels of wheat lodging percentages: non- (0%), light (<50%), and severe (>50%) lodging. The results indicated that the images collected at 91 m (2.5 cm/pixel) flight height could yield a similar, even slightly higher, detection accuracy over the images collected at 46 m (1.2 cm/pixel) and 15 m (0.4 cm/pixel) UAS mission heights. Comparison of random forest and ResNet101 model results showed that ResNet101 resulted in more satisfactory performance (75% accuracy) with higher accuracy over random forest (71% accuracy). Thus, ResNet101 is a suitable model for wheat lodging ratio detection. This study recommends that UASs images collected at the height of about 91 m (2.5 cm/pixel resolution) coupled with ResNet101 model is a useful and efficient approach for wheat lodging ratio detection.

Key words: wheat lodging ratio, machine learning, deep learning, mission height, UAS, ResNet101

CLC Number: