Welcome to Smart Agriculture 中文

Smart Agriculture ›› 2020, Vol. 2 ›› Issue (3): 75-85.doi: 10.12133/j.smartag.2020.2.3.202008-SA001

• Topic--Agricultural Artificial Intelligence and Big Data • Previous Articles     Next Articles

Identification and Morphological Analysis of Adult Spodoptera Frugiperda and Its Close Related Species Using Deep Learning

WEI Jing1(), WANG Yuting1, YUAN Huizhu2, ZHANG Menglei1, WANG Zhenying2   

  1. 1.Shenzhen SenseAgro Technology Co. , Ltd, Shenzhen 518063, China
    2.Institute of Plant Protection, Chinese Academy of Agricultural Sciences, Beijing 100193, China
  • Received:2020-08-01 Revised:2020-08-31 Online:2020-09-30
  • corresponding author:

    1. WANG Yuting, E-mail:yuting.wang@ senseagro.com; 

    2. WANG Zhenying, E-mail:zywang@ippcaas.cn

  • About author:WEI Jing, E-mail: jing.wei@senseagro.com
  • Supported by:
    Central-level Public Welfare Scientific Research Institute Basic Research Business Fund Special Project (Y2019YJ06); Chinese Academy of Agricultural Sciences Major Research Task (CAAS-ZDRW202007)

Abstract:

Invasive pest fall armyworm (FAW) Spodoptera frugiperda is one of the serious threats to the food safety. Early warning and control plays a key role in FAW management. Nowadays, deep learning technology has been applied to recognize the image of FAW. However, there is a serious lack of training dataset in the current researches, which may mislead the model to learn features unrelated to the key visual characteristics (ring pattern, reniform pattern, etc.) of FAW adults and its close related species. Therefore, this research established a database of 10,177 images belonging to 7 species of noctuid adults, including FAW and 6 FAW close related species. Based on the small-scale dataset, transfer learning was used to build the recognition model of FAW adults by employing three deep learning models (VGG-16, ResNet-50 and DenseNet-121) pretrained on ImageNet. All of the models got more than 98% recognition accuracy on the same testing dataset. Moreover, by using feature visualization techniques, this research visualized the features learned by deep learning models and compared them to the related key visual characteristics recognized by human experts. The results showed that there was a high consistency between the two counterparts, i.e., the average feature recognition rate of ResNet-50 and DenseNet-121 was around 85%, which further demonstrated that it was possible to use the deep learning technology for the real-time monitoring of FAW adults. In addition, this study also found that the learning abilities of key visual characteristics among different models were different even though they have similar recognition accuracy. Herein, we suggest that when evaluating the model capacity, we should not only focus on the recognition rate, the ability of learning individual visual characteristics should be allocated importance for evaluating the model performance. For those important taxonomical traits, if the visualization results indicated that the model didn't learnt them, we should then modify our datasets or adjusting the training strategies to increase the learning ability. In conclusion, this study verified that visualizing the features learnt by the model is a good way to evaluate the learning ability of deep learning models, and to provide a possible way for other researchers in the field who want to understand the features learnt by deep learning models.

Key words: Spodoptera frugiperda, noctuid, adult moth recognition, deep learning, visual characteristics, feature visualization, transfer learning

CLC Number: