欢迎您访问《智慧农业(中英文)》官方网站! English

Smart Agriculture ›› 2021, Vol. 3 ›› Issue (1): 109-117.doi: 10.12133/j.smartag.2021.3.1.202009-SA004

• 信息处理与决策 • 上一篇    下一篇

面向植物病害识别的卷积神经网络精简结构Distilled-MobileNet模型

邱文杰1(), 叶进1(), 胡亮青1, 杨娟2, 李其利3, 莫贱友3, 易万茂1   

  1. 1.广西大学 计算机与电子信息学院,广西 南宁 530004
    2.广西大学 农学院,广西 南宁 530004
    3.广西壮族自治区农业科学院 植物保护研究所,广西 南宁 530007
  • 收稿日期:2020-09-29 修回日期:2020-12-03 出版日期:2021-03-30
  • 基金项目:
    国家重点研发计划(2017YFD0202106-3);广西创新驱动发展专项项目(桂科AA17204059-9);地方综合性研究型大学“五有”领军型人才培养研究与实践(2018JGZ103);2020年教育部新农科研究与改革实践项目
  • 作者简介:邱文杰(1997-),男,硕士,研究方向为计算机视觉与农业病虫害识别。E-mail:qiuwenjie1997@163.com
  • 通信作者: 叶 进(1970-),女,博士,教授,研究方向为计算机软件与系统开发。电话:13877192800。E-mail:

Distilled-MobileNet Model of Convolutional Neural Network Simplified Structure for Plant Disease Recognition

QIU Wenjie1(), YE Jin1(), HU Liangqing1, YANG Juan2, LI Qili3, MO Jianyou3, YI Wanmao1   

  1. 1.School of Computer and Electronic Information, Guangxi University, Nanning 530004, China
    2.School of Agriculture, Guangxi University, Nanning 530004, China
    3.Institute of Plant Protection, Guangxi Academy of Agricultural Sciences, Nanning 530007, China
  • Received:2020-09-29 Revised:2020-12-03 Online:2021-03-30
  • Foundation items:National Key Research and Development Program of China(2017YFD0202106-3); Research and Practice on the Cultivation of "Five have" Leader-type Talents in Local Comprehensive Research-oriented University(2018JGZ103); New Agricultural Science Research and Reform Practice Project of the Ministry of Education, 2020
  • About author:QIU Wenjie, E-mail:qiuwenjie1997@163.com
  • Corresponding author:YE Jin, E-mail:

摘要:

卷积神经网络(CNN)的发展带来了大量的网络参数和庞大的模型体积,极大地限制了其在小规模计算资源设备上的应用。为将CNN应用在各种小型设备上,研究了一种基于知识蒸馏的结构化模型压缩方法。该方法首先利用VGG16训练了一个识别率较高的教师模型,再将该模型中的知识通过蒸馏的方法迁移到MobileNet,从而大幅减少了模型的参数量。将知识蒸馏后的Distilled-MobileNet模型应用在14种作物的38种常见病害分类中。进行了知识蒸馏在VGG16、AlexNet、GoogleNet和ResNet 4种不同网络结构上的表现测试,结果表明,当VGG16作为教师模型时,模型准确率提升了97.54%;使用单个病害识别率、平均准确率、模型内存、平均识别时间4个指标对训练好的Distilled-MobileNet模型进行真实环境下准确性评估,经测试,模型的平均准确率达到了97.62%,平均识别时间缩短至0.218 s,仅占VGG16模型的13.20%,模型大小压缩仅为19.83 MB,相比于VGG16缩小了93.60%,使其具备了较高的准确性和实时性要求。本方法模型在压缩内存体积和缩短识别时间上较传统神经网络有了明显提高,为内存和计算资源受限设备上的病害识别提供了新的思路。

关键词: 病害识别, 深度学习, 模型压缩, 知识蒸馏, 卷积神经网络

Abstract:

The development of convolutional neural networks(CNN) has brought a large number of network parameters and huge model volumes, which greatly limites the application on devices with small computing resources, such as single-chip microcomputers and mobile devices. In order to solve the problem, a structured model compression method was studied in this research. Its core idea was using knowledge distillation to transfer the knowledge from the complex integrated model to a lightweight small-scale neural network. Firstly, VGG16 was used to train a teacher model with a higher recognition rate, whose volume was much larger than the student model. Then the knowledge in the model was transfered to MobileNet by using distillation. The parameters number of the VGG16 model was greatly reduced. The knowledge-distilled model was named Distilled-MobileNet, and was applied to the classification task of 38 common diseases (powdery mildew, Huanglong disease, etc.) of 14 crops (soybean, cucumber, tomato, etc.). The performance test of knowledge distillation on four different network structures of VGG16, AlexNet, GoogleNet, and ResNet showed that when VGG16 was used as a teacher model, the accuracy of the model was improved to 97.54%. Using single disease recognition rate, average accuracy rate, model memory and average recognition time as 4 indicators to evaluate the accuracy of the trained Distilled-MobileNet model in a real environment, the results showed that, the average accuracy of the model reached 97.62%, and the average recognition time was shortened to 0.218 s, only accounts for 13.20% of the VGG16 model, and the model size was reduced to only 19.83 MB, which was 93.60% smaller than VGG16. Compared with traditional neural networks, distilled-mobile model has a significant improvement in reducing size and shorting recognition time, and can provide a new idea for disease recognition on devices with limited memory and computing resources.

Key words: disease identification, deep learning, model compression, knowledge distillation, convolutional neural network

中图分类号: