Welcome to Smart Agriculture 中文

Smart Agriculture ›› 2020, Vol. 2 ›› Issue (3): 48-60.doi: 10.12133/j.smartag.2020.2.3.202007-SA006

• Topic--Agricultural Artificial Intelligence and Big Data • Previous Articles     Next Articles

Design and Application of Facility Greenhouse Image Collecting and Environmental Data Monitoring Robot System

GUO Wei1(), WU Huarui1,2,3(), ZHU Huaji1,2,3   

  1. 1.National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
    2.Beijing Research Center for Information Technology in Agriculture, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
    3.Key Laboratory of Agri-Informatics, Ministry of Agriculture and Rural Affairs, Beijing 100097, China
  • Received:2020-07-22 Revised:2020-09-08 Online:2020-09-30
  • Foundation items:
    Pinggu Agricultural Science and Technology Innovation Service Platform Construction and Demonstration Application of Agricultural Artificial Intelligence (Z191100004019007); National Natural Science Foundation of China(61871041); Hebei Province Key Research and Development Program Project (19226919D)
  • About author:GUO Wei, E-mail:guowei@nercita.org.cn
  • corresponding author: WU Huarui, E-mail:wuhr@nercita.org.cn

Abstract:

China's facility horticulture has developed rapidly in the past 30 years and now comes to the first in the world in terms of area. However, the number of farmers is decreasing. "Machine replaces labor" has become the current research hotspot. In order to realize the fine collection of crop images and environmental monitoring data, a three-dimensional environmental robot monitoring system for crops was designed. The robot consists of three parts: perception center, decision center and execution center, which carry out environmental perception from machine perspective, data analysis, decision instruction generation and action execution respectively. In perception layer, the system realized real-time videos, images, data monitoring such as air temperature, air humidity, illumination intensity and concentrations of carbon dioxide in grid scale from multi-angle with high accuracy. At the system level, automatic speech recognition was integrated to make the system easier to use, especially for farmers who usually work in the fields. In transport layer, monitoring data and control instructions were converged to local data center through wireless bridges. Concretely, transmission mode was chosen according to different characteristics of data, wire transmission is available for big size data, such as images and videos, while wireless transmission is mainly applied to small size data, such as environmental monitoring parameters. In data processing layer, feedbacks and control instructions were made by multi-source heterogeneous data of crop model analysis, in terms of commands, independent inspection mode and real-time remote-control mode were available for users. Plant type, user information, historical data and management data were taken into consideration. Finally, in application layer, the system provided web and mobile intelligence services that could be used for the whole growth periods in terms of images, real-time videos, monitoring data collection and analysis of cucumbers, tomatoes, greenhouse peaches, etc. The system has been demonstrated and applied in solar greenhouse No. 7 of Beijing Xiaotangshan National Precision Agriculture Base and No. 5 of Shijiazhuang Agricultural and Forestry Science Research Institute with good achievements. Farmers and researchers have realized real-time monitoring, remote control and management. On one hand, the system can used to avoid working in extreme environment, such as high temperature and pesticide environment. On the other hand, with the help of the robot, independent inspection and data collection could achieve instead of people, and it is very intuitive in time-saving and indirect costs saving for productions and researchers. The results showed that the system could be widely applied in greenhouse facilities production and research.

Key words: unmanned farm, agricultural robots, environmental monitoring, machine vision, disease recognition, remote control, deep learning

CLC Number: