欢迎您访问《智慧农业(中英文)》官方网站! English
专题--机器视觉与农业智能感知

自然环境中鲜食葡萄快速识别与采摘点自动定位方法

  • 朱衍俊 ,
  • 杜文圣 ,
  • 王春颖 ,
  • 刘平 ,
  • 李祥
展开
  • 1.山东农业大学 机械与电子工程学院/智能化农业机械与装备实验室/山东省园艺机械与装备重点实验室,山东 泰安 271018
    2.山东交通学院 工程机械学院,山东 济南 250357
    3.山东农业大学生命科学学院小麦育种全国重点实验室,山东 泰安 271018
朱衍俊,研究方向为农业信息化。E-mail:zhu__yanjun@163.com
刘 平,博士研究生,教授,研究方向为农业智能控制技术、机器人控制与导航、目标检测与识别。E-mail:liupingsdau@126.com
李 祥,博士研究生,研究方向为作物高产相关性状的发育生物学、表型组学的研究与应用。E-mail: lixiang@sdau.edu.cn

收稿日期: 2023-04-04

  网络出版日期: 2023-07-04

基金资助

山东省重点研发计划项目(2022TZXD0010)

Rapid Recognition and Picking Points Automatic Positioning Method for Table Grape in Natural Environment

  • ZHU Yanjun ,
  • DU Wensheng ,
  • WANG Chunying ,
  • LIU Ping ,
  • LI Xiang
Expand
  • 1.Shandong Agricultural Equipment Intelligent Engineering Laboratory/ Shandong Provincial Key Laboratory of Horticultural/ Machinery and Equipment, College of Mechanical and Electronic Engineering, Shandong Agricultural University, Taian 271018, China
    2.College of Construction Machinery, Shandong Jiaotong University, Jinan 250357, China
    3.National Key Laboratory of Wheat Improvement, College of Life Sciences, Shandong Agricultural University, Taian 271018, China
ZHU Yanjun, E-mail:zhu__yanjun@163.com

1. LIU Ping, E-mail:liupingsdau@126.com

2. LI Xiang, E-mail: lixiang@sdau.edu.cn

Received date: 2023-04-04

  Online published: 2023-07-04

Supported by

Key R&D Program Project in Shandong Province (2022TZXD0010)

摘要

[目的/意义] 自然环境中鲜食葡萄的快速识别与精准定位是实现鲜食葡萄机器人自动采摘的先决条件。 [方法] 本研究基于改进的K-means聚类算法和轮廓分析法提出一种鲜食葡萄采摘点自动定位的方法。首先,采用加权灰度阈值作为聚类算法相似度的判定依据,并以此为基础提出一种自适应调整K值的K-means聚类算法,实现鲜食葡萄的快速有效识别检测;然后,利用提出的轮廓分析法获得果梗轴和采摘点感兴趣区域,利用几何方法实现鲜食葡萄采摘点快速准确定位;最后,利用采集的917张鲜食葡萄图像对本研究提出的算法进行实验验证。[结果和讨论]本研究提出算法定位的鲜食葡萄采摘点与最优采摘点的误差小于12个像素的成功率为90.51%,平均定位时间为0.87 s,实现鲜食葡萄采摘点的快速准确的定位。在篱壁式种植方式与棚架式种植方式下分别进行50次模拟仿真试验,结果表明,篱壁式紫葡萄采摘点定位成功率为86.00%,棚架式紫葡萄识别定位成功率达到92.00%,篱壁式绿葡萄采摘点定位成功率为78.00%,棚架式绿葡萄识别定位成功率为80.00%,整体试验效果较好。 [结论] 本研究可为鲜食葡萄采摘机器人实现精准采摘葡萄提供技术支撑。

本文引用格式

朱衍俊 , 杜文圣 , 王春颖 , 刘平 , 李祥 . 自然环境中鲜食葡萄快速识别与采摘点自动定位方法[J]. 智慧农业, 2023 , 5(2) : 23 -34 . DOI: 10.12133/j.smartag.SA202304001

Abstract

[Objective] Rapid recognition and automatic positioning of table grapes in the natural environment is the prerequisite for the automatic picking of table grapes by the picking robot. [Methods] An rapid recognition and automatic picking points positioning method based on improved K-means clustering algorithm and contour analysis was proposed. First, euclidean distance was replaced by a weighted gray threshold as the judgment basis of K-means similarity. Then the images of table grapes were rasterized according to the K value, and the initial clustering center was obtained. Next, the average gray value of each cluster and the percentage of pixel points of each cluster in the total pixel points were calculated. And the weighted gray threshold was obtained by the average gray value and percentage of adjacent clusters. Then, the clustering was considered as have ended until the weighted gray threshold remained unchanged. Therefore, the cluster image of table grape was obtained. The improved clustering algorithm not only saved the clustering time, but also ensured that the K value could change adaptively. Moreover, the adaptive Otsu algorithm was used to extract grape cluster information, so that the initial binary image of the table grape was obtained. In order to reduce the interference of redundant noise on recognition accuracy, the morphological algorithms (open operation, close operation, images filling and the maximum connected domain) were used to remove noise, so the accurate binary image of table grapes was obtained. And then, the contours of table grapes were obtained by the Sobel operator. Furthermore, table grape clusters grew perpendicular to the ground due to gravity in the natural environment. Therefore, the extreme point and center of gravity point of the grape cluster were obtained based on contour analysis. In addition, the linear bundle where the extreme point and the center of gravity point located was taken as the carrier, and the similarity of pixel points on both sides of the linear bundle were taken as the judgment basis. The line corresponding to the lowest similarity value was taken as the grape stem, so the stem axis of the grape was located. Moreover, according to the agronomic picking requirements of table grapes, and combined with contour analysis, the region of interest (ROI) in picking points could be obtained. Among them, the intersection of the grapes stem and the contour was regarded as the middle point of the bottom edge of the ROI. And the 0.8 times distance between the left and right extreme points was regarded as the length of the ROI, the 0.25 times distance between the gravity point and the intersection of the grape stem and the contour was regarded as the height of the ROI. After that, the central point of the ROI was captured. Then, the nearest point between the center point of the ROI and the grape stem was determined, and this point on the grape stem was taken as the picking point of the table grapes. Finally, 917 grape images (including Summer Black, Moldova, and Youyong) taken by the rear camera of MI8 mobile phone at Jinniu Mountain Base of Shandong Fruit and Vegetable Research Institute were verified experimentally. Results and Discussions] The results showed that the success rate was 90.51% when the error between the table grape picking points and the optimal points were less than 12 pixels, and the average positioning time was 0.87 s. The method realized the fast and accurate localization of table grape picking points. On top of that, according to the two cultivation modes (hedgerow planting and trellis planting) of table grapes, a simulation test platform based on the Dense mechanical arm and the single-chip computer was set up in the study. 50 simulation tests were carried out for the four conditions respectively, among which the success rate of localization for purple grape picking point of hedgerow planting was 86.00%, and the average localization time was 0.89 s; the success rate of localization for purple grape identification and localization of trellis planting was 92.00%, and the average localization time was 0.67 s; the success rate of localization for green grape picking point of hedgerow planting was 78.00%, and the average localization time was 0.72 s; and the success rate of localization for green grape identification and localization of trellis planting was 80.00%, and the average localization time was 0.71 s. [Conclusions] The experimental results showed that the method proposed in the study can meet the requirements of table grape picking, and can provide technical supports for the development of grape picking robot.

参考文献

1 皆彦吉. 果蔬采摘机器人的研究现状、问题及对策[J]. 时代农机, 2018, 45(9): ID 42.
  JIE Y J. Research status, problems and countermeasures of fruit and vegetable picking robot[J]. Times agricultural machinery, 2018, 45(9): ID 42.
2 袁志英, 陈进, 郭鹏, 等. 果蔬采摘机器人的发展现状、问题及对策分析[C]// 中国农业机械学会国际学术年会. 北京, 中国: 中国农业机械学会, 2012.
  YUAN Z Y, CHEN J, GUO P, et al. Analysis on the development status, problems and countermeasures of fruit and vegetable picking robot[C]// International Academic Annual Conference of Chinese Society of Agricultural Machinery. Beijing, China: Chinese Society of Agricultural Machinery, 2012.
3 张洁, 李艳文. 果蔬采摘机器人的研究现状、问题及对策[J]. 机械设计, 2010, 27(6): 1-5.
  ZHANG J, LI Y W. Research situation, problems and solutions of fruit-vegetable picking robots[J]. Journal of machine design, 2010, 27(6): 1-5.
4 KONDO N, SHIBANO Y, MOHRI K, et al. Basic studies on robot to work in vineyard (Part 1) manipulator and harvesting hand[J]. Journal of the Japanese society of agricultural machinery, 1993, 55(6): 85-94.
5 KONDO N, SHIBANO Y, MOHRI K, et al. Basic studies on robot to work in vineyard (Part 2) discriminating, position detecting and harvesting experiments by using visual sensor[J]. Journal of the Japanese society of agricultural machinery, 1994, 56(1): 45-53.
6 MONTA M, KONDO N, SHIBANO Y, et al. Basic studies on robot to work in vineyard (Part 3) measurement of physical properties for robotization and manufacture of berry thinning hand[J]. Journal of the Japanese society of agricultural machinery, 1994, 56(2): 93-100.
7 LIU S, WHITTY M. Automatic grape bunch detection in vineyards with an SVM classifier[J]. Journal of applied logic, 2015, 13(4): 643-653.
8 REIS M J C S, MORAIS R, PERES E, et al. Automatic detection of bunches of grapes in natural environment from color images[J]. Journal of applied logic, 2012, 10(4): 285-290.
9 雷旺雄, 卢军. 葡萄采摘机器人采摘点的视觉定位[J]. 江苏农业学报, 2020, 36(4): 1015-1021.
  LEI W X, LU J. Visual positioning method for picking point of grape picking robot[J]. Jiangsu Journal of Agricultural Sciences, 2020, 36(4): 1015-1021.
10 PEREZ-ZAVALA R, MIGUEL T T, AUAT C F, et al. A pattern recognition strategy for visual grape bunch detection in vineyards[J]. Computers and Electronics in Agriculture, 2018, 151: 136-149.
11 BERENSTEIN R, SHAHAR OBEN, SHAPIRO A, et al. Grape clusters and foliage detection algorithms for autonomous selective vineyard sprayer[J]. Intelligent service robotics, 2010, 3(4): 233-243.
12 刘平, 朱衍俊, 张同勋, 等. 自然环境下贴叠葡萄串的识别与图像分割算法[J]. 农业工程学报, 2020, 36(6): 161-169.
  LIU P, ZHU Y J, ZHANG T X, et al. Algorithm for recognition and image segmentation of overlapping grape cluster in natural environment[J]. Transactions of the Chinese society of agricultural engineering, 2020, 36(6): 161-169.
13 MIAO Y B, HUANG L L, ZHANG S. A two-step phenotypic parameter measurement strategy for overlapped grapes under different light conditions[J]. Sensors, 2021, 21(13), ID 4532.
14 LUO L F, TANG Y C, LU Q H, et al. A vision methodology for harvesting robot to detect cutting points on peduncles of double overlapping grape clusters in a vineyard[J]. Computers in industry, 2018, 99: 130-139.
15 罗陆锋, 邹湘军, 熊俊涛, 等. 自然环境下葡萄采摘机器人采摘点的自动定位[J]. 农业工程学报, 2015, 31(2): 14-21.
  LUO L F, ZOU X J, XIONG J T, et al. Automatic positioning for picking point of grape picking robot in natural environment[J]. Transactions of the Chinese society of agricultural engineering, 2015, 31(2): 14-21.
16 XIONG J T, LIU Z, LIN R, et al. Green grape detection and picking-point calculation in a night-time natural environment using a charge-coupled device (CCD) vision sensor with artificial illumination[J]. Sensors, 2018, 18(4): ID 969.
17 熊俊涛, 何志良, 汤林越, 等. 非结构环境中扰动葡萄采摘点的视觉定位技术[J]. 农业机械学报, 2017, 48(4): 29-33, 81.
  XIONG J T, HE Z L, TANG L Y, et al. Visual localization of disturbed grape picking point in non-structural environment[J]. Transactions of the Chinese society for agricultural machinery, 2017, 48(4): 29-33, 81.
18 LUO L F, TANG Y C, ZOU X J, et al. Vision-based extraction of spatial information in grape clusters for harvesting robots[J]. Biosystems Engineering, 2016, 151(11): 90-104.
19 袁妍. 基于深度视觉的棚架葡萄采摘机器人手眼系统设计与果梗近景识别研究[D]. 镇江: 江苏大学, 2019.
  YUAN Y. Hand-eye system design and close-shot stem recognition for robotic harvesting of trellis grape clusters based on depth sensing[D]. Zhenjiang: Jiangsu University, 2019.
20 罗庆, 饶元, 金秀, 等. 基于改进YOLOv5s和多模态图像的树上毛桃检测(英文)[J]. 智慧农业(中英文), 2022, 4(4): 84-104.
  LUO Q, RAO Y, JIN X, et al. Multi-class on-tree peach detection using improved YOLOv5s and multi-modal images[J]. Smart Agriculture, 2022, 4(4): 84-104.
21 商枫楠, 周学成, 梁英凯, 等. 基于改进YOLOX的自然环境中火龙果检测方法[J]. 智慧农业(中英文), 2022, 4(3): 120-131.
  SHANG F N, ZHOU X C, LIANG Y K, et al. Detection method for dragon fruit in natural environment based on improved YOLOX[J]. Smart agriculture, 2022, 4(3): 120-131.
22 周文静, 查志华, 吴杰. 改进圆形Hough变换的田间红提葡萄果穗成熟度判别[J]. 农业工程学报, 2020, 36(9): 205-213.
  ZHOU W J, ZHA Z H, WU J. Maturity discrimination of "Red Globe" grape cluster in grapery by improved circle Hough transform[J]. Transactions of the Chinese society of agricultural engineering, 2020, 36(9): 205-213.
23 宁政通, 罗陆锋, 廖嘉欣, 等. 基于深度学习的葡萄果梗识别与最优采摘定位[J]. 农业工程学报, 2021, 37(9): 222-229.
  NING Z T, LUO L F, LIAO J X, et al. Recognition and the optimal picking point location of grape stems based on deep learning[J]. Transactions of the Chinese society of agricultural engineering, 2021, 37(9): 222-229.
24 孙碧玉. 基于深度学习的番茄果实目标检测和番茄串采摘点定位技术研究[D]. 天津: 天津理工大学, 2021.
  SUN B Y. Research on tomato fruit target detection and tomato string picking point location technology based on deep learning[D]. Tianjin: Tianjin University of Technology, 2021.
25 江梅, 孙飒爽, 何东健, 等. 融合K-means聚类分割算法与凸壳原理的遮挡苹果目标识别与定位方法[J]. 智慧农业, 2019, 1(2): 45-54.
  JIANG M, SUN S S, HE D J, et al. Recognition and localization method of occluded apples based on K-means clustering segmentation algorithm and convex hull theory[J]. Smart agriculture, 2019, 1(2): 45-54.
26 陈宏, 马峻, 陈寿宏, 等. 改进K-means聚类与Otsu算法的光栅投影轮廓有效点自动提取方法[J]. 激光杂志, 2020, 41(9): 40-46.
  CHEN H, MA J, CHEN S H, et al. Automatic extraction of effective points from fringe projection profilometry based on improved K-means clustering and Otsu algorithms[J]. Laser journal, 2020, 41(9): 40-46.
27 夏亚飞. 基于K均值聚类和二维Otsu的棉花HSV图像分割[J]. 软件, 2020, 41(7): 170-173.
  XIA Y F. Cotton HSV image segmentation based on K-means clustering and two-dimensional otsu[J]. Computer engineering & software, 2020, 41(7): 170-173.
28 曹帅帅, 陈雪鑫, 苗圃, 等. 基于PSO与K-均值聚类算法优化结合的图像分割方法[J]. 计算机与现代化, 2020(1): 22-27.
  CAO S S, CHEN X X, MIAO P, et al. Image segmentation method based on optimization of PSO algorithm and K-means clustering algorithm[J]. Computer and modernization, 2020(1): 22-27.
29 YU Y, ZHANG K L, YANG L, et al. Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN[J]. Computers and electronics in agriculture, 2019, 163: ID 104846.
文章导航

/