Welcome to Smart Agriculture

Smart Agriculture ›› 2023, Vol. 5 ›› Issue (2): 1-12.doi: 10.12133/j.smartag.SA202305004

• Topic--Machine Vision and Agricultural Intelligent Perception •     Next Articles

A Lightweight Fruit Load Estimation Model for Edge Computing Equipment

XIA Xue1(), CHAI Xiujuan1, ZHANG Ning1(), ZHOU Shuo1, SUN Qixin1, SUN Tan2()   

  1. 1.Agricultural Information Institute, Chinese Academy of Agricultural Sciences/Key Laboratory of Agricultural Big Data, Ministry of Agriculture and Rural Affairs, Beijing 100081, China
    2.Chinese Academy of Agricultural Sciences, Beijing 100081, China
  • Received:2023-05-11 Online:2023-06-30

Abstract:

[Objective] The fruit load estimation of fruit tree is essential for horticulture management. Traditional estimation method by manual sampling is not only labor-intensive and time-consuming but also prone to errors. Most existing models can not apply to edge computing equipment with limited computing resources because of their high model complexity. This study aims to develop a lightweight model for edge computing equipment to estimate fruit load automatically in the orchard. [Methods] The experimental data were captured using the smartphone in the citrus orchard in Jiangnan district, Nanning city, Guangxi province. In the dataset, 30 videos were randomly selected for model training and other 10 for testing. The general idea of the proposed algorithm was divided into two parts: Detecting fruits and extracting ReID features of fruits in each image from the video, then tracking fruit and estimating the fruit load. Specifically, the CSPDarknet53 network was used as the backbone of the model to achieve feature extraction as it consumes less hardware computing resources, which was suitable for edge computing equipment. The path aggregation feature pyramid network PAFPN was introduced as the neck part for the feature fusion via the jump connection between the low-level and high-level features. The fused features from the PAFPN were fed into two parallel branches. One was the fruit detection branch and another was the identity embedding branch. The fruit detection branch consisted of three prediction heads, each of which performed 3×3 convolution and 1×1 convolution on the feature map output by the PAFPN to predict the fruit's keypoint heat map, local offset and bounding box size, respectively. The identity embedding branch distinguished between different fruit identity features. In the fruit tracking stage, the byte mechanism from the ByteTrack algorithm was introduced to improve the data association of the FairMOT method, enhancing the performance of fruit load estimation in the video. The Byte algorithm considered both high-score and low-score detection boxes to associate the fruit motion trajectory, then matches the identity features' similarity of fruits between frames. The number of fruit IDs whose tracking duration longer than five frames was counted as the amount of citrus fruit in the video. [Results and Discussions] All experiments were conducted on edge computing equipment. The fruit detection experiment was conducted under the same test dataset containing 211 citrus tree images. The experimental results showed that applying CSPDarkNet53+PAFPN structure in the proposed model achieved a precision of 83.6%, recall of 89.2% and F1 score of 86.3%, respectively, which were superior to the same indexes of FairMOT (ResNet34) model, FairMOT (HRNet18) model and Faster RCNN model. The CSPDarkNet53+PAFPN structure adopted in the proposed model could better detect the fruits in the images, laying a foundation for estimating the amount of citrus fruit on trees. The model complexity experimental results showed that the number of parameters, FLOPs (Floating Point Operations) and size of the proposed model were 5.01 M, 36.44 G and 70.2 MB, respectively. The number of parameters for the proposed model was 20.19% of FairMOT (ResNet34) model's and 41.51% of FairMOT (HRNet18) model's. The FLOPs for the proposed model was 78.31% less than FairMOT (ResNet34) model's and 87.63% less than FairMOT (HRNet18) model's. The model size for the proposed model was 23.96% of FairMOT (ResNet34) model's and 45.00% of FairMOT (HRNet18) model's. Compared with the Faster RCNN, the model built in this study showed advantages in the number of parameters, FLOPs and model size. The low complexity proved that the proposed model was more friendly to edge computing equipment. Compared with the lightweight backbone network EfficientNet-Lite, the CSPDarkNet53 applied in the proposed model's backbone performed better fruit detection and model complexity. For fruit load estimation, the improved tracking strategy that integrated the Byte algorithm into the FairMOT positively boosted the estimation accuracy of fruit load. The experimental results on the test videos showed that the AEP (Average Estimating Precision) and FPS (Frames Per Second) of the proposed model reached 91.61% and 14.76 f/s, which indicated that the proposed model could maintain high estimation accuracy while the FPS was 2.4 times and 4.7 times of the comparison models, respectively. The RMSE (Root Mean Square Error) of the proposed model was 4.1713, which was 47.61% less than FairMOT (ResNet34) model's and 22.94% less than FairMOT (HRNet18) model's. The R2 of the determination coefficient between the algorithm-measured value and the manual counted value was 0.9858, which was superior to other comparison models. The proposed model revealed better performance in estimating fruit load and lower model complexity than other comparatives. [Conclusions] The experimental results proved the validity of the proposed model for fruit load estimation on edge computing equipment. This research could provide technical references for the automatic monitoring and analysis of orchard productivity. Future research will continue to enrich the data resources, further improve the model's performance, and explore more efficient methods to serve more fruit tree varieties.

Key words: smart orchard, fruit load estimation, edge computing, deep learning, multiple object tracking, lightweight model

CLC Number: