Welcome to Smart Agriculture

Smart Agriculture ›› 2024, Vol. 6 ›› Issue (2): 40-48.doi: 10.12133/j.smartag.SA202310010

• Special Issue--Agricultural Information Perception and Models • Previous Articles     Next Articles

Oilseed Rape Sclerotinia in Hyperspectral Images Segmentation Method Based on Bi-GRU and Spatial-Spectral Information Fusion

ZHANG Jing1(), ZHAO Zexuan1, ZHAO Yanru2, BU Hongchao1, WU Xingyu1   

  1. 1. School of Management Engineering, Capital University of Economics and Business, Beijing 100070, China
    2. College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling 712100, China
  • Received:2023-10-12 Online:2024-03-30
  • corresponding author:
    ZHANG Jing, E-mail:
  • Supported by:
    Capital University of Economics and Business Teaching Reform Project 2024(01892454202148); National Natural Science Foundation of China(31901403)


[Objective] The widespread prevalence of sclerotinia disease poses a significant challenge to the cultivation and supply of oilseed rape, not only results in substantial yield losses and decreased oil content in infected plant seeds but also severely impacts crop productivity and quality, leading to significant economic losses. To solve the problems of complex operation, environmental pollution, sample destruction and low detection efficiency of traditional chemical detection methods, a Bi-directional Gate Recurrent Unit (Bi-GRU) model based on space-spectrum feature fusion was constructed to achieve hyperspectral images (HSIs) segmentation of oilseed rape sclerotinia infected area. [Methods] The spectral characteristics of sclerotinia disease from a spectral perspective was initially explored. Significantly varying spectral reflectance was notably observed around 550 nm and within the wavelength range of 750-1 000 nm at different locations on rapeseed leaves. As the severity of sclerotinia infection increased, the differences in reflectance at these wavelengths became more pronounced. Subsequently, a rapeseed leaf sclerotinia disease dataset comprising 400 HSIs was curated using an intelligent data annotation tool. This dataset was divided into three subsets: a training set with 280 HSIs, a validation set with 40 HSIs, and a test set with 80 HSIs. Expanding on this, a 7×7 pixel neighborhood was extracted as the spatial feature of the target pixel, incorporating both spatial and spectral features effectively. Leveraging the Bi-GRU model enabled simultaneous feature extraction at any point within the sequence data, eliminating the impact of the order of spatial-spectral data fusion on the model's performance. The model comprises four key components: an input layer, hidden layers, fully connected layers, and an output layer. The Bi-GRU model in this study consisted of two hidden layers, each housing 512 GRU neurons. The forward hidden layer computed sequence information at the current time step, while the backward hidden layer retrieves the sequence in reverse, incorporating reversed-order information. These two hidden layers were linked to a fully connected layer, providing both forward and reversed-order information to all neurons during training. The Bi-GRU model included two fully connected layers, each with 1 000 neurons, and an output layer with two neurons representing the healthy and diseased classes, respectively. [Results and Discussions] To thoroughly validate the comprehensive performance of the proposed Bi-GRU model and assess the effectiveness of the spatial-spectral information fusion mechanism, relevant comparative analysis experiments were conducted. These experiments primarily focused on five key parameters—ClassAP(1), ClassAP(2), mean average precision (mAP), mean intersection over union (mIoU), and Kappa coefficient—to provide a comprehensive evaluation of the Bi-GRU model's performance. The comprehensive performance analysis revealed that the Bi-GRU model, when compared to mainstream convolutional neural network (CNN) and long short-term memory (LSTM) models, demonstrated superior overall performance in detecting rapeseed sclerotinia disease. Notably, the proposed Bi-GRU model achieved an mAP of 93.7%, showcasing a 7.1% precision improvement over the CNN model. The bidirectional architecture, coupled with spatial-spectral fusion data, effectively enhanced detection accuracy. Furthermore, the study visually presented the segmentation results of sclerotinia disease-infected areas using CNN, Bi-LSTM, and Bi-GRU models. A comparison with the Ground-Truth data revealed that the Bi-GRU model outperformed the CNN and Bi-LSTM models in detecting sclerotinia disease at various infection stages. Additionally, the Dice coefficient was employed to comprehensively assess the actual detection performance of different models at early, middle, and late infection stages. The dice coefficients for the Bi-GRU model at these stages were 83.8%, 89.4% and 89.2%, respectively. While early infection detection accuracy was relatively lower, the spatial-spectral data fusion mechanism significantly enhanced the effectiveness of detecting early sclerotinia infections in oilseed rape. [Conclusions] This study introduces a Bi-GRU model that integrates spatial and spectral information to accurately and efficiently identify the infected areas of oilseed rape sclerotinia disease. This approach not only addresses the challenge of detecting early stages of sclerotinia infection but also establishes a basis for high-throughput non-destructive detection of the disease.

Key words: oilseed rape sclerotinia detection, hyperspectral image classification, Bi-GRU, spatial-spectral feature fusion, deep learning