Banana Fusarium wilt (BFW) is a devastating disease with no effective cure methods. Timely and effective detection of the disease and evaluation of its spreading trend will help farmers in making right decisions on plantation management. The main purpose of this study was to find the spectral features of the BFW-infected canopy and build the optimal BFW classification models for different stages of infection. A RedEdge-MX camera mounted on an unmanned aerial vehicle (UAV) was used to collect multispectral images of a banana plantation infected with BFW in July and August 2020. Three types of spectral features were used as the inputs of classification models, including three-visible-band images, five-multispectral-band images, and vegetation indices (VIs). Four supervised methods including Support Vector Machine (SVM), Random Forest (RF), Back Propagation Neural Networks (BPNN) and Logistic Regression (LR), and two unsupervised methods including Hotspot Analysis (HA) and Iterative Self-Organizing Data Analysis Technique Algorithm (ISODATA) were adopted to detect the BFW-infected canopies. Comparing to the healthy canopies, the BFW-infected canopies had higher reflectance in the visible region, but lower reflectance in the NIR region. The classification results showed that most of the supervised and unsupervised methods reached excellent accuracies. Among all the supervised methods, RF based on the five-multispectral-band was considered as the optimal model, with higher overall accuracy (OA) of 97.28% and faster running time of 22 min. For the unsupervised methods, HA reached high and balanced OAs of more than 95% based on the selected VIs derived from the red and NIR band, especially for WDRVI, NDVI, and TDVI. By comprehensively evaluating the classification results of different metrics, the unsupervised method HA was recommended for BFW recognition, especially in the late stage of infection; the supervised method RF was recommended in the early stage of infection to reach a slightly higher accuracy. The results found in this study could give advice for banana plantation management and provide approaches for plant disease detection.
Sugarcane is the main industrial crop for sugar production, and its growth status is closely related to fertilizer, water, and light input. Unmanned aerial vehicle (UAV)-based multispectral imagery is widely used for high-throughput phenotyping, since it can rapidly predict crop vigor at field scale. This study focused on the potential of drone multispectral images in predicting canopy nitrogen concentration (CNC) and irrigation levels for sugarcane. An experiment was carried out in a sugarcane field with three irrigation levels and five fertilizer levels. Multispectral images at an altitude of 40 m were acquired during the elongating stage. Partial least square (PLS), backpropagation neural network (BPNN), and extreme learning machine (ELM) were adopted to establish CNC prediction models based on various combinations of band reflectance and vegetation indices. The simple ratio pigment index (SRPI), normalized pigment chlorophyll index (NPCI), and normalized green-blue difference index (NGBDI) were selected as model inputs due to their higher grey relational degree with the CNC and lower correlation between one another. The PLS model based on the five-band reflectance and the three vegetation indices achieved the best accuracy (Rv = 0.79, RMSEv = 0.11). Support vector machine (SVM) and BPNN were then used to classify the irrigation levels based on five spectral features which had high correlations with irrigation levels. SVM reached a higher accuracy of 80.6%. The results of this study demonstrated that high resolution multispectral images could provide effective information for CNC prediction and water irrigation level recognition for sugarcane crop.
IntroductionSugarcane is the main industrial crop for sugar production; its growth status is closely related to fertilizer, water, and light input. Unmanned aerial vehicle (UAV)-based multispectral imagery is widely used for high-throughput phenotyping because it can rapidly predict crop vigor. This paper mainly studied the potential of multispectral images obtained by low-altitude UAV systems in predicting canopy nitrogen (N) content and irrigation level for sugarcane.MethodsAn experiment was carried out on sugarcane fields with three irrigation levels and five nitrogen levels. A multispectral image at a height of 40 m was acquired during the elongation stage, and the canopy nitrogen content was determined as the ground truth. N prediction models, including partial least square (PLS), backpropagation neural network (BPNN), and extreme learning machine (ELM) models, were established based on different variables. A support vector machine (SVM) model was used to recognize the irrigation level.ResultsThe PLS model based on band reflectance and five vegetation indices had better accuracy (R=0.7693, root mean square error (RMSE)=0.1109) than the BPNN and ELM models. Some spectral information from the multispectral image had obviously different features among the different irrigation levels, and the SVM algorithm was used for irrigation level classification. The classification accuracy reached 77.8%.ConclusionLow-altitude multispectral images could provide effective information for N prediction and water irrigation level recognition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.