Cotton is a significant economic crop. It is vulnerable to aphids (Aphis gossypii Glovers) during the growth period. Rapid and early detection has become an important means to deal with aphids in cotton. In this study, the visible/near-infrared (Vis/NIR) hyperspectral imaging system (376–1044 nm) and machine learning methods were used to identify aphid infection in cotton leaves. Both tall and short cotton plants (Lumianyan 24) were inoculated with aphids, and the corresponding plants without aphids were used as control. The hyperspectral images (HSIs) were acquired five times at an interval of 5 days. The healthy and infected leaves were used to establish the datasets, with each leaf as a sample. The spectra and RGB images of each cotton leaf were extracted from the hyperspectral images for one-dimensional (1D) and two-dimensional (2D) analysis. The hyperspectral images of each leaf were used for three-dimensional (3D) analysis. Convolutional Neural Networks (CNNs) were used for identification and compared with conventional machine learning methods. For the extracted spectra, 1D CNN had a fine classification performance, and the classification accuracy could reach 98%. For RGB images, 2D CNN had a better classification performance. For HSIs, 3D CNN performed moderately and performed better than 2D CNN. On the whole, CNN performed relatively better than conventional machine learning methods. In the process of 1D, 2D, and 3D CNN visualization, the important wavelength ranges were analyzed in 1D and 3D CNN visualization, and the importance of wavelength ranges and spatial regions were analyzed in 2D and 3D CNN visualization. The overall results in this study illustrated the feasibility of using hyperspectral imaging combined with multi-dimensional CNN to detect aphid infection in cotton leaves, providing a new alternative for pest infection detection in plants.
Rapid and accurate detection of pesticide residue levels can help to prevent the harm of pesticide residue. This study used visible/near-infrared (Vis-NIR) (376–1044 nm) and near-infrared (NIR) (915–1699 nm) hyperspectral imaging systems (HISs) to detect the level of pesticide residues. Three different varieties of grapes were sprayed with four levels of pesticides. Logistic regression (LR), support vector machine (SVM), random forest (RF), convolutional neural network (CNN), and residual neural network (ResNet) models were used to build classification models for pesticide residue levels. The saliency maps of CNN and ResNet were conducted to visualize the contribution of wavelengths. Overall, the results of NIR spectra performed better than those of Vis-NIR spectra. For Vis-NIR spectra, the best model was ResNet, with the accuracy of over 93%. For NIR spectra, LR was the best, with the accuracy of over 97%, but SVM, CNN, and ResNet also showed closed and fine results. The saliency map of CNN and ResNet presented similar and closed ranges of crucial wavelengths. Overall results indicated deep learning performed better than conventional machine learning. The study showed that the use of hyperspectral imaging technology combined with machine learning can effectively detect the level of pesticide residues in grapes.
Hyperspectral imaging provides an effective way to identify the geographical origin of Radix Glycyrrhizae to assess its quality.
Rapid and accurate prediction of crop nitrogen content is of great significance for guiding precise fertilization. In this study, an unmanned aerial vehicle (UAV) digital camera was used to collect cotton canopy RGB images at 20 m height, and two cotton varieties and six nitrogen gradients were used to predict nitrogen content in the cotton canopy. After image-preprocessing, 46 hand features were extracted, and deep features were extracted by convolutional neural network (CNN). Partial least squares and Pearson were used for feature dimensionality reduction, respectively. Linear regression, support vector machine, and one-dimensional CNN regression models were constructed with manual features as input, and the deep features were used as inputs to construct a two-dimensional CNN regression model to achieve accurate prediction of cotton canopy nitrogen. It was verified that the manual feature and deep feature models constructed from UAV RGB images had good prediction effects. R2 = 0.80 and RMSE = 1.67 g kg−1 of the Xinluzao 45 optimal model, and R2 = 0.42 and RMSE = 3.13 g kg−1 of the Xinluzao 53 optimal model. The results show that the UAV RGB image and machine learning technology can be used to predict the nitrogen content of large-scale cotton, but due to insufficient data samples, the accuracy and stability of the prediction model still need to be improved.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.