Manual evaluation of crop injury to herbicides is time-consuming. Unmanned aerial systems (UAS) and high-resolution multispectral sensors and machine learning classification techniques have the potential to save time and improve precision in the evaluation of herbicide injury in crops, including grain sorghum (Sorghum bicolor L. Moench). The objectives of our research are to (1) evaluate three supervised classification algorithms [support vector machine (SVM), maximum likelihood, and random forest] for categorizing high-resolution UAS imagery to aid in data extraction and (2) evaluate the use of vegetative indices (VIs) collected from UAS imagery as an alternative to traditional methods of visible herbicide injury assessment in mesotrione-tolerant grain sorghum breeding trials. An experiment was conducted in a randomized complete block design using a factorial treatment arrangement of three genotypes by four mesotrione doses. Herbicide injury was rated visually on a scale of 0 (no injury) to 100 (complete plant mortality). The UAS flights were flown at 9, 15, 21, 27, and 35 days after treatment. Results show the SVM algorithm to be the most consistently accurate, and high correlations (r ¼ −0.83 to −0.94; p < 0.0001) were observed between the normalized difference VI and ground-measured herbicide injury. Therefore, we conclude that VIs collected with UAS coupled with machine learning image classification has the potential to be an effective method of evaluating mesotrione injury in grain sorghum. © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Grain sorghum (Sorghum bicolor [L]. Moench) is a crucial crop to the world's semiarid regions, as it can produce grain and biomass yields in precipitation-limited environments. Many genotypes have a characterized form of drought resistance known as the stay-green (SG) trait, enabling sorghum plants to resist postflowering drought stress that can severely reduce yields. Breeding for SG sorghum lines is considered vital for sorghum breeders around the world, but selecting for SG traits currently relies on methods that are labor-intensive and timeconsuming. Using unmanned aerial systems capable of capturing high-resolution imagery offers a solution for reducing the time and energy required to select for these traits. A field study was conducted in Manhattan, Kansas, where 20 Pioneer ® sorghum hybrids were planted in a randomized complete block design with three replications per hybrid. Imagery was collected with a DJI ® Matrice 200™ equipped with a MicaSense ® RedEdge-MX™ multispectral camera. Flight altitude was 30 m, and flights were collected under clear, sunny skies within AE2.5 h of solar noon. Ground-measured data included visual senescence ratings, fresh and dry plant biomass, leaf area index, and final grain yield. After correlation and regression analysis, results indicated significant relationships with the near-infrared spectral band with fresh and dry plant biomass samples, the green normalized difference vegetation index scores at flowering were the most related to final grain yield, and the visible atmospherically resistant index was the most related to visual senescence scores. Significant spectral band/vegetative indices were clustered into groups, and significant differences were found between various traits. We have developed a methodology for SG sorghum growers to collect, process, and extract data for more efficient identification of traits of interest. © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 International License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Site-specific weed management using open-sourced object detection algorithms could accurately detect weeds in cropping systems. We investigated the use of object detection algorithms to detect Palmer amaranth (Amaranthus palmeri S. Watson) in soybean [Glycine max (L.) Merr.]. The objectives were to 1) develop an annotated image database of A. palmeri and soybean to fine-tune object detection algorithms, 2) compare effectiveness of multiple open-sourced algorithms in detecting A. palmeri, and 3) evaluate the relationship between A. palmeri growth features and A. palmeri detection ability. Soybean field sites were established in Manhattan, KS and Gypsum, KS with natural populations of A. palmeri. A total of 1108 and 392 images were taken aerially and at ground level, respectively, between May 27 and July 27, 2021. After image annotation, a total of 4492 images were selected. Annotated images were used to fine tune open-source Faster regional convolutional (Faster R-CNN) and Single Shot Detector (SSD) algorithms using a Resnet backbone, as well as the You Only Look Once (YOLO) series algorithms. Results demonstrated that YOLO version 5 achieved the highest mean average precision score of 0.77. For both A. palmeri and soybean detections within this algorithm, the highest F1 score was 0.72 when using a confidence threshold of 0.298. A lower confidence threshold of 0.15 increased the likelihood of species detection, but also increased the likelihood of false positive detections. The trained YOLOv5 dataset was used to identify A. palmeri in a dataset paired with measured growth features. Linear regression models predicted that as A. palmeri densities increased and as A. palmeri height increased, precision, recall, and F1 scores of algorithms decreased. We conclude that open-sourced algorithms such as YOLOv5 show great potential in detecting A. palmeri in soybean cropping systems.
The objective of this paper is to explore data privacy and sharing challenges associated with conducting regional data collection using large-scale unmanned aircraft systems (UAS) technologies. During the 2016 growing season, the North Dakota State University Extension Service and Elbit Systems of America conducted a proofof-concept UAS Research and Development (R&D) project to use large-scale UAS platforms to address some of the issues limiting UAS technology adoption. While the researchers initially thought that the high-resolution spatial and temporal field imagery collected would be well received by area farmers, the team learned that some had concerns over farm operations privacy and about who would have access to the data. The team conducted multiple stakeholder meetings to address the issues raised.While the flights were successfully executed, problems related to farmer interest in the data collected and how to get that data to farmers were identified. Though the project demonstrated how third party operation of large-scale UAS can remove the operational burden from farmers, the project introduced additional privacy concerns and highlighted the need for better rural broadband to make grower access to data in the cloud practical.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.