Wildfires can be beneficial for native vegetation. However, wildfires can impact property values, human safety, and ecosystem function. Resource managers require safe, easy to use, timely, and cost-effective methods for quantifying wildfire damage and regeneration. In this work, we demonstrate an approach using an unmanned aerial system (UAS) equipped with a MicaSense RedEdge multispectral sensor to classify and estimate wildfire damage in a coastal marsh. We collected approximately 7.2 km2 of five-band multispectral imagery after a wildfire event in February 2016, which was used to create a photogrammetry-based digital surface model (DSM) and orthomosaic for object-based classification analysis. Airborne light detection and ranging data were used to validate the accuracy of the DSM. Four-band airborne imagery from pre- and post-fire were used to estimate pre-fire health, post-fire damage, and track the vegetation recovery process. Immediate and long-term post-fire classifications, area, and volume of burned regions were produced to track the revegetation progress. The UAS-based classification produced from normalized difference vegetation index and DSM was compared to the Landsat-based Burned Area Reflectance Classification. Experimental results show the potential of using UAS and the presented approach compared to satellite-based mapping in terms of classification accuracies, turnaround time, and spatial and temporal resolutions.
Abstract:This study describes an unmanned aerial system (UAS) method for accurately estimating the number and diameters of harvested Loblolly Pine (Pinus taeda) stumps in a final harvest (often referred as clear-cut) situation. The study methods are potentially useful in initial detection, quantification of area and volume estimation of legal or illegal logging events to help estimate the volumes and value of removed pine timber. The study sites used included three adjacent pine stands in East-Central Mississippi. Using image pattern recognition algorithms, results show a counting accuracy of 77.3% and RMSE of 4.3 cm for stump diameter estimation. The study also shows that the area can be accurately estimated from the UAS collected data. Our experimental study shows that the proposed UAS survey method has the potential for wide use as a monitoring or investigation tool in the forestry and land management industries.
The radiometric quality of remotely sensed imagery is crucial for precision agriculture applications because estimations of plant health rely on the underlying quality. Sky conditions, and specifically shadowing from clouds, are critical determinants in the quality of images that can be obtained from low-altitude sensing platforms. In this work, we first compare common deep learning approaches to classify sky conditions with regard to cloud shadows in agricultural fields using a visible spectrum camera. We then develop an artificial-intelligence-based edge computing system to fully automate the classification process. Training data consisting of 100 oblique angle images of the sky were provided to a convolutional neural network and two deep residual neural networks (ResNet18 and ResNet34) to facilitate learning two classes, namely (1) good image quality expected, and (2) degraded image quality expected. The expectation of quality stemmed from the sky condition (i.e., density, coverage, and thickness of clouds) present at the time of the image capture. These networks were tested using a set of 13,000 images. Our results demonstrated that ResNet18 and ResNet34 classifiers produced better classification accuracy when compared to a convolutional neural network classifier. The best overall accuracy was obtained by ResNet34, which was 92% accurate, with a Kappa statistic of 0.77. These results demonstrate a low-cost solution to quality control for future autonomous farming systems that will operate without human intervention and supervision.
Uncrewed aerial systems (UASs) provide high temporal and spatial resolution information for crop health monitoring and informed management decisions to improve yields. However, traditional in-season yield prediction methodologies are often inconsistent and inaccurate due to variations in soil types and environmental factors. This study aimed to identify the best phenological stage and vegetation index (VI) for estimating corn yield under rainfed conditions. Multispectral images were collected over three years (2020-2022) during the corn growing season and over fifty VIs were analyzed. In the three-year period, thirty-one VIs exhibited significant correlations (r ≥ 0.7) with yield. Sixteen VIs were significantly correlated with the yield at least for two years, and five VIs had a significant correlation with the yield for all three years. A strong correlation with yield was achieved by combining red, red edge, and near infrared-based indices. Further, combined correlation and random forest an alyses between yield and VIs led to the identification of consistent and highest predictive power VIs for corn yield prediction. Among them, leaf chlorophyll index, Medium Resolution Imaging Spectrometer (MERIS) terrestrial chlorophyll index and modified normalized difference at 705 were the most consistent predictors of corn yield when recorded around the reproductive stage (R1). This study demonstrated the dynamic nature of canopy reflectance and the importance of considering growth stages, and environmental conditions for accurate corn yield prediction.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.