Climate change is increasing pest insects’ ability to reproduce as temperatures rise, resulting in vast tree mortality globally. Early information on pest infestation is urgently needed for timely decisions to mitigate the damage. We investigated the mapping of trees that were in decline due to European spruce bark beetle infestation using multispectral unmanned aerial vehicles (UAV)-based imagery collected in spring and fall in four study areas in Helsinki, Finland. We used the Random Forest machine learning to classify trees based on their symptoms during both occasions. Our approach achieved an overall classification accuracy of 78.2% and 84.5% for healthy, declined and dead trees for spring and fall datasets, respectively. The results suggest that fall or the end of summer provides the most accurate tree vitality classification results. We also investigated the transferability of Random Forest classifiers between different areas, resulting in overall classification accuracies ranging from 59.3% to 84.7%. The findings of this study indicate that multispectral UAV-based imagery is capable of classifying tree decline in Norway spruce trees during a bark beetle infestation.
Multi-and hyperspectral cameras on drones can be valuable tools in environmental monitoring. A significant shortcoming complicating their usage in quantitative remote sensing applications is insufficient robust radiometric calibration methods. In a direct reflectance transformation method, the drone is equipped with a camera and an irradiance sensor, allowing transformation of image pixel values to reflectance factors without ground reference data. This method requires the sensors to be calibrated with higher accuracy than what is usually required by the empirical line method (ELM), but consequently it offers benefits in robustness, ease of operation, and ability to be used on Beyond-Visual Line of Sight flights. The objective of this study was to develop and assess a drone-based workflow for direct reflectance transformation and implement it on our hyperspectral remote sensing system. A novel atmospheric correction method is also introduced, using two reference panels, but, unlike in the ELM, the correction is not directly affected by changes in the illumination. The sensor system consists of a hyperspectral camera (Rikola HSI, by Senop) and an onboard irradiance spectrometer (FGI AIRS), which were both given thorough radiometric calibrations. In laboratory tests and in a flight experiment, the FGI AIRS tilt-corrected irradiances had accuracy better than 1.9% at solar zenith angles up to 70 • . The system's lowaltitude reflectance factor accuracy was assessed in a flight experiment using reflectance reference panels, where the normalized root mean square errors (NRMSE) were less than ±2% for the light panels (25% and 50%) and less than ±4% for the dark panels (5% and 10%). In the high-altitude images, taken at 100-150 m altitude, the NRMSEs without atmospheric correction were within 1.4%-8.7% for VIS bands and 2.0%-18.5% for NIR bands. Significant atmospheric effects appeared already at 50 m flight altitude. The proposed atmospheric correction was found to be practical and it decreased the high-altitude NRMSEs to 1.3%-2.6% for VIS bands and to 2.3%-5.3% for NIR bands. Overall, the workflow was found to be efficient and to provide similar accuracies as the ELM, but providing operational advantages in such challenging scenarios as in forest monitoring, large-scale autonomous mapping tasks, and real-time applications. Tests in varying illumination conditions showed that the reflectance factors of the gravel and vegetation targets varied up to 8% between sunny and cloudy conditions due to reflectance anisotropy effects, while the direct reflectance workflow had better accuracy. This suggests that the varying illumination conditions have to be further accounted for in drone-based in quantitative remote sensing applications.
Positioning of unoccupied aerial systems (UAS, drones) is predominantly based on Global Navigation Satellite Systems (GNSS). Due to potential signal disruptions, redundant positioning systems are needed for reliable operation. The objective of this study was to implement and assess a redundant positioning system for high flying altitude drone operation based on visual-inertial odometry (VIO). A new sensor suite with stereo cameras and an inertial measurement unit (IMU) was developed, and a state-of-the-art VIO algorithm, VINS-Fusion, was used for localisation. Empirical testing of the system was carried out at flying altitudes of 40–100 m, which cover the common flight altitude range of outdoor drone operations. The performance of various implementations was studied, including stereo-visual-odometry (stereo-VO), monocular-visual-inertial-odometry (mono-VIO) and stereo-visual-inertial-odometry (stereo-VIO). The stereo-VIO provided the best results; the flight altitude of 40–60 m was the most optimal for the stereo baseline of 30 cm. The best positioning accuracy was 2.186 m for a 800 m-long trajectory. The performance of the stereo-VO degraded with the increasing flight altitude due to the degrading base-to-height ratio. The mono-VIO provided acceptable results, although it did not reach the performance level of the stereo-VIO. This work presented new hardware and research results on localisation algorithms for high flying altitude drones that are of great importance since the use of autonomous drones and beyond visual line-of-sight flying are increasing and will require redundant positioning solutions that compensate for potential disruptions in GNSS positioning. The data collected in this study are published for analysis and further studies.
Crop growth is often uneven within an agricultural parcel, even if it has been managed evenly. Aerial images are often used to determine the presence of vegetation and its spatial variability in field parcels. However, the reasons for this uneven growth have been less studied, and they might be connected to variations in topography, as well as soil properties and quality. In this study, we evaluated the relationship between drone image data and field and soil quality indicators. In total, 27 multispectral and RGB drone image datasets were collected from four real farm fields in 2016–2020. We analyzed 13 basic soil quality indicators, including penetrometer resistance in top- and subsoil, soil texture (clay, silt, fine sand, and sand content), soil organic carbon (SOC) content, clay/SOC ratio, and soil quality assessment parameters (topsoil biological indicators, subsoil macroporosity, compacted layers in the soil profile, topsoil structure, and subsoil structure). Furthermore, a topography variable describing water flow was used as an indicator. Firstly, we evaluated single pixel-wise linear correlations between the drone datasets and soil/field-related parameters. Correlations varied between datasets and, in the best case, were 0.8. Next, we trained and tested multiparameter non-linear models (random forest algorithm) using all 14 soil-related parameters as features to explain the multispectral (NIR band) and RGB (green band) reflectance values of each drone dataset. The results showed that the soil/field indicators could effectively explain the spatial variability in the drone images in most cases (R2 > 0.5), especially for annual crops, and in the best case, the R2 value was 0.95. The most important field/soil features for explaining the variability in drone images varied between fields and imaging times. However, it was found that basic soil quality indicators and topography variables could explain the variability observed in the drone orthomosaics in certain conditions. This knowledge about soil quality indicators causing within-field variation could be utilized when planning cultivation operations or evaluating the value of a field parcel.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.