Estimating animal populations is critical for wildlife management. Aerial surveys are used for generating population estimates, but can be hampered by cost, logistical complexity, and human risk. Additionally, human counts of organisms in aerial imagery can be tedious and subjective. Automated approaches show promise, but can be constrained by long setup times and difficulty discriminating animals in aggregations. We combine unmanned aircraft systems (UAS), thermal imagery and computer vision to improve traditional wildlife survey methods. During spring 2015, we flew fixed-wing UAS equipped with thermal sensors, imaging two grey seal (Halichoerus grypus) breeding colonies in eastern Canada. Human analysts counted and classified individual seals in imagery manually. Concurrently, an automated classification and detection algorithm discriminated seals based upon temperature, size, and shape of thermal signatures. Automated counts were within 95–98% of human estimates; at Saddle Island, the model estimated 894 seals compared to analyst counts of 913, and at Hay Island estimated 2188 seals compared to analysts’ 2311. The algorithm improves upon shortcomings of computer vision by effectively recognizing seals in aggregations while keeping model setup time minimal. Our study illustrates how UAS, thermal imagery, and automated detection can be combined to efficiently collect population data critical to wildlife management.
Very high-resolution satellite imagery (≤5 m resolution) has become available on a spatial and temporal scale appropriate for dynamic wetland management and conservation across large areas. Estuarine wetlands have the potential to be mapped at a detailed habitat scale with a frequency that allows immediate monitoring after storms, in response to human disturbances, and in the face of sea-level rise. Yet mapping requires significant fieldwork to run modern classification algorithms and estuarine environments can be difficult to access and are environmentally sensitive. Recent advances in unoccupied aircraft systems (UAS, or drones), coupled with their increased availability, present a solution. UAS can cover a study site with ultra-high resolution (<5 cm) imagery allowing visual validation. In this study we used UAS imagery to assist training a Support Vector Machine to classify WorldView-3 and RapidEye satellite imagery of the Rachel Carson Reserve in North Carolina, USA. UAS and field-based accuracy assessments were employed for comparison across validation methods. We created and examined an array of indices and layers including texture, NDVI, and a LiDAR DEM. Our results demonstrate classification accuracy on par with previous extensive fieldwork campaigns (93% UAS and 93% field for WorldView-3; 92% UAS and 87% field for RapidEye). Examining change between 2004 and 2017, we found drastic shoreline change but general stability of emergent wetlands. Both WorldView-3 and RapidEye were found to be valuable sources of imagery for habitat classification with the main tradeoff being WorldView’s fine spatial resolution versus RapidEye’s temporal frequency. We conclude that UAS can be highly effective in training and validating satellite imagery.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.