2017
DOI: 10.1371/journal.pone.0186193
|View full text |Cite
|
Sign up to set email alerts
|

The potential of small-Unmanned Aircraft Systems for the rapid detection of threatened unimproved grassland communities using an Enhanced Normalized Difference Vegetation Index

Abstract: The loss of unimproved grassland has led to species decline in a wide range of taxonomic groups. Agricultural intensification has resulted in fragmented patches of remnant grassland habitat both across Europe and internationally. The monitoring of remnant patches of this habitat is critically important, however, traditional surveying of large, remote landscapes is a notoriously costly and difficult task. The emergence of small-Unmanned Aircraft Systems (sUAS) equipped with low-cost multi-spectral cameras offer… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2018
2018
2025
2025

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 34 publications
(21 citation statements)
references
References 37 publications
0
21
0
Order By: Relevance
“…Furthermore, the application of sUAS surveying may also be applicable to help understand vegetation development and ecosystem functioning in relation to a site's morphological development through vegetation and habitat mapping, as previously performed in other settings elsewhere (e.g. Belluco et al, 2006;Strong et al, 2017). This potential application, including multi-spectral vegetation analysis, should form the basis of future research focussed on the value and use of sUAS in intertidal settings.…”
Section: Development and Ecological Change In Intertidal Wetland Envimentioning
confidence: 95%
“…Furthermore, the application of sUAS surveying may also be applicable to help understand vegetation development and ecosystem functioning in relation to a site's morphological development through vegetation and habitat mapping, as previously performed in other settings elsewhere (e.g. Belluco et al, 2006;Strong et al, 2017). This potential application, including multi-spectral vegetation analysis, should form the basis of future research focussed on the value and use of sUAS in intertidal settings.…”
Section: Development and Ecological Change In Intertidal Wetland Envimentioning
confidence: 95%
“…With the advent and rapid development of GIS software and the large amount of remotely sensed data available, these tools have been increasingly used for predictive plant community mapping (Burnside & Waite, 2011). Among the wide range of remote sensing techniques and platforms, there are many studies that use passive multispectral remotely sensed data to identify plant communities (Townsend & Walsh, 2001;Brown et al, 2006;Balzarolo et al, 2009;Berni et al, 2009;Hamada et al, 2011;Strong et al, 2017). These studies used a methodology based on identifying specific reflectance values in different wavelengths of distinct vegetation by performing some form of classification, either unsupervised or supervised (Jones & Vaughn, 2010).…”
Section: Introductionmentioning
confidence: 99%
“…Despite its great potential and recent progress, the use of UAVs and classification algorithms for mapping grassland plant communities has received little attention in the scientific literature. Some studies have utilized different sensors and statistical algorithms to automatically map grassland communities, from consumer grade cameras (Gonçalves et al, 2015; Lu & He, 2017) to multispectral sensors (Strong et al, 2017). In addition, UAVs have been used to estimate aboveground biomass production in grasslands (Wang et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…sUAS are being used increasingly across a number of scientific disciplines as an alternative approach to provide high-resolution detailed imagery (e.g., James and Robson 2014;Tonkin and Midgley 2016;Strong et al 2017). Images can be used for rapid reconstruction of surface geometry, providing there is sufficient overlap between images, without the need for camera position or orientation data through automated photogrammetric techniques (e.g., James and Robson 2012;Westoby et al 2012;Javemick et al 2014;Nolan et al 2015).…”
Section: Dsmmentioning
confidence: 99%