Aquatic vegetation has important ecological and regulatory functions and should be monitored in order to detect ecosystem changes. Field data collection is often costly and time-consuming; remote sensing with unmanned aircraft systems (UASs) provides aerial images with sub-decimetre resolution and offers a potential data source for vegetation mapping. In a manual mapping approach, UAS true-colour images with 5-cm-resolution pixels allowed for the identification of non-submerged aquatic vegetation at the species level. However, manual mapping is labour-intensive, and while automated classification methods are available, they have rarely been evaluated for aquatic vegetation, particularly at the scale of individual vegetation stands. We evaluated classification accuracy and time-efficiency for mapping non-submerged aquatic vegetation at three levels of detail at five test sites (100 m × 100 m) differing in vegetation complexity. We used object-based image analysis and tested two classification methods (threshold classification and Random Forest) using eCognition ® . The automated classification results were compared to results from manual mapping. Using threshold classification, overall accuracy at the five test sites ranged from 93% to 99% for the water-versus-vegetation level and from 62% to 90% for the growth-form level. Using Random Forest classification, overall accuracy ranged from 56% to 94% for the growth-form level and from 52% to 75% for the dominant-taxon level. Overall classification accuracy decreased with increasing vegetation complexity. In test sites with more complex vegetation, automated classification was more time-efficient than manual mapping. This study demonstrated that automated classification of non-submerged aquatic vegetation from true-colour UAS images was feasible, indicating good potential for operative mapping of aquatic vegetation. When choosing the preferred mapping method (manual versus automated) the desired level of thematic detail and the required accuracy for the mapping task needs to be considered.
Abbreviations GIS = geographic information system; GPS = global positioning system; PAMS = personal aerial mapping system; RPAS = remotely piloted aircraft system; UAS = unmanned aircraft system; UAV = unmanned aerial vehicle; VAT = value added tax Nomenclature Vascular plants: Karlsson (1997 Methods: At one lake and at the river site we evaluated accuracy with which aquatic plant species can be identified on printouts of UAS images (scale 1:800, resolution 5.6 cm). As assessment units we used homogeneous vegetation patches, referred to as vegetation stands of one or more species. The accuracy assessment included calibration and validation based on field controls. At the river site, we produced a digital vegetation map based on an UAS orthoimage (geometrically corrected image mosaic) and the results of the species identification evaluation. We applied visual image interpretation and manual mapping. At one of the lake sites, we assessed the abundance (four-grade scale) of the dominating Phragmites australis and produced a cover map. Results:We identified the species composition of vegetation stands at the lake and the river site with an overall accuracy of 95.1% and 80.4%, respectively. It was feasible to produce a digital vegetation map, albeit with a slight reduction in detail compared to the species identification step. At the site for abundance assessment, P. australis covered 20% of the total lake surface area, and 70% of the covered area had cover ≤25%. Conclusions:The tested UAS facilitates lake and river vegetation identification and mapping at the species level, as well as abundance estimates.
Abstract:Monitoring of aquatic vegetation is an important component in the assessment of freshwater ecosystems. Remote sensing with unmanned aircraft systems (UASs) can provide sub-decimetre-resolution aerial images and is a useful tool for detailed vegetation mapping. In a previous study, non-submerged aquatic vegetation was successfully mapped using automated classification of spectral and textural features from a true-colour UAS-orthoimage with 5-cm pixels. In the present study, height data from a digital surface model (DSM) created from overlapping UAS-images has been incorporated together with the spectral and textural features from the UAS-orthoimage to test if classification accuracy can be improved further. We studied two levels of thematic detail: (a) Growth forms including the classes of water, nymphaeid, and helophyte; and (b) dominant taxa including seven vegetation classes. We hypothesized that the incorporation of height data together with spectral and textural features would increase classification accuracy as compared to using spectral and textural features alone, at both levels of thematic detail. We tested our hypothesis at five test sites (100 m × 100 m each) with varying vegetation complexity and image quality using automated object-based image analysis in combination with Random Forest classification. Overall accuracy at each of the five test sites ranged from 78% to 87% at the growth-form level and from 66% to 85% at the dominant-taxon level. In comparison to using spectral and textural features alone, the inclusion of height data increased the overall accuracy significantly by 4%-21% for growth-forms and 3%-30% for dominant taxa. The biggest improvement gained by adding height data was observed at the test site with the most complex vegetation. Height data derived from UAS-images has a large potential to efficiently increase the accuracy of automated classification of non-submerged aquatic vegetation, indicating good possibilities for operative mapping.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.