Rising global temperatures tied to increases in greenhouse gas emissions are impacting high latitude regions, leading to changes in vegetation composition and feedbacks to climate through increased methane (CH4) emissions. In subarctic peatlands, permafrost collapse has led to shifts in vegetation species on landscape scales with high spatial heterogeneity. Our goal was to provide a baseline for vegetation distribution related to permafrost collapse and changes in biogeochemical processes. We collected unmanned aerial system (UAS) imagery at Stordalen Mire, Abisko, Sweden to classify vegetation cover types. A series of digital image processing routines were used to generate texture attributes within the image for the purpose of characterizing vegetative cover types. An artificial neural network (ANN) was developed to classify the image. The ANN used all texture variables and color bands (three spectral bands and six metrics) to generate a probability map for each of the eight cover classes. We used the highest probability for a class at each pixel to designate the cover type in the final map. Our overall misclassification rate was 32%, while omission and commission error by class ranged from 0% to 50%. We found that within our area of interest, cover classes most indicative of underlying permafrost (hummock and tall shrub) comprised 43.9% percent of the landscape. Our effort showed the capability of an ANN applied to UAS high-resolution imagery to develop a classification that focuses on vegetation types associated with permafrost status and therefore potentially changes in greenhouse gas exchange. We also used a method to examine the multiple probabilities representing cover class prediction at the pixel level to examine model confusion. UAS image collection can be inexpensive and a repeatable avenue to determine vegetation change at high latitudes, which can further be used to estimate and scale corresponding changes in CH4 emissions.
This letter reports the sensitivity of X-band interferometric synthetic aperture radar (InSAR) data from the first dual-spacecraft radar interferometer, TanDEM-X, to variations in tropical-forest aboveground biomass (AGB). It also reports the first tropical-forest AGB estimates from TanDEM-X data. Tropical forests account for about 50% of the world's forested biomass and play critical roles in the control of atmospheric carbon dioxide by emission through deforestation and uptake through forest growth. The TanDEM-X InSAR data used in this analysis were taken over the Tapajós National Forest, Pará, Brazil, where field measurements from 30 stands were acquired. The magnitude of the InSAR normalized complex correlation, which is called coherence, decreases by about 25% as AGB increases from 2 to 430 Mg-ha −1 , suggesting more vertically distributed return-power profiles with increasing biomass. Comparison of InSAR coherences to those of small-spot (15 cm) lidar suggests that lidar penetrates deeper into the canopies than InSAR. Modeling InSAR profiles from InSAR coherence and lidar profiles yields an estimate of 0.29 dB/m for the X-band extinction coefficient relative to that of lidar. Forest AGB estimated from InSAR observations on 0.25-ha plots shows RMS scatters about the field-estimated AGB between 52 and 62 Mg-ha −1 , which is between 29% and 35% of the average AGB of 179 Mg-ha −1 , depending on the data analysis mode. The sensitivity and biomass-estimation performance suggest the potential of TanDEM-X observations to contribute to global tropical-forest biomass monitoring.
Abstract. Terrestrial and airborne laser scanning and structure from motion techniques have emerged as viable methods to map snow depths. While these systems have advanced snow hydrology, these techniques have noted limitations in either horizontal or vertical resolution. Lidar on an unpiloted aerial vehicle (UAV) is another potential method to observe field- and slope-scale variations at the vertical resolutions needed to resolve local variations in snowpack depth and to quantify snow depth when snowpacks are shallow. This paper provides some of the earliest snow depth mapping results on the landscape scale that were measured using lidar on a UAV. The system, which uses modest-cost, commercially available components, was assessed in a mixed deciduous and coniferous forest and open field for a thin snowpack (< 20 cm). The lidar-classified point clouds had an average of 90 and 364 points/m2 ground returns in the forest and field, respectively. In the field, in situ and lidar mean snow depths, at 0.4 m horizontal resolution, had a mean absolute difference of 0.96 cm and a root mean square error of 1.22 cm. At 1 m horizontal resolution, the field snow depth confidence intervals were consistently less than 1 cm. The forest areas had reduced performance with a mean absolute difference of 9.6 cm, a root mean square error of 10.5 cm, and an average one-sided confidence interval of 3.5 cm. Although the mean lidar snow depths were only 10.3 cm in the field and 6.0 cm in the forest, a pairwise Steel–Dwass test showed that snow depths were significantly different between the coniferous forest, the deciduous forest, and the field land covers (p < 0.0001). Snow depths were shallower, and snow depth confidence intervals were higher in areas with steep slopes. Results of this study suggest that performance depends on both the point cloud density, which can be increased or decreased by modifying the flight plan over different vegetation types, and the grid cell variability that depends on site surface conditions.
The ability to automatically delineate individual tree crowns using remote sensing data opens the possibility to collect detailed tree information over large geographic regions. While individual tree crown delineation (ITCD) methods have proven successful in conifer-dominated forests using Light Detection and Ranging (LiDAR) data, it remains unclear how well these methods can be applied in deciduous broadleaf-dominated forests. We applied five automated LiDAR-based ITCD methods across fifteen plots ranging from conifer- to broadleaf-dominated forest stands at Harvard Forest in Petersham, MA, USA, and assessed accuracy against manual delineation of crowns from unmanned aerial vehicle (UAV) imagery. We then identified tree- and plot-level factors influencing the success of automated delineation techniques. There was relatively little difference in accuracy between automated crown delineation methods (51–59% aggregated plot accuracy) and, despite parameter tuning, none of the methods produced high accuracy across all plots (27—90% range in plot-level accuracy). The accuracy of all methods was significantly higher with increased plot conifer fraction, and individual conifer trees were identified with higher accuracy (mean 64%) than broadleaf trees (42%) across methods. Further, while tree-level factors (e.g., diameter at breast height, height and crown area) strongly influenced the success of crown delineations, the influence of plot-level factors varied. The most important plot-level factor was species evenness, a metric of relative species abundance that is related to both conifer fraction and the degree to which trees can fill canopy space. As species evenness decreased (e.g., high conifer fraction and less efficient filling of canopy space), the probability of successful delineation increased. Overall, our work suggests that the tested LiDAR-based ITCD methods perform equally well in a mixed temperate forest, but that delineation success is driven by forest characteristics like functional group, tree size, diversity, and crown architecture. While LiDAR-based ITCD methods are well suited for stands with distinct canopy structure, we suggest that future work explore the integration of phenology and spectral characteristics with existing LiDAR as an approach to improve crown delineation in broadleaf-dominated stands.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.