2020
DOI: 10.5194/tc-14-2925-2020
|View full text |Cite
|
Sign up to set email alerts
|

Snow depth mapping from stereo satellite imagery in mountainous terrain: evaluation using airborne laser-scanning data

Abstract: Abstract. Accurate knowledge of snow depth distributions in mountain catchments is critical for applications in hydrology and ecology. Recently, a method was proposed to map snow depth at meter-scale resolution from very-high-resolution stereo satellite imagery (e.g., Pléiades) with an accuracy close to 0.5 m. However, the validation was limited to probe measurements and unmanned aircraft vehicle (UAV) photogrammetry, which sampled a limited fraction of the topographic and snow depth variability. We improve up… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

6
98
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 86 publications
(105 citation statements)
references
References 45 publications
6
98
1
Order By: Relevance
“…We opted for a FN-FB set as it offered a larger disparity in point clouds when calculated within the ASP routine, and therefore found as the best trade-off in reducing intersection errors and data gaps due to terrain occlusion. The semi-global matching, binary transform routine in ASP was found by Deschamps-Berger et al (2020) to reduce the random error and root mean squared error of both snowcovered and stable, snow-free terrain compared to local search algorithms previously implemented for snow depth mapping using Pléiades (Marti et al, 2016;Shaw et al, 2020a). Due to saturation in the SnowON_2017 images (see discussion Limitations of Pléiades for Multi-Annual Snow Depth Derivation), correlation failure of the stereo process in ASP resulted in a loss of ∼17% of the total pixels in the catchment (Shaw et al, 2020a).…”
Section: Digital Elevation Model Generationmentioning
confidence: 99%
See 3 more Smart Citations
“…We opted for a FN-FB set as it offered a larger disparity in point clouds when calculated within the ASP routine, and therefore found as the best trade-off in reducing intersection errors and data gaps due to terrain occlusion. The semi-global matching, binary transform routine in ASP was found by Deschamps-Berger et al (2020) to reduce the random error and root mean squared error of both snowcovered and stable, snow-free terrain compared to local search algorithms previously implemented for snow depth mapping using Pléiades (Marti et al, 2016;Shaw et al, 2020a). Due to saturation in the SnowON_2017 images (see discussion Limitations of Pléiades for Multi-Annual Snow Depth Derivation), correlation failure of the stereo process in ASP resulted in a loss of ∼17% of the total pixels in the catchment (Shaw et al, 2020a).…”
Section: Digital Elevation Model Generationmentioning
confidence: 99%
“…A key challenge remains for linking precipitation events, wind variability and snow depth at the highest reaches of the mountain catchments, especially because the available data for snow depth offers only a snap shot in time (Figure 4). Nevertheless, the emergence of decimetre accuracy Pléiades and World-View products to estimate snow depth (Marti et al, 2016;McGrath et al, 2019;Shaw et al, 2020a;Deschamps-Berger et al, 2020) may offer new potential in how we can constrain high mountain precipitation, though further steps are required to relate localised in situ meteorological observations to catchment-wide snow volumes on an inter-annual basis.…”
Section: Snow Depth Vs Meteorology Of the Central Andesmentioning
confidence: 99%
See 2 more Smart Citations
“…The benefits of remote sensing techniques to characterise snow have been long-established (e.g. Dietz et al, 2012;Dozier and Painter, 2004;König et al, 2001;Kokhanovsky et al, 2019;Nolin, 2010). Commonly, methods to derive information about the snow physical properties from optical satellite observations are based on surface reflectance products (e.g.…”
Section: Introductionmentioning
confidence: 99%