2021
DOI: 10.1038/s41467-021-26107-z
|View full text |Cite
|
Sign up to set email alerts
|

From calibration to parameter learning: Harnessing the scaling effects of big data in geoscientific modeling

Abstract: The behaviors and skills of models in many geosciences (e.g., hydrology and ecosystem sciences) strongly depend on spatially-varying parameters that need calibration. A well-calibrated model can reasonably propagate information from observations to unobserved variables via model physics, but traditional calibration is highly inefficient and results in non-unique solutions. Here we propose a novel differentiable parameter learning (dPL) framework that efficiently learns a global mapping between inputs (and opti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
162
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

4
4

Authors

Journals

citations
Cited by 143 publications
(163 citation statements)
references
References 51 publications
(68 reference statements)
1
162
0
Order By: Relevance
“…The data used here represent the best-instrumented sites from USGS, and 415 locations are only a tiny fraction of the millions of river reaches in the United States. In the future, the combination of processbased modelling and machine learning may allow more robust predictions on a global scale, which are already started by other scholars (Jia et al, 2021;Karpatne et al, 2018;Read et al, 2019;Tsai et al, 2021).…”
Section: Further Discussionmentioning
confidence: 92%
“…The data used here represent the best-instrumented sites from USGS, and 415 locations are only a tiny fraction of the millions of river reaches in the United States. In the future, the combination of processbased modelling and machine learning may allow more robust predictions on a global scale, which are already started by other scholars (Jia et al, 2021;Karpatne et al, 2018;Read et al, 2019;Tsai et al, 2021).…”
Section: Further Discussionmentioning
confidence: 92%
“…This analysis is not intended to be a formal SBI or model calibration; rather, the purpose is to further explore the validity of our emulators. This approach is much more simple than other calibration approaches that might employ an evolution search algorithm [41], gradient-based method to adjust parameters in a series of more limited model simulations [42] or even use ML approaches to replace the calibration routine [43]. Given this proof of concept, future work should include more complex frameworks, including those that loop parameters back to the original physical model simulation or use a more formalized Bayesian framework [44].…”
Section: Parameter Evaluationmentioning
confidence: 99%
“…lations [42] or even use ML approaches to replace the calibration routine [43]. Given th proof of concept, future work should include more complex frameworks, including tho that loop parameters back to the original physical model simulation or use a more forma ized Bayesian framework [44].…”
Section: Base-case Model Performance and In Range Test Casesmentioning
confidence: 99%
“…Such gradient information is extremely useful for solving previously difficult or unsolvable problems. For example, Tsai et al (2021) recently proposed a novel differentiable parameter learning (dPL) framework to integrate big-data DL and differentiable PBMs for parameter calibration. Another example is the use deep learning surrogate model to perform riverine bathymetry inversion (Ghorbanidehno et al, 2021).…”
Section: Introductionmentioning
confidence: 99%