2022
DOI: 10.48550/arxiv.2206.08744
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improved uncertainty quantification for Gaussian process regression based interatomic potentials

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…Overestimated uncertainty, when training GP models with exact marginal log-likehood objective, is a known issue, see e.g. [14]and [26]. Variational GP model with inducing points and with proper scalable objective, such as the predic- tive log-likelihood, therefore allowed us to improve the uncertainty estimates almost without sacrificing the predictive quality.…”
Section: Learning Uncertaintymentioning
confidence: 99%
See 1 more Smart Citation
“…Overestimated uncertainty, when training GP models with exact marginal log-likehood objective, is a known issue, see e.g. [14]and [26]. Variational GP model with inducing points and with proper scalable objective, such as the predic- tive log-likelihood, therefore allowed us to improve the uncertainty estimates almost without sacrificing the predictive quality.…”
Section: Learning Uncertaintymentioning
confidence: 99%
“…Uncertainty quantification is typically addressed within Bayesian worldview either by Bayesian model averaging, see e.g. [10], [11] or by directly learning the uncertainty parameters within the framework of Gaussian processes [12], [13], [14]. Here we follow the second route and address the problem of learning the uncertainty parameters by scalable Gaussian processes.…”
Section: Introductionmentioning
confidence: 99%