2023
DOI: 10.48550/arxiv.2302.11024
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Gradient Flows for Sampling: Mean-Field Models, Gaussian Approximations and Affine Invariance

Abstract: Sampling a probability distribution with an unknown normalization constant is a fundamental problem in computational science and engineering. This task may be cast as an optimization problem over all probability measures, and an initial distribution can be evolved to the desired minimizer (the target distribution) dynamically via gradient flows. Mean-field models, whose law is governed by the gradient flow in the space of probability measures, may also be identified; particle approximations of these mean-field… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 112 publications
0
1
0
Order By: Relevance
“…This new assumption (28) covers almost all reasonable scenarios with ρ 0 being Gaussian, as long as π has second moment. The upper bound in the assumption of [17] is unnecessary. As is suggested in [57], the optimal asymptotic convergence rate should be e −2t , which is proved in [26] under a different set of assumptions.…”
Section: Theorem 23 [57 Theorem 33]mentioning
confidence: 99%
“…This new assumption (28) covers almost all reasonable scenarios with ρ 0 being Gaussian, as long as π has second moment. The upper bound in the assumption of [17] is unnecessary. As is suggested in [57], the optimal asymptotic convergence rate should be e −2t , which is proved in [26] under a different set of assumptions.…”
Section: Theorem 23 [57 Theorem 33]mentioning
confidence: 99%