2020
DOI: 10.3934/fods.2020013
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic learning on manifolds

Abstract: This paper presents novel mathematical results in support of the probabilistic learning on manifolds (PLoM) recently introduced by the authors. An initial dataset, constituted of a small number of points given in an Euclidean space, is given. The points are independent realizations of a vector-valued random variable for which its non-Gaussian probability measure is unknown but is, a priori, concentrated in an unknown subset of the Euclidean space. A learned dataset, constituted of additional realizations, is c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
36
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 26 publications
(36 citation statements)
references
References 95 publications
0
36
0
Order By: Relevance
“…The preservation of the concentration is quantified by the calculation of a L 2 distance on (Θ,  , ) between the learned set and the training set. 13 • Using the learned set, PLoM allows for carrying out any conditional statistics such as w  → E{Q|W = w} from  w in R n q , and consequently, to directly construct metamodels in a probabilistic framework.…”
Section: Brief Discussion On the Hypotheses And The Objectives Of Plommentioning
confidence: 99%
See 3 more Smart Citations
“…The preservation of the concentration is quantified by the calculation of a L 2 distance on (Θ,  , ) between the learned set and the training set. 13 • Using the learned set, PLoM allows for carrying out any conditional statistics such as w  → E{Q|W = w} from  w in R n q , and consequently, to directly construct metamodels in a probabilistic framework.…”
Section: Brief Discussion On the Hypotheses And The Objectives Of Plommentioning
confidence: 99%
“…In the recently published mathematical foundations of PLoM, 13 to establish the main theorem, we introduced a distance between the random matrix defined by PLoM and the deterministic matrix that represent all the given points of the training set. In the present article, and in order to facilitate the quantification of the preservation of the concentration of the probability measure between the usual MCMC method (No PLoM), the PLoM method without partition (No-Group PLoM), and the PLoM method with partition (With-Group PLoM), we apply this distance to each group of the partition.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…In probabilistic learning on manifolds, [20] proposed a Markov chain Monte Carlo (MCMC) sampler to generate new data sets, which preserve the concentration of probability measure estimated from the original data set [21] and have applications in uncertainty quantification [26]. This paper handles the same problem, but explicitly estimates the manifold by the density ridge, and generates new data by bootstrapping, which avoids the computational cost of MCMC sampling.…”
mentioning
confidence: 99%