2021
DOI: 10.1002/nme.6856
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic learning on manifolds (PLoM) with partition

Abstract: The probabilistic learning on manifolds (PLoM) introduced in 2016 has solved difficult supervised problems for the "small data" limit where the number N of points in the training set is small. Many extensions have since been proposed, making it possible to deal with increasingly complex cases. However, the performance limit has been observed and explained for applications for which N is very small and for which the dimension of the diffusion-map basis is close to N. For these cases, we propose a novel extensio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2

Relationship

6
2

Authors

Journals

citations
Cited by 26 publications
(27 citation statements)
references
References 35 publications
0
27
0
Order By: Relevance
“…There are methods for sampling underlying distributions on manifolds [95,96,97,98,99,100,101,102]. Among all these existing methods in machine learning, there is the probabilistic learning method (PLoM), which has specifically been developed for small non-Gaussian data (small value of N d ) in arbitrary dimension [103,104,105], with the possibility to take into account additional constraints coming from experiments [106] or from nonlinear partial differential equations [107], to construct a polynomial chaos representation of databases on manifolds [77], to construct Bayesian posteriors in high dimension [108], and which has been used for complex optimization problems under uncertainties [93,109] and challenging applications [110,111,112].…”
Section: Probabilistic Learning On Manifolds (Plom) Used As a Machine...mentioning
confidence: 99%
See 1 more Smart Citation
“…There are methods for sampling underlying distributions on manifolds [95,96,97,98,99,100,101,102]. Among all these existing methods in machine learning, there is the probabilistic learning method (PLoM), which has specifically been developed for small non-Gaussian data (small value of N d ) in arbitrary dimension [103,104,105], with the possibility to take into account additional constraints coming from experiments [106] or from nonlinear partial differential equations [107], to construct a polynomial chaos representation of databases on manifolds [77], to construct Bayesian posteriors in high dimension [108], and which has been used for complex optimization problems under uncertainties [93,109] and challenging applications [110,111,112].…”
Section: Probabilistic Learning On Manifolds (Plom) Used As a Machine...mentioning
confidence: 99%
“…In order to facilitate the reading of this paper, the reader will find in Section A.1 of Appendix A a summary of the PLoM algorithm. We give this summary, because the proposed algorithm is the assembly of ingredients, which are distributed in three different papers with slightly different notations: basic algorithm of PLoM [103,104], novel algorithm to estimate the optimal value of the parameter of the kernel for the calculation of the diffusion-maps basis [105], and taking into account of the normalization constraints [106].…”
Section: Probabilistic Learning On Manifolds (Plom) Used As a Machine...mentioning
confidence: 99%
“…Presently we propose to use the Störmer-Verlet scheme (see [72] for the deterministic case and [73] for the stochastic case), which is an efficient scheme that allows for having a long-time energy conservation for non-dissipative Hamiltonian dynamical systems. In [74], we have proposed to use an extension of the Störmer-Verlet scheme for stochastic dissipative Hamiltonian systems, that we have also used in [75,59,40,38,39,47,51].…”
Section: Numerical Implementationmentioning
confidence: 99%
“…, N} of Q post . Regarding the resampling of a probability measure with MCMC algorithms, it should also be noted that, when the available training set is composed of a small number of points, suitable algorithms should be used like those which have been specifically developed to deal with the case of small data (see [40,41,42,43,44,45,46,39,47] for data-driven problems and [48,49,50] for optimization problems).…”
Section: Introductionmentioning
confidence: 99%
“…(ii) Organization and novelties of the paper. First of all, let us point out that a neighboring problem has been tackled in [25] devoted to take into account constraints in the PLoM (probabilistic learning on manifolds) method [44,45,46]. However, in [25], function h c is explicit.…”
Section: Introductionmentioning
confidence: 99%