2018
DOI: 10.1137/17m1153157
|View full text |Cite
|
Sign up to set email alerts
|

Finite-Dimensional Gaussian Approximation with Linear Inequality Constraints

Abstract: Introducing inequality constraints in Gaussian process (GP) models can lead to more realistic uncertainties in learning a great variety of real-world problems. We consider the finite-dimensional Gaussian approach from Maatouk and Bay (2017) which can satisfy inequality conditions everywhere (either boundedness, monotonicity or convexity). Our contributions are threefold. First, we extend their approach in order to deal with general sets of linear inequalities. Second, we explore several Markov Chain Monte Carl… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
105
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

4
4

Authors

Journals

citations
Cited by 60 publications
(106 citation statements)
references
References 28 publications
1
105
0
Order By: Relevance
“…Other possible potential future work is to derive the GP-Protein framework with multiple mRNA profiles (driving-forces) and multiple gap proteins (outputs). Finally, one may consider accounting for positiveness constraints into the GP framework using finite-dimensional Gaussian approximations (see, e.g., Maatouk and Bay, 2017;López-Lopera et al, 2018).…”
Section: Resultsmentioning
confidence: 99%
“…Other possible potential future work is to derive the GP-Protein framework with multiple mRNA profiles (driving-forces) and multiple gap proteins (outputs). Finally, one may consider accounting for positiveness constraints into the GP framework using finite-dimensional Gaussian approximations (see, e.g., Maatouk and Bay, 2017;López-Lopera et al, 2018).…”
Section: Resultsmentioning
confidence: 99%
“…To do so, tensorized finite dimensional projection spaces could be considered (see e.g., [42] or [43]). If the functional inputs are functions from T ⊂ R d → R, then for tensorized projection spaces, the projection dimension is of the form p (1) × ... × p (1) , where p (1) , .…”
Section: Scope Of the Methodologymentioning
confidence: 99%
“…. , i n ), or by a more advanced rejection method called Rejection Sampling from the Mode (RSM) [19], or by more involved Markov Chain Monte Carlo (MCMC) methods [4,33,23]; see also their presentations in [18].…”
Section: Sample Zmentioning
confidence: 99%
“…of Z n . The density in (4) is arguably more complicated than a truncated Gaussian density function, for which many implemented algorithms are available, as discussed above when introducing the references [4,33,23,18]. In [26,22], several approximations of the distribution in (4) by multidimensional Gaussian distributions are presented (in particular, the Laplace and EP approximations, the variational method and the Kullback-Leibler method).…”
Section: Comparison With Classical Gpcmentioning
confidence: 99%