2013 IEEE International Workshop on Machine Learning for Signal Processing (MLSP) 2013
DOI: 10.1109/mlsp.2013.6661916
|View full text |Cite
|
Sign up to set email alerts
|

Bounded Gaussian process regression

Abstract: We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We approximate the intractable posterior distributions by the Laplace approximation and expectation propagation and show the properties of the models on an artificial example. We finally consider two real-world data sets or… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(14 citation statements)
references
References 4 publications
0
14
0
Order By: Relevance
“…This approach has the advantages of avoiding the burden of a large training set that comes with a neural network model, and the inexact satisfaction of constraints that come with penalization of constraints in the loss function. There has been significant interest in the incorporation of constraints into Gaussian process regression (GPR) models recently (Bachoc et al, 2019;Da Veiga and Marrel, 2012;Jensen et al, 2013;López-Lopera et al, 2018;Raissi et al, 2017;Riihimäki and Vehtari, 2010;Solak et al, 2003;Yang et al, 2018). Many of these approaches leverage the analytic formulation of the GP to incorporate constraints through the likelihood function or i.e.…”
Section: Introductionmentioning
confidence: 99%
“…This approach has the advantages of avoiding the burden of a large training set that comes with a neural network model, and the inexact satisfaction of constraints that come with penalization of constraints in the loss function. There has been significant interest in the incorporation of constraints into Gaussian process regression (GPR) models recently (Bachoc et al, 2019;Da Veiga and Marrel, 2012;Jensen et al, 2013;López-Lopera et al, 2018;Raissi et al, 2017;Riihimäki and Vehtari, 2010;Solak et al, 2003;Yang et al, 2018). Many of these approaches leverage the analytic formulation of the GP to incorporate constraints through the likelihood function or i.e.…”
Section: Introductionmentioning
confidence: 99%
“…( 4)] with a non-Gaussian likelihood that satisfies the bounds, which is then used to obtain a posterior formula for predicting observations y from f . The paper by Jensen et al (2013) provides an overview and comparison of these two methods; we review this below. For the subsequent discussion, we assume that we have a set of observations y i that satisfy the bound constraint: a ≤ y i ≤ b.…”
Section: Transformed Output and Likelihoodmentioning
confidence: 99%
“…The field u together with the observations u i are then treated with a traditional GP model using the steps outlined in Section 2. The probit function, which is the inverse cumulative distribution function of a standard normal random variable: Φ −1 (•), is commonly used as a warping function (Jensen et al, 2013). The probit function transforms bounded values y ∈…”
Section: Warping Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…The inverse cumulative Gaussian function (Probit) is used to map the responses in this study. Fancier ways of transforming outputs or using non‐Gaussian likelihoods are also suggested but are out of the scope of the current article (Jensen, Nielsen, & Larsen, 2013; Snelson et al., 2004).…”
Section: Testbed Cases and Data Acquisitionmentioning
confidence: 99%