2021
DOI: 10.1007/978-3-030-86523-8_32
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Information Filter for Fast Gaussian Process Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(15 citation statements)
references
References 8 publications
0
15
0
Order By: Relevance
“…For larger datasets, stochastic optimization has been applied e.g. [2,12,17,27] to obtain faster and more data efficient optimization procedures. For recent reviews on the subject we refer to [20,23,24].…”
Section: Global Sparse Gpsmentioning
confidence: 99%
See 4 more Smart Citations
“…For larger datasets, stochastic optimization has been applied e.g. [2,12,17,27] to obtain faster and more data efficient optimization procedures. For recent reviews on the subject we refer to [20,23,24].…”
Section: Global Sparse Gpsmentioning
confidence: 99%
“…The log of the marginal likelihood of our model formulated in Section 3.3 is L(θ) = log q (y|θ) = log N (0, P ) with P = HS −1 H T + V which can be efficiently computed as detailed in Section A.3 and can be used for deterministic optimization with full batch y for moderate sample size N . However, in order to scale this parameter optimization part to larger number of samples N in a competitive time, stochastic optimization techniques exploiting subsets of data have to be developed similarly done for the global sparse GP model (SVI [12]; REC [27]; IF [17]). We adapt the hybrid approach IF of [17] where we can also exploit an independent factorization of the log marginal likelihood which decomposes into a sum of J terms, so that it can be used for stochastic optimization.…”
Section: Hyperparameter Estimationmentioning
confidence: 99%
See 3 more Smart Citations