2024
DOI: 10.1109/tnnls.2023.3294089
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Sparse Gaussian Process

Abstract: Adaptive learning is necessary for non-stationary environments where the learning machine needs to forget past data distribution. Efficient algorithms require a compact model update to not grow in computational burden with the incoming data and with the lowest possible computational cost for online parameter updating. Existing solutions only partially cover these needs. Here, we propose the first adaptive sparse Gaussian Process (GP) able to address all these issues. We first reformulate a variational sparse G… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…This research marks a novel advancement in etch recipe tuning, where current studies lack ML applications for ScAlN etching optimization. The present methodology may present scalability issues if the input data have many samples because the computational burden of the method is of the order of N 3 , N being the number of training samples, but this problem has been overcome with the use of sparse Gaussian processes, which dramatically reduce the computational burden of the GP modeling without compromising its accuracy, and these versions admit fully nonlinear adaptive counterparts …”
Section: Introductionmentioning
confidence: 99%
“…This research marks a novel advancement in etch recipe tuning, where current studies lack ML applications for ScAlN etching optimization. The present methodology may present scalability issues if the input data have many samples because the computational burden of the method is of the order of N 3 , N being the number of training samples, but this problem has been overcome with the use of sparse Gaussian processes, which dramatically reduce the computational burden of the GP modeling without compromising its accuracy, and these versions admit fully nonlinear adaptive counterparts …”
Section: Introductionmentioning
confidence: 99%