2022
DOI: 10.1002/acs.3442
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive estimation of external fields in reproducing kernel Hilbert spaces

Abstract: Summary This article studies the distributed parameter system that governs adaptive estimation by mobile sensor networks of external fields in a reproducing kernel Hilbert space (RKHS). The article begins with the derivation of conditions that guarantee the well‐posedness of the ideal, infinite dimensional governing equations of evolution for the centralized estimation scheme. Subsequently, convergence of finite dimensional approximations is studied. Rates of convergence in all formulations are established usi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 45 publications
0
1
0
Order By: Relevance
“…Many of these approaches are based in Gaussian mixture models [39] and mixture models more generally [40], with application to car following [41], [42], [43], [44], driver influence via economic models [45], [46], and robotic manipulation [47], [48]. We choose reproducing kernel Hilbert spaces (RKHS) based tools because a) they are amenable to non-parametric modeling, meaning that unlike many machine learning approaches such as mixture models, there are very few parameters that are required, and hence less vulnerability to the need for excessive tuning; b) they established tools for statistical inference and estimation [27], [49], [50], and are gaining traction as tools for data-driven verification [51], [52] and control [53], [54] of dynamical systems; and c) numerical methods scale with the number of samples of observed data, as opposed to dimension of the state, which makes them a promising approach for extension to run-time adaptive automation.…”
Section: Safementioning
confidence: 99%
“…Many of these approaches are based in Gaussian mixture models [39] and mixture models more generally [40], with application to car following [41], [42], [43], [44], driver influence via economic models [45], [46], and robotic manipulation [47], [48]. We choose reproducing kernel Hilbert spaces (RKHS) based tools because a) they are amenable to non-parametric modeling, meaning that unlike many machine learning approaches such as mixture models, there are very few parameters that are required, and hence less vulnerability to the need for excessive tuning; b) they established tools for statistical inference and estimation [27], [49], [50], and are gaining traction as tools for data-driven verification [51], [52] and control [53], [54] of dynamical systems; and c) numerical methods scale with the number of samples of observed data, as opposed to dimension of the state, which makes them a promising approach for extension to run-time adaptive automation.…”
Section: Safementioning
confidence: 99%