2020
DOI: 10.1109/tac.2019.2920087
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-Based Discrete-Time Concurrent Learning for Standalone Function Approximation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 21 publications
1
10
0
Order By: Relevance
“…Maximizing λ min pSq λmaxpSq (maximizing λ min pSq and minimizing λmaxpSq) also leads to enlarging aγ which causes faster settling-time in (66). This result completely coincides with the obtained results in [20] and even supports the continuoustime framework studies in [2,4,26]. Therefore, while applying the proposed FTCL, the data recording algorithm in [25] is used, where the appropriate data is selected to maximize λ min pSq λmaxpSq .…”
Section: ) Adaptive Approximators With Non-zero Mfaes (εPkq ‰ 0)supporting
confidence: 87%
See 4 more Smart Citations
“…Maximizing λ min pSq λmaxpSq (maximizing λ min pSq and minimizing λmaxpSq) also leads to enlarging aγ which causes faster settling-time in (66). This result completely coincides with the obtained results in [20] and even supports the continuoustime framework studies in [2,4,26]. Therefore, while applying the proposed FTCL, the data recording algorithm in [25] is used, where the appropriate data is selected to maximize λ min pSq λmaxpSq .…”
Section: ) Adaptive Approximators With Non-zero Mfaes (εPkq ‰ 0)supporting
confidence: 87%
“…Theorem 1: Consider the approximator for nonlinear system (1) given by ( 14), whose parameters are adjusted according to the update law of (20) with the regressor given by (11). Let Assumptions 1-2 hold, once the rank condition on M and…”
Section: Finite-time Convergent Analysis For the Proposed Ftclmentioning
confidence: 99%
See 3 more Smart Citations