2014 IEEE Symposium on Adaptive Dynamic Programming and Reinforcement Learning (ADPRL) 2014
DOI: 10.1109/adprl.2014.7010609
|View full text |Cite
|
Sign up to set email alerts
|

Subspace identification for predictive state representation by nuclear norm minimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…In this case, a bigger subspace that maps both the system dynamics and a part of the noise will achieve better performance because the noise can be next reduced during regression by means of regularization. The trade off between approximation and estimation errors is handled by solving a rank minimization problem [18].…”
Section: The Learning Algorithmmentioning
confidence: 99%
“…In this case, a bigger subspace that maps both the system dynamics and a part of the noise will achieve better performance because the noise can be next reduced during regression by means of regularization. The trade off between approximation and estimation errors is handled by solving a rank minimization problem [18].…”
Section: The Learning Algorithmmentioning
confidence: 99%
“…These advantages of PSR are helpful in solving the problems in HMM‐based methods. The most common PSR learning algorithm is the transformed PSR (TPSR) [8] as well as the improved PSR models [9–13], which uses assumed tests to construct the dynamic matrix and reduces its dimension via singular value decomposition (SVD). However, these algorithms were for applications in artificial intelligence and machine learning.…”
Section: Introductionmentioning
confidence: 99%
“…MoM-based algorithms are often consistent with theoretical guarantees in form of finite-sample bounds. In addition, these algorithms are able to learn a large variety of models [1], [3], [15], [11]. In a recent work, [15] showed that numerous models are encompassed into the common framework of linear Sequential Systems (SSs) or equivalently Multiplicity Automata (MA).…”
Section: Introductionmentioning
confidence: 99%