2018
DOI: 10.1109/jstsp.2018.2838549
|View full text |Cite
|
Sign up to set email alerts
|

Classification and Representation via Separable Subspaces: Performance Limits and Algorithms

Abstract: We study the classification performance of Kronecker-structured models in two asymptotic regimes and developed an algorithm for separable, fast and compact K-S dictionary learning for better classification and representation of multidimensional signals by exploiting the structure in the signal. First, we study the classification performance in terms of diversity order and pairwise geometry of the subspaces. We derive an exact expression for the diversity order as a function of the signal and subspace dimension… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 42 publications
0
1
0
Order By: Relevance
“…Equivalently, we preserve this multi-dimensional structure of the signal by projecting the signal onto Kronecker-structured subspace, which is a Kronecker product of a number of subspaces corresponding to the dimensionality of the signal, while observing only a small subset of the elements of the signal; hence the K-S subspace model is a special case of general subspace models. Authors in [14] and [15] show how the multi-dimensional structure in data can be well exploited for better classification and representation performance.…”
Section: Introductionmentioning
confidence: 99%
“…Equivalently, we preserve this multi-dimensional structure of the signal by projecting the signal onto Kronecker-structured subspace, which is a Kronecker product of a number of subspaces corresponding to the dimensionality of the signal, while observing only a small subset of the elements of the signal; hence the K-S subspace model is a special case of general subspace models. Authors in [14] and [15] show how the multi-dimensional structure in data can be well exploited for better classification and representation performance.…”
Section: Introductionmentioning
confidence: 99%