2019
DOI: 10.1080/00401706.2019.1610069
|View full text |Cite
|
Sign up to set email alerts
|

Matrix Linear Discriminant Analysis

Abstract: We propose a novel linear discriminant analysis approach for the classification of high-dimensional matrix-valued data that commonly arises from imaging studies. Motivated by the equivalence of the conventional linear discriminant analysis and the ordinary least squares, we consider an efficient nuclear norm penalized regression that encourages a low-rank structure. Theoretical properties including a non-asymptotic risk bound and a rank consistency result are established. Simulation studies and an application … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(13 citation statements)
references
References 24 publications
0
13
0
Order By: Relevance
“…They introduced nuclear norm penalized regression methods to estimate M, which can achieve parsimonious models with enhanced interpretability. The low‐rank assumption has been commonly used in neuroimaging applications; see Zhu et al (2014); Zhou & Li (2014); Kong et al (2020); Yu et al (2020); Hu et al (2020b); Hu et al (2020a) for examples. In the current article, we also assume that M is of low rank.…”
Section: Model Setup and Estimation Proceduresmentioning
confidence: 99%
See 1 more Smart Citation
“…They introduced nuclear norm penalized regression methods to estimate M, which can achieve parsimonious models with enhanced interpretability. The low‐rank assumption has been commonly used in neuroimaging applications; see Zhu et al (2014); Zhou & Li (2014); Kong et al (2020); Yu et al (2020); Hu et al (2020b); Hu et al (2020a) for examples. In the current article, we also assume that M is of low rank.…”
Section: Model Setup and Estimation Proceduresmentioning
confidence: 99%
“…However, OLS may perform suboptimally as it does not utilize the information that the entries of Y i are related, especially when both p and T diverge with the sample size n. Recently, Yuan et al (2007), and Chen, Dong, & Chan (2013) proposed reduced rank regression models by assuming the low rankness of M. They introduced nuclear norm penalized regression methods to estimate M, which can achieve parsimonious models with enhanced interpretability. The low-rank assumption has been commonly used in neuroimaging applications; see Zhu et al (2014); Zhou & Li (2014); Kong et al (2020); Yu et al (2020); Hu et al (2020b); Hu et al (2020a) for examples. In the current article, we also assume that M is of low rank.…”
Section: Introductionmentioning
confidence: 99%
“…The proposed feature extractors are explained in detail in this subsection  2D-PCA Principal Component Analysis(PCA) [11] is widely used as a dimensionality reduction feature representation. In this paper, 2D-PCA is proposed and its implementation is explained hereafter.…”
Section: B Feature Extractionmentioning
confidence: 99%
“…According to [11], the above function can also be rewritten as shown below in equation (9). Where function (, ) denotes the matrix trace operation.…”
Section: B Feature Extractionmentioning
confidence: 99%
“…[22] developed a generalized scalar-on-image regression models via total variation regularization, which can keep the piecewise smooth nature of imaging data. [23] proposed an efficient nuclear norm penalized estimation method for matrix linear discriminant analysis.…”
Section: Introductionmentioning
confidence: 99%