2019
DOI: 10.1016/j.jmva.2018.09.011
|View full text |Cite
|
Sign up to set email alerts
|

Sparse quadratic classification rules via linear dimension reduction

Abstract: We consider the problem of high-dimensional classification between two groups with unequal covariance matrices. Rather than estimating the full quadratic discriminant rule, we propose to perform simultaneous variable selection and linear dimension reduction on the original data, with the subsequent application of quadratic discriminant analysis on the reduced space. In contrast to quadratic discriminant analysis, the proposed framework doesn’t require the estimation of precision matrices; it scales linearly wi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 46 publications
(113 reference statements)
2
5
0
Order By: Relevance
“…In the conventional classification problem where m 0 " 1, a special case of the proposed CLIPS classifier becomes a new sparse quadratic discriminant analysis (QDA) method (cf. Fan et al, 2015Li and Shao, 2015;Jiang et al, 2018;Qin, 2018;Zou, 2019;Gaynanova and Wang, 2019;Cai and Zhang, 2019;Pan and Mai, 2020). As a byproduct of our theoretical study, we show that the new QDA method enjoys better theoretical properties compared to some of the state-of-the-art sparse QDA methods such as Fan et al (2015).…”
Section: Introductionsupporting
confidence: 54%
“…In the conventional classification problem where m 0 " 1, a special case of the proposed CLIPS classifier becomes a new sparse quadratic discriminant analysis (QDA) method (cf. Fan et al, 2015Li and Shao, 2015;Jiang et al, 2018;Qin, 2018;Zou, 2019;Gaynanova and Wang, 2019;Cai and Zhang, 2019;Pan and Mai, 2020). As a byproduct of our theoretical study, we show that the new QDA method enjoys better theoretical properties compared to some of the state-of-the-art sparse QDA methods such as Fan et al (2015).…”
Section: Introductionsupporting
confidence: 54%
“…In this section, we compare our method, Algorithm 1 (QDAP), with LDA, DSDA (Clemmensen et al, 2011), QDA, DAP (Gaynanova & Wang, 2019), and RDA (Guo et al, 2007) by both simulated and real data examples. Besides the classical methods LDA and QDA, RDA is a well known regularization approach which works well for moderate and high dimensional data.…”
Section: Methods For Comparisonmentioning
confidence: 99%
“…A majority of these works are based on various sparsity assumptions. See Mai (2013) for a review and Gaynanova & Wang (2019) for recent progresses. In contrast, we do not impose sparsity assumptions and our method shares invariance property with the classical LDA and QDA methods.…”
Section: Introductionmentioning
confidence: 99%
“…Then the same formulation as (8), with Γ replaced by Γ, follows. Alternatively, we can consider the two-dimensional projection approach for quadratic discriminant analysis proposed by Gaynanova and Wang (2019) in the multivariate setting. In our functional framework, we would optimize the following two objective functions…”
Section: Properties Of the Regularized Classifier In The Population Modelmentioning
confidence: 99%