2017
DOI: 10.3233/aic-170729
|View full text |Cite
|
Sign up to set email alerts
|

Linear discriminant analysis: A detailed tutorial

Abstract: Linear Discriminant Analysis (LDA) is a very common technique for dimensionality reduction problems as a preprocessing step for machine learning and pattern classification applications. At the same time, it is usually used as a black box, but (sometimes) not well understood. The aim of this paper is to build a solid intuition for what is LDA, and how LDA works, thus enabling readers of all levels be able to get a better understanding of the LDA and to know how to apply this technique in different applications.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
354
1
19

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 775 publications
(374 citation statements)
references
References 68 publications
0
354
1
19
Order By: Relevance
“…A known problem with GMMs, however, is that use of highdimensional (short-term) features often leads to numerical problems due to singular covariance matrices. Even if off-the-shelf dimensionality reduction methods such as principal component analysis (PCA) [18] or linear discriminant analysis (LDA) [19] prior to GMM modeling may help, they are not jointly optimized with the GMM. Is there an alternative way to learn a generative model that can handle high-dimensional inputs natively?…”
Section: Introductionmentioning
confidence: 99%
“…A known problem with GMMs, however, is that use of highdimensional (short-term) features often leads to numerical problems due to singular covariance matrices. Even if off-the-shelf dimensionality reduction methods such as principal component analysis (PCA) [18] or linear discriminant analysis (LDA) [19] prior to GMM modeling may help, they are not jointly optimized with the GMM. Is there an alternative way to learn a generative model that can handle high-dimensional inputs natively?…”
Section: Introductionmentioning
confidence: 99%
“…As discussed in [29,30], high dimensionality has problems, such as requiring a large amount of time, high space complexity, and high over-fitting problems. Through appropriate application of dimensionality reduction techniques, it is possible to project a set of high-dimensional vector samples into much lower dimensionality while preserving the relevant global structure information of the data [31].…”
Section: Proposed Systemmentioning
confidence: 99%
“…LDA is fast training approach for dimension reduction, so that it helps to improve computational complexity [31]. According to [30,31], there are two types of LDA technique: class-dependent and class-independent. In class-dependent types, the lower dimensional space is calculated for each class to project its data on it.…”
Section: Proposed Systemmentioning
confidence: 99%
See 2 more Smart Citations