1996
DOI: 10.1109/34.531802
|View full text |Cite
|
Sign up to set email alerts
|

Using discriminant eigenfeatures for image retrieval

Abstract: This paper describes the automatic selection of features from an image training set using the theories of multi-dimensional linear discriminant analysis and the associated optimal linear projection. We demonstrate the e ectiveness of these Most Discriminating Features for view-based class retrieval from a large database of widely varying real-world objects presented as \well-framed" views, and compare it with that of the principal component analysis.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
643
0
6

Year Published

1999
1999
2014
2014

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 1,436 publications
(655 citation statements)
references
References 16 publications
6
643
0
6
Order By: Relevance
“…However, because of the small number of data sets compared with the dimensionality of the images, an FDA analysis directly on the original data would result in a singular within-class scatter matrix, which cannot be inverted as required in equation (3). Therefore, the FDA is applied to the PC scores resulting from projecting the original data onto the PCs of the PCA subspace used for the analysis (Swets and Weng, 1996;Markiewicz et al, 2009), rather than the original data.…”
Section: Fisher Discriminant Analysismentioning
confidence: 99%
“…However, because of the small number of data sets compared with the dimensionality of the images, an FDA analysis directly on the original data would result in a singular within-class scatter matrix, which cannot be inverted as required in equation (3). Therefore, the FDA is applied to the PC scores resulting from projecting the original data onto the PCs of the PCA subspace used for the analysis (Swets and Weng, 1996;Markiewicz et al, 2009), rather than the original data.…”
Section: Fisher Discriminant Analysismentioning
confidence: 99%
“…In particular, if S B has a nontrivial projection along these directions, LDA considers the corresponding classes to be perfectly separable. Techniques proposed to solve this classical problem include the perturbation method [31], two stage PCA+LDA [26], and the null space methods pioneered by Chen et al [7]. The latter have dominated research in recent years.…”
Section: Seeking Optimal Discriminant Subspace With Kernel Trickmentioning
confidence: 99%
“…Let us assume that A and B represent the null spaces ofS b andS w respectively, while A = R J − A and B = R J − B denote the orthogonal complements of A and B. Traditional approaches attempt to solve the problem by utilizing an intermediate PCA step to remove A and B. LDA is then performed in the lower dimensional PCA subspace, as it was done for example in [3,29]. Nevertheless, it should be noted at this point that the maximum of the ratio in Eq.6 can be reached only when Ψ TS wΨ = 0 and Ψ TS bΨ = 0.…”
Section: Where Are the Optimal Discriminant Features?mentioning
confidence: 99%
“…To address the problem, one popular approach is to introduce an intermediate Principal Component Analysis (PCA) step to remove the null spaces of the two scatter matrices. LDA is then performed in the lower dimensional PCA subspace, as it was done for example in [3,29]. However, it has been shown that the discarded null spaces may contain significant discriminatory information [10].…”
Section: Introductionmentioning
confidence: 99%