1999
DOI: 10.1109/72.750575
|View full text |Cite
|
Sign up to set email alerts
|

Face recognition using the nearest feature line method

Abstract: In this paper, we propose a novel classification method, called the nearest feature line (NFL), for face recognition. Any two feature points of the same class (person) are generalized by the feature line (FL) passing through the two points. The derived FL can capture more variations of face images than the original points and thus expands the capacity of the available database. The classification is based on the nearest distance from the query feature point to each FL. With a combined face database, the NFL er… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
258
0
2

Year Published

2006
2006
2015
2015

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 493 publications
(261 citation statements)
references
References 19 publications
1
258
0
2
Order By: Relevance
“…For example, in the Smote algorithm [1], virtual samples of the minority class are generated such that the number of minority training samples is increased. Here the virtual samples are generated through interpolating between each minority class point and its k nearest neighboring minority class points, which looks somewhat like the interpolating scheme used in Nfl [8] and Nnl [13]. In fact, the Nfl and Nnl algorithms have implicitly utilized virtual samples since they use a virtual point instead of a real data point to help compute the distance between a data point and a class.…”
Section: Virtual Samplesmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, in the Smote algorithm [1], virtual samples of the minority class are generated such that the number of minority training samples is increased. Here the virtual samples are generated through interpolating between each minority class point and its k nearest neighboring minority class points, which looks somewhat like the interpolating scheme used in Nfl [8] and Nnl [13]. In fact, the Nfl and Nnl algorithms have implicitly utilized virtual samples since they use a virtual point instead of a real data point to help compute the distance between a data point and a class.…”
Section: Virtual Samplesmentioning
confidence: 99%
“…The Nfl (Nearest Feature Line) method was originally proposed for face recognition [8]. In Nfl, feature line is defined as the line passing through two points from a same class.…”
Section: Nnlmentioning
confidence: 99%
“…Classical but still popular subspace-based classifiers include NN (Nearest Neighbor), NFL (Nearest Feature Line, proposed by Stan Z. Li et al [35]) and NS (Nearest Subspace).…”
Section: Nn Nfl and Nsmentioning
confidence: 99%
“…Different from Nearest Neighbor (NN) and Nearest Subspace (NS) classifiers [19] [36][37][38][51] [58], which forbids representing the query sample across classes, the recently developed l 1 -regularized sparse representation [10] or l 2 -regularized collaborative representation [33] represents the query image by the training samples from all classes, which could effectively overcome the small-sample-size or overfitting problem of NN and NS. Let ⋅ is the l p -norm, and p=1 for SRC in [10], while p=2 for CRC in [33].…”
Section: Sparse Representation or Collaborative Representation Based mentioning
confidence: 99%
“…Moreover, in order to better exploit the prior knowledge that face images from the same subject construct a subspace, nearest subspace (NS) classifiers [19] [36][37][38][51] [58] were also developed, which are usually superior to the popular NN classifier. Recently an interesting classifier, namely sparse representation based classification (SRC), was proposed by Wright et al [10] for robust FR.…”
Section: Introductionmentioning
confidence: 99%