2006 2nd International Conference on Information &Amp; Communication Technologies
DOI: 10.1109/ictta.2006.1684683
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Face recognition using neural network-PCA

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…Additionally, we also plot the feature distributions to better visualize the results via principle component analysis (PCA). PCA is a widely used approach in which a linear transformation is designed to compress the information among features into the relatively lower dimensions [55][56][57][58][59]. Fig.…”
Section: B) Experimental Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Additionally, we also plot the feature distributions to better visualize the results via principle component analysis (PCA). PCA is a widely used approach in which a linear transformation is designed to compress the information among features into the relatively lower dimensions [55][56][57][58][59]. Fig.…”
Section: B) Experimental Resultsmentioning
confidence: 99%
“…Additionally, we also plot the feature distributions to better visualize the results via principle component analysis (PCA). PCA is a widely used approach in which a linear transformation is designed to compress the information among features into the relatively lower dimensions [55][56][57][58][59]. Figures 4 and 5 show the distribution of the first and second principal component among all data sets for acoustic signals and medical record, respectively.…”
Section: B) Experimental Resultsmentioning
confidence: 99%
“…The basic approach is to compute the eigenvectors of the covariance matrix of the original data, and to approximate it by making a linear combination of the leading eigenvectors [17]. By using the PCA procedure, the test vector can be identified by first, projecting the image onto the eigenvector space to obtain the corresponding set of weights, and then by comparing it with the set of weights for the vectors in the training set [18,19]. The problem of low-dimensional feature representation can be stated as follows: Let X= (x 1 , x 2 ,…, x i ,…, x n ) represents the n × N data matrix, where each x i is a vector of dimension n, concatenated from a face and online signature feature vectors.…”
Section: Principal Component Analysismentioning
confidence: 99%
“…In [2], PCA is a useful analytical technique that has established application in fields such as face recognition and minimisation of image size, and is a prevalent technique to find patterns of high dimension data. It is a way of recognising patterns in data, and demonstrates the data in a way to pinnacle their affinities and variances.…”
Section: Principal Component Analysismentioning
confidence: 99%