2017
DOI: 10.4018/ijaci.2017100104
|View full text |Cite
|
Sign up to set email alerts
|

PCA as Dimensionality Reduction for Large-Scale Image Retrieval Systems

Abstract: Dimensionality reduction in large-scale image research plays an important role for their performance in different applications. In this paper, we explore Principal Component Analysis (PCA) as a dimensionality reduction method. For this purpose, first, the Scale Invariant Feature Transform (SIFT) features and Speeded Up Robust Features (SURF) are extracted as image features. Second, the PCA is applied to reduce the dimensions of SIFT and SURF feature descriptors. By comparing multiple sets of experimental data … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 41 publications
(33 citation statements)
references
References 28 publications
0
33
0
Order By: Relevance
“…One technique to reduce the number of attributes is dimensionality reduction [14]. One example algorithm for dimensionality reduction is principal component analysis (PCA) [15], which is an expensive process. Most machine learning algorithms require multiple scans of the dataset with time complexities based on the number of data records, therefore, reducing the number of data records seems to be the way to go for big data analysis.…”
Section: Techniques For Data Reductionmentioning
confidence: 99%
“…One technique to reduce the number of attributes is dimensionality reduction [14]. One example algorithm for dimensionality reduction is principal component analysis (PCA) [15], which is an expensive process. Most machine learning algorithms require multiple scans of the dataset with time complexities based on the number of data records, therefore, reducing the number of data records seems to be the way to go for big data analysis.…”
Section: Techniques For Data Reductionmentioning
confidence: 99%
“…The next step consisted of comparing the query image features with those of image database. The comparison was performed within two similarity matching methods: Flann based matcher and Brute force matcher [7]. Our selection of these two methods was due to their efficiency and fast execution, which is so important in our case.…”
Section: Researchmentioning
confidence: 99%
“…Therefore, we propose a cloud-based platform that groups and integrates image and video processing algorithms, which are exploited and combined for providing an efficient method of image indexation and retrieval. The proposed combination of descriptors is well adapted for dimensionality reduction, where the selection of the most significant values of descriptors (using PCA method) allowed to reduce the research time with the maintain of precision [7]. As a result, our method is well suited for large scale image retrieval.…”
Section: Introductionmentioning
confidence: 99%
“…Dimensionality reduction is a process of converting data from a high dimensional space to a lower dimensional space with the aim of preserving meaningful information from the original data. Dimensionality reduction can be applied in any field that has high dimensional data (a large number of variables) such as signal processing [1], speech recognition [2,3], neuroinformatics [4,5], bioinformatics [6,7], social media [8,9], telecoms [10], and computer vision [11], for data visualization, data exploration, noise reduction or as a pre-processing step to support classification models. An appropriate dimensionality reduction technique is related to the goodness of preserving the geometry (structure) of the data of interest.…”
Section: Introductionmentioning
confidence: 99%