2015
DOI: 10.1080/02664763.2015.1005063
|View full text |Cite
|
Sign up to set email alerts
|

A comparison of different procedures for principal component analysis in the presence of outliers

Abstract: Principal component analysis (PCA) is a popular technique that is useful for dimensionality reduction but it is affected by the presence of outliers. The outlier sensitivity of classical PCA (CPCA) has caused the development of new approaches. Effects of using estimates obtained by expectation-maximization -EM and multiple imputation -MI instead of outliers were examined on the artificial and a real data set. Furthermore, robust PCA based on minimum covariance determinant (MCD), PCA based on estimates obtained… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
9
0
3

Year Published

2017
2017
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 16 publications
1
9
0
3
Order By: Relevance
“…When data were incomplete (10% and 20% of missing values), but without contamination, the SVD88 algorithm always obtained the smallest prediction errors. This behavior is as expected, since without outliers the procedures based on least squares generally perform well [12]. This situation changes completely when there is contamination of the data with 5, 10 and 15% of outliers.…”
Section: Resultssupporting
confidence: 81%
See 2 more Smart Citations
“…When data were incomplete (10% and 20% of missing values), but without contamination, the SVD88 algorithm always obtained the smallest prediction errors. This behavior is as expected, since without outliers the procedures based on least squares generally perform well [12]. This situation changes completely when there is contamination of the data with 5, 10 and 15% of outliers.…”
Section: Resultssupporting
confidence: 81%
“…The presence of outliers in a data set can reduce the effectiveness of least squares techniques [12], which in this case is the standard SVD. To avoid this behavior, robust lower-rank approximations or equivalent robust SVD could be used.…”
Section: Robust Singular Value Decompositionmentioning
confidence: 99%
See 1 more Smart Citation
“…Araştırmaların çoğunda aykırı değerler, modeldeki parametre tahminlerini etkileyebilir, seçilen modeli değiştirebilir ve modele dayanan tahminleri etkileyebilir düşüncesiyle veriden çıkarılarak, verilerin tekrar modellenmesi yoluna başvurmaktadır. Fakat bu da veri kümesinin küçülmesine ve aykırı değerin bulunduğu gözlemin sahip olduğu diğer önemli bilgiler içeren bağımsız değişken değerlerinin de silinmesine neden olmaktadır [1]. Bu yüzden bu çalışmada veri kümesinde bulunan aykırı değerli gözlemin tümünü silmek yerine sadece aykırı değeri silerek, yerine çoklu değer atama yöntemi ile değer atayarak aykırı değer probleminin çözülmesi amaçlanmaktadır.…”
Section: Introductionunclassified
“…Among other proposals for RPCA are the orthogonal PCA method developed by Maronna [3] and spherical PCA method developed by Locantore et al [4]. Alkan et al [5] examined if the missing value imputation methods can be used as an alternative approach to the RPCA. Alkan [6] also adapted minimum covariance determinant (MCD) method using the jacknife resampling approach and he examined the impacts of the changes resulting from this adaptation on RPCA based on MCD.…”
Section: Introductionmentioning
confidence: 99%