2020
DOI: 10.1007/s40300-020-00185-3
|View full text |Cite
|
Sign up to set email alerts
|

The main contributions of robust statistics to statistical science and a new challenge

Abstract: In the first part of the paper, we trace the development of robust statistics through its main contributions which have penetrated mainstream statistics. The goal of this paper is neither to provide a full overview of robust statistics, nor to make a complete list of its tools and methods, but to focus on basic concepts that have become standard ideas and tools in modern statistics. In the second part we focus on the particular challenge provided by high-dimensional statistics and discuss how robustness ideas … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(11 citation statements)
references
References 50 publications
0
11
0
Order By: Relevance
“…On the other hand, it would also be important to explore whether the circular coordinates with generalized penalty can be helpful in model selection. In dimension reduction, sparsity can be studied further with a view of robust statistics using a geometric induced loss function [39,27]. This line of research is motivated by the statistical literature on generalized penalty functions [19].…”
Section: Discussionmentioning
confidence: 99%
“…On the other hand, it would also be important to explore whether the circular coordinates with generalized penalty can be helpful in model selection. In dimension reduction, sparsity can be studied further with a view of robust statistics using a geometric induced loss function [39,27]. This line of research is motivated by the statistical literature on generalized penalty functions [19].…”
Section: Discussionmentioning
confidence: 99%
“…Hence, (72) can be viewed as fitting a nonlinear regression on covariances using a particular robust loss function ρ [40]. Again, the function H defines an M-estimation problem [29,31], and its use is justified by the hope that model errors s ij − σ ij (γ) are treated as outliers by utilizing the robust loss function ρ [19,47]. Hence, the model errors should not impact the estimate of the parameter vector γ.…”
Section: Robust Moment Estimation (Rme)mentioning
confidence: 99%
“…There is literature investigating the robustness of SEM estimation to non-normal distributions [13][14][15]. This kind of robustness that relies on contaminated distributions [16][17][18][19] is not the target of this article. Instead, we rely on the concept of model robustness; that is, how robust is an SEM estimation method in the presence of local model misspecifications.…”
Section: Introductionmentioning
confidence: 99%
“…Here, we assume that the measurement is either observed or missing for all variables and do not consider the case where the measurements for one index are observed for some variables but missing for others. Two main characteristics are observed for the existing clustering methods: first, for imbalanced multivariate functional data, no specific distance measurements are defined and the model-based clustering methods, except for that of Misumi et al (2019), cannot be applied directly; second, the above methods rarely consider outliers, though real data are often corrupted by noises and outliers, see Ronchetti (2021) for a recent review. Although some improvements have been achieved in K-means clustering (Wang & Su 2011) and hierarchical clustering (Gagolewski et al 2016) for multivariate data to reduce the influence of outliers, robust (i.e., outlier-resistant) clustering algorithms for sparse multivariate functional data are lacking.…”
Section: Introductionmentioning
confidence: 99%