2005
DOI: 10.1016/j.jmva.2003.12.005
|View full text |Cite
|
Sign up to set email alerts
|

Simple consistent cluster methods based on redescending M-estimators with an application to edge identification in images

Abstract: We use the local maxima of a redescending M-estimator to identify cluster, a method proposed already by Morgenthaler (in: H.D. Lawrence, S. Arthur (Eds.), Robust Regression, Dekker, New York, 1990, pp. 105-128) for finding regression clusters. We work out the method not only for classical regression but also for orthogonal regression and multivariate location and show that all three approaches are special cases of a general approach which includes also other cluster problems. For the general case we show consi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
16
0

Year Published

2006
2006
2020
2020

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(16 citation statements)
references
References 19 publications
0
16
0
Order By: Relevance
“…Markatou (2000) and Shen, Yang, and Wang (2004) proposed using a weight factor for each data to robustify the estimation procedure for mixture regression models. There are also some related robust methods for linear clustering; see, for example, Hennig (2002Hennig ( , 2003, Mueller and Garlipp (2005), García-Escudero, Gordaliza, San Martín, Van Aelst, andZamar (2009), andGarcía-Escudero, Gordaliza, Mayo-Iscara, andSan Martín (2010).…”
Section: Introductionmentioning
confidence: 99%
“…Markatou (2000) and Shen, Yang, and Wang (2004) proposed using a weight factor for each data to robustify the estimation procedure for mixture regression models. There are also some related robust methods for linear clustering; see, for example, Hennig (2002Hennig ( , 2003, Mueller and Garlipp (2005), García-Escudero, Gordaliza, San Martín, Van Aelst, andZamar (2009), andGarcía-Escudero, Gordaliza, Mayo-Iscara, andSan Martín (2010).…”
Section: Introductionmentioning
confidence: 99%
“…3D-points were then projected onto the horizontal plane, resulting in a two-dimensional point cloud with xy-coordinates. In a first step, a circle was fitted to the 2D-point cloud, using the circular cluster method of Müller and Garlipp [39], which is based on edge identification by redescending M-estimators (these are non-decreasing near the origin, but decreasing toward 0 far from the origin). The method is implemented in the R-package edci [40].…”
Section: Measurement Of Dbhmentioning
confidence: 99%
“…We used a dense grid of 25 × 25 starting points (i.e., possible circle centers) with a distance of 3.2 cm between two neighboring starting points, and 5 starting radii per starting point providing in total 3125 starting values (i.e., combinations of radius and centerpoint) for the optimization algorithm. The initial diameter estimate (obtained from the Müller and Garlipp algorithm [39]) is henceforward referred to as DBH 1 . In a second step, all data points with distance of greater than or equal to 5 cm to the circular arc associated with DBH 1 were removed.…”
Section: Measurement Of Dbhmentioning
confidence: 99%
“…Yao, Wei, & Yu () and Song, Yao, & Xing () considered robust mixture regression using a t ‐distribution and a Laplace distribution, respectively. There has also been extensive work in linear clustering; see, for example, Henning (, ), Mueller & Garlipp (), and García‐Escudero et al. (, ).…”
Section: Introductionmentioning
confidence: 99%