2006
DOI: 10.21236/ada478351
|View full text |Cite
|
Sign up to set email alerts
|

Stratification Learning: Detecting Mixed Density and Dimensionality in High Dimensional Point Clouds (PREPRINT)

Abstract: Approved for public release; distribution is unlimited.ii REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of this collection of information, including sugges… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
38
0

Year Published

2007
2007
2020
2020

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 32 publications
(38 citation statements)
references
References 12 publications
0
38
0
Order By: Relevance
“…Some of them emphasize modeling of the underlying flats and then use the models to infer the clusters (see e.g., Independent Component Analysis (Hyvärinen and Oja 2000), Subspace Separation (Kanatani 2001(Kanatani , 2002, Generalized Principal Component Analysis (GPCA) (Vidal et al 2005;Ma et al 2008)). A few others address the clustering part and then use its output to estimate the parameters of the underlying flats (see e.g., Multi-way Clustering algorithms (Agarwal et al 2005(Agarwal et al , 2006Govindu 2005;Shashua et al 2006), Tensor Voting (Medioni et al 2000), k-Manifolds (Souvenir and Pless 2005), Grassmann Clustering (Gruber and Theis 2006), Poisson Mixture Model (Haro et al 2006)). There are also algorithms that iterate between the two components of data clustering and subspace modeling (see e.g., Mixtures of PPCA (MoPPCA) (Tipping and Bishop 1999), K-Subspaces (Ho et al 2003)/kPlanes (Bradley and Mangasarian 2000;Tseng 1999)).…”
Section: Introductionmentioning
confidence: 99%
“…Some of them emphasize modeling of the underlying flats and then use the models to infer the clusters (see e.g., Independent Component Analysis (Hyvärinen and Oja 2000), Subspace Separation (Kanatani 2001(Kanatani , 2002, Generalized Principal Component Analysis (GPCA) (Vidal et al 2005;Ma et al 2008)). A few others address the clustering part and then use its output to estimate the parameters of the underlying flats (see e.g., Multi-way Clustering algorithms (Agarwal et al 2005(Agarwal et al , 2006Govindu 2005;Shashua et al 2006), Tensor Voting (Medioni et al 2000), k-Manifolds (Souvenir and Pless 2005), Grassmann Clustering (Gruber and Theis 2006), Poisson Mixture Model (Haro et al 2006)). There are also algorithms that iterate between the two components of data clustering and subspace modeling (see e.g., Mixtures of PPCA (MoPPCA) (Tipping and Bishop 1999), K-Subspaces (Ho et al 2003)/kPlanes (Bradley and Mangasarian 2000;Tseng 1999)).…”
Section: Introductionmentioning
confidence: 99%
“…First, if T → ∞, k → ∞, and k/T → 0, then we can approximate the binomial process by a Poisson process. Second, the density ρ(x t ) is considered constant inside the sphere, a valid assumption for small 1 We should mention that in [15] we compared the original framework (with no regularization or noise modelling as here developed), with a two step approach, where we first estimate the local dimensionality per point using the original Levina-Bickel approach, and then cluster following the information bottleneck approach [34]. This has been shown not only to be less elegant and mathematically funded than the approach here presented, but mush less robust, even when compared to the non-regularized and noise-transparent formulation.…”
Section: Local Intrinsic Dimension Estimationmentioning
confidence: 99%
“…In [15], we proposed to study a stratification by extending the Levina and Bickel's technique. Instead of modeling each point and its local ball of radius R as a Poisson process and computing the maximum likelihood (ML) for each ball separately, all the possible balls are considered at the same time in the ML function.…”
Section: Translation Poisson Mixture Model (Tpmm)mentioning
confidence: 99%
See 1 more Smart Citation
“…Similar to [18], we take a deliberate departure from the manifold assumption. While their methods are statistical in nature, we use local homology to recognize locations where the assumption is violated.…”
Section: Introductionmentioning
confidence: 99%