2010 IEEE International Conference on Acoustics, Speech and Signal Processing 2010
DOI: 10.1109/icassp.2010.5494931
|View full text |Cite
|
Sign up to set email alerts
|

Optimized intrinsic dimension estimator using nearest neighbor graphs

Abstract: We develop an approach to intrinsic dimension estimation based on k-nearest neighbor (kNN) distances. The dimension estimator is derived using a general theory on functionals of kNN density estimates. This enables us to predict the performance of the dimension estimation algorithm. In addition, it allows for optimization of free parameters in the algorithm. We validate our theory through simulations and compare our estimator to previous kNN based dimensionality estimation approaches.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 6 publications
0
6
0
Order By: Relevance
“…The authors evaluate the performance of their methods on synthetic datasets, some of which have been used by similar studies in the literature [79], while the others (challenging ones) are proposed by the authors to have manifolds with nonconstant curvature. The comparison of the achieved results with those obtained by the estimators proposed in [27,33,112,118] has led to the conclusion that none of the methods has a good performance on all the tested datasets. However, graph theoretic approaches would appear to behave better when manifolds of nonconstant curvature are processed.…”
Section: Nearestmentioning
confidence: 88%
“…The authors evaluate the performance of their methods on synthetic datasets, some of which have been used by similar studies in the literature [79], while the others (challenging ones) are proposed by the authors to have manifolds with nonconstant curvature. The comparison of the achieved results with those obtained by the estimators proposed in [27,33,112,118] has led to the conclusion that none of the methods has a good performance on all the tested datasets. However, graph theoretic approaches would appear to behave better when manifolds of nonconstant curvature are processed.…”
Section: Nearestmentioning
confidence: 88%
“…Furthermore, the bounded support of a random variable as a source of k-NN estimator bias was already identified in papers by Sricharan et al [8,20]. The authors estimated the bias using Taylor expansion of pdf p(x), as:…”
Section: Theorem 1 (Lebesguementioning
confidence: 99%
“…The utility of these expressions is that they can be used to optimize over tuning parameters of the fusion criterion, thereby circumventing the need for manual parameter tuning. This theory has been applied to non-parametric estimation of the mutual information [8] and estimation of intrinsic dimension [9].…”
Section: Technical Accomplishmentsmentioning
confidence: 99%
“…This has led to the fastest and most reliable anomaly detection method to date and can be applicable to large datasets with millions of samples. The asymptotic theory was applied to intrinsic dimension estimation in [9], which was implemented for fusion and segmentation of hyperspectral imagery in [15]. The theory was also applied to high dimension correlation screening [17].…”
Section: Expressions For Divergence Estimator Bias Variance and A Cltmentioning
confidence: 99%