2013
DOI: 10.1007/s00454-013-9561-6
|View full text |Cite
|
Sign up to set email alerts
|

Dimension Reduction by Random Hyperplane Tessellations

Abstract: Abstract. Given a subset K of the unit Euclidean sphere, we estimate the minimal number m = m(K) of hyperplanes that generate a uniform tessellation of K, in the sense that the fraction of the hyperplanes separating any pair x, y ∈ K is nearly proportional to the Euclidean distance between x and y. Random hyperplanes prove to be almost ideal for this problem; they achieve the almost optimal bound m = O(w(K) 2 ) where w(K) is the Gaussian mean width of K. Using the map that sends x ∈ K to the sign vector with r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
141
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
3
3
3

Relationship

1
8

Authors

Journals

citations
Cited by 101 publications
(143 citation statements)
references
References 24 publications
2
141
0
Order By: Relevance
“…We begin with the following simple comparison, which follows from the Cauchy-Schwartz inequality for all v ∈ R n : (Indeed, u = tv/ v 2 lies in K since t/ v ≤ 1 and K is star shaped.) Next, we will control Au 1 with an application of the following uniform deviation inequality, which we proved in [35].…”
Section: Proof Of Main Resultsmentioning
confidence: 99%
“…We begin with the following simple comparison, which follows from the Cauchy-Schwartz inequality for all v ∈ R n : (Indeed, u = tv/ v 2 lies in K since t/ v ≤ 1 and K is star shaped.) Next, we will control Au 1 with an application of the following uniform deviation inequality, which we proved in [35].…”
Section: Proof Of Main Resultsmentioning
confidence: 99%
“…On the other hand, is a function of K, the projection's dimensionality, and scales approximately as 1/ √ K. In the extreme case of 1-bit scalar quantization the embedding does not preserve signal amplitudes and, therefore, their 2 distances. Still, it does preserve their angle, i.e., their correlation coefficient [17], [18].…”
Section: Background a Randomized Embeddingsmentioning
confidence: 99%
“…Tighter bounds can instead be developed if we are interested in preservation of the angles between two signals-i.e., their correlation or inner productinstead of the distance between them. [30][31][32] …”
Section: Theorem 1 (Johnson-lindenstrauss Lemmamentioning
confidence: 99%