Proceedings of the Seventeenth Annual ACM-SIAM Symposium on Discrete Algorithm - SODA '06 2006
DOI: 10.1145/1109557.1109681
|View full text |Cite
|
Sign up to set email alerts
|

Matrix approximation and projective clustering via volume sampling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

7
334
0

Year Published

2006
2006
2019
2019

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 172 publications
(341 citation statements)
references
References 0 publications
7
334
0
Order By: Relevance
“…To our knowledge, such a dimension reduction result is not known for the projective clustering problem for any p, including the cases p = 2 and p = ∞. Previous results for the cases p = 1, 2, ∞ [10,4,13] only showed the existence of such a subspace spanned by poly( ks ) points -the algorithm for finding the subspace enumerated all subsets of poly( ks ) points. Our dimension reduction result, combined with the recent fixed-dimensional result of [6], yields an O(mn · poly( s ) + m(log m) f (s/ ) ) time algorithm for the projective clustering problem with k = 1.…”
Section: Subspace Projective Clusteringmentioning
confidence: 89%
See 1 more Smart Citation
“…To our knowledge, such a dimension reduction result is not known for the projective clustering problem for any p, including the cases p = 2 and p = ∞. Previous results for the cases p = 1, 2, ∞ [10,4,13] only showed the existence of such a subspace spanned by poly( ks ) points -the algorithm for finding the subspace enumerated all subsets of poly( ks ) points. Our dimension reduction result, combined with the recent fixed-dimensional result of [6], yields an O(mn · poly( s ) + m(log m) f (s/ ) ) time algorithm for the projective clustering problem with k = 1.…”
Section: Subspace Projective Clusteringmentioning
confidence: 89%
“…, a m ), and can be computed in time O(min{mn 2 , m 2 n}) using Singular Value Decomposition (SVD). Some recent work on p = 2 case [1,2,3,4,5,9,12], initiated by a result due to Frieze, Kannan, and Vempala [7], has focused on algorithms for computing a k-dimensional subspace that gives (1 + )-approximation to the optimum in time O(mn·poly(k, 1/ )), i.e., linear in the number of co-ordinates we store. Most of these algorithms, with the exception of [1,12], depend on subroutines that sample poly(k, 1/ ) points from given a 1 , a 2 , .…”
Section: Introductionmentioning
confidence: 99%
“…However, in ref. 13 not in general guarantee the return of an SPSD approximant when applied to an SPSD matrix. The same holds true for approaches motivated by numerical analysis; in recent work, the authors of ref.…”
Section: Proof Of Theoremmentioning
confidence: 99%
“…Though our goals and corresponding algorithms are quite different in their approach and scope of application, it is of interest to note that our Theorem 1 can in fact be viewed as a kernel-level version of a theorem of ref. 13, where a related notion termed volume sampling is employed for column selection. However, in ref.…”
Section: Proof Of Theoremmentioning
confidence: 99%
“…Recently, the effort has been towards eliminating the additive term in the inequality thereby yielding a relative approximation in the form A − Π C A F ≤ (1 + ) A − A k F . Along these lines, Deshpande et al [5] first shows the existence of such approximations introducing a sampling technique related to the volume of the simplex defined by the column subsets of size k, without giving a polynomial time algorithm. Specifically, they show that there exists k columns with which one can get a √ k + 1 relative error approximation in Frobenius norm, which is tight.…”
Section: Comparison To Related Workmentioning
confidence: 99%