2016
DOI: 10.1561/9781680831412
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Low Rank Models

Abstract: Principal components analysis (PCA) is a well-known technique for approximating a tabular data set by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types. This framework encompasses many well known techniques in data analysis, such as nonnegative matrix factorization, matrix completion, sparse and robust PCA, k-means, k-SVD, and maximum margin matrix factorization. The method handles heterogeneous data set… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
85
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 87 publications
(86 citation statements)
references
References 115 publications
1
85
0
Order By: Relevance
“…The global geometry of low-rank matrix recovery but with analysis customized to linear measurements and quadratic loss functions is also covered in [34], [41]. Since Theorem 1 only requires the (2r, 4r)-restricted strong convexity and smoothness property (6), aside from low-rank matrix recovery [45], it can also be applied to many other low-rank matrix optimization problems [46] which do not necessarily involve quadratic loss functions. Typical examples include 1-bit matrix completion [43], [47] and Poisson principal component analysis (PCA) [48].…”
Section: B Global Geometry For General Low-rank Optimizationmentioning
confidence: 99%
“…The global geometry of low-rank matrix recovery but with analysis customized to linear measurements and quadratic loss functions is also covered in [34], [41]. Since Theorem 1 only requires the (2r, 4r)-restricted strong convexity and smoothness property (6), aside from low-rank matrix recovery [45], it can also be applied to many other low-rank matrix optimization problems [46] which do not necessarily involve quadratic loss functions. Typical examples include 1-bit matrix completion [43], [47] and Poisson principal component analysis (PCA) [48].…”
Section: B Global Geometry For General Low-rank Optimizationmentioning
confidence: 99%
“…In addition, our algorithmic framework also allows for direct incorporation of constraints not only on estimates of the matrix factors, but also on the estimate of X * itself to account for entry-wise structural constraints that could arise naturally in many matrix completion scenarios. Several other recent efforts in the DL literature have proposed algorithmic procedures for coping with missing data [37], [38], and a survey of algorithmic approaches to generalized low-rank modeling tasks is given in the recent work [39].…”
Section: B Connections With Existing Workmentioning
confidence: 99%
“…But we will not follow the general algorithm proposed in [21] for the following reasons: (i) due to the special structure of the weight, our algorithm is more efficient than [21] (see Algorithm 3.1, in p. 42 [21]), (ii) it allows a detailed convergence analysis which is usually not available in other algorithms proposed in the literature [9,26,21], and (iii) it can handle bigger size matrices as we will demonstrate in the numerical result section. If k = 0, then (4.1) reduces to an unweighted rank r factorization of A 2 and can be solved as an alternating least squares problem [16,17,18].…”
Section: )mentioning
confidence: 99%