2008
DOI: 10.21236/ada486804
|View full text |Cite
|
Sign up to set email alerts
|

Relational Learning via Collective Matrix Factorization

Abstract: Relational learning is concerned with predicting unknown values of a relation, given a database of entities and observed relations among entities. An example of relational learning is movie rating prediction, where entities could include users, movies, genres, and actors. Relations would then encode users' ratings of movies, movies' genres, and actors' roles in movies. A common prediction technique given one pairwise relation, for example a #users × #movies ratings matrix, is low-rank matrix factorization. In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
308
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 345 publications
(309 citation statements)
references
References 4 publications
1
308
0
Order By: Relevance
“…In contrast, our approach centers on interactions between users and items, but we will see that it is flexible enough to incorporate user and item features as well. Another approach is collective matrix factorization (Singh and Gordon (2008)), which studies matrix recovery using multiple matrices across multiple groups. In the setting of multiple matrices across two groups, our approach will in fact apply to a more general version of this problem and simultaneously allow us to leverage the modeling advantages of tensors.…”
Section: Statistical Methodologymentioning
confidence: 99%
“…In contrast, our approach centers on interactions between users and items, but we will see that it is flexible enough to incorporate user and item features as well. Another approach is collective matrix factorization (Singh and Gordon (2008)), which studies matrix recovery using multiple matrices across multiple groups. In the setting of multiple matrices across two groups, our approach will in fact apply to a more general version of this problem and simultaneously allow us to leverage the modeling advantages of tensors.…”
Section: Statistical Methodologymentioning
confidence: 99%
“…Furthermore, a comparison is made between HTLFA and CMF(i.e. Collective Matrix Factorization) [21]. In CMF, they simultaneously factor several matrices, sharing parameters among factors when an entity participates in multiple relations.…”
Section: Impact Of Constraint Parametermentioning
confidence: 99%
“…This means that we only measure the element‐wise loss on the observed edges; and among all these edges, we treat the element‐wise loss equally (referred to as ‘0/1 Weight Matix’). This type of weight matrix in widely used in the literature, especially in the context of collaborative filtering [6,7].…”
Section: Optimization Formationmentioning
confidence: 99%