2019
DOI: 10.1007/s10994-019-05801-6
|View full text |Cite
|
Sign up to set email alerts
|

Deep collective matrix factorization for augmented multi-view learning

Abstract: Learning by integrating multiple heterogeneous data sources is a common requirement in many tasks. Collective Matrix Factorization (CMF) is a technique to learn shared latent representations from arbitrary collections of matrices. It can be used to simultaneously complete one or more matrices, for predicting the unknown entries. Classical CMF methods assume linearity in the interaction of latent factors which can be restrictive and fails to capture complex non-linear interactions. In this paper, we develop the… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 15 publications
(15 citation statements)
references
References 41 publications
0
15
0
Order By: Relevance
“…Representation learning from arbitrary collections of matrices have been studied in Collective Matrix Factorization (CMF) [Singh and Gordon, 2008], group-sparse CMF [Klami et al, 2014] and a neural approach Deep CMF [Mariappan and Rajan, 2019]. These approaches learn two latent factors for the row and column entities of each matrix, to reconstruct them.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Representation learning from arbitrary collections of matrices have been studied in Collective Matrix Factorization (CMF) [Singh and Gordon, 2008], group-sparse CMF [Klami et al, 2014] and a neural approach Deep CMF [Mariappan and Rajan, 2019]. These approaches learn two latent factors for the row and column entities of each matrix, to reconstruct them.…”
Section: Related Workmentioning
confidence: 99%
“…Entity-specific representations may be learned by using N different autoencoders, one for each entity, where each autoencoder takes as input the concatenation of all matrices containing that entity. This approach is inadequate for matrices with different datatypes and sparsity levels as discussed in [Mariappan and Rajan, 2019]. Our approach addresses these problems but with higher computational cost.…”
Section: Neural Collective Multi-way Spectral Clustering Networkmentioning
confidence: 99%
“…In this paper, we analyze patient representation learning in light of 2 recent advances in CMF and KG representation learning. A deep autoencoder-based architecture, called deep CMF (DCMF), was developed for CMF, which was found to outperform classical nonneural variants of CMF in several tasks [ 9 ]. Using DCMF, which provides a seamless way of integrating heterogeneous data, we evaluate the effectiveness of patient representations when the input data are augmented with additional information from literature-derived KGs.…”
Section: Introductionmentioning
confidence: 99%
“…A model for CMF based on deep learning was developed by Mariappan and Rajan [ 9 ], which is briefly described next. Given M matrices (indexed by m) that describe the relationships between E entities (indexed by e), each with dimension d e, DCMF jointly obtains latent representations of each entity U e and low-rank factorizations of each matrix such that U e =f θ ([C] (e) ), where f θ is an entity-specific nonlinear transformation, obtained through a neural network–based encoder with weights θ and [C] (e) denotes all matrices in the collection that contain a relationship of entity e. The entities corresponding to the rows and columns of the m th matrix are denoted by indices r m and c m , respectively.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation