Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) 2019
DOI: 10.18653/v1/w19-4301
|View full text |Cite
|
Sign up to set email alerts
|

Deep Generalized Canonical Correlation Analysis

Abstract: We present Deep Generalized Canonical Correlation Analysis (DGCCA) -a method for learning nonlinear transformations of arbitrarily many views of data, such that the resulting transformations are maximally informative of each other. While methods for nonlinear two-view representation learning (Deep CCA, (Andrew et al., 2013)) and linear many-view representation learning (Generalized CCA (Horst, 1961)) exist, DGCCA is the first CCA-style multiview representation learning technique that combines the flexibility o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
96
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
2
2

Relationship

1
9

Authors

Journals

citations
Cited by 90 publications
(96 citation statements)
references
References 17 publications
0
96
0
Order By: Relevance
“…CCA has been successfully employed effectively in other contexts to integrate highdimensional biological data 66,65 . Penalized CCA 67 and deep CCA 68,69 can produce non-linear variates and may prove to be highly effective as we confront higher throughput platforms with greater cell-to-cell data.…”
Section: Discussionmentioning
confidence: 99%
“…CCA has been successfully employed effectively in other contexts to integrate highdimensional biological data 66,65 . Penalized CCA 67 and deep CCA 68,69 can produce non-linear variates and may prove to be highly effective as we confront higher throughput platforms with greater cell-to-cell data.…”
Section: Discussionmentioning
confidence: 99%
“…One finds linear transformations {U j ∈ R dj ×k } J j=1 , that minimize the mutual reconstruction error under constraints, in a way equivalent to maximizing correlation. This framework can be extended to non-linear feature extractors [29] with the objective:…”
Section: Extensions Of Ccamentioning
confidence: 99%
“…More recently, several deep neural network (DNN)-based algorithms have been proposed for nonlinear feature representation learning on multi-view problems [13], [14]. A deep model for CCA estimation, referred to as deep CCA (DCCA), has also been proposed [15], [16]. Like CCA, DCCA is a parametric approach and is scalable to large datasets, and like KCCA, it can model nonlinearity in the data.…”
Section: Previous Workmentioning
confidence: 99%