2010
DOI: 10.1007/s00357-010-9059-3
|View full text |Cite
|
Sign up to set email alerts
|

Dimensionality Reduction on the Cartesian Product of Embeddings of Multiple Dissimilarity Matrices

Abstract: Abstract:We consider the problem of combining multiple dissimilarity representations via the Cartesian product of their embeddings. For concreteness, we choose the inferential task at hand to be classification. The high dimensionality of this Cartesian product space implies the necessity of dimensionality reduction before training a classifier. We propose a supervised dimensionality reduction method, which utilizes the class label information, to help achieve a favorable combination. The simulation and real da… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 25 publications
(19 reference statements)
0
7
0
Order By: Relevance
“…PCA rotates the data into orthogonal principal components while maintaining the information content. Because principal components are independent of each other, some prefer to use them instead of the raw data in multivariate statistical analyses [21]. The first PCA axis (PC1) represents size, unless size has been removed from the data before analysis.…”
Section: (B) Sex Identificationmentioning
confidence: 99%
“…PCA rotates the data into orthogonal principal components while maintaining the information content. Because principal components are independent of each other, some prefer to use them instead of the raw data in multivariate statistical analyses [21]. The first PCA axis (PC1) represents size, unless size has been removed from the data before analysis.…”
Section: (B) Sex Identificationmentioning
confidence: 99%
“…Ideally both sources of data contain complementary information so that their fusion leads to larger power in testing and higher accuracy in classification than using either textual content data or graph structure data alone. We achieve the fusion by combining the embeddings obtained in the P-or W-approach via the Cartesian product [6].…”
Section: Fusionmentioning
confidence: 99%
“…In feature level fusion, feature vectors extracted from different data sources are combined into the Cartesian product space, directly [6] or via some data transformation procedures [3]. Decision level fusion involves combining results obtained separately from all data sources.…”
Section: Introductionmentioning
confidence: 99%
“…Combining information from disparate data sources when the information in the various spaces is fundamentally incommensurate -that is, a separate collection of useful features can be extracted from each space but their interpoint geometry precludes profitable alignment in a common space -is considered via Cartesian product space embedding in [12].…”
Section: Related Workmentioning
confidence: 99%