2020
DOI: 10.48550/arxiv.2001.08975
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sparse Semi-supervised Heterogeneous Interbattery Bayesian Analysis

Abstract: The Bayesian approach to feature extraction, known as factor analysis (FA), has been widely studied in machine learning to obtain a latent representation of the data. An adequate selection of the probabilities and priors of these bayesian models allows the model to better adapt to the data nature (i.e. heterogeneity, sparsity), obtaining a more representative latent space.The objective of this article is to propose a general FA framework capable of modelling any problem. To do so, we start from the Bayesian In… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(6 citation statements)
references
References 31 publications
0
6
0
Order By: Relevance
“…We propose a novel probabilistic latent variable model to generate kernel relationships, instead of data observations, based on a linear generative model. We introduce this model using the Bayesian inter-battery factor analysis approach proposed in [14] to show its capabilities to efficiently face semisupervised heterogeneous multi-view problems combining linear and non-linear data representations. Besides, we extend the model formulation to provide the automatic selection of SVs, obtaining scalable solutions, as well as include an ARD prior over the kernel to obtain feature selection capabilities.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…We propose a novel probabilistic latent variable model to generate kernel relationships, instead of data observations, based on a linear generative model. We introduce this model using the Bayesian inter-battery factor analysis approach proposed in [14] to show its capabilities to efficiently face semisupervised heterogeneous multi-view problems combining linear and non-linear data representations. Besides, we extend the model formulation to provide the automatic selection of SVs, obtaining scalable solutions, as well as include an ARD prior over the kernel to obtain feature selection capabilities.…”
Section: Discussionmentioning
confidence: 99%
“…1 Whereas kernel LVMs obtain this latent representation as a linear combination, by some dual variables, of the kernel representation of the n-th data, here we propose to reformulate this idea from a generative point of view. In particular, we start from the SSHIBA algorithm formulation [14] and consider that there exist some latent variables z n,: ∼ N (0, I Kc ) which are linearly combined with a set of dual variables A (m) ∈ R N ×Kc to generate a kernel vector, k (m) n,: , as:…”
Section: Bayesian Sparse Factor Analysis With Kernelized Observationsmentioning
confidence: 99%
See 3 more Smart Citations