2015
DOI: 10.1109/tpami.2013.201
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Nonparametric Models for Multiway Data Analysis

Abstract: Tensor decomposition is a powerful computational tool for multiway data analysis. Many popular tensor decomposition approaches-such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)-amount to multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g., missing data and binary data), and (iii) noisy observations and outliers. To address these issues, we propose tensor-variate latent nonparametric Bayesian models for multiway data … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
111
0
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 105 publications
(113 citation statements)
references
References 22 publications
1
111
0
1
Order By: Relevance
“…Tucker decomposition and CP decomposition are two popular tensor decomposition approaches. The probabilistic Tucker decomposition models mainly include probabilistic Tucker decomposition [34], exponential family tensor factorization [35] and InfTucker model [36]. Probabilistic Tucker decomposition was closely related to probabilistic PCA.…”
Section: Other Probabilistic Models Of Low-rank Matrix/tensor Factorimentioning
confidence: 99%
See 1 more Smart Citation
“…Tucker decomposition and CP decomposition are two popular tensor decomposition approaches. The probabilistic Tucker decomposition models mainly include probabilistic Tucker decomposition [34], exponential family tensor factorization [35] and InfTucker model [36]. Probabilistic Tucker decomposition was closely related to probabilistic PCA.…”
Section: Other Probabilistic Models Of Low-rank Matrix/tensor Factorimentioning
confidence: 99%
“…Among them, the TT format is a special case of the HT and the tensor tree structure [33]. The probabilistic models of the Tucker were presented in [34][35][36] and that of the CP were developed in [37][38][39][40][41][42][43][44][45][46][47][48].…”
Section: Introductionmentioning
confidence: 99%
“…Tensor subspace learning methods, such as HOSVD, Tucker decomposition and CANDECOMP [3], MPCA [4] and probabilistic decompositions [5,6,7] have been proposed to tackle the dimensionality problem. These methods project the raw data to a lower dimensional space, in which the projected data can be considered as highly descriptive features of the raw information.…”
Section: Introductionmentioning
confidence: 99%
“…VB has low computational costs in comparison with the sampling methods, such as Markov Chain Monte Carlo (MCMC). It has widespread use in machine learning, signal processing and many other fields, such as the state space model [10,11], time series [12,13], Mixture Models [14], filter [15][16][17], image [18][19][20], communication [21,22], speech recognition [23] and graphical models [24,25].…”
Section: Introductionmentioning
confidence: 99%