2008
DOI: 10.1155/2008/947438
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic Latent Variable Models as Nonnegative Factorizations

Abstract: This paper presents a family of probabilistic latent variable models that can be used for analysis of nonnegative data. We show that there are strong ties between nonnegative matrix factorization and this family, and provide some straightforward extensions which can help in dealing with shift invariances, higher-order decompositions and sparsity constraints. We argue through these extensions that the use of this approach allows for rapid development of complex statistical models for analyzing nonnegative data.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
72
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 107 publications
(72 citation statements)
references
References 22 publications
0
72
0
Order By: Relevance
“…Those state hypotheses with an entirely correct voice assignment are represented by a thick border. [41,3] in O 3 . The two most likely state hypotheses at each step are listed in the large rectangles above each state S t , where the voices are notated with braces.…”
Section: Inferencementioning
confidence: 99%
See 2 more Smart Citations
“…Those state hypotheses with an entirely correct voice assignment are represented by a thick border. [41,3] in O 3 . The two most likely state hypotheses at each step are listed in the large rectangles above each state S t , where the voices are notated with braces.…”
Section: Inferencementioning
confidence: 99%
“…(Notes belonging to the upper voice are shown in red and notes belonging to the lower voice are shown in blue.) Again, notice false positive pitch detection [41,3]. In this figure, the emission sets O t are shown on the bottom, and the boxes below each O t node list the emitted notes in decreasing pitch order.…”
Section: Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…The connection between non-negative tensor factorization and latent variable models has been previously explored in the literature (e.g., in [7], [8]). Non-negative tensor factorization models have also been applied to other language processing applications, including subject-verb-object selectional preference induction [9] and learning semantic word similarity [10].…”
Section: A Modelmentioning
confidence: 99%
“…Formally, we find a local solution to the problem: (8) where denotes the set of element-wise non-negative tensors whose entries sum to one, i.e., the set of tensors corresponding to valid joint probability distributions; are the -grams in the training data (obtained by sliding a window of size over each sentence); and is the probability distribution given by . 1 For traditional -gram models, the maximum likelihood objective yields models that are highly overfit to the data; in particular, they are plagued with zero probability -grams.…”
Section: B Trainingmentioning
confidence: 99%