Interspeech 2018 2018
DOI: 10.21437/interspeech.2018-1840
|View full text |Cite
|
Sign up to set email alerts
|

Expectation-Maximization Algorithms for Itakura-Saito Nonnegative Matrix Factorization

Abstract: This paper presents novel expectation-maximization (EM) algorithms for estimating the nonnegative matrix factorization model with Itakura-Saito divergence. Indeed, the common EMbased approach exploits the space-alternating generalized EM (SAGE) variant of EM but it usually performs worse than the conventional multiplicative algorithm. We propose to explore more exhaustively those algorithms, in particular the choice of the methodology (standard EM or SAGE variant) and the latent variable set (full or reduced).… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 21 publications
(40 reference statements)
0
2
0
Order By: Relevance
“…We remark that if κ = 0, then λ = ρ = 0: therefore, q j,f t = 0 and p j,f t becomes the posterior power of s j,f t , as mentioned in Section III-C1. Then, we recognize in (25) the IS divergence between P j and W j H j , as in the EM algorithm for ISNMF [46]. Consequently, the updates rules ( 36) and ( 37) are similar to those obtained in such a scenario [46], up to an additional power 1/2, which is common when applying the majorizeminimization methodology for estimating ISNMF [44].…”
Section: ) Relation To Other Approachesmentioning
confidence: 82%
See 1 more Smart Citation
“…We remark that if κ = 0, then λ = ρ = 0: therefore, q j,f t = 0 and p j,f t becomes the posterior power of s j,f t , as mentioned in Section III-C1. Then, we recognize in (25) the IS divergence between P j and W j H j , as in the EM algorithm for ISNMF [46]. Consequently, the updates rules ( 36) and ( 37) are similar to those obtained in such a scenario [46], up to an additional power 1/2, which is common when applying the majorizeminimization methodology for estimating ISNMF [44].…”
Section: ) Relation To Other Approachesmentioning
confidence: 82%
“…To assess the validity of this update scheme, we applied both procedures (maximization of the exact functional (44) and its approximation (46)) on the learning dataset used in the experimental evaluation (see Section IV-A). The average relative difference between the phases obtained with those two approaches was of approximately .…”
Section: M-step: Phase Parametersmentioning
confidence: 99%