2020
DOI: 10.48550/arxiv.2009.13093
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

f-Divergence Variational Inference

Neng Wan,
Dapeng Li,
Naira Hovakimyan

Abstract: This paper introduces the f -divergence variational inference (f -VI) that generalizes variational inference to all f -divergences. Initiated from minimizing a crafty surrogate f -divergence that shares the statistical consistency with the f -divergence, the f -VI framework not only unifies a number of existing VI methods, e.g. Kullback-Leibler VI [1], Rényi's α-VI [2], and χ-VI [3], but offers a standardized toolkit for VI subject to arbitrary divergences from f -divergence family. A general f -variational bo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…Although it is not possible to directly solve Eq. (11) in this case, we draw from Expectation Maximization (EM) to derive an iterative update scheme for the k + 1th iteration with the form…”
Section: Update Lawsmentioning
confidence: 99%
See 1 more Smart Citation
“…Although it is not possible to directly solve Eq. (11) in this case, we draw from Expectation Maximization (EM) to derive an iterative update scheme for the k + 1th iteration with the form…”
Section: Update Lawsmentioning
confidence: 99%
“…In Wang et al [9], Regli and Silva [10], the authors proposed variants of the α-divergence to improve the performance and robustness of the inference algorithm. Wan et al [11] further extended the VI framework to f -divergence, which is a broad statistical divergence family that recovers the KL, α and χ-divergence as special cases.…”
Section: Introductionmentioning
confidence: 99%