Robotics: Science and Systems XVII 2021
DOI: 10.15607/rss.2021.xvii.073
|View full text |Cite
|
Sign up to set email alerts
|

Variational Inference MPC using Tsallis Divergence

Abstract: In this paper, we provide a generalized framework for Variational Inference-Stochastic Optimal Control by using the non-extensive Tsallis divergence. By incorporating the deformed exponential function into the optimality likelihood function, a novel Tsallis Variational Inference-Model Predictive Control algorithm is derived, which includes prior works such as Variational Inference-Model Predictive Control, Model Predictive Path Integral Control, Cross Entropy Method, and Stein Variational Inference Model Predi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(34 citation statements)
references
References 15 publications
0
33
0
Order By: Relevance
“…The first toy example is the Gaussian mixture example from Wang et al (2018). In this example, the true distribution is a finite mixture of Gaussians (MOG)…”
Section: Qualitative Results On Synthetic Examplesmentioning
confidence: 99%
“…The first toy example is the Gaussian mixture example from Wang et al (2018). In this example, the true distribution is a finite mixture of Gaussians (MOG)…”
Section: Qualitative Results On Synthetic Examplesmentioning
confidence: 99%
“…Better performance perhaps could be achieved with a family of divergences that interpolates between forward KL and χ 2 , such as the Rènyi divergence family [47]. Tail adaptive algorithms [48] might help to select the best performing Rènyi divergence.…”
Section: Discussionmentioning
confidence: 99%
“…As the root divergence of the Rényi's α-divergence, χ-divergence and many other useful divergences [17,18], f -divergence is a more inclusive statistical divergence (family) and was utilized to improve the statistical properties [19,20], sharpness [10,21], and surely the generality of variational bounds [10,21,22]. However, most of these works only dealt with some portions of f -divergences for their favorable statistical properties, e.g.…”
Section: Introductionmentioning
confidence: 99%
“…However, most of these works only dealt with some portions of f -divergences for their favorable statistical properties, e.g. mass-covering [19] and tail-adaptive [20], and did not develop a systematic VI framework that harbors all f -divergences. Meanwhile, since i) the regular f -divergence does not explicitly induce an f -variational bound as elegant as the ELBO [11], χ upper bound (CUBO) [3], or Rényi variational bound (RVB) [2], and ii) only specific choices of f -divergence result in an f -variational bound that trivially depends on the evidence [12], a thorough and comprehensive analysis on the f -divergence VI has been due for a long time.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation