2018
DOI: 10.48550/arxiv.1811.11206
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Partitioned Variational Inference: A unified framework encompassing federated and continual learning

Abstract: Variational inference (VI) has become the method of choice for fitting many modern probabilistic models. However, practitioners are faced with a fragmented literature that offers a bewildering array of algorithmic options. First, the variational family. Second, the granularity of the updates e.g. whether the updates are local to each data point and employ message passing or global. Third, the method of optimization (bespoke or blackbox, closed-form or stochastic updates, etc.). This paper presents a new framew… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 17 publications
0
7
0
Order By: Relevance
“…The Kullback-Leibler term in the variational lower bound of VCL naturally regularizes the approximate posterior toward the prior. Improved training procedures have also been developed for this type of approximate Bayesian continual learning through the use of natural gradients [10,47], fixed-point updates [52], and local approximation [6]. More expressive classes of variational distributions were also considered, including channel factorized Gaussian [22], multiplicative normalizing flow [22], or structured Laplace approximations [33].…”
Section: Continual Learning Algorithmsmentioning
confidence: 99%
“…The Kullback-Leibler term in the variational lower bound of VCL naturally regularizes the approximate posterior toward the prior. Improved training procedures have also been developed for this type of approximate Bayesian continual learning through the use of natural gradients [10,47], fixed-point updates [52], and local approximation [6]. More expressive classes of variational distributions were also considered, including channel factorized Gaussian [22], multiplicative normalizing flow [22], or structured Laplace approximations [33].…”
Section: Continual Learning Algorithmsmentioning
confidence: 99%
“…In Bayesian federated learning, the goal is to obtain a variational distribution q(θ) on the model parameter space that minimizes the global free energy (see, e.g., [1], [7], [14])…”
Section: A Setupmentioning
confidence: 99%
“…In another study [56], authors introduce a Partitioned Variational Inference (PVI) for probabilistic models that work well over federated data. They train a Bayesian Neural Network (BNN) over an FL environment that is allowed for both synchronous or asynchronous model updates across many machines, their proposed approach along with the integration of other methods could allow a more communication-efficient training of BNN on non-iid federated data.…”
Section: B Rsq2 -How Can We Make Communication More Efficient In An F...mentioning
confidence: 99%
“…Ability to handle different phases of the model training well [56] a Partitioned Variational Inference (PVI)…”
Section: A Decentralized Deep Learning Modelmentioning
confidence: 99%