2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016
DOI: 10.1109/cvpr.2016.630
|View full text |Cite
|
Sign up to set email alerts
|

Principled Parallel Mean-Field Inference for Discrete Random Fields

Abstract: Mean-field variational inference is one of the most popular approaches to inference in discrete random fields. Standard mean-field optimization is based on coordinate descent and in many situations can be impractical. Thus, in practice, various parallel techniques are used, which either rely on ad hoc smoothing with heuristically set parameters, or put strong constraints on the type of models.In this paper, we propose a novel proximal gradientbased approach to optimizing the variational objective. It is natura… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
24
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(24 citation statements)
references
References 31 publications
0
24
0
Order By: Relevance
“…In the parallel scheme, we update all the probabilities Q x (s x = l) in parallel for each x and l (Equation (10)). Although naive parallelization does not guarantee convergence to a local minimum, Baqué et al [BBFF16] showed how using additional damping in each fix point step would guarantee convergence. In our case, however, we did not experience the need for such a damping factor.…”
Section: Parallel Mean-field Inference On the Gpumentioning
confidence: 99%
“…In the parallel scheme, we update all the probabilities Q x (s x = l) in parallel for each x and l (Equation (10)). Although naive parallelization does not guarantee convergence to a local minimum, Baqué et al [BBFF16] showed how using additional damping in each fix point step would guarantee convergence. In our case, however, we did not experience the need for such a damping factor.…”
Section: Parallel Mean-field Inference On the Gpumentioning
confidence: 99%
“…Finally, it is worth mentioning that total-variation (TV) terms can be viewed as the continuous counterpart of Potts regulariza-tion, and were the subject of a large number of vision works in recent years [7,23,24,30]. Recently, there have been significant research efforts focusing on designing parallel (or distributed) formulations for optimizing pairwise functions [1,19,25]. Distributing computations would be beneficial not only to highresolution images and massive 3D grids but also to difficult high-order models [11,14,26], which require approximate solutions solving a large number of problems of the form (1).…”
Section: Introductionmentioning
confidence: 99%
“…Unfortunately such techniques are restricted to TV regularization terms. Mean-field inference techniques [1,16] also attracted significant attention recently as they can be parrallelized, albeit at the cost of convergence guarantees [1].…”
Section: Introductionmentioning
confidence: 99%
“…While the initial work by [19] used a version of mean-field that is not guaranteed to converge, their follow-up paper [20] proposed a convergent mean-field algorithm for negative semi-definite label compatibility functions. Recently, Baqué et al [4] presented a new algorithm that has convergence guarantees in the general case. Vineet et al [34] extended the mean-field model to allow the addition of higher-order terms on top of the dense pairwise potentials, enabling the use of co-occurence potentials [24] and P n -Potts models [15].…”
mentioning
confidence: 99%
“…Our work can be viewed as a complementary direction to previous research trends in dense CRFs. While [4,20,34] improved mean-field and [29,38] learnt the parameters, we focus on the energy minimisation problem.…”
mentioning
confidence: 99%