2018
DOI: 10.1137/17m1150669
|View full text |Cite
|
Sign up to set email alerts
|

Image Labeling Based on Graphical Models Using Wasserstein Messages and Geometric Assignment

Abstract: We introduce a novel approach to Maximum A Posteriori inference based on discrete graphical models. By utilizing local Wasserstein distances for coupling assignment measures across edges of the underlying graph, a given discrete objective function is smoothly approximated and restricted to the assignment manifold. A corresponding multiplicative update scheme combines in a single process (i) geometric integration of the resulting Riemannian gradient flow and (ii) rounding to integral solutions that represent va… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
15
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 12 publications
(15 citation statements)
references
References 50 publications
0
15
0
Order By: Relevance
“…Each label g j represents the data of class j. Image labeling denotes the problem to assign class labels to image data depending on the local context encoded by the graph G. We refer to [HSÅS18] for more details and background on the image labeling problem. Assignment of labels to data are represented by discrete probability distributions…”
Section: The Assignment Flowmentioning
confidence: 99%
See 1 more Smart Citation
“…Each label g j represents the data of class j. Image labeling denotes the problem to assign class labels to image data depending on the local context encoded by the graph G. We refer to [HSÅS18] for more details and background on the image labeling problem. Assignment of labels to data are represented by discrete probability distributions…”
Section: The Assignment Flowmentioning
confidence: 99%
“…By contrast, the assignment flow provides a smooth dynamical system on a graph (network), where all ingredients coherently fit into the overall mathematical framework. Based on this, we recently showed how discrete graphical models for image labeling can be evaluated using the assignment flow [HSÅS18], and how unsupervised labeling can be modeled by coupling the assignment flow and Riemannian gradient flows for label evolution on feature manifolds [ZZr + 18]. Our current work, to be reported elsewhere, studies machine learning problems based on controlling the assignment flow.…”
mentioning
confidence: 98%
“…Each label j represents the data of class j. Image labeling denotes the problem of finding an assignment V → X assigning class labels to nodes depending on the image data F V and the local context encoded by the graph structure G. We refer to [HSÅS18] for more details and background on the image labeling problem. G may be a grid graph (with self-loops) as in low-level image processing or a less structured graph, with arbitrary connectivity in terms of the neighborhoods…”
Section: Image Labeling Using Geometric Assignmentmentioning
confidence: 99%
“…In comparison with discrete graphical models, an antipodal viewpoint was adopted by [ÅPSS17] for the design of the assignment flow approach: Rather than performing nonsmooth convex outer relaxation and programming, followed by subsequent rounding to integral solutions that is common when working with large-scale discrete graphical models, the assignment flow provides a smooth nonconvex interior relaxation that performs rounding to integral solutions simultaneously. In [HSÅS18] it was shown that the assignment flow can emulate a given discrete graphical model in terms of smoothed local Wasserstein distances, that evaluate the edge-based parameters of the graphical model. In comparison to established belief propagation iterations [YFW05,WJW05], the assignment flow driven by 'Wasserstein messages' [HSÅS18] continuously takes into account basic constraints, which enables to compute good suboptimal solutions just by numerically integrating the flow in a proper way [ZSPS18].…”
mentioning
confidence: 99%
See 1 more Smart Citation