2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition 2018
DOI: 10.1109/cvpr.2018.00174
|View full text |Cite
|
Sign up to set email alerts
|

Discrete-Continuous ADMM for Transductive Inference in Higher-Order MRFs

Abstract: This paper introduces a novel algorithm for transductive inference in higher-order MRFs, where the unary energies are parameterized by a variable classifier. The considered task is posed as a joint optimization problem in the continuous classifier parameters and the discrete label variables. In contrast to prior approaches such as convex relaxations, we propose an advantageous decoupling of the objective function into discrete and continuous subproblems and a novel, efficient optimization method related to ADM… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 56 publications
0
6
0
Order By: Relevance
“…Discrete optimization is one of hottest topics in mathematics and is widely used to solve computer vision problems (Kim et al 2017;Laude et al 2018). In this paper, we propose a new discrete back propagation algorithm, where a projection function is exploited to binarize or quantize the input variables in a unified framework.…”
Section: Discrete Back Propagation Via Projectionmentioning
confidence: 99%
See 1 more Smart Citation
“…Discrete optimization is one of hottest topics in mathematics and is widely used to solve computer vision problems (Kim et al 2017;Laude et al 2018). In this paper, we propose a new discrete back propagation algorithm, where a projection function is exploited to binarize or quantize the input variables in a unified framework.…”
Section: Discrete Back Propagation Via Projectionmentioning
confidence: 99%
“…(1) The binarization of CNNs could be essentially solved based on the discrete optimization, but it has long been neglected in previous work. Discrete optimization methods can often provide strong guarantees about the quality of the solutions they find and lead to much better performance in practice (Felzenszwalb and Zabih 2007;Kim et al 2017;Laude et al 2018).…”
Section: Introductionmentioning
confidence: 99%
“…An optimization process using a PGM involves defining an energy function, which can embed contextual information about the data, i.e., relationships among data points at different resolutions, scales, or time steps (prior knowledge). PGMs can be optimized by targeting the maximization of the joint probability (or the minimization of the energy function) of the graph (Laude et al, 2018), which would result in specific river network predictions. Although classes of PGMs (e.g., Bayesian network models) have been used for water quality applications (Section 2), newer models that integrate PGM with DL such as DBN have been used for water quality predictions (Solanki et al, 2015; Yan et al, 2020).…”
Section: Opportunities For Advancement Of Water Quality MLmentioning
confidence: 99%
“…Size-constrained proposal e y < l a t e x i t s h a 1 _ b a s e 6 4 = " C P 7 J n V i B w g n m j Y I 5 t 4 ADMM for incorporating high-order segmentation priors on the target region's histogram of intensities [26] or compactness [17]. More recently, similar techniques have been proposed to include higher-order priors directly in the learning process [27,28]. To our knowledge, our work is the first employing a discretecontinuous framework for weakly-supervised segmentation.…”
Section: Crf-regularized Proposalmentioning
confidence: 99%