2022
DOI: 10.1609/aaai.v36i9.21164
|View full text |Cite
|
Sign up to set email alerts
|

Pretrained Cost Model for Distributed Constraint Optimization Problems

Abstract: Distributed Constraint Optimization Problems (DCOPs) are an important subclass of combinatorial optimization problems, where information and controls are distributed among multiple autonomous agents. Previously, Machine Learning (ML) has been largely applied to solve combinatorial optimization problems by learning effective heuristics. However, existing ML-based heuristic methods are often not generalizable to different search algorithms. Most importantly, these methods usually require full knowledge about the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 32 publications
0
2
0
Order By: Relevance
“…(1) DBP with a damping factor of 0.9 and its splitting constraint factor graph version (DBP-SCFG) with a splitting ratio of 0.95 [36]; (2) GAT-PCM-LNS with a destroy probability of 0.2 [170], which is a local search method combining the LNS framework [82,122] with neural-learned repair heuristics; (3) Mini-bucket Elimination (MBE) with an i-bound of 9 [194], which is a memory-bounded inference algorithm; (4) Toulbar2 with timeout of 1200s [195], which is a highly optimized exact solver written in C++. The hyperparameters for DBP and GAT-PCM-LNS are set according to the original papers, while the memory budget for MBE and timeout for Toulbar2 are set based on our computational resources.…”
Section: Empirical Evaluationsmentioning
confidence: 99%
See 1 more Smart Citation
“…(1) DBP with a damping factor of 0.9 and its splitting constraint factor graph version (DBP-SCFG) with a splitting ratio of 0.95 [36]; (2) GAT-PCM-LNS with a destroy probability of 0.2 [170], which is a local search method combining the LNS framework [82,122] with neural-learned repair heuristics; (3) Mini-bucket Elimination (MBE) with an i-bound of 9 [194], which is a memory-bounded inference algorithm; (4) Toulbar2 with timeout of 1200s [195], which is a highly optimized exact solver written in C++. The hyperparameters for DBP and GAT-PCM-LNS are set according to the original papers, while the memory budget for MBE and timeout for Toulbar2 are set based on our computational resources.…”
Section: Empirical Evaluationsmentioning
confidence: 99%
“…The method is referred as "Damped Max-sum" in[36]. We use Damped BP for a coherent presentation.3 This chapter has been published in[186].…”
mentioning
confidence: 99%