2016
DOI: 10.1007/s10472-016-9515-9
|View full text |Cite
|
Sign up to set email alerts
|

Improving configuration checking for satisfiable random k-SAT instances

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 12 publications
0
8
0
Order By: Relevance
“…The third category of variants of CC is the largest one. Many CC rules are added according to analyzing different problems to obtain a better performance, such as NCCA+ (Abramé, Habet, and Toumi 2017) Algorithm 1: CC 2 FS(G, cutof f ) for SAT, SCC (Wang, Cai, and Yin 2016) for MWCP, HC-SCC (Luo et al 2017) for weighted partial maximum satisfiability problem, and CCA (Cai and Su 2012;Li et al 2018) for minimum weighted vertex cover problem, etc.…”
Section: Related Workmentioning
confidence: 99%
“…The third category of variants of CC is the largest one. Many CC rules are added according to analyzing different problems to obtain a better performance, such as NCCA+ (Abramé, Habet, and Toumi 2017) Algorithm 1: CC 2 FS(G, cutof f ) for SAT, SCC (Wang, Cai, and Yin 2016) for MWCP, HC-SCC (Luo et al 2017) for weighted partial maximum satisfiability problem, and CCA (Cai and Su 2012;Li et al 2018) for minimum weighted vertex cover problem, etc.…”
Section: Related Workmentioning
confidence: 99%
“…Then, it attempts to find a solution by repeatedly iteratively this assignment by flipping the Boolean value of one variable (changing its value from false to true, or true to false) according to several variable selection heuristics at a time until seeking out a solution (a solution is found) or timeout. SLS algorithms for SAT differ by the heuristic used in choosing which variable to flip (for examples, see the literature 34‐36 ). Below we briefly overview some popular SLS algorithms with some justifications why they were selected for further improvements.…”
Section: Preliminariesmentioning
confidence: 99%
“…As the MAX-SAT problem is very closely related to the SAT problem, one could adapt effective local search strategies for SAT, such as random walk [32], promising decreasing variable picking [23] and configuration checking (CC) [2,11,13], to solve the MAX-SAT problem. Unfortunately, making these adaptations effective for MAX-SAT is highly non-trivial because of the difference between SAT and MAX-SAT.…”
Section: Introductionmentioning
confidence: 99%
“…In order to make the Path-Relinking method competitive for the MAX-SAT, we identify two drawbacks in the previous Path-Relinking algorithm proposed in [16]: (1) complete trajectories between the two elite solutions are constructed, no matter how the quality of the solutions is in these trajectories, so that many search steps are made in exploring low-quality solutions; (2) the search is not sufficiently diversified, because Path-Relinking is just used to intensify the search around the solutions that have been produced by a GRASP (Greedy Randomized Adaptive Search Procedure) heuristic. Consequently, we propose an effective local search algorithm for the MAX-SAT called IPBMR (Iterated Path-Breaking with Mutation and Restart ) to remedy the above two drawbacks: (1) We establish a condition to break the construction of a trajectory between two elite solutions, allowing the search to focus only on high quality solutions; (2) We randomize the construction of the trajectories between two elite solutions, and if the search falls in a local optimum solution, we perform weak mutations followed by strong mutations that randomly flip a subset of variables of the local optimum solution in order to further diversify the search; (3) We restart [8] the search to explore new regions of the search space if the mutations do not allow to improve the local mimimum solution.…”
Section: Introductionmentioning
confidence: 99%