Recent results establish for the hard-core model (and more generally for 2-spin antiferromagnetic systems) that the computational complexity of approximating the partition function on graphs of maximum degree ∆ undergoes a phase transition that coincides with the uniqueness/nonuniqueness phase transition on the infinite ∆-regular tree. For the ferromagnetic Potts model we investigate whether analogous hardness results hold. Goldberg and Jerrum showed that approximating the partition function of the ferromagnetic Potts model is at least as hard as approximating the number of independent sets in bipartite graphs, so-called #BIS-hardness. We improve this hardness result by establishing it for bipartite graphs of maximum degree ∆. To this end, we first present a detailed picture for the phase diagram for the infinite ∆-regular tree, giving a refined picture of its first-order phase transition and establishing the critical temperature for the coexistence of the disordered and ordered phases. We then prove for all temperatures below this critical temperature (corresponding to the region where the ordered phase "dominates") that it is #BIS-hard to approximate the partition function on bipartite graphs of maximum degree ∆.The #BIS-hardness result uses random bipartite regular graphs as a gadget in the reduction. The analysis of these random graphs relies on recent results establishing connections between the maxima of the expectation of their partition function, attractive fixpoints of the associated tree recursions, and induced matrix norms. In this paper we extend these connections to random regular graphs for all ferromagnetic models. Using these connections, we establish the Bethe prediction for every ferromagnetic spin system on random regular graphs, which says roughly that the expectation of the log of the partition function Z is the same as the log of the expectation of Z. As a further consequence of our results, we prove for the ferromagnetic Potts model that the Swendsen-Wang algorithm is torpidly mixing (i. e., exponentially slow convergence to its stationary distribution) on random ∆-regular graphs at the critical temperature for sufficiently large q.
The hard-core model has received much attention in the past couple of decades as a lattice gas model with hard constraints in statistical physics, a multicast model of calls in communication networks, and as a weighted independent set problem in combinatorics, probability and theoretical computer science.In this model, each independent set I in a graph G is weighted proportionally to λ |I| , for a positive real parameter λ. For large λ, computing the partition function (namely, the normalizing constant which makes the weighting a probability distribution on a finite graph) on graphs of maximum degree ∆ ≥ 3, is a well known computationally challenging problem. More concretely, let λ c (T ∆ ) denote the critical value for the so-called uniqueness threshold of the hard-core model on the infinite ∆-regular tree; recent breakthrough results of Dror Weitz (2006) and Allan Sly (2010) have identified λ c (T ∆ ) as a threshold where the hardness of estimating the above partition function undergoes a computational transition.We focus on the well-studied particular case of the square lattice Z 2 , and provide a new lower bound for the uniqueness threshold, in particular taking it well above λ c (T 4 ). Our technique refines and builds on the tree of self-avoiding walks approach of Weitz, resulting in a new technical sufficient criterion (of wider applicability) for establishing strong spatial mixing (and hence uniqueness) for the hard-core model. Our new criterion achieves better bounds on strong spatial mixing when the graph has extra structure, improving upon what can be achieved by just using the maximum degree. Applying our technique to Z 2 we prove that strong spatial mixing holds for all λ < 2.3882, improving upon the work of Weitz that held for λ < 27/16 = 1.6875. Our results imply a fully-polynomial deterministic approximation algorithm for estimating the partition function, as well as rapid mixing of the associated Glauber dynamics to sample from the hard-core distribution.
Abstract. We study the computational complexity of approximately counting the number of independent sets of a graph with maximum degree ∆. More generally, for an input graph G = (V, E) and an activity λ > 0, we are interested in the quantity ZG(λ) defined as the sum over independent sets I weighted as w(I) = λ |I| . In statistical physics, ZG(λ) is the partition function for the hard-core model, which is an idealized model of a gas where the particles have non-negibile size. Recently, an interesting phase transition was shown to occur for the complexity of approximating the partition function. Weitz showed an FPAS for the partition function for any graph of maximum degree ∆ when ∆ is constant and λ < λc(T∆) := (∆ − 1) ∆−1 /(∆ − 2) ∆ . The quantity λc(T∆) is the critical point for the so-called uniqueness threshold on the infinite, regular tree of degree ∆. On the other side, Sly proved that there does not exist efficient (randomized) approximation algorithms for λc(T∆) < λ < λc(T∆) + ε(∆), unless NP=RP, for some function ε(∆) > 0. We remove the upper bound in the assumptions of Sly's result for ∆ = 4, 5, that is, we show that there does not exist efficient randomized approximation algorithms for all λ > λc(T∆) for ∆ = 3 and ∆ ≥ 6. Sly's inapproximability result uses a clever reduction, combined with a second-moment analysis of Mossel, Weitz and Wormald which prove torpid mixing of the Glauber dynamics for sampling from the associated Gibbs distribution on almost every regular graph of degree ∆ for the same range of λ as in Sly's result. We extend Sly's result by improving upon the technical work of Mossel et al., via a more detailed analysis of independent sets in random regular graphs.
The hard-core model has received much attention in the past couple of decades as a lattice gas model with hard constraints in statistical physics, a multicast model of calls in communication networks, and as a weighted independent set problem in combinatorics, probability and theoretical computer science. In this model, each independent set I in a graph G is weighted proportionally to λ |I | , for a positive real parameter λ. For large λ, computing the partition function (namely, the normalizing constant which makes the weighting a probability distribution on a finite graph) on graphs of maximum degree ≥ 3, is a well known computationally challenging problem. More concretely, let λ c (T ) denote the critical value for the so-called uniqueness Supported by NSF grants CCF-0830298 and CCF-0910584. Jinwoo Shin was supported by the Algorithms and Randomness Center at Georgia Technology.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.