2019
DOI: 10.48550/arxiv.1911.06943
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Overlap Gap Property and Approximate Message Passing Algorithms for $p$-spin models

Abstract: We consider the algorithmic problem of finding a near ground state (near optimal solution) of a p-spin model. We show that for a class of algorithms broadly defined as Approximate Message Passing (AMP), the presence of the Overlap Gap Property (OGP), appropriately defined, is a barrier. We conjecture that when p ≥ 4 the model does indeed exhibits OGP (and prove it for the space of binary solutions). Assuming the validity of this conjecture, as an implication, the AMP fails to find near ground states in these m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

1
12
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(13 citation statements)
references
References 16 publications
1
12
0
Order By: Relevance
“…Our main result is a simple proof that all near-maxima of H N on [−1, 1] N are close to a corner in {±1} N , confirming the conjecture of [GJ19]. Moreover we obtain an explicit quantitative dependence, though we do not expect it to be tight.…”
Section: Introductionsupporting
confidence: 65%
See 1 more Smart Citation
“…Our main result is a simple proof that all near-maxima of H N on [−1, 1] N are close to a corner in {±1} N , confirming the conjecture of [GJ19]. Moreover we obtain an explicit quantitative dependence, though we do not expect it to be tight.…”
Section: Introductionsupporting
confidence: 65%
“…In other words, to understand the set of near-maxima of H N on [−1, 1] N , it is in some sense sufficient to understand it on the discrete cube. Conditional on (an implication of) this result, [GJ19] prove that approximate message passing algorithms fail to approximately optimize pure p-spin models with γ p = 0 for exactly 1 value of p, over [−1, 1] N when p ≥ 4 is even. By contrast for certain other mixture functions ξ satisfying a no overlap gap condition, approximate message passing yields the only known algorithm to efficiently locate an near-maximum of H N with high probability [Mon19,AMS20].…”
Section: Introductionmentioning
confidence: 90%
“…In particular, the recent paper [GJ19] proves that approximate message passing algorithms (of the type studied in this paper) cannot achieve a (1 − ε)-approximation of the optimum in pure p-spin Ising models, under the assumption that these exhibit an overlap gap. However [GJ19] does not characterize optimal approximation ratio, which we instead do here, as a special case of our results.…”
Section: Introductionmentioning
confidence: 95%
“…In a different direction, Gamarnik and co-authors showed in several examples that the existence of an overlap gap rules out a (1−ε)-approximation for certain classes of polynomial time algorithms [GS14, GS17, CGP + 19]. In particular, the recent paper [GJ19] proves that approximate message passing algorithms (of the type studied in this paper) cannot achieve a (1 − ε)-approximation of the optimum in pure p-spin Ising models, under the assumption that these exhibit an overlap gap. However [GJ19] does not characterize optimal approximation ratio, which we instead do here, as a special case of our results.…”
Section: Introductionmentioning
confidence: 99%
“…Approximate Message Passing (AMP) algorithms are a general family of iterative algorithms that have seen widespread use in a variety of applications. First developed for compressed sensing in [DMM09,DMM10a,DMM10b], they have since been applied to many high-dimensional problems arising in statistics and machine learning, including Lasso estimation and sparse linear regression [BM11b,MAYB13], generalized linear models and phase retrieval [Ran11,SR14,SC19], robust linear regression [DM16], sparse or structured principal components analysis (PCA) [RF12, DM14, DMK + 16, MV17], group synchronization problems [PWBM18], deep learning [BS16, BSR17, MMB17], and optimization in spin glass models [Mon19,GJ19,AMS20].…”
Section: Introductionmentioning
confidence: 99%