2020
DOI: 10.48550/arxiv.2010.14021
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Bridging Classical and Quantum with SDP initialized warm-starts for QAOA

Abstract: We study the Quantum Approximate Optimization Algorithm (QAOA) in the context of the Max-Cut problem. Near-term (noisy) quantum devices are only able to (accurately) execute QAOA at low circuit depths while QAOA requires a relatively high circuit-depth in order to "see" the whole graph [FGG20]. We introduce a classical pre-processing step that initializes QAOA with a biased superposition of all possible cuts in the graph, referred to as a warm-start. In particular, our initialization informs QAOA by a solution… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
32
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(33 citation statements)
references
References 16 publications
1
32
0
Order By: Relevance
“…Another approach is to modify the QAOA ansatz. This includes introducing additional parameters within layers of QAOA [21], modifying the structure of the ansatz [22][23][24][25], modifying the cost function [27], objective function [28], and circuit structure [26]. Such technological and algorithmic advances are likely necessary to reduce the numbers of layers or gates, and hence the accumulated noise, as the QAOA scales to larger sizes.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Another approach is to modify the QAOA ansatz. This includes introducing additional parameters within layers of QAOA [21], modifying the structure of the ansatz [22][23][24][25], modifying the cost function [27], objective function [28], and circuit structure [26]. Such technological and algorithmic advances are likely necessary to reduce the numbers of layers or gates, and hence the accumulated noise, as the QAOA scales to larger sizes.…”
Section: Discussionmentioning
confidence: 99%
“…These studies have shown some promising results, for example, with QAOA outperforming the conventional lower bound of the GW algorithm for MaxCut on some small instances [19,20]. There have also been a variety of proposed modifications to the algorithm to improve performance [21][22][23][24][25][26][27][28] and solve optimization problems with constraints [29][30][31]. The results from these and other studies have encouraged research into extending the QAOA to larger and more complex problems.…”
Section: Introductionmentioning
confidence: 99%
“…It has been discussed both theoretically and experimentally [6][7][8][9][10][11][12][13][14][15][16][17][18][19]. Variants of QAOA have also been explored [20][21][22][23][24][25][26][27]. Motivated by adiabatic evolution, QAOA uses a string of unitary evolution operators alternating between two Hamiltonian functions with time parameters that are optimized classically in order to maximize the cost function (equivalently, minimize the energy of the corresponding Hamiltonian).…”
Section: Introductionmentioning
confidence: 99%
“…In recent work, Tate et al [4] and Egger et * Email of the corresponding author: swatig@gatech.edu al. [5] explored using classical algorithms to specify the initial state for QAOA.…”
Section: Introductionmentioning
confidence: 99%
“…Following the classical optimization literature ( [6][7][8]), we will refer to these classically-inspired initializations as warm starts. The approach by Tate et al [4] considered warm-starting QAOA using rank-2 and rank-3 Burer-Monteiro locally optimal solutions for Max-Cut; however, their method plateaus even at low circuit depths of p = 1 for some initializations and is unable to improve the cut quality for some instances. Egger et al [5] similarly considered warmstarts for quadratic unconstrained binary optimization (QUBO) problems; however, they only focus on rank-1 solutions from classical linear programming relaxations for QUBO.…”
Section: Introductionmentioning
confidence: 99%