2021
DOI: 10.48550/arxiv.2107.10254
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Fixed-Point Acceleration for Convex Optimization

Abstract: Fixed-point iterations are at the heart of numerical computing and are often a computational bottleneck in real-time applications that typically need a fast solution of moderate accuracy. We present neural fixed-point acceleration which combines ideas from meta-learning and classical acceleration methods to automatically learn to accelerate fixed-point problems that are drawn from a distribution. We apply our framework to SCS, the state-of-the-art solver for convex cone programming, and design models and loss … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 22 publications
(29 reference statements)
0
4
0
Order By: Relevance
“…In addition to the development of the methodologies, another trend is to explore new applications in wireless networks. Besides the optimization problems mentioned before, learning to optimize techniques are expected to solve other problems involved in wireless communication for future research directions, including multi-object optimization problems [245], bi-level optimization problems [246], conic programming [247], maximum-likelihood estimation problems [248], etc, in various emerging applications.…”
Section: ) Advanced Methodologies and Extended Applications Of Moasmentioning
confidence: 99%
“…In addition to the development of the methodologies, another trend is to explore new applications in wireless networks. Besides the optimization problems mentioned before, learning to optimize techniques are expected to solve other problems involved in wireless communication for future research directions, including multi-object optimization problems [245], bi-level optimization problems [246], conic programming [247], maximum-likelihood estimation problems [248], etc, in various emerging applications.…”
Section: ) Advanced Methodologies and Extended Applications Of Moasmentioning
confidence: 99%
“…• Section 6.4.1 discusses models that integrate fixed-point computations into semiamortized models. Venkataraman and Amos (2021) amortize convex cone programs by differentiating through the splitting cone solver (O'donoghue et al, 2016) and Bai et al (2022) amortize deep equilibrium models (Bai et al, 2019(Bai et al, , 2020.…”
Section: Semi-amortized Modelsmentioning
confidence: 99%
“…Neural fixed-point acceleration (Venkataraman and Amos, 2021) proposes a semi-amortized method for computing fixed-points and use it for convex cone programming. Representing a latent state at time t with ĥt , they parameterize the initial iterate ŷ0 , ĥ0 = init θ (x) with an initialization model init θ and perform the fixed-point computations…”
Section: Neural Fixed-point Acceleration (Neuralfp) and Conic Optimiz...mentioning
confidence: 99%
See 1 more Smart Citation