2020
DOI: 10.1137/18m1223915
|View full text |Cite
|
Sign up to set email alerts
|

Multigrid Optimization for Large-Scale Ptychographic Phase Retrieval

Abstract: Ptychography is a popular imaging technique that combines diffractive imaging with scanning microscopy. The technique consists of a coherent beam that is scanned across an object in a series of overlapping positions, leading to reliable and improved reconstructions. Ptychographic microscopes allow for large fields to be imaged at high resolution at the cost of additional computational expense. In this work, we propose a multigrid-based optimization framework to reduce the computational burdens of large-scale p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
2

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(21 citation statements)
references
References 48 publications
0
21
0
Order By: Relevance
“…the zero vector). To identify a faithful mapping N Θ as in (7), we solve a training problem. This is modeled as finding weights that minimize an expected loss, which is typically solved using optimization methods like SGD [69] and Adam [70].…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations
“…the zero vector). To identify a faithful mapping N Θ as in (7), we solve a training problem. This is modeled as finding weights that minimize an expected loss, which is typically solved using optimization methods like SGD [69] and Adam [70].…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…In practice, the expectation in ( 9) is approximated using a finite subset of data, which is referred to as training data. In addition to minimizing over the training data, we aim for (7) to hold for a set of testing data that was not used during training (which tests the network's ability to generalize).…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…(1.1) in a distributed manner, where f j : R n → R is smooth and convex, and g : R n → R is proximable. Problems of the form (1.1) arise in many contexts, including machine learning [1,32,30,17], statistics [14], phase retrieval [9,2,6], geophysics [11,7], and image processing [26,13]. These problems often contain many samples, i.e., N is often very large, making the optimization computationally challenging.…”
Section: Introductionmentioning
confidence: 99%