2017
DOI: 10.1088/1361-6420/aa8d93
|View full text |Cite
|
Sign up to set email alerts
|

Projected regression method for solving Fredholm integral equations arising in the analytic continuation problem of quantum physics

Abstract: We present a supervised machine learning approach to the inversion of Fredholm integrals of the first kind as they arise, for example, in the analytic continuation problem of quantum many-body physics. The approach provides a natural regularization for the ill-conditioned inverse of the Fredholm kernel, as well as an efficient and stable treatment of constraints. The key observation is that the stability of the forward problem permits the construction of a large database of outputs for physically meaningful in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
43
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(43 citation statements)
references
References 31 publications
0
43
0
Order By: Relevance
“…We have seen that the average spectrum method is not the parameter free method suggested by the deceptively written functional integral (13): We have to choose a grid density ρ(x), which acts as a default model, and a number N of grid points, which acts as a regularization parameter. The reason for this is that the naive discretization (19) does not converge to a well defined functional integral. Instead we have to sample the components of the models we are integrating over from distributions that are consistent for different discretizations.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…We have seen that the average spectrum method is not the parameter free method suggested by the deceptively written functional integral (13): We have to choose a grid density ρ(x), which acts as a default model, and a number N of grid points, which acts as a regularization parameter. The reason for this is that the naive discretization (19) does not converge to a well defined functional integral. Instead we have to sample the components of the models we are integrating over from distributions that are consistent for different discretizations.…”
Section: Discussionmentioning
confidence: 99%
“…The straightforward method for evaluating (19) is to perform a random walk in the space of non-negative vectors f , updating a single component, f n → f n , at a time. Detailed balance is fulfilled if we sample f n from the conditional distribution ∝ exp(−χ 2 (f ; f n )/2) with…”
Section: A Components Samplingmentioning
confidence: 99%
See 1 more Smart Citation
“…In that sense, our framework directly integrates the projection step employed in Ref. 17. We show that by applying standard training techniques, we are able to reach accuracy comparable with the MaxEnt approach while keeping lower standard deviations in the error distribution.…”
mentioning
confidence: 87%
“…Fortunately, there has been extensive research in the last several years using artificial neural networks and other machine learning algorithms to attack such inverse problems [23][24][25]. The setting of such approaches are data driven -meaning that a large number of forward problems are first solved in order to produce a set of training data.…”
Section: A Neural Network For the Inverse Problemmentioning
confidence: 99%