2023
DOI: 10.1007/jhep03(2023)221
|View full text |Cite
|
Sign up to set email alerts
|

Multi-variable integration with a neural network

Abstract: In this article we present a method for automatic integration of parametric integrals over the unit hypercube using a neural network. The method fits a neural network to the primitive of the integrand using a loss function designed to minimize the difference between multiple derivatives of the network and the function to be integrated. We apply this method to two example integrals resulting from the sector decomposition of a one-loop and two-loop scalar integrals. Our method can achieve per-mil and percent acc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…• The string all%n will use every iteration mod n. So if one specifies tot_iters=15 and cv_iters='all%3', then then the iterations used will be [3,6,9,12] • The previous result can be shifted by using all%n+b where b is the shift. So for tot_iters=15 and cv_iters='all%3+2', you'll get [2,5,8,11,14].…”
Section: Specifying Control Variate Iterationsmentioning
confidence: 99%
See 1 more Smart Citation
“…• The string all%n will use every iteration mod n. So if one specifies tot_iters=15 and cv_iters='all%3', then then the iterations used will be [3,6,9,12] • The previous result can be shifted by using all%n+b where b is the shift. So for tot_iters=15 and cv_iters='all%3+2', you'll get [2,5,8,11,14].…”
Section: Specifying Control Variate Iterationsmentioning
confidence: 99%
“…[2]. More recent techniques use quasirandom or low-discrepancy pointsets in a class of methods known as Quasi Monte Carlo [3,4], apply multigrid ideas in Multilevel Monte Carlo estimation [5], or leverage machine learning (ML) [6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21]. A parallel research thrust has been the synthesis of ideas, whereby one tries to apply two such techniques, for example, by using several control variates [22], combining antithetic variates and control variates [23,24], or combining control variates and adaptive importance sampling [25,26].…”
mentioning
confidence: 99%
“…In references [13,14] a new approach has been proposed which can be utilized to circumvent this issue. These methods are based on using artificial neural networks (aNN) to build a surrogate model for the primitive of the integrand.…”
Section: Introductionmentioning
confidence: 99%
“…The precise knowledge of the amplitude structure can then be used to significantly improve the phase-space integration for a given process [9]. Generally, it is possible to improve numerical integration through neural networks by directly learning the primitive function [10], or using modified and enhanced implementations of importance sampling [11][12][13][14][15][16]. Technically, this promising approach encodes a change of integration variables in a normalizing flow [17] and then uses online training [18] while generating weighted phase space configurations, or weighted events.…”
Section: Introductionmentioning
confidence: 99%