2020
DOI: 10.48550/arxiv.2008.06545
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GANplifying Event Samples

Anja Butter,
Sascha Diefenbacher,
Gregor Kasieczka
et al.

Abstract: A critical question concerning generative networks applied to event generation in particle physics is if the generated events add statistical precision beyond the training sample. We show for a simple example with increasing dimensionality how generative networks indeed amplify the training statistics. We quantify their impact through an amplification factor or equivalent numbers of sampled events.

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
13
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
4

Relationship

5
3

Authors

Journals

citations
Cited by 8 publications
(13 citation statements)
references
References 47 publications
0
13
0
Order By: Relevance
“…For the LHC, this can be seen for instance in the non-trivial uncertainty for an intermediate Breit-Wigner resonance. These results are another step in understanding GANplification patterns [4] and might even allow us to use INNs to extrapolate in phase space.…”
Section: Discussionmentioning
confidence: 82%
See 2 more Smart Citations
“…For the LHC, this can be seen for instance in the non-trivial uncertainty for an intermediate Breit-Wigner resonance. These results are another step in understanding GANplification patterns [4] and might even allow us to use INNs to extrapolate in phase space.…”
Section: Discussionmentioning
confidence: 82%
“…This confirms our general observation, that the (B)INN learns a functional form of the density in both directions, in close analogy to a fit. It also means that the uncertainty from the generative network training is not described by the simple statistical scaling we observed for simpler networks [63,64] and instead points towards a GANplification-like [4] pattern.…”
Section: Marginalizing Phase Spacementioning
confidence: 77%
See 1 more Smart Citation
“…Neural networks based data augmentation in high energy collision simulations have been addressed, for example, in Refs [53][54][55]…”
mentioning
confidence: 99%
“…Naively, it seems that the answer must be no, based on the same reasoning that motivates the argument that a generative network cannot produce more information than exists in its statistically limited training data set. However, this argument fails to account for the implicit knowledge embed-ded in the architecture of the network, which can contribute information in the same manner as a functional fit [61]. A super-resolution network applied to LHC jets combines the information from the low-resolution image with QCD knowledge extracted from the training data, for instance the underlying theoretical principles of soft and collinear splittings combined with mass drop patterns.…”
mentioning
confidence: 99%