2023
DOI: 10.21468/scipostphys.15.3.094
|View full text |Cite
|
Sign up to set email alerts
|

Two invertible networks for the matrix element method

Anja Butter,
Theo Heimel,
Till Martini
et al.

Abstract: The matrix element method is widely considered the ultimate LHC inference tool for small event numbers. We show how a combination of two conditional generative neural networks encodes the QCD radiation and detector effects without any simplifying assumptions, while keeping the computation of likelihoods for individual events numerically efficient. We illustrate our approach for the CP-violating phase of the top Yukawa coupling in associated Higgs and single-top production. Currently, the limiting factor for th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 93 publications
0
4
0
Order By: Relevance
“…In fact, CALOFLOW proved to be the first ever generative model in HEP that could pass the following stringent test: a binary classifier trained on the raw voxels of (CALOFLOW) generated vs. (GEANT4) reference showers could not distinguish the two with 100% accuracy. Normalizing Flows showed similar good performance in other generative tasks in high-energy physics [21][22][23][24][25][26][27][28][29][30][31][32][33][34].…”
Section: Introductionmentioning
confidence: 76%
“…In fact, CALOFLOW proved to be the first ever generative model in HEP that could pass the following stringent test: a binary classifier trained on the raw voxels of (CALOFLOW) generated vs. (GEANT4) reference showers could not distinguish the two with 100% accuracy. Normalizing Flows showed similar good performance in other generative tasks in high-energy physics [21][22][23][24][25][26][27][28][29][30][31][32][33][34].…”
Section: Introductionmentioning
confidence: 76%
“…1 We can then sample over the weight distributions to produce a central value and an uncertainty distribution for the network output. In LHC physics, Bayesian networks can be applied to classification [3], regression [75,76], and generative networks [4,77,78]. While it is in general possible to separate these uncertainties into statistical and systematic (stochasticity [75] or model limitations [76]), we know that our number of training jets is sufficiently large to only leave us with systematic uncertainties from the training process.…”
Section: Bayesian Particlenetmentioning
confidence: 99%
“…These networks should be trained on first-principle simulations, easy to handle, efficient to ship, powerful in amplifying the training samples [49,50], and -most importantly -precise. Going beyond forward generation, conditional generative networks can also be applied to probabilistic unfolding [51][52][53][54][55][56], inference [57,58], or anomaly detection [59][60][61][62][63][64], reinforcing the precision requirements.…”
Section: Introductionmentioning
confidence: 99%