2023
DOI: 10.21468/scipostphys.14.6.159
|View full text |Cite
|
Sign up to set email alerts
|

$\nu$-flows: Conditional neutrino regression

Abstract: We present \nuν-Flows, a novel method for restricting the likelihood space of neutrino kinematics in high-energy collider experiments using conditional normalising flows and deep invertible neural networks. This method allows the recovery of the full neutrino momentum which is usually left as a free parameter and permits one to sample neutrino values under a learned conditional likelihood given event observations. We demonstrate the success of \nuν-Flows in a case study by applying it to simulated semileptonic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 61 publications
0
3
0
Order By: Relevance
“…In fact, CALOFLOW proved to be the first ever generative model in HEP that could pass the following stringent test: a binary classifier trained on the raw voxels of (CALOFLOW) generated vs. (GEANT4) reference showers could not distinguish the two with 100% accuracy. Normalizing Flows showed similar good performance in other generative tasks in high-energy physics [21][22][23][24][25][26][27][28][29][30][31][32][33][34].…”
Section: Introductionmentioning
confidence: 76%
“…In fact, CALOFLOW proved to be the first ever generative model in HEP that could pass the following stringent test: a binary classifier trained on the raw voxels of (CALOFLOW) generated vs. (GEANT4) reference showers could not distinguish the two with 100% accuracy. Normalizing Flows showed similar good performance in other generative tasks in high-energy physics [21][22][23][24][25][26][27][28][29][30][31][32][33][34].…”
Section: Introductionmentioning
confidence: 76%
“…This rough online training is successful because normalizing flows, or invertible networks (INNs) [19,20], are especially well-suited, stable, and precise in LHC physics applications [21]. This has been shown in many instances, including event generation [22][23][24][25], detector simulations [26][27][28][29], unfolding or inverse simulations [20,30], kinematic reconstruction [31], Bayesian inference [32,33], or inference using the matrix element method [34]. On the other hand, for expensive integrands online training is clearly not optimal, because it does not make use of all previously generated data at subsequent stages of the network training.…”
Section: Introductionmentioning
confidence: 99%
“…Once we control the forward direction with NN-based event generators [51][52][53][54][55][56], conditional GANs and INNs also allow us to invert the simulation chain, to unfold detector effects [57][58][59] or to extract the hard scattering process in a statistically consistent manner [60,61]. The fully calibrated inverted simulation uses the same conditional INN (cINN) as simulation-based inference [62,63] or kinematic reconstruction [64]. Obviously, any application of (generative) networks to LHC physics requires an uncertainty treatment [56,65].…”
Section: Introductionmentioning
confidence: 99%