Simulating nature and in particular processes in particle physics require expensive computations and sometimes would take much longer than scientists can afford. Here, we explore ways to a solution for this problem by investigating recent advances in generative modeling and present a study for the generation of events from a physical process with deep generative models. The simulation of physical processes requires not only the production of physical events, but to also ensure that these events occur with the correct frequencies. We investigate the feasibility of learning the event generation and the frequency of occurrence with several generative machine learning models to produce events like Monte Carlo generators. We study three processes: a simple two-body decay, the processes e+e− → Z → l+l− and $$pp\to t\bar{t}$$ p p → t t ¯ including the decay of the top quarks and a simulation of the detector response. By buffering density information of encoded Monte Carlo events given the encoder of a Variational Autoencoder we are able to construct a prior for the sampling of new events from the decoder that yields distributions that are in very good agreement with real Monte Carlo events and are generated several orders of magnitude faster. Applications of this work include generic density estimation and sampling, targeted event generation via a principal component analysis of encoded ground truth data, anomaly detection and more efficient importance sampling, e.g., for the phase space integration of matrix elements in quantum field theories.
We present a study for the generation of events from a physical process with deep generative models. The simulation of physical processes requires not only the production of physical events, but also to ensure these events occur with the correct frequencies. We investigate the feasibility of learning the event generation and the frequency of occurrence with Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs) to produce events like Monte Carlo generators. We study three processes: a simple two-body decay, the processes e + e − → Z → l + l − and pp → t t including the decay of the top quarks and a simulation of the detector response. We find that the tested GAN architectures and the standard VAE are not able to learn the distributions precisely. By buffering density information of encoded Monte Carlo events given the encoder of a VAE we are able to construct a prior for the sampling of new events from the decoder that yields distributions that are in very good agreement with real Monte Carlo events and are generated several orders of magnitude faster. Applications of this work include generic density estimation and sampling, targeted event generation via a principal component analysis of encoded ground truth data, anomaly detection and more efficient importance sampling, e.g. for the phase space integration of matrix elements in quantum field theories.
In the scenarios reviewed, GHG emission reduction targets drive electrification. • If targets are met by 2050, 40-100% of heat and car transport will be electrified. • The electricity consumption of heat and transport will increase to 400-800 TWh.
One of the most promising strategies to identify the nature of dark matter consists in the search for new particles at accelerators and with so-called direct detection experiments. Working within the framework of simplified models, and making use of machine learning tools to speed up statistical inference, we address the question of what we can learn about dark matter from a detection at the LHC and a forthcoming direct detection experiment. We show that with a combination of accelerator and direct detection data, it is possible to identify newly discovered particles as dark matter, by reconstructing their relic density assuming they are weakly interacting massive particles (WIMPs) thermally produced in the early Universe, and demonstrating that it is consistent with the measured dark matter abundance. An inconsistency between these two quantities would instead point either towards additional physics in the dark sector, or towards a non-standard cosmology, with a thermal history substantially different from that of the standard cosmological model. 6 Summary and conclusions 23 A Details, validation and performance of the machine learning tools 24 B Posterior distributions with a larger systematic uncertainty 25 C Profile likelihood results 26 -1 -1 Different definitions of WIMPs are used in the literature. To avoid confusion, here we use the termWIMP to refer to any particle in the O(1) GeV to O(100) TeV mass range that interacts with Standard Model particles with a strength similar to the weak-interaction, such that the abundance is obtained through the thermal freeze-out mechanism. Our definition is sometimes referred to as a 'hidden-sector WIMP' [14].
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.