Particles beyond the Standard Model (SM) can generically have lifetimes that are long compared to SM particles at the weak scale. When produced at experiments such as the Large Hadron Collider (LHC) at CERN, these long-lived particles (LLPs) can decay far from the interaction vertex of the primary proton–proton collision. Such LLP signatures are distinct from those of promptly decaying particles that are targeted by the majority of searches for new physics at the LHC, often requiring customized techniques to identify, for example, significantly displaced decay vertices, tracks with atypical properties, and short track segments. Given their non-standard nature, a comprehensive overview of LLP signatures at the LHC is beneficial to ensure that possible avenues of the discovery of new physics are not overlooked. Here we report on the joint work of a community of theorists and experimentalists with the ATLAS, CMS, and LHCb experiments—as well as those working on dedicated experiments such as MoEDAL, milliQan, MATHUSLA, CODEX-b, and FASER—to survey the current state of LLP searches at the LHC, and to chart a path for the development of LLP searches into the future, both in the upcoming Run 3 and at the high-luminosity LHC. The work is organized around the current and future potential capabilities of LHC experiments to generally discover new LLPs, and takes a signature-based approach to surveying classes of models that give rise to LLPs rather than emphasizing any particular theory motivation. We develop a set of simplified models; assess the coverage of current searches; document known, often unexpected backgrounds; explore the capabilities of proposed detector upgrades; provide recommendations for the presentation of search results; and look towards the newest frontiers, namely high-multiplicity ‘dark showers’, highlighting opportunities for expanding the LHC reach for these signals.
Global fits of primary and secondary cosmic-ray (CR) fluxes measured by AMS-02 have great potential to study CR propagation models and search for exotic sources of antimatter such as annihilating dark matter (DM). Previous studies of AMS-02 antiprotons revealed a possible hint for a DM signal which, however, could be affected by systematic uncertainties. To test the robustness of such a DM signal, in this work we systematically study two important sources of uncertainties: the antiproton production cross sections needed to calculate the source spectra of secondary antiprotons and the potential correlations in the experimental data, so far not provided by the AMS-02 Collaboration. To investigate the impact of cross-section uncertainties we perform global fits of CR spectra including a covariance matrix determined from nuclear cross-section measurements. As an alternative approach, we perform a joint fit to both the CR and cross-section data. The two methods agree and show that cross-section uncertainties have a small effect on the CR fits and on the significance of a potential DM signal, which we find to be at the level of 3σ. Correlations in the data can have a much larger impact. To illustrate this effect, we determine possible benchmark models for the correlations in a data-driven method. The inclusion of correlations strongly improves the constraints on the propagation model and, furthermore, enhances the significance of the DM signal up to above 5σ. Our analysis demonstrates the importance of providing the covariance of the experimental data, which is needed to fully exploit their potential.
Chemical equilibrium is a commonly made assumption in the freeze-out calculation of coannihilating dark matter. We explore the possible failure of this assumption and find a new conversion-driven freeze-out mechanism. Considering a representative simplified model inspired by supersymmetry with a neutralino- and sbottom-like particle we find regions in parameter space with very small couplings accommodating the measured relic density. In this region freeze-out takes place out of chemical equilibrium and dark matter self-annihilation is thoroughly inefficient. The relic density is governed primarily by the size of the conversion terms in the Boltzmann equations. Due to the small dark matter coupling the parameter region is immune to direct detection but predicts an interesting signature of disappearing tracks or displaced vertices at the LHC. Unlike freeze-in or superWIMP scenarios, conversion-driven freeze-out is not sensitive to the initial conditions at the end of reheating.Comment: 12 pages + references, 10 figures; v2: Discussion of kinetic equilibrium extended, matches published versio
We present MadDM v.3.0, a numerical tool to compute particle dark matter observables in generic new physics models. The new version features a comprehensive and automated framework for dark matter searches at the interface of collider physics, astrophysics and cosmology and is deployed as a plugin of the MadGraph5 aMC@NLO platform, inheriting most of its features. With respect to the previous version, MadDM v.3.0 can now provide predictions for indirect dark matter signatures in astrophysical environments, such as the annihilation cross section at present time and the energy spectra of prompt photons, cosmic rays and neutrinos resulting from dark matter annihilation. MadDM indirect detection features support both 2 → 2 and 2 → n dark matter annihilation processes. In addition, the ability to compare theoretical predictions with experimental constraints is extended by including the Fermi-LAT likelihood for gamma-ray constraints from dwarf spheroidal galaxies and by providing an interface with the nested sampling algorithm PyMultiNest to perform high dimensional parameter scans efficiently. We validate the code for a wide set of dark matter models by comparing the results from MadDM v.3.0 to existing tools and results in the literature.
Mixed-effects multilevel models are often used to investigate cross-level interactions, a specific type of context effect that may be understood as an upper-level variable moderating the association between a lower-level predictor and the outcome. We argue that multilevel models involving cross-level interactions should always include random slopes on the lower-level components of those interactions. Failure to do so will usually result in severely anti-conservative statistical inference. We illustrate the problem with extensive Monte Carlo simulations and examine its practical relevance by studying 30 prototypical cross-level interactions with European Social Survey data for 28 countries. In these empirical applications, introducing a random slope term reduces the absolute t-ratio of the cross-level interaction term by 31 per cent or more in three quarters of cases, with an average reduction of 42 per cent. Many practitioners seem to be unaware of these issues. Roughly half of the crosslevel interaction estimates published in the European Sociological Review between 2011 and 2016 are based on models that omit the crucial random slope term. Detailed analysis of the associated test statistics suggests that many of the estimates would not reach conventional thresholds for statistical significance in correctly specified models that include the random slope. This raises the question how much robust evidence of cross-level interactions sociology has actually produced over the past decades.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.