Given the vast amounts of data generated by modern particle detectors, computational efficiency is essential for many data-analysis jobs in high-energy physics. We develop a new class of physically interpretable boost invariant polynomial features (BIPs) for jet tagging that achieves such efficiency. We show that, for both supervised and unsupervised tasks, integrating BIPs with conventional techniques leads to models achieving high accuracy on jet tagging benchmarks while being orders of magnitudes faster to train and evaluate than contemporary deep learning systems.
Deep Learning approaches are becoming the go-to methods for data analysis in High Energy Physics (HEP). Nonetheless, most physics-inspired modern architectures are computationally inefficient and lack interpretability. This is especially the case with jet tagging algorithms, where computational efficiency is crucial considering the large amounts of data produced by modern particle detectors. In this work, we present a novel, versatile and transparent framework for jet representation; invariant to Lorentz group boosts, which achieves high accuracy on jet tagging benchmarks while being orders of magnitudes faster to train and evaluate than other modern approaches for both supervised and unsupervised schemes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.