2016
DOI: 10.1088/1748-0221/11/07/p07006
|View full text |Cite
|
Sign up to set email alerts
|

Application of Bayesian neural networks to energy reconstruction in EAS experiments for ground-based TeV astrophysics

Abstract: A toy detector array is designed to detect a shower generated by the interaction between a TeV cosmic ray and the atmosphere. In the present paper, the primary energies of showers detected by the detector array are reconstructed with the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment [1], respectively. Compared to the standard method, the energy resolutions are significantly improved using the BNNs. And the improvement is more obvious for the high energy showers t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…The simple idea of regressing missing kinematic information can be accomplished by a variety of architecture choices. For example, Bayesian neural networks (Mackay 1995;Bhat & Prosper 2005;Neal 2012;Gal & Ghahramani 2016;Bai et al 2016;Blei et al 2017;Bollweg et al 2020;Wagner-Carena et al 2020;Charnock et al 2020) may provide an alternative method for incorporating uncertainties on the network output. In this case, the network would be rerun many times over the same inputs, while the weights float within some prior distribution.…”
Section: Discussionmentioning
confidence: 99%
“…The simple idea of regressing missing kinematic information can be accomplished by a variety of architecture choices. For example, Bayesian neural networks (Mackay 1995;Bhat & Prosper 2005;Neal 2012;Gal & Ghahramani 2016;Bai et al 2016;Blei et al 2017;Bollweg et al 2020;Wagner-Carena et al 2020;Charnock et al 2020) may provide an alternative method for incorporating uncertainties on the network output. In this case, the network would be rerun many times over the same inputs, while the weights float within some prior distribution.…”
Section: Discussionmentioning
confidence: 99%