Neutrino Events Reconstruction has always been crucial for
IceCube Neutrino Observatory. In the Kaggle competition “IceCube
— Neutrinos in Deep Ice”, many solutions use Transformer. We
present ISeeCube, a pure Transformer model based on
TorchScale (the backbone of BEiT-3). When having relatively
same amount of total trainable parameters, our model outperforms the
2nd place solution. By using TorchScale, the
lines of code drop sharply by about 80% and a lot of new methods
can be tested by simply adjusting configs. We compared two
fundamental models for predictions on a continuous space, regression
and classification, trained with MSE Loss and CE Loss
respectively. We also propose a new metric, overlap ratio, to
evaluate the performance of the model. Since the model is simple
enough, it has the potential to be used for more purposes such as
energy reconstruction, and many new methods such as combining it
with GraphNeT can be tested more easily. The code and
pretrained models are available at
https://github.com/ChenLi2049/ISeeCube.