2024
DOI: 10.3390/axioms13030160
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison between Invariant and Equivariant Classical and Quantum Graph Neural Networks

Roy T. Forestano,
Marçal Comajoan Cara,
Gopal Ramesh Dahale
et al.

Abstract: Machine learning algorithms are heavily relied on to understand the vast amounts of data from high-energy particle collisions at the CERN Large Hadron Collider (LHC). The data from such collision events can naturally be represented with graph structures. Therefore, deep geometric methods, such as graph neural networks (GNNs), have been leveraged for various data analysis tasks in high-energy physics. One typical task is jet tagging, where jets are viewed as point clouds with distinct features and edge connecti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 37 publications
0
3
0
Order By: Relevance
“…The recent field of GQML has paved the way for studying how adding symmetry information into quantum learning models changes their performance in terms of expressiveness, trainability, and generalization. Recent results has shown that GQML can indeed provide a heuristic advantage for several machine learning tasks [42][43][44][45][46][47][48][49] over their symmetry agnostic counter-parts. In particular, it was shown that the special case of S n -equivariant QNNs exhibit the holy grail of desirable properties [12]: absence of barren plateaus, generalization from few training points, and the capacity to be efficiently overparametrized.…”
Section: Discussionmentioning
confidence: 99%
“…The recent field of GQML has paved the way for studying how adding symmetry information into quantum learning models changes their performance in terms of expressiveness, trainability, and generalization. Recent results has shown that GQML can indeed provide a heuristic advantage for several machine learning tasks [42][43][44][45][46][47][48][49] over their symmetry agnostic counter-parts. In particular, it was shown that the special case of S n -equivariant QNNs exhibit the holy grail of desirable properties [12]: absence of barren plateaus, generalization from few training points, and the capacity to be efficiently overparametrized.…”
Section: Discussionmentioning
confidence: 99%
“…The main idea behind quantum machine learning (QML) is to use models that are partially or fully executed on a quantum computer by replacing some subroutines of the models with quantum circuits in order to exploit the unique properties of quantum mechanics to enhance the capabilities of classical machine learning algorithms. Some notable examples are quantum support vector machines [34], quantum nearest-neighbor algorithms [35], quantum nearest centroid classifiers [36], and quantum artificial neural networks [6,10], including quantum graph neural networks [11]. In the last case, some layers are typically executed on a quantum circuit that has rotation angles that are free parameters of the whole model.…”
Section: Quantum Computing and Quantum Machine Learningmentioning
confidence: 99%
“…The imminent operation of the High Luminosity Large Hadron Collider (HL-LHC) [1] by the end of this decade signals an era of unprecedented data generation, necessitating vast computing resources and advanced computational strategies to effectively manage and analyze the resulting datasets [2]. A promising approach to deal with this huge amount of data could be the application of quantum machine learning (QML), which could reduce the time complexity of classical algorithms by running on quantum computers and obtain better accuracies thanks to the access to the exponentially large Hilbert space [3][4][5][6][7][8][9][10][11].…”
Section: Introductionmentioning
confidence: 99%