2022
DOI: 10.48550/arxiv.2201.09332
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

How Expressive are Transformers in Spectral Domain for Graphs?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…The Graph Transformer is inspired by the Transformer architecture, which has shown remarkable performance in natural language processing [65,13,41]. The Graph Transformer extends the Transformer architecture to the graph domain, allowing the model to capture the global structure and long-range dependencies of the graph [86,15,31,29,74,48,42,80,8,43,11,5,23,88].…”
Section: B3 Detalied Related Workmentioning
confidence: 99%
“…The Graph Transformer is inspired by the Transformer architecture, which has shown remarkable performance in natural language processing [65,13,41]. The Graph Transformer extends the Transformer architecture to the graph domain, allowing the model to capture the global structure and long-range dependencies of the graph [86,15,31,29,74,48,42,80,8,43,11,5,23,88].…”
Section: B3 Detalied Related Workmentioning
confidence: 99%
“…Some other works introduce structure information into attention by graph distance, path embedding or feature encoded by GNN (Park et al, 2022;Maziarka et al, 2020;Ying et al, 2021;Mialon et al, 2021;Choromanski et al, 2022). Other works use transformer as a module of the whole model (Bastos et al, 2022;Guo et al, 2022).…”
Section: Related Workmentioning
confidence: 99%