ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9414557
|View full text |Cite
|
Sign up to set email alerts
|

Geometric Scattering Attention Networks

Abstract: Geometric scattering has recently gained recognition in graph representation learning, and recent work has shown that integrating scattering features in graph convolution networks (GCNs) can alleviate the typical oversmoothing of features in node representation learning. However, scattering often relies on handcrafted design, requiring careful selection of frequency bands via a cascade of wavelet transforms, as well as an effective weight sharing scheme to combine low-and band-pass information. Here, we introd… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 12 publications
0
10
0
Order By: Relevance
“…Some studies have proposed to transform the joint space of the human body to enhance different structural features and perform signal filtering or aggregate vertex information based on graph Laplacian feature decomposition [15]. Graph scattering transforms, by changing the bandwidth size [16], using different graph signal filters [17] to obtain the expected spectrum, increasing the richness of the graph. On this basis, the wavelet diffusion method [18], and parametric feature learner [19] have good performance.…”
Section: B Graph Structurementioning
confidence: 99%
“…Some studies have proposed to transform the joint space of the human body to enhance different structural features and perform signal filtering or aggregate vertex information based on graph Laplacian feature decomposition [15]. Graph scattering transforms, by changing the bandwidth size [16], using different graph signal filters [17] to obtain the expected spectrum, increasing the richness of the graph. On this basis, the wavelet diffusion method [18], and parametric feature learner [19] have good performance.…”
Section: B Graph Structurementioning
confidence: 99%
“…[13,66] develop diffusion wavelets. [44,45] integrate designed scattering filters and parameterized feature learners. [47] expands GSTs on the spatio-temporal domain.…”
Section: Graph Representation Learningmentioning
confidence: 99%
“…[43]. 15 45. 26.88 43.51 49.23 88.73 20.17 32.98 42.75 44.65 60.57 20.52 40.58 75.38 90.36 153.12 26.85 48.07 93.50 108.90 162.84 DMGNN [35] 15.57 28.72 59.01 73.05 138.62 5.03 9.28 20.21 26.23 52.04 10.21 20.90 41.55 52.28 111.23 31.97 54.32 96.66 119.92 224.63 Traj-GCN [41] 11.68 21.26 40.99 50.78 97.99 3.33 6.25 13.58 17.98 54.00 6.92 13.69 30.30 39.97 114.16 17.18 32.37 60.12 72.55 127.41 MST-GCN [9] 10.28 18.94 37.68 47.03 86.96 3.03 5.68 12.35 16.26 47.91 5.92 12.09 28.36 38.04 111.04 14.99 28.66 55.86 69.05 124.79 STSGCN [53] 12.56 23.04 41.92 50.33 94.17 4.72 6.69 14.53 17.88 49.52 6.41 12.38 29.05 38.86 109.42 17.52 31.48 58.74 72.06 127.40 SPGSN 10.24 18.54 38.22 48.68 89.58 2.91 5.25 11.31 15.01 47.31 5.52 11.16 25.48 37.06 108.14 14.93 28.16 56.72 71.16 125.20 .…”
mentioning
confidence: 99%
“…Similarly, these works prevent the over-smoothing issue by replacing the aggregation strategy of the entire network, lacking an understanding of node-specific differentiations. There are also some works that make efforts to theoretically propose methods for training deep GNNs (Xu et al, 2021;Min et al, 2020). However, these works are limited to specific types of GNNs, lacking generalizability and practical significance.…”
Section: Details Of Datasets and Backbonesmentioning
confidence: 99%