2022
DOI: 10.48550/arxiv.2206.00606
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Higher-Order Attention Networks

Abstract: This paper introduces higher-order attention networks (HOANs), a novel class of attention-based neural networks defined on a generalized higher-order domain called a combinatorial complex (CC). Similar to hypergraphs, CCs admit arbitrary set-like relations between a collection of abstract entities. Simultaneously, CCs permit the construction of hierarchical higher-order relations analogous to those supported by cell complexes. Thus, CCs effectively generalize both hypergraphs and cell complexes and combine the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 75 publications
0
5
0
Order By: Relevance
“…As a first numerical test, we compare the performance of our filtering strategy with the approach proposed in [26], adapted to filtering signals defined over cell complexes by simply replacing B 2 with the matrix in (3) capturing the upper incidence relations of a cell complex. Our optimal FIR filters are derived by solving the least squares problem in (32) and (33), where we set α 0 = 0. Indeed, albeit the difference between our approach and that of [26] is minimum, because we simply set α 0 = 0 a priori, while in [26] the choice of α 0 is part of the optimization, we will show that there is a clear advantage in terms of accuracy.…”
Section: Filtering Over Cell Complexesmentioning
confidence: 99%
See 1 more Smart Citation
“…As a first numerical test, we compare the performance of our filtering strategy with the approach proposed in [26], adapted to filtering signals defined over cell complexes by simply replacing B 2 with the matrix in (3) capturing the upper incidence relations of a cell complex. Our optimal FIR filters are derived by solving the least squares problem in (32) and (33), where we set α 0 = 0. Indeed, albeit the difference between our approach and that of [26] is minimum, because we simply set α 0 = 0 a priori, while in [26] the choice of α 0 is part of the optimization, we will show that there is a clear advantage in terms of accuracy.…”
Section: Filtering Over Cell Complexesmentioning
confidence: 99%
“…An introduction to the processing of signals over cell complexes has been presented in [31], where it was shown that convolutional filters can serve as building blocks in neural networks defined over cell complexes. Recently, in [32], [33] novel attention-based neural networks have been defined over cell complexes. At a more fundamental level, one of the key unresolved issues in DNN is explainability.…”
Section: Introductionmentioning
confidence: 99%
“…x 0 , x 1 , xh, G2: Learned signals and weights (metric tensor) 1: function EDGE FLOW ESTIMATION (Inputs) 2: 19), (20), and (…”
Section: Joint Learning Of Edge Flows and Weightsmentioning
confidence: 99%
“…In [19], weighted persistent homology is proposed to analyse biomolecular data where weights reflect certain physical, chemical or biological properties into the simplicial complex generation. The works in [12,20] proposed attentional deep architectures for simplex-structured data that can be also seen as methods implicitly working on WSCs. Contribution.…”
Section: Introductionmentioning
confidence: 99%
“…These works analyze HOGDMs in the context of complex physical processes [22], [52], [234] and signal processing [213]. A recent work on topological deep learning [122], [123], [192] covers a very broad set of topics related to topological neural networks. Another work [9] focuses on broad representation learning for hypergraphs.…”
Section: Scope Of This Work Vs Related Surveys and Analysesmentioning
confidence: 99%