ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2022
DOI: 10.1109/icassp43922.2022.9746017
|View full text |Cite
|
Sign up to set email alerts
|

Simplicial Convolutional Neural Networks

Abstract: Graphs can model networked data by representing them as nodes and their pairwise relationships as edges. Recently, signal processing and neural networks have been extended to process and learn from data on graphs, with achievements in tasks like graph signal reconstruction, graph or node classifications, and link prediction. However, these methods are only suitable for data defined on the nodes of a graph. In this paper, we propose a simplicial convolutional neural network (SCNN) architecture to learn from dat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 25 publications
(18 citation statements)
references
References 18 publications
1
17
0
Order By: Relevance
“…Proof: See Appendix C. Intuitively, the two previous propositions state that for the simplicial filter H k the labeling and reference orientation of simplices are inconsequential for the filter output. These two properties have been previously reported in the context of neural network on SCs [27], [29], [32]. These equivariances imply that we can learn a filter to process a given simplex by seeing only permuted and reoriented versions of it: if two parts of an SC are topologically equivalent and the simplices support corresponding flows, a simplicial convolutional filter yields equivalent outputs.…”
Section: Simplicial Convolutional Filterssupporting
confidence: 66%
See 1 more Smart Citation
“…Proof: See Appendix C. Intuitively, the two previous propositions state that for the simplicial filter H k the labeling and reference orientation of simplices are inconsequential for the filter output. These two properties have been previously reported in the context of neural network on SCs [27], [29], [32]. These equivariances imply that we can learn a filter to process a given simplex by seeing only permuted and reoriented versions of it: if two parts of an SC are topologically equivalent and the simplices support corresponding flows, a simplicial convolutional filter yields equivalent outputs.…”
Section: Simplicial Convolutional Filterssupporting
confidence: 66%
“…Indeed, all the filters studied in the current paper can be generalized to cell complexes, by substituting the appropriate incidence matrices. Different neural network architectures on SCs have also been developed to learn from the simplicial data, e.g., [27], [28], [29], [30], [31], [32], [33]. Importantly, the linear operation in these different neural convolutional layers may be understood as a simplicial filter (as discussed here) with different filter parameters.…”
mentioning
confidence: 99%
“…The convex optimization problem (18) can be efficiently solved (and distributed) by off-the-shelf convex programming toolboxes. ( 16) X (1) = H (1) X (1) + H (2) X (2) + E (17)…”
Section: Topology Identification In Distribution Gridsmentioning
confidence: 99%
“…Recently, machine learning models have been developed to model data with support on topological spaces modeled as simplicial complexes [108,86,31,77,22,45,44,11], cell complexes [43,82,20,78] and hypergraphs [101,9,50,4,35,10,37]. These models have been used in link prediction [26,44], optimal homology generator detection [52], and trajectory prediction [77].…”
Section: Introductionmentioning
confidence: 99%