2021
DOI: 10.1109/tkde.2021.3108192
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Hypergraph Auto-Encoder for Relational Data Clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(4 citation statements)
references
References 28 publications
0
4
0
Order By: Relevance
“…This unwanted case would cause the effect that the embeddings alone do not contain any useful information about the LFP shape, because it would be possible to train further auto-encoder with arbitrary permutations of the embeddings, which perform equally good. However, in order to allow an interpretation of clusters in embedding space, neighbourhood relations between different LFP shapes should be conserved in the embedding layer [71]. Thus, in our study we used shallow-networks to reduce the number of parameters, and added drop-out layers to prevent the neural network from overfitting.…”
Section: Discussionmentioning
confidence: 99%
“…This unwanted case would cause the effect that the embeddings alone do not contain any useful information about the LFP shape, because it would be possible to train further auto-encoder with arbitrary permutations of the embeddings, which perform equally good. However, in order to allow an interpretation of clusters in embedding space, neighbourhood relations between different LFP shapes should be conserved in the embedding layer [71]. Thus, in our study we used shallow-networks to reduce the number of parameters, and added drop-out layers to prevent the neural network from overfitting.…”
Section: Discussionmentioning
confidence: 99%
“…HGNN + [23] introduces higher-order multimodal modeling based on HGNN to learn the optimal solution in a single hypergraph and fuse the correlations of different modalities into a unified hypergraph. AHGAE [24] combines adaptive hypergraph Laplacian smoothing filtering and relational reconstruction self-encoder to ensure that the potential representation can retain both original and higher-order information. DeepHGSL [25] proposes the hypergraph information bottleneck principle (HIB), which is used to construct loss functions to enhance correct connections and weaken incorrect connections during training.…”
Section: Hypergraph Convolutionmentioning
confidence: 99%
“…In recent years, Graph Neural Networks (GNNs) [44], [45] and their variants [46], [47], [48], [49] have emerged as powerful tools in the realm of data analysis, demonstrating their versatility in a broad spectrum of graph-structured tasks, including graph classification [50], [51], [52], graph clustering [53], [54], [55], and graph link prediction [56], [57]. The power of GNNs also extends beyond graph-structured data, as they have also been effectively utilized in nongraph structured data.…”
Section: Graph and Hypergraph Neural Networkmentioning
confidence: 99%