2023
DOI: 10.1109/tnnls.2022.3161453
|View full text |Cite
|
Sign up to set email alerts
|

Graph Decoupling Attention Markov Networks for Semisupervised Graph Node Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(10 citation statements)
references
References 24 publications
0
10
0
Order By: Relevance
“…In this experiment, the loss is computed for each optimization iteration of the DTI association network and iteratively updated using eqs and until the loss falls below the predefined threshold. Compared to a fixed number of iterations, the experiment employs a dynamic stopping optimization strategy, which proves to be more effective for the optimization process, especially for small-sample data sets. , …”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this experiment, the loss is computed for each optimization iteration of the DTI association network and iteratively updated using eqs and until the loss falls below the predefined threshold. Compared to a fixed number of iterations, the experiment employs a dynamic stopping optimization strategy, which proves to be more effective for the optimization process, especially for small-sample data sets. , …”
Section: Methodsmentioning
confidence: 99%
“…However, current methods based on similarity measurements of DTI association networks typically assume that the network topology is stable and low-noise, which does not align with reality. Constructed DTI networks often come with noise or structural incompleteness issues . Noise in the network may introduce false interactions, distorting the true relationships between drugs and targets, thereby affecting model learning and prediction .…”
Section: Introductionmentioning
confidence: 99%
“…GSL [52][53][54][55] is an important branch of the graph representation learning field, which focuses on optimizing the structure of graphs through joint training to enhance the performance of GNN. GSL methods can be divided into three main categories: metric learning methods [56][57][58], neural network methods [59,60] and direct methods [52,61,62]. In prediction tasks, the traditional GSL process and the GSL process used for traffic prediction are shown in Fig.…”
Section: Graph Structure Learningmentioning
confidence: 99%
“…Next, the final embedding representation h is transformed into a dynamic graph structure using a metric learning method [66]. In this paper, the weighted cosine distance is chosen as the metric function [57]: α ij = cos (w ⊙ h i , w ⊙ h j ), where ⊙ denotes the Hadamard product, and w is a learnable weight vector with the same dimension as the final embedding representations h i and h j . To stabilize the learning process and enhance expressiveness, this paper extends the similarity metric function to a multihead version.…”
Section: Inmentioning
confidence: 99%
“…GLCN (Jiang et al 2019) exploits a unified framework for executing graph learning and graph convolution. IDGL (Chen, Wu, and Zaki 2020) leverages graph regularization terms to enhance learning quality. Pro-GNN (Jin et al 2020) employs alternating optimization to realize the mutual regularization of graph learning and network update.…”
Section: Related Work Graph Structure Learningmentioning
confidence: 99%