2022
DOI: 10.1016/j.media.2021.102272
|View full text |Cite
|
Sign up to set email alerts
|

RA-GCN: Graph convolutional network for disease prediction problems with imbalanced data

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 61 publications
(13 citation statements)
references
References 27 publications
0
13
0
Order By: Relevance
“…The fraction of the majority classes is µ, and for all experiments, we set µ = 0.5 and round down the result µ * k. The training loss L Q from section 2.4 is adopted. We implement two groups of baselines for comparison: (1) Popular quantity-imbalance methods for general scenarios: Re-weight [20] (RW), Focal Loss [28] (Focal) and Class Balanced Loss [9] (CB); (2) Graph-specific quantity-imbalance methods: DR-GCN [44], RA-GCN [14] and GraphSMOTE [58]. To jointly handle the topology-and quantity-imbalance issues and demonstrate the orthogonality of them, we combine our ReNode method with these three general quantity-imbalance methods (RW, Focal, CB) 4 .…”
Section: Renode For the Compound Scene Of Tinl And Qinlmentioning
confidence: 99%
See 2 more Smart Citations
“…The fraction of the majority classes is µ, and for all experiments, we set µ = 0.5 and round down the result µ * k. The training loss L Q from section 2.4 is adopted. We implement two groups of baselines for comparison: (1) Popular quantity-imbalance methods for general scenarios: Re-weight [20] (RW), Focal Loss [28] (Focal) and Class Balanced Loss [9] (CB); (2) Graph-specific quantity-imbalance methods: DR-GCN [44], RA-GCN [14] and GraphSMOTE [58]. To jointly handle the topology-and quantity-imbalance issues and demonstrate the orthogonality of them, we combine our ReNode method with these three general quantity-imbalance methods (RW, Focal, CB) 4 .…”
Section: Renode For the Compound Scene Of Tinl And Qinlmentioning
confidence: 99%
“…DR-GCN [44] propose two types of regularization to tackle quantity imbalance: class-conditioned adversarial training and unlabeled nodes latent distribution constraint. RA-GCN [14] propose to automatically learn to weight the training samples in different classes in an adversarial training manner. AdaGCN Shi et al [45] propose to leverage the boosting algorithm to handle the quantity-imbalance issue for the node classification task.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Then, the pseudo labels are used to train a student graph for Autism spectrum disorder or Alzheimer's disease prediction. RA‐GCN [28], on the other hand, addressed the imbalanced class distribution in the medical data by representing each class by a graph‐based neural network responsible for the weighting of class samples. The whole architecture, then, is trained in an adversarial manner such that the classifier adapts itself with the attention to rare cases.…”
Section: Introductionmentioning
confidence: 99%
“…Indeed, GCNs are widely used for disease diagnosis [5], [6], due to the biomarkers across all subjects are critical for recognizing the common patterns associated with diseases. Previous studies have explored the development of GCNs for various prediction tasks on fMRI data [3], [5], [7], [8]. For example, Parisot et al [5] apply vanilla GCN for supervised disease prediction with fMRI data.…”
Section: Introductionmentioning
confidence: 99%