2021
DOI: 10.1007/978-3-030-73194-6_10
|View full text |Cite
|
Sign up to set email alerts
|

Label Contrastive Coding Based Graph Neural Network for Graph Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 20 publications
(14 citation statements)
references
References 9 publications
0
14
0
Order By: Relevance
“…In addition, the current log data is cumbersome and difficult to label, and how to subtract the reliance on label features in anomaly detection is one of the current difficulties. Therefore, label contrastive coding [38] is introduced to improve the intra-class separability of instance-level by using label contrastive loss.…”
Section: Figure 4: the Framework Of The Hp-sage Modelmentioning
confidence: 99%
“…In addition, the current log data is cumbersome and difficult to label, and how to subtract the reliance on label features in anomaly detection is one of the current difficulties. Therefore, label contrastive coding [38] is introduced to improve the intra-class separability of instance-level by using label contrastive loss.…”
Section: Figure 4: the Framework Of The Hp-sage Modelmentioning
confidence: 99%
“…Existing works generally treat node representations or the graph summary as anchors (Velickovic et al 2019;Zeng and Xie 2021;Ren, Bai, and Zhang 2021;Cao et al 2021;Sun et al 2019). For instance, DGI and MVGRL treat the graph summary as anchors, which is first convolved by GCN and then summarized by a readout function.…”
Section: Anchor and Negative Embedding Generationmentioning
confidence: 99%
“…SUGAR [184] performed subgraph-to-graph contrast, which is to explore the interpretability and semantic connections between substructures and molecular graphs. In addition, graph-to-graph contrast methods [185] , [186] , [187] tried to learn semantics between the augmented graphs in the given dataset. On the other hand, MolCLR [98] , MoCL [97] , KCL [100] , and MICRO-GRAPH [96] leveraged multi-level chemical knowledge where atoms, bond, subgraphs, or graphs can pose in developing chemical properties.…”
Section: Data Augmentation: How To Extend Our Knowledge On Chemical S...mentioning
confidence: 99%