2021 International Joint Conference on Neural Networks (IJCNN) 2021
DOI: 10.1109/ijcnn52387.2021.9534441
|View full text |Cite
|
Sign up to set email alerts
|

Triplet Knowledge Distillation Networks for Model Compression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…Different from grid-structured and independently generated images, distribution shifts concerning graph-structured data can be more complicated and hard to address, which often requires graph-specific technical originality. For instances, Yang et al (2022c) proposes to identify invariant substructures, i.e., a subsets of nodes with causal effects to labels, in input graphs to learn stable predictive relations across environments, while Yang et al (2022b) resorts to an analogy of thermodynamics diffusion on graphs to build a principled knowledge distillation model for geometric knowledge transfer and generalization.…”
Section: Introductionmentioning
confidence: 99%
“…Different from grid-structured and independently generated images, distribution shifts concerning graph-structured data can be more complicated and hard to address, which often requires graph-specific technical originality. For instances, Yang et al (2022c) proposes to identify invariant substructures, i.e., a subsets of nodes with causal effects to labels, in input graphs to learn stable predictive relations across environments, while Yang et al (2022b) resorts to an analogy of thermodynamics diffusion on graphs to build a principled knowledge distillation model for geometric knowledge transfer and generalization.…”
Section: Introductionmentioning
confidence: 99%