Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval 2022
DOI: 10.1145/3477495.3531757
|View full text |Cite
|
Sign up to set email alerts
|

Meta-Knowledge Transfer for Inductive Knowledge Graph Embedding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 46 publications
(13 citation statements)
references
References 27 publications
0
13
0
Order By: Relevance
“…As a representative work of spatial methods, GCN [17] further simplifies graph convolution in the spectral domain by using first-order approximation, which enables graph convolution operations to be carried out in the spatial domain and greatly improves the computational efficiency of graph convolution models. Moreover, to speed up the training of graph neural networks, GNNs Graph-based Knowledge Distillation DKD methods Output layer DKWISL [18] KTG [19] DGCN [20] SPG [21] GCLN [22] Middle layer IEP [23] HKD [24] MHGD [25] IRG [26] DOD [27] HKDIFM [28] KDExplainer [29] TDD [30] DualDE [31] Constructed graph CAG [32] GKD [33] MorsE [34] BAF [35] LAD [36] GD [37] GCMT [38] GraSSNet [39] LSN [40] IntRA-KD [41] RKD [42] CC [43] SPKD [44] KCAN [45] GKD methods…”
Section: Graph Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations
“…As a representative work of spatial methods, GCN [17] further simplifies graph convolution in the spectral domain by using first-order approximation, which enables graph convolution operations to be carried out in the spatial domain and greatly improves the computational efficiency of graph convolution models. Moreover, to speed up the training of graph neural networks, GNNs Graph-based Knowledge Distillation DKD methods Output layer DKWISL [18] KTG [19] DGCN [20] SPG [21] GCLN [22] Middle layer IEP [23] HKD [24] MHGD [25] IRG [26] DOD [27] HKDIFM [28] KDExplainer [29] TDD [30] DualDE [31] Constructed graph CAG [32] GKD [33] MorsE [34] BAF [35] LAD [36] GD [37] GCMT [38] GraSSNet [39] LSN [40] IntRA-KD [41] RKD [42] CC [43] SPKD [44] KCAN [45] GKD methods…”
Section: Graph Neural Networkmentioning
confidence: 99%
“…GKD [33] explores Frobenius to minimize the distribution difference between teachers and students and compresses the student model. MorsE [34] employs L 2 to transfer meta-knowledge to improve the student model in the link prediction and questionanswering system tasks.…”
Section: Graph-based Knowledge Distillation For Deep Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…For the further study on RCGNN, we are interested in continuing working on the combination of RL and GNN optimization, and learning how to better capture the features from the raw data during the representation learning process. In order to improve the model, transfer learning [22] and meta learning can be used [6]. We will keep enhancing the differentiation stability of the model and finally try to modify our model using dynamic and temporal graphs.…”
Section: Table II and Tablementioning
confidence: 99%