Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing 2018
DOI: 10.18653/v1/d18-1032
|View full text |Cite
|
Sign up to set email alerts
|

Cross-lingual Knowledge Graph Alignment via Graph Convolutional Networks

Abstract: Multilingual knowledge graphs (KGs) such as DBpedia and YAGO contain structured knowledge of entities in several distinct languages, and they are useful resources for cross-lingual AI and NLP applications. Cross-lingual KG alignment is the task of matching entities with their counterparts in different languages, which is an important way to enrich the crosslingual links in multilingual KGs. In this paper, we propose a novel approach for crosslingual KG alignment via graph convolutional networks (GCNs). Given a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
393
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 489 publications
(430 citation statements)
references
References 15 publications
0
393
0
Order By: Relevance
“…Such methods only make use of one or two aspects of the aforementioned information. For example, Zhu et al (2017) relied only on topological features while and Wang et al (2018) exploited both topological and attribute features. Chen et al (2018) proposed a co-training algorithm to combine topological features and literal descriptions of entities.…”
Section: Descriptionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Such methods only make use of one or two aspects of the aforementioned information. For example, Zhu et al (2017) relied only on topological features while and Wang et al (2018) exploited both topological and attribute features. Chen et al (2018) proposed a co-training algorithm to combine topological features and literal descriptions of entities.…”
Section: Descriptionsmentioning
confidence: 99%
“…Graph convolutional networks (GCNs) (Kipf and Welling, 2017) are variants of convolutional networks that have proven effective in capturing information from graph structures, such as dependency graphs (Guo et al, 2019b), abstract meaning representation graphs (Guo et al, 2019a), and knowledge graphs (Wang et al, 2018). In practice, multi-layer GCNs are stacked to collect evidence from multi-hop neighbors.…”
Section: Cross-lingual Graph Embeddingsmentioning
confidence: 99%
“…Convolution Weights The first column of Table 5 corresponds to the weight usage and initialization settings used in the code for GCN-Align. We achieve slightly better results than published in [24], which we attribute to a more exhaustive parameter search. Interestingly, all best configurations use Adam optimizer instead of SGD.…”
Section: Methodsmentioning
confidence: 49%
“…MTransE [5] WK3l-15K, WK3l-120K, CN3l H@10(, MR) yes IPTransE [29] DFB-{1,2,3} H@{1,10}, MR yes JAPE [19] DBP15K(JAPE) H@{1,10,50}, MR yes KDCoE [4] WK3l-60K H@{1,10}, MR yes BootEA [20] DBP15K(JAPE), DWY100K H@{1,10}, MRR yes SEA [15] WK3l-15K, WK3l-120K H@{1,5,10}, MRR yes MultiKE [28] DWY100K H@{1,10}, MR, MRR yes AttrE [22] DBP-LGD,DBP-GEO,DBP-YAGO H@{1,10}, MR yes RSN [8] custom DBP15K, DWY100K H@{1,10}, MRR yes GCN-Align [24] DBP15K(JAPE) H@{1,10,50} yes CL-GNN [27] DBP15K(JAPE) H@{1,10} yes MuGNN [3] DBP15K(JAPE), DWY100K H@{1,10}, MRR yes NAEA [30] DBP15K(JAPE), DWY100K H@{1,10}, MRR no…”
Section: Datasets Metrics Codementioning
confidence: 99%
See 1 more Smart Citation