Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2021
DOI: 10.18653/v1/2021.naacl-main.221
|View full text |Cite
|
Sign up to set email alerts
|

Edge: Enriching Knowledge Graph Embeddings with External Text

Abstract: Knowledge graphs suffer from sparsity which degrades the quality of representations generated by various methods. While there is an abundance of textual information throughout the web and many existing knowledge bases, aligning information across these diverse data sources remains a challenge in the literature. Previous work has partially addressed this issue by enriching knowledge graph entities based on "hard" co-occurrence of words present in the entities of the knowledge graphs and external text, while we … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 21 publications
(10 citation statements)
references
References 21 publications
(16 reference statements)
0
10
0
Order By: Relevance
“…The original and the augmented graphs are then aligned to suppress the noise and distil relevant information. In our work, we focus on adding extra edges to the KG rather than nodes as in Rezayi et al (2021) and Wang et al (2016).…”
Section: Kge Methodsmentioning
confidence: 99%
“…The original and the augmented graphs are then aligned to suppress the noise and distil relevant information. In our work, we focus on adding extra edges to the KG rather than nodes as in Rezayi et al (2021) and Wang et al (2016).…”
Section: Kge Methodsmentioning
confidence: 99%
“…There are comprehensive survey articles building KGs from relational databases [35], semi-structued data [33], and unstructured text [1,8,11,28,34]. Rezayi et al [31] propose an approach to augment a KG with key phrases generated from textual content of entities. In our work, we augment our KG with semantically rich triples generated from textual content of each entity.…”
Section: Related Workmentioning
confidence: 99%
“…Models that integrate text information include DKRL [ 14 ] and Joint [ 32 ], which use a continuous bag of words and CNN for text encoding; STKRL [ 33 ], which uses an RNN to obtain text sequence information; EDGE [ 34 ] and AATE [ 35 ], which are based on the bidirectional long short-term memory network; TA-ConvKB [ 36 ], which uses bidirectional short and long term memory network with attention to encode the text. The text- and structure-based representations are generally combined when calculating the score functions.…”
Section: Related Workmentioning
confidence: 99%