2018
DOI: 10.1371/journal.pone.0193094
|View full text |Cite
|
Sign up to set email alerts
|

Jointly learning word embeddings using a corpus and a knowledge base

Abstract: Methods for representing the meaning of words in vector spaces purely using the information distributed in text corpora have proved to be very valuable in various text mining and natural language processing (NLP) tasks. However, these methods still disregard the valuable semantic relational structure between words in co-occurring contexts. These beneficial semantic relational structures are contained in manually-created knowledge bases (KBs) such as ontologies and semantic lexicons, where the meanings of words… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(22 citation statements)
references
References 27 publications
0
22
0
Order By: Relevance
“…Besides, word embeddings are exploited in a great variety of works, such as for the detection of novel and emerging drug terms 19 or for the simplification 11 . It was also noted that the quality of word embeddings can be improved through the combination of corpora and knowledge bases 24 .…”
Section: Resultsmentioning
confidence: 99%
“…Besides, word embeddings are exploited in a great variety of works, such as for the detection of novel and emerging drug terms 19 or for the simplification 11 . It was also noted that the quality of word embeddings can be improved through the combination of corpora and knowledge bases 24 .…”
Section: Resultsmentioning
confidence: 99%
“…• Counter-fit (Mrkšić et al, 2016), a method for injecting both antonym and synonym constraints into word embeddings. • JointReps (Alsuhaibani et al, 2018), a joint word representation learning method which simultaneously utilizes the corpus and KB.…”
Section: Baselinesmentioning
confidence: 99%
“…Incorporating relevant signals from semantic knowledge sources such as WordNet (Miller, 1995), FrameNet (Baker et al, 1998), and Paraphrase Database (PPDB) (Pavlick et al, 2015) has been shown to improve the quality of word embeddings. Recent works utilize these by incorporating them in a neural language modeling objective function (Yu and Dredze, 2014;Alsuhaibani et al, 2018), or as a post-processing step (Faruqui et al, 2014;Mrkšić et al, 2016). Although existing approaches improve the quality of word embeddings, they require explicit modification for handling different types of semantic information.…”
Section: Introductionmentioning
confidence: 99%
“…Osborne et al (2016) introduced an algorithm for determining word representations that also encode prior knowledge into the learned embeddings besides the distributional information originating from raw text corpora. Alsuhaibani et al (2018) consider a learning process where a word embedding and a knowledge base are learned together. The knowledge base is incorporated into the embedding in an implicit manner by integrating it into the objective function (i.e., vectors of words being in a relation are supposed to be close).…”
Section: Related Workmentioning
confidence: 99%