RANLP 2017 - Recent Advances in Natural Language Processing Meet Deep Learning 2017
DOI: 10.26615/978-954-452-049-6_087
|View full text |Cite
|
Sign up to set email alerts
|

Towards Lexical Chains for Knowledge-Graph-basedWord Embeddings

Abstract: Word vectors with varying dimensionalities and produced by different algorithms have been extensively used in NLP. The corpora that the algorithms are trained on can contain either natural language text (e.g. Wikipedia or newswire articles) or artificially-generated pseudo corpora due to natural data sparseness.We exploit Lexical Chain based templates over Knowledge Graph for generating pseudo-corpora with controlled linguistic value. These corpora are then used for learning word embeddings. A number of experi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…Concerning the automatic task of identifying the stakeholders, this one is based on graphical forms, but does not take into account pronominal anaphors and, more generally, lexical chains. Presently, this remains a challenge for NLP researchers, but recent advances based on deep learning and word embedding (e.g., Simov et al, 2017) can provide the means to make the identification of stakeholders and the computing of the SC score more accurate.…”
Section: Discussion and Study Limitationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Concerning the automatic task of identifying the stakeholders, this one is based on graphical forms, but does not take into account pronominal anaphors and, more generally, lexical chains. Presently, this remains a challenge for NLP researchers, but recent advances based on deep learning and word embedding (e.g., Simov et al, 2017) can provide the means to make the identification of stakeholders and the computing of the SC score more accurate.…”
Section: Discussion and Study Limitationsmentioning
confidence: 99%
“…this remains a challenge for NLP researchers, but recent advances based on deep learning and word embedding (e.g., Simov et al, 2017) can provide the means to make the identification of stakeholders and the computing of the SC score more accurate.…”
Section: Acknowledgmentmentioning
confidence: 99%
“…Even though the applicability of lexical chains is diverse, we see little work exploring them with recent advances in NLP, more specifically with word embeddings. In (Simov et al, 2017), lexical chains are built using specific patterns found on WordNet Fellbaum (1998) and used for learning word embeddings. Their resulting vectors, as ours, are tested in the document similarity task.…”
Section: Related Workmentioning
confidence: 99%
“…We expect the proposed algorithms to produce a robust semantic representation through the use of lexical chains. Furthermore, we build our lexical chains using several synset objects 4 in the lexical database in addition to hypernyms and hyponyms, which are commonly used in the literature (Gonzales et al, 2017;Mascarell, 2017;Simov et al, 2017) (detailed in Section 3). The main idea is to bring the semantic relations of lexical chains to traditional word embeddings techniques, leveraging their vector representation, and improving the overall result in the document classification task.…”
Section: Related Workmentioning
confidence: 99%