Proceedings of the 19th SIGBioMed Workshop on Biomedical Language Processing 2020
DOI: 10.18653/v1/2020.bionlp-1.4
|View full text |Cite
|
Sign up to set email alerts
|

Improving Biomedical Analogical Retrieval with Embedding of Structural Dependencies

Abstract: Inferring the nature of the relationships between biomedical entities from text is an important problem due to the difficulty of maintaining human-curated knowledge bases in rapidly evolving fields. Neural word embeddings have earned attention for an apparent ability to encode relational information. However, word embedding models that disregard syntax during training are limited in their ability to encode the structural relationships fundamental to cognitive theories of analogy. In this paper, we demonstrate … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 36 publications
0
3
0
Order By: Relevance
“…Vector cosine similarity in an unsupervised word embedding enabled the prediction of applications for materials years before their publication in the materials science literature ( 49 ). Several supervised analogy learning methods based on word embeddings have been successfully applied in a variety of natural language processing tasks ( 22 , 50 , 51 ). Our algorithm uses this approach to leverage information about cancer and kinases latent in the published literature.…”
Section: Discussionmentioning
confidence: 99%
“…Vector cosine similarity in an unsupervised word embedding enabled the prediction of applications for materials years before their publication in the materials science literature ( 49 ). Several supervised analogy learning methods based on word embeddings have been successfully applied in a variety of natural language processing tasks ( 22 , 50 , 51 ). Our algorithm uses this approach to leverage information about cancer and kinases latent in the published literature.…”
Section: Discussionmentioning
confidence: 99%
“…Vector cosine similarity was taken recently to enable prediction of materials for applications years before their publication in the materials science literature by unsupervised word embedding (39). Several supervised analogy learning methods based on word embeddings have been successfully applied in a variety of natural language processing tasks (13,40,41). Our algorithm uses this approach to leverage information about cancer and kinases latent in the published literature.…”
Section: Discussionmentioning
confidence: 99%
“…These analogies are also known to be solvable by addition and subtraction of the "neural" word embeddings of the corresponding concepts [274,318]. Following a similar approach, the authors in [316] proposed to improve the results by training shallow neural networks using a dependency path of relations between terms in sentences.…”
Section: Holistic Transformationsmentioning
confidence: 99%