2021
DOI: 10.20944/preprints202110.0184.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Self-Attention Based Models for the Extraction of Molecular Interactions from Biological Texts

Abstract: For any molecule, network, or process of interest, to keep up with new publications on these, is becoming increasingly difficult. For many cellular processes, molecules and their interactions that need to be considered can be very large. Automated mining of publications can support large scale molecular interaction maps and database curation. Text mining and Natural Language Processing (NLP)-based techniques are finding their applications in mining the biological literature, handling problems such as Named Ent… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 5 publications
0
1
0
Order By: Relevance
“…In order to effectively extract the crucial information in the image, the results proved that this method is feasible. Srivastava et al introduced the attention mechanism to propose a special architecture based on neural networks, which was applied in biology [13]. Wang et al proposed a neural network model based on the attention mechanism, allowing important words to obtain higher weights, and then optimizing the objective function to significantly improve the effect [14].…”
Section: Introductionmentioning
confidence: 99%
“…In order to effectively extract the crucial information in the image, the results proved that this method is feasible. Srivastava et al introduced the attention mechanism to propose a special architecture based on neural networks, which was applied in biology [13]. Wang et al proposed a neural network model based on the attention mechanism, allowing important words to obtain higher weights, and then optimizing the objective function to significantly improve the effect [14].…”
Section: Introductionmentioning
confidence: 99%