2020
DOI: 10.1109/access.2020.2985685
|View full text |Cite
|
Sign up to set email alerts
|

Interactive Self-Attentive Siamese Network for Biomedical Sentence Similarity

Abstract: The determination of semantic similarity between sentences is an important component in natural language processing (NLP) tasks such as text retrieval and text summarization. Many approaches have been proposed for estimating sentence similarity, and Siamese neural networks (SNN) provide a better approach. However, the sentence semantic representation, generated by sharing weights in the SNN without any attention mechanism, ignores the different contributions of different words to the overall sentence semantics… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…Considering the above problem, to make full use of the contribution of each feature to the case solution, we allow the SA mechanism [21] to redistribute the resources which are originally evenly distributed according to the importance of the object and capture the internal correlation of the data or features. It can both represent the contribution of the features in the case and represent the correlation between the features [23]. Therefore, the SA mechanism is introduced into the SNN metric learning process, and a SASNN weighted similarity depth measurement method is proposed.…”
Section: Sasnn Weighted Similarity Depth Measurement Methodsmentioning
confidence: 99%
“…Considering the above problem, to make full use of the contribution of each feature to the case solution, we allow the SA mechanism [21] to redistribute the resources which are originally evenly distributed according to the importance of the object and capture the internal correlation of the data or features. It can both represent the contribution of the features in the case and represent the correlation between the features [23]. Therefore, the SA mechanism is introduced into the SNN metric learning process, and a SASNN weighted similarity depth measurement method is proposed.…”
Section: Sasnn Weighted Similarity Depth Measurement Methodsmentioning
confidence: 99%
“…attention [17] into Siamese neural network.  ISA-SNN: our previous work [18], fusing interactive attention and self-attention to implement semantic interaction between sentences and integrating it into Siamese neural network.  SA-SNN: introducing self-attention into our SNN.…”
Section: A Baselines and Our Modelsmentioning
confidence: 99%
“…Even though these methods with interactive/cross attention mechanism show the effectiveness on non-biomedical datasets, their performance on biomedical corpora is unsatisfactory owing to long-range dependencies [15] and complex syntactical structure [16] in biomedical corpora. Inspired by self-attention [15] and interactive attention [17], interactive self-attention has been proposed in our previous work [18]. Other methods are proposed for this field [19][20][21][22][23][24].…”
Section: Introductionmentioning
confidence: 99%
“…A text expansion and deep model-based approach for service recommendation is proposed, which can bridge the vocabulary gap between services and user queries with the collective semantic similarity of sentences and descriptions [40]. An interactive self-attentive Siamese neural network is used to verify the effectiveness of the interactive self-attention [41]. With the development of capsule networks, the text representation preprocessed by neural networks can achieve out-of-state results as the input of classification and machine translation.…”
Section: B Sentence Similaritymentioning
confidence: 99%