2021 4th International Conference on Algorithms, Computing and Artificial Intelligence 2021
DOI: 10.1145/3508546.3508632
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Based Multi-level Network for Text Matching with Feature Fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…Niu et al combined Siamese network structure with deep learning techniques such as BiLSTM to efectively extract deep features of the text [4]. Yang and Zhang applied the attention mechanism in the Chinese semantic matching task to improve the representation of text by feature vectors [5]. Te feature vector of the text extracted by the Chinese preprocessing model related to BERT [6] has stronger semantic expression ability, which efectively improves the performance of the Chinese semantic matching model, but lacks the semantic information interaction between diferent texts.…”
Section: Introductionmentioning
confidence: 99%
“…Niu et al combined Siamese network structure with deep learning techniques such as BiLSTM to efectively extract deep features of the text [4]. Yang and Zhang applied the attention mechanism in the Chinese semantic matching task to improve the representation of text by feature vectors [5]. Te feature vector of the text extracted by the Chinese preprocessing model related to BERT [6] has stronger semantic expression ability, which efectively improves the performance of the Chinese semantic matching model, but lacks the semantic information interaction between diferent texts.…”
Section: Introductionmentioning
confidence: 99%