2019
DOI: 10.1155/2019/9543490
|View full text |Cite
|
Sign up to set email alerts
|

A Stacked BiLSTM Neural Network Based on Coattention Mechanism for Question Answering

Abstract: Deep learning is the crucial technology in intelligent question answering research tasks. Nowadays, extensive studies on question answering have been conducted by adopting the methods of deep learning. The challenge is that it not only requires an effective semantic understanding model to generate a textual representation but also needs the consideration of semantic interaction between questions and answers simultaneously. In this paper, we propose a stacked Bidirectional Long Short-Term Memory (BiLSTM) neural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 41 publications
(20 citation statements)
references
References 32 publications
0
20
0
Order By: Relevance
“…The Bidirectional LSTM is an extension of the traditional LSTM and is able to utilize related information from both the previous and future context. In the next section, we exploit the BiLSTM model for depression classification on a given input text (Cai et al 2019).…”
Section: Proposed Methodologymentioning
confidence: 99%
“…The Bidirectional LSTM is an extension of the traditional LSTM and is able to utilize related information from both the previous and future context. In the next section, we exploit the BiLSTM model for depression classification on a given input text (Cai et al 2019).…”
Section: Proposed Methodologymentioning
confidence: 99%
“…This study used different conventional classifiers for CVP, including Adaboost [25], support vector machine (SVM) [26], logistic regression (LR) [27], random forest (RF) [28,29], Gaussian naïve Bayes (GNB) [30], decision tree C4.5 [31], and classification and regression trees (CART) [32]. For NVP, since the input is a variable-length sequence, we used LSTM [33], bidirectional LSTM (biLSTM) [34], Stack-LSTM [35], Stack-biLSTM [36], and Attention-LSTM [37].…”
Section: Common Classifiers Used In This Studymentioning
confidence: 99%
“…where W and b represent the weight vectors and bias terms corresponding to the three gates of the LSTM, respectively. In relevant experiments [43]- [46], reasonable stacking of the networks can effectively improve model's ability of classification and regression. Therefore, this paper construct a stacked BiLSTM network based on the LSTM cell to fully achieve semantic parsing of a single statement.…”
Section: A Semantic Analysis With the Fusion Structure Of Cnn And Bimentioning
confidence: 99%