2018
DOI: 10.1007/978-3-030-00794-2_29
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Question Matching in Data Constrained Environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…In recent times, there have been several studies on deep learning based reading comprehension/ QA (Hermann et al, 2015;Cui et al, 2017;Shen et al, 2017;Wang et al, 2017b;Gupta et al, 2018c;Wang and Jiang, 2016;Berant et al, 2014;Maitra et al, 2018;Cheng et al, 2016;Trischler et al, 2016). To the best of our knowledge, this is the very first attempt to automatically generate the code-mixed questions (i.e.…”
Section: Related Workmentioning
confidence: 98%
“…In recent times, there have been several studies on deep learning based reading comprehension/ QA (Hermann et al, 2015;Cui et al, 2017;Shen et al, 2017;Wang et al, 2017b;Gupta et al, 2018c;Wang and Jiang, 2016;Berant et al, 2014;Maitra et al, 2018;Cheng et al, 2016;Trischler et al, 2016). To the best of our knowledge, this is the very first attempt to automatically generate the code-mixed questions (i.e.…”
Section: Related Workmentioning
confidence: 98%
“…Specifically, we use two different encoders, namely Convolutional Neural Network (CNN) and attention based Recurrent Neural Network (RNN). The effectiveness of CNN and RNN based encoder has been proven in other NLP tasks (Gupta et al, 2018d;Maitra et al, 2018;Gupta et al, 2018a,c). The CNN encoder uses multiple fully-connected over the convolution layer while the RNN encoder uses a LSTM layer with the attention (Raffel and Ellis, 2015) followed by multiple fully-connected layers.…”
Section: Proposed Deep Learning Modelmentioning
confidence: 99%