2019
DOI: 10.1017/s1351324919000202
|View full text |Cite
|
Sign up to set email alerts
|

Sentence embeddings in NLI with iterative refinement encoders

Abstract: Sentence-level representations are necessary for various NLP tasks. Recurrent neural networks have proven to be very effective in learning distributed representations and can be trained efficiently on natural language inference tasks. We build on top of one such model and propose a hierarchy of BiLSTM and max pooling layers that implements an iterative refinement strategy and yields state of the art results on the SciTail dataset as well as strong results for SNLI and MultiNLI. We can show that the sentence em… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 37 publications
(23 citation statements)
references
References 21 publications
0
23
0
Order By: Relevance
“…where T NLI can be any of the existing state of the art textbased NLI models (Wang and Jiang 2016a;Talman, Yli-Jyrä, and Tiedemann 2019;Liu et al 2019a).…”
Section: A Standard Text-based Modelmentioning
confidence: 99%
“…where T NLI can be any of the existing state of the art textbased NLI models (Wang and Jiang 2016a;Talman, Yli-Jyrä, and Tiedemann 2019;Liu et al 2019a).…”
Section: A Standard Text-based Modelmentioning
confidence: 99%
“…The Premise-Hypothesis pairs were created utilizing about 3500 multiplechoice queries with no or a little involvement of annotators with annotating the pairs. They investigated several techniques ranging from the traditional ones to the state-ofthe-art methods bundling word embedding methods including word2vec [10], fastText [11], ELMo [12], BERT [13], and LASER [14], various modeling approaches specifically dedicated to Natural Language Inference dealing, such as, DecompAtt [15], ESIM [16], HBMP [17], ULMFiT [18], and cross-lingual transfer approaches. Another research conducted by Mohammad Mosharaf Hossain et al [19] dealt with countering negation within English sentences as they are ubiquitous in common English sentences.…”
Section: Related Workmentioning
confidence: 99%
“…There are many methods used in text or sentence representations. These methods generally first translate words into word embeddings through a projection layer and then combine such embeddings with different architectures such as Recurrent Neural Network (RNN) [32]- [34], Convolutional Neural Network (CNN) [35], [36]. In this paper, to better encode the semantics of entity description, we focus on the sentence embedding approach.…”
Section: B Hierarchical Bilstm Max Pooling Encodermentioning
confidence: 99%