Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP 2017
DOI: 10.18653/v1/w17-5309
|View full text |Cite
|
Sign up to set email alerts
|

Character-level Intra Attention Network for Natural Language Inference

Abstract: Natural language inference (NLI) is a central problem in language understanding. End-to-end artificial neural networks have reached state-of-the-art performance in NLI field recently.In this paper, we propose Characterlevel Intra Attention Network (CIAN) for the NLI task. In our model, we use the character-level convolutional network to replace the standard word embedding layer, and we use the intra attention to capture the intra-sentence semantics. The proposed CIAN model provides improved results based on a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2017
2017
2017
2017

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…RepEval 2017 Cha-level Intra-attention BiLSTM encoders(Yang et al, 2017) Experimental results of different models on MultiNLI data. SNLI Mix : use of SNLI training dataset.…”
mentioning
confidence: 99%
“…RepEval 2017 Cha-level Intra-attention BiLSTM encoders(Yang et al, 2017) Experimental results of different models on MultiNLI data. SNLI Mix : use of SNLI training dataset.…”
mentioning
confidence: 99%