Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing 2018
DOI: 10.18653/v1/d18-1465
|View full text |Cite
|
Sign up to set email alerts
|

Deep Attentive Sentence Ordering Network

Abstract: In this paper, we propose a novel deep attentive sentence ordering network (referred as ATTOrderNet) which integrates self-attention mechanism with LSTMs in the encoding of input sentences. It enables us to capture global dependencies among sentences regardless of their input order and obtains a reliable representation of the sentence set. With this representation, a pointer network is exploited to generate an ordered sequence. The proposed model is evaluated on Sentence Ordering and Order Discrimination tasks… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
108
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 53 publications
(110 citation statements)
references
References 12 publications
0
108
2
Order By: Relevance
“…Table 3: Performance of predicting the correct head and tail sentences on arXiv and SIND. The results are directly taken from (Cui et al, 2018).…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Table 3: Performance of predicting the correct head and tail sentences on arXiv and SIND. The results are directly taken from (Cui et al, 2018).…”
Section: Resultsmentioning
confidence: 99%
“…This section reports experimental results on the sentence ordering task for determining a coherent order of a given sentence. The proposed TGCM was compared with state-of-theart methods as baselines such as Pairwise Model , Seq2seq (Logeswaran et al, 2018), RNN Decoder (Logeswaran et al, 2018), V-LSTM+PtrNet (Logeswaran et al, 2018), CNN+PtrNet (Gong et al, 2016), LSTM+PtrNet (Gong et al, 2016), and ATTOrderNet (Cui et al, 2018). Here, except the random model, all of the baselines are based on neural networks, which are typically more competitive than traditional approaches (e.g., utilizing handcraft features).…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…propose training sentence ordering models to differentiate between the original order of a well-written text and a permuted sentence order. Cui et al (2018) continue in this paradigm, training an encoderdecoder network to read a series of sentences and reorder them for better coherence. Our goal is not to reorder a student's sentences, but to provide more detailed feedback on whether the right structures (e.g., steps) are present in the methodology.…”
Section: Introductionmentioning
confidence: 99%