2018
DOI: 10.1609/aaai.v32i1.11997
|View full text |Cite
|
Sign up to set email alerts
|

Sentence Ordering and Coherence Modeling using Recurrent Neural Networks

Abstract: Modeling the structure of coherent texts is a key NLP problem. The task of coherently organizing a given set of sentences has been commonly used to build and evaluate models that understand such structure. We propose an end-to-end unsupervised deep learning approach based on the set-to-sequence framework to address this problem. Our model strongly outperforms prior methods in the order discrimination task and a novel task of ordering abstracts from scientific articles. Furthermore, our work shows that useful t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 53 publications
(34 citation statements)
references
References 26 publications
0
34
0
Order By: Relevance
“…A series of approaches had been invented to tackle this rearranging problem. [ 89–92 ] There the topological sort method was selected, [ 93,94 ] a standard algorithm for linear ordering of the vertices of a directed graph. Precisely, we have scriptCn$\mathcal {C}_n$ set of constraints for this sub‐trajectory.…”
Section: Methodsmentioning
confidence: 99%
“…A series of approaches had been invented to tackle this rearranging problem. [ 89–92 ] There the topological sort method was selected, [ 93,94 ] a standard algorithm for linear ordering of the vertices of a directed graph. Precisely, we have scriptCn$\mathcal {C}_n$ set of constraints for this sub‐trajectory.…”
Section: Methodsmentioning
confidence: 99%
“…Sentence encoders are trained to extract sentence-level features [49,58,74]. We use SBERT [74] which finetunes a pre-trained language model to allow it to better capture semantic similarity using feature cosine distance.…”
Section: Sampling Image Pairs Using Languagementioning
confidence: 99%
“…Neural language models that generate word and document embeddings have become increasingly popular in recent years. Numerous research efforts have suggested enhancing distributed representations of documents/sentences [Le and Mikolov, 2014, Dai et al, 2015, Kiros et al, 2015, Hill et al, 2016, Pagliardini et al, 2017, Logeswaran and Lee, 2018, Gupta et al, 2019, Cer et al, 2018, Sinoara et al, 2019. These methods efficiently calculate the semantic similarity of words by capturing the similarity of the respective vector representations.…”
Section: Advanced Language Modelsmentioning
confidence: 99%