2020
DOI: 10.1609/aaai.v34i05.6492
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Pointer Network for Sentence Ordering with Pairwise Ordering Predictions

Abstract: Dominant sentence ordering models use a pointer network decoder to generate ordering sequences in a left-to-right fashion. However, such a decoder only exploits the noisy left-side encoded context, which is insufficient to ensure correct sentence ordering. To address this deficiency, we propose to enhance the pointer network decoder by using two pairwise ordering prediction modules: The FUTURE module predicts the relative orientations of other unordered sentences with respect to the candidate sentence, and the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
34
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 28 publications
(35 citation statements)
references
References 23 publications
1
34
0
Order By: Relevance
“…In the future, we plan to design a fault-tolerant architecture to mitigating error propagation from external tools. Besides, incorporating more powerful decoding algorithms (Zhang et al, 2018a;Yin et al, 2020a) is a promising direction.…”
Section: Discussionmentioning
confidence: 99%
“…In the future, we plan to design a fault-tolerant architecture to mitigating error propagation from external tools. Besides, incorporating more powerful decoding algorithms (Zhang et al, 2018a;Yin et al, 2020a) is a promising direction.…”
Section: Discussionmentioning
confidence: 99%
“…We compute such pairwise relational representation for all the sentence pairs in the paragraph, and utilize the subset of them at each step of the decoder. Different from the previous method of using the learned sentence vectors to calculate the pairwise relationship between sentences (Yin et al, 2020), DRM employs the whole sequence of the sentence pair as the input to BERT. It allows us to directly relate words from different sentences together, which is more straightforward to exploit the intrinsic relations and coherence between sentences.…”
Section: Deep Relational Modulementioning
confidence: 99%
“…It allows us to directly relate words from different sentences together, which is more straightforward to exploit the intrinsic relations and coherence between sentences. Further, instead of relying on the modules trained from scratch to control the pairwise ordering predictions (Yin et al, 2020), DRM adopts BERT as the main building block to obtain a pairwise relationship representation for the sentence pair. Intuitively, being pre-trained on the large corpus in BERT, this representation encodes more reliable and accurate relative ordering information, and thus is more effective to help determine the pairwise ordering predictions in the decoder.…”
Section: Deep Relational Modulementioning
confidence: 99%
See 2 more Smart Citations