Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1060
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Learning for Coherence Modeling

Abstract: We address the task of assessing discourse coherence, an aspect of text quality that is essential for many NLP tasks, such as summarization and language assessment. We propose a hierarchical neural network trained in a multitask fashion that learns to predict a documentlevel coherence score (at the network's top layers) along with word-level grammatical roles (at the bottom layers), taking advantage of inductive transfer between the two tasks. We assess the extent to which our framework generalizes to differen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(18 citation statements)
references
References 26 publications
0
18
0
Order By: Relevance
“…MTL with BERT embeddings (MTL bert ): We replicate the previous MTL model but now use BERT embeddings (Devlin et al, 2019) to initialize the input words. Single-task learning (STL, Farag and Yannakoudakis, 2019): This model has the same architecture as MTL but only performs the coherence prediction task, excluding the grammatical role auxiliary objective. STL with BERT (STL bert ): This is the same as STL but uses BERT embeddings.…”
Section: Neural Coherence Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…MTL with BERT embeddings (MTL bert ): We replicate the previous MTL model but now use BERT embeddings (Devlin et al, 2019) to initialize the input words. Single-task learning (STL, Farag and Yannakoudakis, 2019): This model has the same architecture as MTL but only performs the coherence prediction task, excluding the grammatical role auxiliary objective. STL with BERT (STL bert ): This is the same as STL but uses BERT embeddings.…”
Section: Neural Coherence Modelsmentioning
confidence: 99%
“…Some models utilize structured representations of text (e.g. Egrid representations, Tien Nguyen and Joty, 2017;Joty et al, 2018) and others operate on unstructured text, taking advantage of neural models' ability to learn useful representations for the task (Li and Jurafsky, 2017;Logeswaran et al, 2018;Farag and Yannakoudakis, 2019;Moon et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Coherence modeling is one of the essential aspects of natural language processing Mesgar et al, 2019;Moon et al, 2019;Farag and Yannakoudakis, 2019). A coherent text can facilitate understanding and avoid the confusion for reading comprehension.…”
Section: Introductionmentioning
confidence: 99%
“…Despite some successes, techniques explored so far mainly rely on word sequence within a sentence. Farag and Yannakoudakis (2019) attempted to encode information about the types of grammatical roles, such as clausal modifiers of nouns and coordinating conjunction in a sentence obtained by using the Stanford Dependency Parser (Chen and Manning, 2014). for coherence modeling.…”
Section: Introductionmentioning
confidence: 99%