Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume 2021
DOI: 10.18653/v1/2021.eacl-main.100
|View full text |Cite
|
Sign up to set email alerts
|

Randomized Deep Structured Prediction for Discourse-Level Processing

Abstract: Expressive text encoders such as RNNs and Transformer Networks have been at the center of NLP models in recent work. Most of the effort has focused on sentence-level tasks, capturing the dependencies between words in a single sentence, or pairs of sentences. However, certain tasks, such as argumentation mining, require accounting for longer texts and complicated structural dependencies between them. Deep structured prediction is a general framework to combine the complementary strengths of expressive neural en… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…This hybrid modeling paradigm allow us to leverage expressive textual encoders, and to introduce contextualizing information and model different interdependent decisions. SRL methods have proven effective to model domains with limited supervision (Johnson and Goldwasser, 2018;Subramanian et al, 2018), and approaches that combine neural networks and SRL have shown consistent performance improvements (Widmoser et al, 2021;Roy et al, 2021).…”
Section: Inter-annotator Agreementmentioning
confidence: 99%
“…This hybrid modeling paradigm allow us to leverage expressive textual encoders, and to introduce contextualizing information and model different interdependent decisions. SRL methods have proven effective to model domains with limited supervision (Johnson and Goldwasser, 2018;Subramanian et al, 2018), and approaches that combine neural networks and SRL have shown consistent performance improvements (Widmoser et al, 2021;Roy et al, 2021).…”
Section: Inter-annotator Agreementmentioning
confidence: 99%
“…This hybrid modeling paradigm allow us to leverage expressive textual encoders, and to introduce contextualizing information and model different interdependent decisions. SRL methods have proven effective to model domains with limited supervision (Johnson and Goldwasser, 2018;Subramanian et al, 2018), and approaches that combine neural networks and SRL have shown consistent performance improvements (Widmoser et al, 2021;Roy et al, 2021). Following the conventions of statistical relational learning models, we use horn-clauses of the form p 0 ∧ p 1 ∧ ... ∧ p n ⇒ h to describe relational properties.…”
Section: Joint Probabilistic Modelmentioning
confidence: 99%