Proceedings of the 1st Workshop on Multilingual Representation Learning 2021
DOI: 10.18653/v1/2021.mrl-1.6
|View full text |Cite
|
Sign up to set email alerts
|

Learning Cross-lingual Representations for Event Coreference Resolution with Multi-view Alignment and Optimal Transport

Abstract: We study a new problem of cross-lingual transfer learning for event coreference resolution (ECR) where models trained on data from a source language are adapted for evaluations in different target languages. We introduce the first baseline model for this task based on XLM-RoBERTa, a state-of-the-art multilingual pre-trained language model. We also explore language adversarial neural networks (LANN) that present language discriminators to distinguish texts from the source and target languages to improve the lan… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 39 publications
0
1
0
Order By: Relevance
“…Transfer learning focuses on transferring knowledge between domains, aiming to enhance model performance in the target domain by leveraging the knowledge gained from the source [28]. Several research works show that the effectiveness of transfer learning comes not only from learning "Good" crossdomain feature representation [29,30], but also lies in its ability to learn high-level statistics from source domain data [31]. Fine-tuning is perhaps the most widely used transfer learning technique in deep learning.…”
Section: Introductionmentioning
confidence: 99%
“…Transfer learning focuses on transferring knowledge between domains, aiming to enhance model performance in the target domain by leveraging the knowledge gained from the source [28]. Several research works show that the effectiveness of transfer learning comes not only from learning "Good" crossdomain feature representation [29,30], but also lies in its ability to learn high-level statistics from source domain data [31]. Fine-tuning is perhaps the most widely used transfer learning technique in deep learning.…”
Section: Introductionmentioning
confidence: 99%