Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume 2021
DOI: 10.18653/v1/2021.eacl-main.254
|View full text |Cite
|
Sign up to set email alerts
|

PPT: Parsimonious Parser Transfer for Unsupervised Cross-Lingual Adaptation

Abstract: Cross-lingual transfer is a leading technique for parsing low-resource languages in the absence of explicit supervision. Simple 'direct transfer' of a learned model based on a multilingual input encoding has provided a strong benchmark. This paper presents a method for unsupervised cross-lingual transfer that improves over direct transfer systems by using their output as implicit supervision as part of self-training on unlabelled text in the target language. The method assumes minimal resources and provides ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(13 citation statements)
references
References 22 publications
0
13
0
Order By: Relevance
“…Multilinguality is the key factor contributing to the success of PPTX (Kurniawan et al, 2021). Therefore, optimising the method to leverage this multilinguality provided by the source models is important.…”
Section: Proposed Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…Multilinguality is the key factor contributing to the success of PPTX (Kurniawan et al, 2021). Therefore, optimising the method to leverage this multilinguality provided by the source models is important.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Model Architecture For parsing, we use the same architecture as was used by Kurniawan et al (2021), consisting of embedding layers, a Transformer encoder layer, and a biaffine output layer (Dozat and Manning, 2017). At test time, we run the MST algorithm (Chu and Liu, 1965;Edmonds, 1967) to find the highest scoring tree.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations