2023
DOI: 10.1109/taslp.2022.3224302
|View full text |Cite
|
Sign up to set email alerts
|

Curriculum-Style Fine-Grained Adaption for Unsupervised Cross-Lingual Dependency Transfer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 44 publications
0
1
0
Order By: Relevance
“…However, such direct transfer approaches are insufficient for target domains with large gaps compared to the source domain. Self-training (Yarowsky, 1995;McClosky et al, 2006;Yu et al, 2015;Ramponi and Plank, 2020;Guo et al, 2022) is a simple and early bootstrapping approach for domain adaptation, applied to tasks such as classification Dong and Schäfer (2011); Ye et al (2020), sequence labeling Wang et al (2021Wang et al ( , 2020, dependency parsing Yu et al (2015); Rotman and Reichart (2019); Guo et al (2022), sequence generation He et al (2019), and QA Sachan and Xing (2018) and so on. However, self-training has not been applied to cross-domain constituency parsing.…”
Section: Related Workmentioning
confidence: 99%
“…However, such direct transfer approaches are insufficient for target domains with large gaps compared to the source domain. Self-training (Yarowsky, 1995;McClosky et al, 2006;Yu et al, 2015;Ramponi and Plank, 2020;Guo et al, 2022) is a simple and early bootstrapping approach for domain adaptation, applied to tasks such as classification Dong and Schäfer (2011); Ye et al (2020), sequence labeling Wang et al (2021Wang et al ( , 2020, dependency parsing Yu et al (2015); Rotman and Reichart (2019); Guo et al (2022), sequence generation He et al (2019), and QA Sachan and Xing (2018) and so on. However, self-training has not been applied to cross-domain constituency parsing.…”
Section: Related Workmentioning
confidence: 99%