2023
DOI: 10.1109/taslp.2022.3212698
|View full text |Cite
|
Sign up to set email alerts
|

Cross-Lingual Named Entity Recognition for Heterogenous Languages

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 38 publications
0
2
0
Order By: Relevance
“…Cross-Lingual NER Various strategies have been investigated to tackle the zero-resource challenge for cross-lingual NER, such as translation-based (Xie et al 2018;Liang et al 2021), direct transfer-based (Wu and Dredze 2019;Wu et al 2020c), and knowledge distillation-based approaches (Wu et al 2020a;Liang et al 2021;Chen et al 2021;Zeng et al 2022;Fu et al 2022). The direct transfer-based methods yield inferior results, since the target data has not been effectively utilized.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Cross-Lingual NER Various strategies have been investigated to tackle the zero-resource challenge for cross-lingual NER, such as translation-based (Xie et al 2018;Liang et al 2021), direct transfer-based (Wu and Dredze 2019;Wu et al 2020c), and knowledge distillation-based approaches (Wu et al 2020a;Liang et al 2021;Chen et al 2021;Zeng et al 2022;Fu et al 2022). The direct transfer-based methods yield inferior results, since the target data has not been effectively utilized.…”
Section: Related Workmentioning
confidence: 99%
“…Wu et al (2020a) pioneers the use of knowledge distillation architectures for crosslingual NER tasks and achieves surprising performance. Since then, researchers in this field have focused on knowledge distillation-based approaches and developed various improvements (Zeng et al 2022;Fu et al 2022). Following the previous works above, this paper employs the knowledge distillation architecture as the backbone for cross-lingual NER.…”
Section: Knowledge Distillationmentioning
confidence: 99%