Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) 2019
DOI: 10.18653/v1/w19-4320
|View full text |Cite
|
Sign up to set email alerts
|

Learning Multilingual Meta-Embeddings for Code-Switching Named Entity Recognition

Abstract: In this paper, we propose Multilingual Meta-Embeddings (MME), an effective method to learn multilingual representations by leveraging monolingual pre-trained embeddings. MME learns to utilize information from these embeddings via a self-attention mechanism without explicit language identification. We evaluate the proposed embedding method on the code-switching English-Spanish Named Entity Recognition dataset in a multilingual and cross-lingual setting. The experimental results show that our proposed method ach… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(10 citation statements)
references
References 24 publications
0
10
0
Order By: Relevance
“…Another method that uses multilingual meta-embeddings for code-switching NER was proposed by Genta Indra Winata et al [48]. In this work, the authors proposed multilingual meta-embeddings which is an effective method to learn multilingual representations by using monolingual pretrained embeddings [48]. This paper claims that their proposed method achieves state-of-the-art performance in a multilingual setting and has the generalization ability.…”
Section: Ner In English and Other Resource Heavy Languagesmentioning
confidence: 96%
See 1 more Smart Citation
“…Another method that uses multilingual meta-embeddings for code-switching NER was proposed by Genta Indra Winata et al [48]. In this work, the authors proposed multilingual meta-embeddings which is an effective method to learn multilingual representations by using monolingual pretrained embeddings [48]. This paper claims that their proposed method achieves state-of-the-art performance in a multilingual setting and has the generalization ability.…”
Section: Ner In English and Other Resource Heavy Languagesmentioning
confidence: 96%
“…Furthermore, they have constructed a baseline NER system using deep neural networks and word embedding for Arabic-English CM text and enhanced it using a pooling technique [47]. Another method that uses multilingual meta-embeddings for code-switching NER was proposed by Genta Indra Winata et al [48]. In this work, the authors proposed multilingual meta-embeddings which is an effective method to learn multilingual representations by using monolingual pretrained embeddings [48].…”
Section: Ner In English and Other Resource Heavy Languagesmentioning
confidence: 99%
“…(Xiao et al, 2020; applied meta-learning to solve the multilingual lowresource speech recognition problem. (Winata et al, 2019) also used MAML to adapt models to unseen accents on speech recognition. (Indurthi et al, 2019) adopted meta-learning algorithms to perform speech translation on speech-transcript paired low-resource data.…”
Section: Related Workmentioning
confidence: 99%
“…Wang et al (2018) use a different attention method for NER, which is based on a gated cell that learns to choose appropriate monolingual embeddings according to the input text. Recently, Winata et al (2019) proposed multilingual meta embeddings (MME) combined with self-attention (Vaswani et al, 2017). Their method establishes a state of the art on Spanish-English NER by heavily relying on monolingual embeddings for every language in the code-switched text.…”
Section: Related Workmentioning
confidence: 99%