2021
DOI: 10.1007/s11390-020-9713-0
|View full text |Cite
|
Sign up to set email alerts
|

Language Adaptation for Entity Relation Classification via Adversarial Neural Networks

Abstract: Entity relation classification aims to classify the semantic relationship between two marked entities in a given sentence, and plays a vital role in various natural language processing applications. However, existing studies focus on exploiting mono-lingual data in English, due to the lack of labeled data in other languages. How to effectively benefit from a richly-labeled language to help a poorly-labeled language is still an open problem. In this paper, we come up with a language adaptation framework for cro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…Hence, its processing basis is the need for massive datasets for training. Nonlinear, distributed, parallel computing, adaptive, and self-organizing are the main features of NN processing programs [15].…”
Section: Methodsmentioning
confidence: 99%
“…Hence, its processing basis is the need for massive datasets for training. Nonlinear, distributed, parallel computing, adaptive, and self-organizing are the main features of NN processing programs [15].…”
Section: Methodsmentioning
confidence: 99%
“…There are numerous metaphors used to describe teacher education, including a large grassland with numerous paths to travel through it [26]. A virtual learning environment has been used in a number of studies to examine learning communities and the knowledge connotations of learning models and educational apps in great detail [27]. A methodology for creating virtual learning communities that emphasizes the growth of members' knowledge construction was laid out in the literature as a starting point [28].…”
Section: Introductionmentioning
confidence: 99%
“…Some works attempt to align the contextual word embedding of different languages, such as learning projection transformations (Aldarmaki and Diab, 2019; and forcing model into having similar predictions for parallel sentences (Yang et al, 2021;Gritta and Iacobacci, 2021;Pan et al, 2020;Dou and Neubig, 2021). And there are also a lot of works exploring adversarial training to align different languages (Chen et al, 2021(Chen et al, , 2018bKeung et al, 2019;Lee and Lee, 2019;Wang et al, 2021a;Zou et al, 2018). Their motivation is to learn language-agnostic representation to make model focus more on understand text semantics and generalize better on lowresource languages.…”
Section: Cross-lingual Nlumentioning
confidence: 99%
“…To enable the learned representation to be transferable among different languages, we hope the model can focus on understanding the language-agnostic semantics of text following (Chen et al, 2021(Chen et al, , 2018bKeung et al, 2019;Wang et al, 2021a;Zou et al, 2018). In the other word, the sentence representation should preserve semantic information while removing language-related information.…”
Section: Coding Rate-distortion Maximizationmentioning
confidence: 99%
See 1 more Smart Citation