Proceedings of the Workshop on Deep Learning Approaches for Low-Resource NLP 2018
DOI: 10.18653/v1/w18-3406
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Active Learning for Neural Semantic Role Labeling on Low Resource Conversational Corpus

Abstract: Most Semantic Role Labeling (SRL) approaches are supervised methods which require a significant amount of annotated corpus, and the annotation requires linguistic expertise. In this paper, we propose a Multi-Task Active Learning framework for Semantic Role Labeling with Entity Recognition (ER) as the auxiliary task to alleviate the need for extensive data and use additional information from ER to help SRL. We evaluate our approach on Indonesian conversational dataset. Our experiments show that multi-task activ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 19 publications
0
10
0
1
Order By: Relevance
“…To adapt active learning in a multitask scenario, Reichart et al (2008) proposed a multi-task active learning framework for linguistic annotations with CRF-based models. With the advantages of the RNN model, a multi-task active learning framework was put forward for neural semantic role labeling on low resource conversational corpus (Ikhwantri et al 2018). The above methods are uncertainty-based active learning without demanding diversity explicitly.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…To adapt active learning in a multitask scenario, Reichart et al (2008) proposed a multi-task active learning framework for linguistic annotations with CRF-based models. With the advantages of the RNN model, a multi-task active learning framework was put forward for neural semantic role labeling on low resource conversational corpus (Ikhwantri et al 2018). The above methods are uncertainty-based active learning without demanding diversity explicitly.…”
Section: Related Workmentioning
confidence: 99%
“…The effectiveness of active learning is influenced by the performance of task models. Considering the relevance of different tasks, some researchers have proposed multi-task active learning for linguistic annotations (Reichart et al 2008;Ikhwantri et al 2018). The model in (Ikhwantri et al 2018) exploits the encoder-decoder framework with soft-shared parameters.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Trong lĩnh vực tổng quát, ở hầu hết các công trình, bộ mẫu có được là do khai khoáng từ ngữ liệu [21][22][23] 36 . Khi áp dụng vào văn bản Y Sinh, chúng tôi nhận thấy bài toán SRL có liên quan mật thiết với bài toán Rút trích Thực thể (Named Entity Recognition -NER), vì loại thực thể của đối số quyết định vai trò ngữ nghĩa của đối số (Ví dụ loại thực thể DNA chỉ có thể giữ vai trò "tác nhân" của vị ngữ "encode" chứ không thể giữ vai trò "sản phẩm").…”
Section: Hướng Khớp Mẫuunclassified
“…Active Learning, Task Selection and Sampling Our sampling technique is similar to the ones found in several active learning algorithms (Chen et al, 2006) that are based on Shannon entropy estimations. Reichart et al (2008) and Ikhwantri et al (2018) examined Multi-Task Active Learning (MTAL) using a two task annotation scenario and showed performance gains while needing less labelled data. Our approach is a substantially different variant of MTAL since it was developed for task selection.…”
Section: Related Workmentioning
confidence: 99%