Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-1249
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: Multi-task learning (MTL) has been studied recently for sequence labeling. Typically, auxiliary tasks are selected specifically in order to improve the performance of a target task. Jointly learning multiple tasks in a way that benefit all of them simultaneously can increase the utility of MTL. In order to do so, we propose a new LSTM cell which contains both shared parameters that can learn from all tasks, and task-specific parameters that can learn task specific information. We name it a Shared-Cell Long-Sho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 35 publications
0
5
0
Order By: Relevance
“…Similarly, understanding the relationships between tasks has been identified as crucial, as highlighted by studies focusing on the adaptive sharing of multi-level distributed representations . Furthermore, the advent of novel LSTM cells encapsulating both shared and task-specific parameters, as proposed in SC-LSTM (Lu et al, 2019), shows promise in improving the performance of a target task by judicious selection of auxiliary tasks, and hence, adds a new dimension to the ongoing discourse. Additionally, the reparameterization of convolutions for incremental multi-task learning has provided a pathway to manage task-specific parameters efficiently, especially when introducing new tasks to the MTL framework (Kanakis et al, 2020).…”
Section: Multi-task Learningmentioning
confidence: 99%
“…Similarly, understanding the relationships between tasks has been identified as crucial, as highlighted by studies focusing on the adaptive sharing of multi-level distributed representations . Furthermore, the advent of novel LSTM cells encapsulating both shared and task-specific parameters, as proposed in SC-LSTM (Lu et al, 2019), shows promise in improving the performance of a target task by judicious selection of auxiliary tasks, and hence, adds a new dimension to the ongoing discourse. Additionally, the reparameterization of convolutions for incremental multi-task learning has provided a pathway to manage task-specific parameters efficiently, especially when introducing new tasks to the MTL framework (Kanakis et al, 2020).…”
Section: Multi-task Learningmentioning
confidence: 99%
“…Many proposals in literature have focused on MTL for NER [33] or, POS [34], [35] or, both NER & POS [8], [9], [28], [36]- [39], but these solutions require huge RAM/ROM for on-device inferencing. Existing efforts towards employing MTL for tasks such as sequence labeling and semantic tasks have primarily focused on accuracy, leading to models that are huge for ondevice use where resources are constrained [6], [40].…”
Section: Related Workmentioning
confidence: 99%
“…There exists literature on learning joint representations for both NER and POS using MTL and they have shown promising results in avoiding over-fitting but have been applied for serverside processing [8], [9]. Further, works on building on-device inference models for text classification [10], [11] highlight the advantages of on-device inferencing of neural models.…”
Section: Introductionmentioning
confidence: 99%
“…Sequence Labeling (SL) is a basic paradigm in the natural language processing (NLP) field, which assigns a predefined label to every meaningful unit (e.g., word or character) in a given sequence (He et al, 2020;Lu et al, 2019;Li et al, 2021;. Many NLP tasks belong to this category, such as Named Entity Recognition (NER) (Zhu and Li, 2022;Shen et al, 2022;He and Tang 2022;Zhou et al 2022b), Chinese Word Segmentation (CWS) (Fu et al, 2020;Maimaiti et al, 2021), Part-Of-Speech (POS) tagging (Zhou et al, 2022a;Nguyen and Verspoor, 2018).…”
Section: Introductionmentioning
confidence: 99%