Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1079
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Networks with Universe, Group, and Task Feature Learning

Abstract: We present methods for multi-task learning that take advantage of natural groupings of related tasks. Task groups may be defined along known properties of the tasks, such as task domain or language. Such task groups represent supervised information at the inter-task level and can be encoded into the model. We investigate two variants of neural network architectures that accomplish this, learning different feature spaces at the levels of individual tasks, task groups, as well as the universe of all tasks:(1) pa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(17 citation statements)
references
References 25 publications
0
17
0
Order By: Relevance
“…Motivated by this problem, the joint models [12,13,14] are developed for solving intent detection and slot filling tasks together. Besides, some work [15,5,16,17,18,19], tries to enhance the performance via multi-task learning. These joint models or multi-task learning methods link the two tasks implicitly via applying a joint loss function.…”
Section: Related Workmentioning
confidence: 99%
“…Motivated by this problem, the joint models [12,13,14] are developed for solving intent detection and slot filling tasks together. Besides, some work [15,5,16,17,18,19], tries to enhance the performance via multi-task learning. These joint models or multi-task learning methods link the two tasks implicitly via applying a joint loss function.…”
Section: Related Workmentioning
confidence: 99%
“…In [24], each task has its own encoder and decoder, while all tasks share a representation learning layer and a joint encoding layer. [92] creates encoder modules on different levels, including task level, task group level, and universal level.…”
Section: Modular Architecturesmentioning
confidence: 99%
“…Meanwhile, some work [10,22] tries to enhance the performance via multi-task learning. These multi-task learning methods link the two tasks implicitly via applying a joint loss function.…”
Section: Related Workmentioning
confidence: 99%