2021
DOI: 10.1007/s41060-021-00274-0
|View full text |Cite
|
Sign up to set email alerts
|

Deep multi-task learning with flexible and compact architecture search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 25 publications
0
7
0
Order By: Relevance
“…The results showed that deep learning model could perform better and bring valuable information to meet users' needs. Focusing on more fundamental deep learning techniques, Zhao et al [54] developed a flexible approach to compact architecture search for deep multi-task learning (MTL) problems. Though sharing model architectures is a popular method for MTL problems, identifying the appropriate components to be shared by multiple tasks is still a challenge.…”
Section: Applied and Flexible Deep Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…The results showed that deep learning model could perform better and bring valuable information to meet users' needs. Focusing on more fundamental deep learning techniques, Zhao et al [54] developed a flexible approach to compact architecture search for deep multi-task learning (MTL) problems. Though sharing model architectures is a popular method for MTL problems, identifying the appropriate components to be shared by multiple tasks is still a challenge.…”
Section: Applied and Flexible Deep Learningmentioning
confidence: 99%
“…We now see more examples of transfer learning, where models trained on one (source) domain are applied in another (target) domain suffering from data scarcity. However, learning generalized models that perform well on multiple tasks could be a challenging process [54]. These models are often trained with self-supervision on large data and contain millions or billions of learned parameters, such as models for language processing (e.g., BERT, GPT-3, and XLNet) and image classification (ResNet, EfficientNet, Inception).…”
Section: New Trends From the Industry Perspectivementioning
confidence: 99%
See 1 more Smart Citation
“…The results showed that deep learning model could perform better and bring valuable information to meet users' needs. Focusing on more fundamental deep learning techniques, Zhao et al [53] developed a flexible approach to compact architecture search for deep multitask learning (MTL) problems. Though sharing model architectures is a popular method for MTL problems, identifying the appropriate components to be shared by multiple tasks is still a challenge.…”
Section: Applied and Flexible Deep Learningmentioning
confidence: 99%
“…We now see more examples of transfer learning, where models trained on one (source) domain are applied in another (target) domain suffering from data scarcity. However, learning generalized models that perform well on multiple tasks could be a challenging process [53]. These models are often trained with self-supervision on large data and contain millions or billions of learned parameters, such as models for language processing (e.g., BERT, GPT-3, XLNet) and image classification (ResNet, EfficientNet, Inception).…”
Section: New Trends From the Industry Perspectivementioning
confidence: 99%