2016
DOI: 10.1109/tpami.2015.2452911
|View full text |Cite
|
Sign up to set email alerts
|

Flexible Clustered Multi-Task Learning by Learning Representative Tasks

Abstract: Multi-task learning (MTL) methods have shown promising performance by learning multiple relevant tasks simultaneously, which exploits to share useful information across relevant tasks. Among various MTL methods, clustered multi-task learning (CMTL) assumes that all tasks can be clustered into groups and attempts to learn the underlying cluster structure from the training data. In this paper, we present a new approach for CMTL, called flexible clustered multi-task (FCMTL), in which the cluster structure is lear… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
32
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 74 publications
(32 citation statements)
references
References 29 publications
0
32
0
Order By: Relevance
“…More specifically, [10,12,26] assume different tasks share similar sparse feature selection pattern. [8,11,16,17,37,38] assume that the weight vectors. With similar spirit of above task structure assumption, [3,13,14] directly assume that the weight matrix should be low-rank, which enforce different tasks to share the same low-dimension feature transformation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…More specifically, [10,12,26] assume different tasks share similar sparse feature selection pattern. [8,11,16,17,37,38] assume that the weight vectors. With similar spirit of above task structure assumption, [3,13,14] directly assume that the weight matrix should be low-rank, which enforce different tasks to share the same low-dimension feature transformation.…”
Section: Related Workmentioning
confidence: 99%
“…The principle of MTL is to leverage the relationship assumptions among tasks through a model design-e.g., commonalities across tasks. Some well-known MTL design categories are feature selection [10,12,26], where tasks shared a feature-wise sparsity structure, task structure [8,11,16,17,37,38], where model parameters of different tasks share common structures, and the low rank structure of model parameters of tasks [3,13,14] in linear models [35], and parameter sharing [7,18,32] and information sharing [20,22,23] in neural network models [24]. Each of the above designs corresponds to an assumption of the task relationship.…”
mentioning
confidence: 99%
“…Zhou et al (Zhou, Chen, and Ye 2011a) use a similar idea for task clustering, but with a different optimization method. Zhou and Zhao (Zhou and Zhao 2016) propose to cluster tasks by identifying represen-tative tasks. Another way of performing task clustering is through the decomposition of the weight matrix W (Kumar and Daume III 2012; Barzilai and Crammer 2015).…”
Section: Related Workmentioning
confidence: 99%
“…Recently, Zhou and Zhao (2016) propose flexible clustered multi-task (FCMTL) which is an improved version of clustered multi-task learning (CMTL). In order to explore inter-target correlation, also based on the cluster assumption, the cluster structure is learned in FCMTL by identifying representative tasks.…”
Section: Related Workmentioning
confidence: 99%