2010
DOI: 10.1007/s10994-010-5231-6
|View full text |Cite
|
Sign up to set email alerts
|

Boosted multi-task learning

Abstract: In this paper we propose a novel algorithm for multi-task learning with boosted decision trees. We learn several different learning tasks with a joint model, explicitly addressing their commonalities through shared parameters and their differences with taskspecific ones. This enables implicit data sharing and regularization. Our algorithm is derived using the relationship between 1 -regularization and boosting. We evaluate our learning method on web-search ranking data sets from several countries. Here, multi-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
43
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 52 publications
(43 citation statements)
references
References 26 publications
0
43
0
Order By: Relevance
“…We will show that our approach consistently outperforms recent multi-task learning techniques [8], [18], [11] across this wide range of applications. Our approach was first introduced in a conference paper [19].…”
Section: Introductionmentioning
confidence: 77%
See 4 more Smart Citations
“…We will show that our approach consistently outperforms recent multi-task learning techniques [8], [18], [11] across this wide range of applications. Our approach was first introduced in a conference paper [19].…”
Section: Introductionmentioning
confidence: 77%
“…Current approaches to Domain Adaptation, and more generally Transfer or Multi-Task Learning [5], [6], [7], [8], treat classification in each domain as separate but related problems and exploit their relationship to learn from the supervised data available across all of them. Multi-task learning methods typically assume that the decision boundaries in each domain can be decomposed into a private and a shared term in a common feature space X , as illustrated by Fig.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations