ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9053557
|View full text |Cite
|
Sign up to set email alerts
|

Large Dimensional Asymptotics of Multi-Task Learning

Abstract: Multi Task Learning (MTL) efficiently leverages useful information contained in multiple related tasks to help improve the generalization performance of all tasks. This article conducts a large dimensional analysis of a simple but, as we shall see, extremely powerful when carefully tuned, Least Square Support Vector Machine (LSSVM) version of MTL, in the regime where the dimension p of the data and their number n grow large at the same rate. Under mild assumptions on the input data, the theoretical analysis of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 37 publications
(45 reference statements)
0
2
0
Order By: Relevance
“…Specifically, the article provides a theoretical analysis generalizing [4,5] to a data model with different covariance structures for each data class. As in [4,6], our analysis reveals the importance of optimizing the training data labels in order to combat negative transfer.…”
Section: Introductionmentioning
confidence: 94%
See 1 more Smart Citation
“…Specifically, the article provides a theoretical analysis generalizing [4,5] to a data model with different covariance structures for each data class. As in [4,6], our analysis reveals the importance of optimizing the training data labels in order to combat negative transfer.…”
Section: Introductionmentioning
confidence: 94%
“…LS-SVM classification performances in high dimensions will heavily depend on the nature of the kernel function. This feature becomes adaptable to different given problem as explored in [9] and [6] for vanishing difference in means across classes.…”
Section: Assumptions and Non-trivial Regimementioning
confidence: 99%