2020
DOI: 10.1007/s10489-020-01954-3
|View full text |Cite
|
Sign up to set email alerts
|

A new multi-task learning method with universum data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…Typical methods include precision-based weighted ensemble methods [29] and weight-based update ensemble model methods [30]. In addition, some algorithms based on transfer learning [31], multi task learning [32][33], and deep learning [34][35][36] to the ensemble methods are proposed. The passive adaptive strategies enable models can achieve high classification accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Typical methods include precision-based weighted ensemble methods [29] and weight-based update ensemble model methods [30]. In addition, some algorithms based on transfer learning [31], multi task learning [32][33], and deep learning [34][35][36] to the ensemble methods are proposed. The passive adaptive strategies enable models can achieve high classification accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Xie et al [35] presented a novel multi-task twin pinball loss SVM, which enhances the noise insensitivity of multi-task twin hinge loss SVM. For more information on MTLSVM, see [36][37][38][39]. Although some attempts have been made to introduce MTL into SVM, and they perform better than STL in most cases, they do not address the problems of classifier sensitivity to noise and resampling instability.…”
Section: Introductionmentioning
confidence: 99%
“…This model has experimentally demonstrated that it delivers a better accuracy compared to the methods that only make use of labeled data. Further, many Universum models have been recently proposed that improve their parental models by increasing the classification accuracy [25,37,30].…”
Section: Introductionmentioning
confidence: 99%