2014
DOI: 10.1016/j.patcog.2014.06.008
|View full text |Cite
|
Sign up to set email alerts
|

Probability estimation for multi-class classification using AdaBoost

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…The optimised 2LP is a potential base unit for deep learning, like for boosting [11], [16], [20], [43], [44]. The numerical optimisation approach is universal, and it admits any distortion types or types of objects.…”
Section: Discussionmentioning
confidence: 99%
“…The optimised 2LP is a potential base unit for deep learning, like for boosting [11], [16], [20], [43], [44]. The numerical optimisation approach is universal, and it admits any distortion types or types of objects.…”
Section: Discussionmentioning
confidence: 99%
“…Models work with the problem of having two classes called binary classifiers [ 54 ]. A natural way to extend these models to apply to multi-class classification problems, which have many different classes, is to use multiple binary classifiers and techniques like one-vs-one [ 55 ]. In a one-on-one, multiple binary classifiers are built for each pair of classes.…”
Section: Methodsmentioning
confidence: 99%
“…Recently, AdaBoost has been successfully used in many fields of study, and few of them are as follows: cost-sensitive classification, semi supervised learning, tracking and network intrusion detection [45,46]. Assaad et al [47] predicted future values of time series using neural networks as base learners and AdaBoost ensemble method.…”
Section: Ensemble Modelsmentioning
confidence: 99%