2012
DOI: 10.1007/s11460-012-0188-9
|View full text |Cite
|
Sign up to set email alerts
|

Boosting and margin theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…But due to the increasing of iterations, the efficiency of algorithm is greatly reduced. Because the adjustment for weights of samples is too small, researchers have proposed different methods to solve this problem [15,16] . In this paper a new weight adjusting method is proposed.…”
Section: Adaboostm1 Algorithm and Its Improvementmentioning
confidence: 99%
“…But due to the increasing of iterations, the efficiency of algorithm is greatly reduced. Because the adjustment for weights of samples is too small, researchers have proposed different methods to solve this problem [15,16] . In this paper a new weight adjusting method is proposed.…”
Section: Adaboostm1 Algorithm and Its Improvementmentioning
confidence: 99%
“…Since local accuracy is a key feature of the DES method, many algorithms use k-nearest neighbors as a framework [19, 26, 27]. Other methods for generating different homogeneous classifiers have been proposed, including random subspace [28], bagging [29], boosting [30], and clustering [31].…”
Section: Introductionmentioning
confidence: 99%