2003
DOI: 10.1007/s10044-003-0192-z
|View full text |Cite
|
Sign up to set email alerts
|

New Applications of Ensembles of Classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
141
0
1

Year Published

2005
2005
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 252 publications
(142 citation statements)
references
References 35 publications
0
141
0
1
Order By: Relevance
“…The MCSs have been integrated by using four well-known resampling methods: random selection with no replacement [1], bagging [2], boosting [8], and Arc-x4 [3]. Only the result of the best technique on each database has been presented in Table 1.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…The MCSs have been integrated by using four well-known resampling methods: random selection with no replacement [1], bagging [2], boosting [8], and Arc-x4 [3]. Only the result of the best technique on each database has been presented in Table 1.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Previous works demonstrated that a MES trained on balanced subsets of the original TS better recognises minority class patterns than IC [10,11,12], as discussed in section 2. Based on these observations, the proposed method works as follows.…”
Section: Methodsmentioning
confidence: 99%
“…The last approach coping with imbalanced TS is based on multi-experts system, also known as ensemble of classifiers, where each composing classifier is trained on a subset of the majority class and on the whole minority class [10,11,12]. The idea is based on the widely accepted result that a MES approach generally produces better results than those obtained by individual composing experts, since different features and recognition systems complement each other in classification performance.…”
Section: Techniques For Handling Imbalancedmentioning
confidence: 99%
See 2 more Smart Citations