1992
DOI: 10.1109/21.155943
|View full text |Cite
|
Sign up to set email alerts
|

Methods of combining multiple classifiers and their applications to handwriting recognition

Abstract: Method of combining the classification powers of several classifiers is regarded as a general problem in various application areas of pattern recognition, and a systematic investigation has been made. Possible solutions to the problem can be divided into three categories according to the levels of information available from the various classifiers. Four approaches are proposed based on different methodologies for solving this problem. One is suitable for combining individual classifiers such as Bayesian, k-NN … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
1,000
2
21

Year Published

1999
1999
2014
2014

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 1,838 publications
(1,028 citation statements)
references
References 20 publications
5
1,000
2
21
Order By: Relevance
“…Let θ j be an l × l matrix. Each element θ j (t′, t) describes the probability that rater j labels a voxel with t ′ when the true label is t. This matrix is similar to the normalized confusion matrix of a Bayesian classifier (Xu et al, 1992), and we will use this terminology for the remainder of the paper. Let θ= [θ 1 , …, θ r ] be the unknown set of all confusion matrices characterizing all r raters.…”
Section: Multi-label Staple Algorithmmentioning
confidence: 99%
“…Let θ j be an l × l matrix. Each element θ j (t′, t) describes the probability that rater j labels a voxel with t ′ when the true label is t. This matrix is similar to the normalized confusion matrix of a Bayesian classifier (Xu et al, 1992), and we will use this terminology for the remainder of the paper. Let θ= [θ 1 , …, θ r ] be the unknown set of all confusion matrices characterizing all r raters.…”
Section: Multi-label Staple Algorithmmentioning
confidence: 99%
“…This approach can be refined assigning different weights to each classifier to optimize the performance of the combined classifier on the training set [86], or, assuming mutual independence between classifiers, a Bayesian decision rule selects the class with the highest posterior probability computed through the estimated class conditional probabilities and the Bayes' formula [130,122]. A Bayesian approach has also been used in Consensus based classification of multisource remote sensing data [10,9,19], outperforming conventional multivariate methods for classification.…”
Section: Non-generative Ensemblesmentioning
confidence: 99%
“…Feature subspace methods performed by partitioning the set of features, where each subset is used by one classifier in the team, are proposed in [130,99,18]. Other methods for combining different feature sets using genetic algorithms are proposed in [81,79].…”
Section: Generative Ensemblesmentioning
confidence: 99%
See 1 more Smart Citation
“…A large number of strategies have been proposed for this purpose, see for example [5][6][7][8][9]. Since the aggregation problem also occurs in all other decomposition methods and in ensemble methods, these research areas as well provide a large number of aggregation strategies (sometimes called classifier combination schemes); see for example [10] and references therein. However, since the semantics of these problems are different, we note that the aggregation strategies from different fields are not always interchangeable.…”
Section: Introductionmentioning
confidence: 99%