2003
DOI: 10.1007/3-540-45105-6_117
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary Multiobjective Optimization for Generating an Ensemble of Fuzzy Rule-Based Classifiers

Abstract: Abstract. One advantage of evolutionary multiobjective optimization (EMO) algorithms over classical approaches is that many non-dominated solutions can be simultaneously obtained by their single run. In this paper, we propose an idea of using EMO algorithms for constructing an ensemble of fuzzy rule-based classifiers with high diversity. The classification of new patterns is performed based on the vote of multiple classifiers generated by a single run of EMO algorithms. Even when the classification performance… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2005
2005
2020
2020

Publication Types

Select...
6
2
2

Relationship

1
9

Authors

Journals

citations
Cited by 26 publications
(7 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…Table 5 demonstrates the average accuracy values of the classifiers found on test data. It also compares the performance of our method with the well-known approach, Ishibuchi et al's method [17]. From Table 5, we can see that the performance of our classifiers was much better than the results of the other algorithm.…”
Section: Resultsmentioning
confidence: 65%
“…Table 5 demonstrates the average accuracy values of the classifiers found on test data. It also compares the performance of our method with the well-known approach, Ishibuchi et al's method [17]. From Table 5, we can see that the performance of our classifiers was much better than the results of the other algorithm.…”
Section: Resultsmentioning
confidence: 65%
“…Pareto-based generation of ensembles for radial basis function networks [60] and fuzzy rule systems [61] have also been reported.…”
Section: Diverse Ensemble Generationmentioning
confidence: 99%
“…This research was continued, for instance, by Perrone [20] and Fillipi et al [21]. Using an ensemble (mixture of predictors) usually results in an improvement in the prediction accuracy [22]; this is specially true if the individual predictors are unstable, that is, different predictors are obtained in each instance of training. Several general and constructive methods have been proposed to create these ensembles: boosting, originally proposed by Schapire [23], which creates each classifier taking into account the errors of the previous one, and bagging, proposed by Breiman et al [24], which uses different partitions of the original training set to train each predictor in the ensemble.…”
Section: Introductionmentioning
confidence: 97%