2020
DOI: 10.1109/access.2020.2980949
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Fuzzy Measure Learning for Classifier Ensembles From Coalitions Performance

Abstract: In Machine Learning an ensemble refers to the combination of several classifiers with the objective of improving the performance of every one of its counterparts. To design an ensemble two main aspects must be considered: how to create a diverse set of classifiers and how to combine their outputs. This work focuses on the latter task. More specifically, we focus on the usage of aggregation functions based on fuzzy measures, such as the Sugeno and Choquet integrals, since they allow to model the coalitions and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 58 publications
(74 reference statements)
0
5
0
Order By: Relevance
“…Papers [20]- [22] applied the majority voting method to ensemble learning in different applications, and the results of the studies indicated that the majority voting method had a nice performance. In recent years, fuzzy theory has been used to deal with uncertain data, and interval-valued aggregation functions based on fuzzy theory have been proposed for ensemble learning [23]. Paper [24]- [27] proposed interval-valued aggregation functions to capture the uncertainty of data and applied them to ensemble learning.…”
Section: Related Workmentioning
confidence: 99%
“…Papers [20]- [22] applied the majority voting method to ensemble learning in different applications, and the results of the studies indicated that the majority voting method had a nice performance. In recent years, fuzzy theory has been used to deal with uncertain data, and interval-valued aggregation functions based on fuzzy theory have been proposed for ensemble learning [23]. Paper [24]- [27] proposed interval-valued aggregation functions to capture the uncertainty of data and applied them to ensemble learning.…”
Section: Related Workmentioning
confidence: 99%
“…The combination is usually performed by classical aggregation functions, such as the arithmetic mean. However, it has been proven that the use of more sophisticated aggregation functions, such as weighted means, OWA opertors or fuzzy integrals, can lead to an improvement of the classification accuracy of the ensemble [39,40,41]. In the case of fuzzy integrals (Choquet or Sugeno), the underlying fuzzy measure is able to capture positive and negative interaction among the classifiers of the ensemble.…”
Section: Motivationmentioning
confidence: 99%
“…Although in this paper the proposed interval-valued Choquet integral is given with respect to a interval-valued fuzzy measure, for the sake of simplicity, we have consider a classical fuzzy measure whose codomain is [0, 1] instead of L([0, 1]). In this section, the estimation of the fuzzy measure will be done using the CPM construction method [39], which is specifically designed for classifier aggregation using the Choquet integral. This method estimates each coefficient of the fuzzy measure by considering the accuracy of each possible coalition of classifiers.…”
Section: Interval-valued Choquet Integral For Aggregating Classifiersmentioning
confidence: 99%
“…In the literature, different ensemble classifiers based on fuzzy integrals have been presented which proposed different strategies to construct different fuzzy measures. As the review of different approaches to make fuzzy measures in ensemble learning is out of the scope of this paper, for more information, researchers referenced to papers [17] [18].…”
Section: Related Workmentioning
confidence: 99%
“…Bagging is a resampling strategy which was proposed by Breiman in 1996 [2] with the aim of improving the accuracy and stability of the machine learning algorithms in classification and regression problems. This method is a simple as well as effective way to construct ensembles which help reduce variance and avoid overfitting in classification [18]. In this approach, to generate diversity among classifiers, a new data set is constructed for each classifier.…”
Section: A Bagging (Bootstrap Aggregation)mentioning
confidence: 99%