2017
DOI: 10.1016/j.ins.2017.06.039
|View full text |Cite
|
Sign up to set email alerts
|

A distributed approach to multi-objective evolutionary generation of fuzzy rule-based classifiers from big data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(33 citation statements)
references
References 41 publications
0
33
0
Order By: Relevance
“…Different strategies have been applied to obtain humanreadable fuzzy models in Big Data classification problems, including fuzzy versions of decision trees (FDTs) [17], [22], subgroup discovery (SD) [20], associative classifiers (FACs) [13], [21], emerging patterns mining (EPM) [18], and rule-based classifiers (FRBCs) [14]- [17], [19]. In [17], a distributed version of C4.5 is used to extract a candidate rule base that is optimized by an evolutionary algorithm. Segatori et al proposed a distributed FDT that exploits the classical Decision Tree implementation in Spark MLlib 9 , extending the learning scheme by employing fuzzy information gain based on fuzzy entropy [22].…”
Section: Related Workmentioning
confidence: 99%
“…Different strategies have been applied to obtain humanreadable fuzzy models in Big Data classification problems, including fuzzy versions of decision trees (FDTs) [17], [22], subgroup discovery (SD) [20], associative classifiers (FACs) [13], [21], emerging patterns mining (EPM) [18], and rule-based classifiers (FRBCs) [14]- [17], [19]. In [17], a distributed version of C4.5 is used to extract a candidate rule base that is optimized by an evolutionary algorithm. Segatori et al proposed a distributed FDT that exploits the classical Decision Tree implementation in Spark MLlib 9 , extending the learning scheme by employing fuzzy information gain based on fuzzy entropy [22].…”
Section: Related Workmentioning
confidence: 99%
“…This is a very interesting topic to be analyzed in depth, as it comprises several perspectives. On the one hand, whether to carry out an a apriori learning of the DB via strong fuzzy partitions that are properly adapted to the density related to each attribute of the dataset 21,22 . On the other hand, to carry out a post-processing stage for the tuning of the membership functions 23,24 .…”
Section: Lessons Learned and Discussionmentioning
confidence: 99%
“…Then, many researchers worked in this area and built various types of fuzzy sets and membership functions to illustrate fuzzy decision tree. 17 A combination of decision trees with random vectors of samples is named as random forest in the work of Breiman. This fuzzy partition is easy to understand and useful in diagnosing for potential transformers.…”
Section: Related Workmentioning
confidence: 99%
“…11 Big data is also one of the research branches in the fuzzy decision tree, and lots of novel methods are proposed such as big data-driven smart energy management, 12 indexing techniques, 13 semiautomagical fuzzy partition method, 14 modified Gini index fuzzy SLIQ decision tree algorithm, 15 Chi-FRBCS-BigData algorithm, 16 and multiobjective evolutionary algorithms. 17 A combination of decision trees with random vectors of samples is named as random forest in the work of Breiman. 18 This paper employs fuzzy logic on to the random forest and its error rates yielded by randomly selecting features.…”
Section: Related Workmentioning
confidence: 99%