2009
DOI: 10.1007/978-3-642-10439-8_37
|View full text |Cite
|
Sign up to set email alerts
|

CoXCS: A Coevolutionary Learning Classifier Based on Feature Space Partitioning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2009
2009
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…In the literature parallel working Learning Classifier Systems are typically used to improve the algorithmic performance of a classifier system or in order to be able to deal with distributed data [25]. For example, some authors apply multiple coevolving Michigan-style Learning Classifier Systems on decompositions of the problem space to increase the solution quality on large problems [1,10,19,30]. Moreover, coevolving Michigan-style Learning Classifier Systems have been used for social modelling on the El' Farol problem [13].…”
Section: Related Workmentioning
confidence: 98%
“…In the literature parallel working Learning Classifier Systems are typically used to improve the algorithmic performance of a classifier system or in order to be able to deal with distributed data [25]. For example, some authors apply multiple coevolving Michigan-style Learning Classifier Systems on decompositions of the problem space to increase the solution quality on large problems [1,10,19,30]. Moreover, coevolving Michigan-style Learning Classifier Systems have been used for social modelling on the El' Farol problem [13].…”
Section: Related Workmentioning
confidence: 98%
“…CoXCS is a coevolutionary learning classifier based on feature space partitioning [2]. It extends the XCS model by introducing a coevolutionary approach.…”
Section: A Coxcsmentioning
confidence: 99%
“…It was shown in Tumer and Oza (2003) that feature partitioning potentially helps in overcoming the curse of dimensionality and avoids the feature selection drawbacks. There are different popular strategies for creating feature subset-based ensembles, such as manual selection (Gershoff and Schulenburg 2007), feature bagging (or random subspace) (Ho 1998), equal linear division of the feature space (Abedini and Kirley 2009), or by employing feature selection algorithms (Debie et al 2013a). The random subspace approach in particular has been studied extensively in the literature, especially its application to the decision tree classifier (random forests) (Breiman 2001).…”
Section: A Generic Framework For Categorizing Ensemblesmentioning
confidence: 99%
“…Individuals were used to classify the partially masked training data corresponding to the feature in focus. A co‐evolutionary multi‐population XCS was proposed (Abedini and Kirley ). Co‐evolutionary multi‐population XCS is based on a collection of independent populations of classifiers that are trained using different partitions of the feature space within the training data set.…”
Section: Introductionmentioning
confidence: 99%