2016
DOI: 10.1080/18756891.2016.1146536
|View full text |Cite
|
Sign up to set email alerts
|

Hyperrectangles Selection for Monotonic Classification by Using Evolutionary Algorithms

Abstract: In supervised learning, some real problems require the response attribute to represent ordinal values that should increase with some of the explaining attributes. They are called classification problems with monotonicity constraints. Hyperrectangles can be viewed as storing objects in R n which can be used to learn concepts combining instance-based classification with the axis-parallel rectangle mainly used in rule induction systems. This hybrid paradigm is known as nested generalized exemplar learning. In thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 36 publications
(51 reference statements)
0
7
0
Order By: Relevance
“…• Monotonic Accuracy (MAcc [44] • Monotonicity Compliance (MCC [46]), defined as the proportion of the input space where the requested monotonicity constraints are not violated, weighted by the joint probability distribution of the input space.…”
Section: Predictive Assessment Metricsmentioning
confidence: 99%
See 2 more Smart Citations
“…• Monotonic Accuracy (MAcc [44] • Monotonicity Compliance (MCC [46]), defined as the proportion of the input space where the requested monotonicity constraints are not violated, weighted by the joint probability distribution of the input space.…”
Section: Predictive Assessment Metricsmentioning
confidence: 99%
“…• Monotonic Accuracy (MAcc [44]), computed as standard Accuracy, but only considering those examples that completely fulfill the monotonicity constraints in the test set. In other words, non-monotonic examples do not take part in the calculation of MAcc.…”
Section: Predictive Assessment Metricsmentioning
confidence: 99%
See 1 more Smart Citation
“…Instance-based learning has proven to be a good approach for monotonic classification [2,15,31,18]. However, some of these methods, such as Monotonic k-Nearest Neighbors [15] (MkNN), need to learn from a fully monotonic set to ensure monotonic predictions.…”
Section: Introductionmentioning
confidence: 99%
“…Several monotonic classification approaches have been proposed in the specialized literature. They include classification trees and rule induction [9,10,11,12,13,14], neural networks [15,16], instance-based learning [4,17,18] and hybridizations [19,20]. Some of them require the training set to be purely monotone to work correctly, such as the MKNN classifier [18].…”
Section: Introductionmentioning
confidence: 99%