DOI: 10.1007/978-3-540-87479-9_38
|View full text |Cite
|
Sign up to set email alerts
|

Nearest Neighbour Classification with Monotonicity Constraints

Abstract: Abstract. In many application areas of machine learning, prior knowledge concerning the monotonicity of relations between the response variable and predictor variables is readily available. Monotonicity may also be an important model requirement with a view toward explaining and justifying decisions, such as acceptance/rejection decisions. We propose a modified nearest neighbour algorithm for the construction of monotone classifiers from data. We start by making the training data monotone with as few label cha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
71
0

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 76 publications
(72 citation statements)
references
References 19 publications
1
71
0
Order By: Relevance
“…However, if models violate these constraints they may not be accepted by experts as valid, conforming to monotonicity constraints improves model acceptance [7,10].…”
Section: Monotonicitymentioning
confidence: 99%
“…However, if models violate these constraints they may not be accepted by experts as valid, conforming to monotonicity constraints improves model acceptance [7,10].…”
Section: Monotonicitymentioning
confidence: 99%
“…Meanwhile, several machine learning algorithms have been modified so as to guarantee monotonicity in attributes, including nearest neighbor classification (Duivesteijn and Feelders 2008), neural networks (Sill 1998), decision tree learning (Ben-David 1995;Potharst and Feelders 2002), rule induction (Dembczyński et al 2009), as well as methods based on isotonic separation (Chandrasekaran et al 2005) and piecewise linear models (Dembczyński et al 2006).…”
Section: Related Workmentioning
confidence: 99%
“…Meanwhile, several machine learning algorithms have been modified so as to guarantee monotonicity in attributes, including nearest neighbor classification [15], decision tree learning [16] and rule induction [17]. Instead of modifying models and algorithms, one can also modify the data.…”
Section: Related Workmentioning
confidence: 99%