2018 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) 2018
DOI: 10.1109/fuzz-ieee.2018.8491636
|View full text |Cite
|
Sign up to set email alerts
|

Learning fuzzy decision trees using integer programming

Abstract: DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…Their approach is essentially a randomized optimal version of CART. (Rhuggenaath et al 2018) build a mixed integer linear program to learn fuzzy decision trees, where split decisions on the internal nodes of a tree are not crisp. (Firat et al 2018) apply column generation techniques in constructing univariate binary classification trees.…”
Section: Related Workmentioning
confidence: 99%
“…Their approach is essentially a randomized optimal version of CART. (Rhuggenaath et al 2018) build a mixed integer linear program to learn fuzzy decision trees, where split decisions on the internal nodes of a tree are not crisp. (Firat et al 2018) apply column generation techniques in constructing univariate binary classification trees.…”
Section: Related Workmentioning
confidence: 99%
“…Several years later the first MILP formulations were proposed for the full problem (Bertsimas and Dunn 2017) and (Verwer and Zhang 2017). The latest methods improve these works using non-crisp decision boundaries (Rhuggenaath et al 2018), a binary encoding (Verwer and Zhang 2019), new analytical bounds and an improved tree representation translation (Hu, Rudin, and Seltzer 2019), by translating to CP (Verhaeghe et al 2020), using dynamic programming with search (Demirović et al 2020), by caching branch-andbound (Aglin, Nijssen, and Schaus 2020), and optimized randomization (Blanquero et al 2021). In this work, we build on these works to create the first formulation for optimal learning of robust decision trees.…”
Section: Optimal Decision Treesmentioning
confidence: 99%