Proceedings of the 2020 Genetic and Evolutionary Computation Conference 2020
DOI: 10.1145/3377930.3389815
|View full text |Cite
|
Sign up to set email alerts
|

Multi-objective hyperparameter tuning and feature selection using filter ensembles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1

Relationship

4
3

Authors

Journals

citations
Cited by 34 publications
(28 citation statements)
references
References 29 publications
0
28
0
Order By: Relevance
“…The latter can either be a single value or an interval of values. We define a counterfactual explanation x for an observation x * as a data point fulfilling the following: (1) its prediction f (x ) is close to the desired outcome set Y , (2) it is close to x * in the X space, (3) it differs from x * only in a few features, and (4) it is a plausible data point according to the probability distribution P X . For classification models, we assume thatf returns the probability for a user-selected class and Y has to be the desired probability (range).…”
Section: Multi-objective Counterfactualsmentioning
confidence: 99%
See 1 more Smart Citation
“…The latter can either be a single value or an interval of values. We define a counterfactual explanation x for an observation x * as a data point fulfilling the following: (1) its prediction f (x ) is close to the desired outcome set Y , (2) it is close to x * in the X space, (3) it differs from x * only in a few features, and (4) it is a plausible data point according to the probability distribution P X . For classification models, we assume thatf returns the probability for a user-selected class and Y has to be the desired probability (range).…”
Section: Multi-objective Counterfactualsmentioning
confidence: 99%
“…The complete code of the algorithm and the code to reproduce the experiments and results of this paper are available at https://github.com/susanne-207/moc. The implementation of MOC is based on our implementation of [19], which we also used for [3]. We will provide an open source R library with our implementation of the method based on the iml package [23].…”
Section: Electronic Submissionmentioning
confidence: 99%
“…Hyperparameters were tuned using model-based optimization (MBO) within a nested spatial cross-validation (CV) [50][51][52]. In MBO, first, n hyperparameter settings are randomly chosen from a user-defined search space.…”
Section: Hyperparameter Optimizationmentioning
confidence: 99%
“…To optimize the number of features used for model fitting, the percentage of features was added as a hyperparameter during the optimization stage [51]. For PCA, the number of principal components was tuned.…”
Section: Hyperparameter Optimizationmentioning
confidence: 99%
“…To optimize the number of features used for model fitting, the percentage of features was added as a hyperparameter during the optimization stage ( [39]).…”
Section: Benchmarking Design 1) Algorithmsmentioning
confidence: 99%