2019
DOI: 10.1142/s0219530519400086
|View full text |Cite
|
Sign up to set email alerts
|

Robust randomized optimization with k nearest neighbors

Abstract: Modern applications of machine learning typically require the tuning of a multitude of hyperparameters. With this motivation in mind, we consider the problem of optimization given a set of noisy function evaluations. We focus on robust optimization in which the goal is to find a point in the input space such that the function remains high when perturbed by an adversary within a given radius. Here we identify the minimax optimal rate for this problem, which turns out to be of order [Formula: see text], where [F… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…The k-NN algorithm was developed to classify the unknown data based on existing data with known labels. 32 The essence of k-NN is analogy learning, that is learning by comparing the given test sample with training samples. Suppose that in the last section, l PCs are selected from feature vector of IMF energy ratios as the inputs of the ML model, which indicates that all the training samples should be in this l-dimensional space.…”
Section: K-nnmentioning
confidence: 99%
“…The k-NN algorithm was developed to classify the unknown data based on existing data with known labels. 32 The essence of k-NN is analogy learning, that is learning by comparing the given test sample with training samples. Suppose that in the last section, l PCs are selected from feature vector of IMF energy ratios as the inputs of the ML model, which indicates that all the training samples should be in this l-dimensional space.…”
Section: K-nnmentioning
confidence: 99%