We consider β-smooth (satisfies the generalized Hölder condition with parameter β > 2) stochastic convex optimization problem with zero-order one-point oracle. The best known result was [1]:in γ-strongly convex case, where n is the dimension. In this paper we improve this bound:This work is based on results achieved by 63 Conference MIPT held in November 2020.
Gradient-free/zeroth-order methods for black-box convex optimization have been extensively studied in the last decade with the main focus on oracle calls complexity. In this paper, besides the oracle complexity, we focus also on iteration complexity, and propose a generic approach that, based on optimal first-order methods, allows to obtain in a black-box fashion new zeroth-order algorithms for non-smooth convex optimization problems. Our approach not only leads to optimal oracle complexity, but also allows to obtain iteration complexity similar to first-order methods, which, in turn, allows to exploit parallel computations to accelerate the convergence of our algorithms. We also elaborate on extensions for stochastic optimization problems, saddle-point problems, and distributed optimization.1 Note that, for most of the algorithms in this paper, we can make these assumptions only on the intersection of Qγ and the ball x 0 + B d p (R) for some p ∈ [1, 2], where x 0 is the starting point of the algorithm and R = O x 0 − x * p ln d with x * being a solution of (1) closest to x 0 [32].
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.