Document Type
Article
Publication Title
arXiv
Abstract
Gradient-free/zeroth-order methods for black-box convex optimization have been extensively studied in the last decade with the main focus on oracle calls complexity. In this paper, besides the oracle complexity, we focus also on iteration complexity, and propose a generic approach that, based on optimal first-order methods, allows to obtain in a black-box fashion new zeroth-order algorithms for non-smooth convex optimization problems. Our approach not only leads to optimal oracle complexity, but also allows to obtain iteration complexity similar to first-order methods, which, in turn, allows to exploit parallel computations to accelerate the convergence of our algorithms. We also elaborate on extensions for stochastic optimization problems, saddle-point problems, and distributed optimization. © 2022, CC BY.
DOI
10.48550/arXiv.2201.12289
Publication Date
1-28-2022
Keywords
Black boxes; Convex optimisation; Convex optimization problems; First order; First-order methods; Generic approach; Non-smooth convex optimizations; Optimisations; Ordering algorithms; Power
Recommended Citation
A. Gasnikov et al., "The power of first-order smooth optimization for black-box non-smooth problems," 2022, arXiv:2201.12289
Included in
Applied Mathematics Commons, Computer Sciences Commons, Statistics and Probability Commons
Comments
Preprint: arXiv