Black-Box Reductions for Zeroth-Order Gradient Algorithms to Achieve Lower Query Complexity

Document Type

Article

Publication Title

Journal of Machine Learning Research

Abstract

Zeroth-order (ZO) optimization has been the key technique for various machine learning applications especially for black-box adversarial attack, where models need to be learned in a gradient-free manner. Although many ZO algorithms have been proposed, the high function query complexities hinder their applications seriously. To address this challenging problem, we propose two stagewise black-box reduction frameworks for ZO algorithms under convex and non-convex settings respectively, which lower down the function query complexities of ZO algorithms. Moreover, our frameworks can directly derive the convergence results of ZO algorithms under convex and non-convex settings without extra analyses, as long as convergence results under strongly convex setting are given. To illustrate the advantages, we further study ZO-SVRG, ZO-SAGA and ZO-Varag under strongly-convex setting and use our frameworks to directly derive the convergence results under convex and non-convex settings. The function query complexities of these algorithms derived by our frameworks are lower than that of their vanilla counterparts without frameworks, or even lower than that of state-of-the-art algorithms. Finally we conduct numerical experiments to illustrate the superiority of our frameworks.

First Page

1

Last Page

47

Publication Date

6-1-2021

Keywords

Black-box reduction, Convex optimization, Non-convex optimization, Stagewise training, Zeroth order optimization

Comments

IR deposit conditions: accepted version - not permitted

Available as OA on JMLR

Share

COinS