Random-Reshuffled SARAH Does Not Need a Full Gradient Computations

Document Type

Article

Publication Title

arXiv

Abstract

The StochAstic Recursive grAdient algoritHm (SARAH) algorithm is a variance reduced variant of the Stochastic Gradient Descent (SGD) algorithm that needs a gradient of the objective function from time to time. In this paper, we remove the necessity of a full gradient computation. This is achieved by using a randomized reshuffling strategy and aggregating stochastic gradients obtained in each epoch. The aggregated stochastic gradients serve as an estimate of a full gradient in the SARAH algorithm. We provide a theoretical analysis of the proposed approach and conclude the paper with numerical experiments that demonstrate the efficiency of this approach. Copyright © 2021, The Authors. All rights reserved.

DOI

10.48550/arXiv.2111.13322

Publication Date

11-26-2021

Keywords

Gradient algorithm; Gradients computation; Numerical experiments; Objective functions; Stochastic gradient; Stochastic gradient descent algorithm; Stochastics

Comments

Preprint: arXiv

Share

COinS