An Accelerated Doubly Stochastic Gradient Method with Faster Explicit Model Identification
Document Type
Conference Proceeding
Publication Title
CIKM '22: Proceedings of the 31st ACM International Conference on Information & Knowledge Management
Abstract
Sparsity regularized loss minimization problems play an important role in various fields including machine learning, data mining, and modern statistics. Proximal gradient descent method and coordinate descent method are the most popular approaches to solving the minimization problem. Although existing methods can achieve implicit model identification, aka support set identification, in a finite number of iterations, these methods still suffer from huge computational costs and memory burdens in high-dimensional scenarios. The reason is that the support set identification in these methods is implicit and thus cannot explicitly identify the low-complexity structure in practice, namely, they cannot discard useless coefficients of the associated features to achieve algorithmic acceleration via dimension reduction. To address this challenge, we propose a novel accelerated doubly stochastic gradient descent (ADSGD) method for sparsity regularized loss minimization problems, which can reduce the number of block iterations by eliminating inactive coefficients during the optimization process and eventually achieve faster explicit model identification and improve the algorithm efficiency. Theoretically, we first prove that ADSGD can achieve a linear convergence rate and lower overall computational complexity. More importantly, we prove that ADSGD can achieve a linear rate of explicit model identification. Numerically, experimental results on benchmark datasets confirm the efficiency of our proposed method. © 2022 ACM.
First Page
57
Last Page
66
DOI
10.1145/3511808.3557234
Publication Date
10-17-2022
Keywords
Data mining, Efficiency, Gradient methods, Machine learning, Parallel processing systems, Stochastic models
Recommended Citation
R. Bao, B. Gu, and H. Huang, "An Accelerated Doubly Stochastic Gradient Method with Faster Explicit Model Identification", in Proceedings of the 31st ACM International Conference on Information & Knowledge Management (CIKM '22:), pp. 57-66, Oct. 2022, doi:10.1145/3511808.3557234
Comments
IR Deposit conditions: non-described