Preconditioning meets biased compression for efficient distributed optimization
Document Type
Article
Publication Title
Computational Management Science
Abstract
Methods with preconditioned updates show up well in badly scaled and/or ill-conditioned convex optimization problems. However, theoretical analysis of these methods in distributed setting is not yet provided. We close this issue by studying preconditioned version of the Error Feedback (EF) method, a popular convergence stabilization mechanism for distributed learning with biased compression. We combine EF and EF21 algorithms with preconditioner based on Hutchinson’s approximation to the diagonal of the Hessian. An experimental comparison of the algorithms with the ResNet computer vision model is provided.
DOI
10.1007/s10287-023-00496-6
Publication Date
12-24-2023
Keywords
Accelerated methods, First-order methods, Non-convex minimization
Recommended Citation
V. Pirau, A. Beznosikov, M. Takáč, V. Matyukhin and A. Gasnikov, "Preconditioning meets biased compression for efficient distributed optimization," Computational Management Science, vol. 21, no. 1, Dec 2023. doi: 10.1007/s10287-023-00496-6
Comments
IR conditions: non-described