Part of the Physical Sciences and Mathematics Commons

Works by Peter Richtárik in Physical Sciences and Mathematics

2024

A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate, Slavomír Hanzely, Dmitry Kamzolov, Dmitry Pasechnyuk, Alexander Gasnikov, Peter Richtárik, Martin Takáč
Martin Takac

2023

Stochastic distributed learning with gradient quantization and double-variance reduction, Samuel Horváth, Dmitry Kovalev, Konstantin Mishchenko, Peter Richtárik, Sebastian Stich
Samuel Horváth

2022

A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate, Slavomír Hanzely, Dmitry Kamzolov, Dmitry Pasechnyuk, Alexander Gasnikov, Peter Richtárik, Martin Takáč
Machine Learning Faculty Publications

PDF

Stochastic distributed learning with gradient quantization and double-variance reduction, Samuel Horváth, Dmitry Kovalev, Konstantin Mishchenko, Peter Richtárik, Sebastian Stich
Machine Learning Faculty Publications

Natural Compression for Distributed Deep Learning, Samuel Horváth, Chen Yu Ho, Horváth L'udovít, Atal Narayan Sahu, Marco Canini, Peter Richtárik
Machine Learning Faculty Publications

Adaptive Learning Rates for Faster Stochastic Gradient Methods, Samuel Horvath, Konstantin Mishchenko, Peter Richtárik
Machine Learning Faculty Publications

AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods, Zheng Shi, Abdurakhmon Sadiev, Nicolas Loizou, Peter Richtárik, Martin Takac
Martin Takac

FedShuffle: Recipes for Better Use of Local Work in Federated Learning, Samuel Horváth, Maziar Sanjabi, Lin Xiao, Peter Richtárik, Michael Rabbat
Machine Learning Faculty Publications

Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample, Albert S. Berahas, Majid Jahani, Peter Richtárik, Martin Takáč
Martin Takac

Doubly Adaptive Scaled Algorithm for Machine Learning Using Second-Order Information, Majid Jahani, Sergey Rusakov, Zheng Shi, Peter Richtárik, Michael W. Mahoney, Martin Takáč
Martin Takac

2021

Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample, Albert S. Berahas, Majid Jahani, Peter Richtárik, Martin Takáč
Machine Learning Faculty Publications

Doubly Adaptive Scaled Algorithm for Machine Learning Using Second-Order Information, Majid Jahani, Sergey Rusakov, Zheng Shi, Peter Richtárik, Michael W. Mahoney, Martin Takáč
Machine Learning Faculty Publications

AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods, Zheng Shi, Abdurakhmon Sadiev, Nicolas Loizou, Peter Richtárik, Martin Takac
Machine Learning Faculty Publications