Part of the Computer Sciences Commons
2024
A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate, Slavomír Hanzely, Dmitry Kamzolov, Dmitry Pasechnyuk, Alexander Gasnikov, Peter Richtárik, Martin Takáč
Martin Takac
2023
Decentralized personalized federated learning: Lower bounds and optimal algorithm for all personalization modes, Abdurakhmon Sadiev, Ekaterina Borodich, Aleksandr Beznosikov, Darina Dvinskikh, Saveliy Chezhegov, Rachael Tappenden, Martin Takac, Alexander Gasnikov
Martin Takac
2022
Decentralized personalized federated learning: Lower bounds and optimal algorithm for all personalization modes, Abdurakhmon Sadiev, Ekaterina Borodich, Aleksandr Beznosikov, Darina Dvinskikh, Saveliy Chezhegov, Rachael Tappenden, Martin Takac, Alexander Gasnikov
Machine Learning Faculty Publications
Exploiting higher-order derivatives in convex optimization methods, Dmitry Kamzolov, Alexander Gasnikov, Pavel Dvurechensky, Artem Agafonov, Martin Takac
Martin Takac
On Scaled Methods for Saddle Point Problems, Aleksandr Beznosikov, Aibek Alanov, Dmitry Kovalev, Martin Takac, Alexander Gasnikov
Martin Takac
Algorithm for Constrained Markov Decision Process with Linear Convergence, Egor Gladin, Maksim Lavrik-Karmazin, Karina Zainullina, Varvara Rudenko, Alexander Gasnikov, Martin Takac
Martin Takac
On Scaled Methods for Saddle Point Problems, Aleksandr Beznosikov, Aibek Alanov, Dmitry Kovalev, Martin Takac, Alexander Gasnikov
Machine Learning Faculty Publications
Recent Theoretical Advances in Non-Convex Optimization, Marina Danilova, Pavel Dvurechensky, Alexander Gasnikov, Eduard Gorbunov, Sergey Guminov, Dmitry Kamzolov, Innokentiy Shibaev
Machine Learning Faculty Publications
2020