Part of the Statistics and Probability Commons

Works by Martin Takáč in Statistics and Probability

2022

The Power Of First-Order Smooth Optimization for Black-Box Non-Smooth Problems, Alexander V. Gasnikov., Anton Novitskii, Vasilii Novitskii, Farshed Abdukhakimov, Dmitry Kamzolov, Aleksandr Beznosikov, Martin Takáč, Pavel Dvurechensky, Bin Gu
Bin Gu

Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample, Albert S. Berahas, Majid Jahani, Peter Richtárik, Martin Takáč
Martin Takac

The Power Of First-Order Smooth Optimization for Black-Box Non-Smooth Problems, Alexander V. Gasnikov., Anton Novitskii, Vasilii Novitskii, Farshed Abdukhakimov, Dmitry Kamzolov, Aleksandr Beznosikov, Martin Takáč, Pavel Dvurechensky, Bin Gu
Martin Takac

Doubly Adaptive Scaled Algorithm for Machine Learning Using Second-Order Information, Majid Jahani, Sergey Rusakov, Zheng Shi, Peter Richtárik, Michael W. Mahoney, Martin Takáč
Martin Takac

Random-Reshuffled SARAH Does Not Need a Full Gradient Computations, Aleksandr Beznosikov, Martin Takáč
Martin Takac

The Power Of First-Order Smooth Optimization for Black-Box Non-Smooth Problems, Alexander V. Gasnikov., Anton Novitskii, Vasilii Novitskii, Farshed Abdukhakimov, Dmitry Kamzolov, Aleksandr Beznosikov, Martin Takáč, Pavel Dvurechensky, Bin Gu
Machine Learning Faculty Publications

PDF

2021

Random-Reshuffled SARAH Does Not Need a Full Gradient Computations, Aleksandr Beznosikov, Martin Takáč
Machine Learning Faculty Publications

Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample, Albert S. Berahas, Majid Jahani, Peter Richtárik, Martin Takáč
Machine Learning Faculty Publications

Doubly Adaptive Scaled Algorithm for Machine Learning Using Second-Order Information, Majid Jahani, Sergey Rusakov, Zheng Shi, Peter Richtárik, Michael W. Mahoney, Martin Takáč
Machine Learning Faculty Publications