Document Type
Conference Proceeding
Publication Title
Proceedings of Machine Learning Research
Abstract
Algorithms for min-max optimization and variational inequalities are often studied under monotonicity assumptions. Motivated by non-monotone machine learning applications, we follow the line of works (Diakonikolas et al., 2021; Lee & Kim, 2021; Pethick et al., 2022; Böhm, 2022) aiming at going beyond monotonicity by considering the weaker negative comonotonicity assumption. In this work, we provide tight complexity analyses for the Proximal Point (PP), Extragradient (EG), and Optimistic Gradient (OG) methods in this setup, closing several questions on their working guarantees beyond monotonicity. In particular, we derive the first non-asymptotic convergence rates for PP under negative comonotonicity and star-negative comonotonicity and show their tightness via constructing worst-case examples; we also relax the assumptions for the last-iterate convergence guarantees for EG and OG and prove the tightness of the existing best-iterate guarantees for EG and OG via constructing counter-examples.
First Page
11614
Last Page
11641
Publication Date
7-2023
Keywords
Comonotonicity; Complexity analysis; Extragradient; Gradient's methods; Machine learning applications; Min-max optimization; Monotonicity; Optimistics; Proximal Points; Variational inequalities
Recommended Citation
E. Gorbunov, A. Taylor, S. Horvath and G. Gidel, "Convergence of Proximal Point and Extragradient-Based Methods Beyond Monotonicity: the Case of Negative Comonotonicity," Proceedings of Machine Learning Research , vol. 202, pp. 11614 - 11641, Jul 2023.
Comments
Open Access version from PMLR
Uploaded on May 29, 2024