Document Type
Conference Proceeding
Publication Title
Proceedings of Machine Learning Research
Abstract
We revisit the acceleration of the noise-tolerant power method for which, despite previous studies, the results remain unsatisfactory as they are either wrong or suboptimal, also lacking generality. In this work, we present a simple yet general and optimal analysis via noise-corrupted Chebyshev polynomials, which allows a larger iteration rank p than the target rank k, requires less noise conditions in a new form, and achieves the optimal iteration complexity (Equation presented) for some q satisfying k ≤ q ≤ p in a certain regime of the momentum parameter. Interestingly, it shows dynamic dependence of the noise tolerance on the spectral gap, i.e., from linear at the beginning to square-root near convergence, while remaining commensurate with the previous in terms of overall tolerance. We relate our new form of noise norm conditions to the existing trigonometric one, which enables an improved analysis of generalized eigenspace computation and canonical correlation analysis. We conduct an extensive experimental study to showcase the great performance of the considered algorithm with a larger iteration rank p > k across different applications.
First Page
7147
Last Page
7175
Publication Date
4-2023
Keywords
Acceleration, Iterative methods, Machine learning, Polynomials
Recommended Citation
Z. Xu, "On the Accelerated Noise-Tolerant Power Method", in "26th Intl. Conf. on Artificial Intelligence and Statistics (AISTATS 2023), vol. 206, pp. 7147-7175, Apr 2023. Available at: https://proceedings.mlr.press/v206/xu23g.html
Comments
IR conditions: non-described