Logo image
Global Convergence of Iteratively Reweighted Least Squares for Robust Subspace Recovery
Preprint

Global Convergence of Iteratively Reweighted Least Squares for Robust Subspace Recovery

Gilad Lerman, Kang Li, Tyler Maunu and Teng Zhang
arXiv (Cornell University)
06/29/2025
Handle:
https://hdl.handle.net/10192/74635

Abstract

Computer Science - Learning Mathematics - Optimization and Control Statistics - Machine Learning
Robust subspace estimation is fundamental to many machine learning and data analysis tasks. Iteratively Reweighted Least Squares (IRLS) is an elegant and empirically effective approach to this problem, yet its theoretical properties remain poorly understood. This paper establishes that, under deterministic conditions, a variant of IRLS with dynamic smoothing regularization converges linearly to the underlying subspace from any initialization. We extend these guarantees to affine subspace estimation, a setting that lacks prior recovery theory. Additionally, we illustrate the practical benefits of IRLS through an application to low-dimensional neural network training. Our results provide the first global convergence guarantees for IRLS in robust subspace recovery and, more broadly, for nonconvex IRLS on a Riemannian manifold.

Metrics

10 Record Views

Details

Logo image