Logo image
A Smoothing Newton Method for Rank-one Matrix Recovery
Preprint

A Smoothing Newton Method for Rank-one Matrix Recovery

Tyler Maunu and Gabriel Abreu
arXiv (Cornell University)
07/30/2025
Handle:
https://hdl.handle.net/10192/70072

Abstract

Computer Science - Learning Mathematics - Optimization and Control Statistics - Machine Learning
We consider the phase retrieval problem, which involves recovering a rank-one positive semidefinite matrix from rank-one measurements. A recently proposed algorithm based on Bures-Wasserstein gradient descent (BWGD) exhibits superlinear convergence, but it is unstable, and existing theory can only prove local linear convergence for higher rank matrix recovery. We resolve this gap by revealing that BWGD implements Newton's method with a nonsmooth and nonconvex objective. We develop a smoothing framework that regularizes the objective, enabling a stable method with rigorous superlinear convergence guarantees. Experiments on synthetic data demonstrate this superior stability while maintaining fast convergence.

Metrics

30 Record Views

Details

Logo image