Scholarship list
Conference presentation
Bures-Wasserstein Barycenters and Low-Rank Matrix Recovery
Date presented 01/04/2023
Joint Mathematics Meeting, 01/04/2023–01/07/2023, Boston, MA
We revisit the problem of recovering a low-rank positive semidefinite matrix from rank-one projections using tools from optimal transport. More specifically, we show that a variational formulation of this problem is equivalent to computing a Wasserstein barycenter. In turn, this new perspective enables the development of new geometric first-order methods with strong convergence guarantees in Bures-Wasserstein distance. Experiments on simulated data demonstrate the advantages of our new methodology over existing methods.
Conference presentation
The Geometry of Graph Projection, Interpolation, and Sketching
Date presented 09/28/2022
SIAM Conference on Mathematics of Data Science, 09/25/2022–09/30/2022
Conference presentation
Stochastic and Private Nonconvex Outlier-Robust PCA
Date presented 2022
Mathematical and Scientific Machine Learning, 08/15/2022–08/17/2022, Beijing
We develop theoretically guaranteed stochastic methods for outlier-robust PCA. Outlier-robust PCA seeks an underlying low-dimensional linear subspace from a dataset that is corrupted with outliers. We are able to show that our methods, which are variants of stochastic geodesic gradient descent over the Grassmannian manifold, converge and recover an underlying subspace in various regimes through the development of a novel convergence analysis. The main application of this method is an effective differentially private algorithm for outlier-robust PCA that uses a Gaussian noise mechanism within the stochastic gradient method. Our results emphasize the advantages of the nonconvex methods over another convex approach to solve Outlier-robust PCA in the differentially private setting. Experiments on synthetic and stylized data verify these results.