Paper ID | MLSP-18.2 |
Paper Title |
Exact Linear Convergence Rate Analysis for Low-Rank Symmetric Matrix Completion via Gradient Descent |
Authors |
Trung Vu, Raviv Raich, Oregon State University, United States |
Session | MLSP-18: Matrix Factorization and Applications |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 14:00 - 14:45 |
Presentation Time: | Wednesday, 09 June, 14:00 - 14:45 |
Presentation |
Poster
|
Topic |
Machine Learning for Signal Processing: [MLR-MFC] Matrix factorizations/completion |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Factorization-based gradient descent is a scalable and efficient algorithm for solving low-rank matrix completion. Recent progress in structured non-convex optimization has offered global convergence guarantees for gradient descent under certain statistical assumptions on the low-rank matrix and the sampling set. However, while the theory suggests gradient descent enjoys fast linear convergence to a global solution of the problem, the universal nature of the bounding technique prevents it from obtaining an accurate estimate of the rate of convergence. This paper performs a local analysis of the exact linear convergence rate of gradient descent for factorization-based symmetric matrix completion. Without any additional assumptions on the underlying model, we identify the deterministic condition for local convergence guarantee for gradient descent, which depends only on the solution matrix and the sampling set. More crucially, our analysis provides a closed-form expression of the asymptotic rate of convergence that matches exactly with the linear convergence observed in practice. To the best of our knowledge, our result is the first one that offers the exact linear convergence rate of gradient descent for matrix factorization in Euclidean space for matrix completion. |