**"Randomly Initialized Alternating Minimization: Fast Convergence in Low-Rank Matrix Recovery"**
#### Stöger, DominikIn this talk, we will focus on the task of reconstructing rank-one matrices from random linear measurements, a problem that is ubiquitous in signal processing and machine learning.
We will focus on the Alternating Least Squares (ALS) method, which is popular among practitioners.
Although the ALS method has been studied in the literature for this non-convex problem, most works only show convergence of ALS from an initialization close to the true solution and thus require a carefully designed initialization scheme.
However, practitioners often prefer random initialization as it is model-agnostic.
In this talk, we will demonstrate that ALS with random initialization converges to the true rank-one solution with $\varepsilon$-accuracy in $O(\log n + \log (1/\varepsilon)) $ iterations using only a near-optimal amount of measurements, where we assume the measurement matrices to have i.i.d. Gaussian entries and where by $n$ we denote the ambient dimension.
We will establish that the convergence can be separated into two distinct phases.
In the first phase, starting from a near-orthogonal initialization the signal gets gradually more aligned with the true signal.
In the second phase, ALS converges linearly to the ground truth matrix.
We will support our theory by numerical experiments, which confirm our predictions. |