WNCG Seminar: Convex Shmonvex: Dropping Convexity for Faster Matrix Estimation
Fitting a low-rank matrix to data is a fundamental and widely used primitive in machine learning. For most problems beyond the very basic PCA, theoretically sound methods have overwhelmingly combined statistical models of the data with convex optimization. As the size and dimensionality of data increases, this approach is overly computationally wasteful, not least because it represents an nr dimensional object with n^2 parameters.
In this talk we present several of our recent results in understanding and designing much faster non-convex algorithms, and characterizing their statistical performance.
Watch the presentation online on the WNCG YouTube Channel.