Abstract: The fast Johnson-Lindenstrauss transform has triggered a large amount of research into fast randomized transforms that reduce data dimensionality while approximately preserving geometry. We discuss uses of these fast transforms in three situations. In the first, we use the transform to precondition a data matrix before subsampling, and show how for huge data sets, this leads to great acceleration in algorithms such as PCA and K-means clustering. The second situation reconsiders the common problem of sketching for regression. We argue that it is more natural to switch to a robust model, and we further introduce a "partially sketched" variant. Finally, we present recent empirical work on using sketching to reduce the need for pivoting in the QR and LU decompositions.
Applications of randomized sketching to subsampling, robust regression and linear algebra
Event Status
Scheduled
Event Details
Date and Time
Nov. 20, 2015, All Day