Speaker — Nicolas Keriven (Ecole Normale Supérieure)
Abstract — Many problems in machine learning and imaging can be framed as an infinite dimensional Lasso problem to estimate a sparse measure. This includes for instance regression using a continuously parameterized dictionary, mixture model estimation and super-resolution of images. To make the problem tractable, one typically sketches the observations (often called compressive-sensing in imaging) using randomized projections. In this work, we provide a comprehensive treatment of the recovery performances of this class of approaches, proving that (up to log factors) a number of sketches proportional to the sparsity is enough to identify the sought after measure with robustness to noise. We prove both exact support stability (the number of recovered atoms matches that of the measure of interest) and approximate stability (localization of the atoms) by extending two classical proof techniques (minimal norm dual certificate and golfing scheme certificate).
Biography — Nicolas Keriven is currently a postdoctoral researcher at Ecole Normale Supérieure, in the CFM-ENS “Laplace” chair on data science. He organizes the Laplace reading group and his research interests are compressive sensing, dimensionality reduction, learning, big data, and small data. He graduated from Ecole polytechnique (Palaiseau, France), and obtained the “Mathématiques, Vision, Apprentissage” (MVA) Master’s degree from Ecole Normale Supérieure de Cachan in 2014. He prepared his PhD thesis at IRISA, Rennes, France, under the supervision of Rémi Gribonval, and defended it in October, 2017. He received the Best Student Paper Award at SPARS 2017 in Lisbon, Portugal. He plays piano.