Dictionary learning for parsimonious representations.

Authors
  • GRIBONVAL Remi
  • JENATTON Rodolphe
  • BACH Francis
  • KLEINSTEUBER Martin
  • SEIBERT Matthias
Publication date
2014
Publication type
Proceedings Article
Summary A popular approach within the signal processing and machine learning communities consists in modelling high-dimensional data as sparse linear combinations of atoms selected from a dictionary. Given the importance of the choice of the dictionary for the operational deployment of these tools, a growing interest for \emph{learned} dictionaries has emerged. The most popular dictionary learning techniques, which are expressed as large-scale matrix factorization through the optimization of a non convex cost function, have been widely disseminated thanks to extensive empirical evidence of their success and steady algorithmic progress. Yet, until recently they remained essentially heuristic. We will present recent work on statistical aspects of sparse dictionary learning, contributing to the characterization of the excess risk as a function of the number of training samples. The results cover non only sparse dictionary learning but also a much larger class of constrained matrix factorization problems.
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr