PEYRE Gabriel

< Back to ILB Patrimony
Topics of productions
Affiliations
  • 2014 - 2021
    Département de mathématiques et applications de l'ENS
  • 2015 - 2021
    Centre national de la recherche scientifique
  • 2013 - 2019
    Avancées en calcul numérique des variations
  • 2014 - 2019
    Centre de recherche Inria de Paris
  • 2004 - 2018
    Centre de recherches en mathématiques de la décision
  • 2012 - 2016
    Université Paris-Dauphine
  • 2004 - 2005
    Ecole Polytechnique
  • 2021
  • 2020
  • 2019
  • 2017
  • 2015
  • 2014
  • 2013
  • Momentum Residual Neural Networks.

    Michael e. SANDER, Pierre ABLIN, Mathieu BLONDEL, Gabriel PEYRE
    2021
    The training of deep residual neural networks (ResNets) with backpropagation has a memory cost that increases linearly with respect to the depth of the network. A simple way to circumvent this issue is to use reversible architectures. In this paper, we propose to change the forward rule of a ResNet by adding a momentum term. The resulting networks, momentum residual neural networks (MomentumNets), are invertible. Unlike previous invertible architectures, they can be used as a drop-in replacement for any existing ResNet block. We show that MomentumNets can be interpreted in the infinitesimal step size regime as second-order ordinary differential equations (ODEs) and exactly characterize how adding momentum progressively increases the representation capabilities of MomentumNets. Our analysis reveals that MomentumNets can learn any linear mapping up to a multiplicative factor, while ResNets cannot. In a learning to optimize setting, where convergence to a fixed point is required, we show theoretically and empirically that our method succeeds while existing invertible architectures fail. We show on CIFAR and ImageNet that MomentumNets have the same accuracy as ResNets, while having a much smaller memory footprint, and show that pre-trained MomentumNets are promising for fine-tuning models.
Affiliations are detected from the signatures of publications identified in scanR. An author can therefore appear to be affiliated with several structures or supervisors according to these signatures. The dates displayed correspond only to the dates of the publications found. For more information, see https://scanr.enseignementsup-recherche.gouv.fr