KERDREUX Thomas

< Back to ILB Patrimony
Affiliations
  • 2019 - 2021
    Département d'Informatique de l'Ecole Normale Supérieure
  • 2019 - 2020
    Ecole normale supérieure Paris
  • 2020 - 2021
    Technical University of Berlin
  • 2019 - 2020
    Apprentissage statistique et parcimonie
  • 2019 - 2020
    Sciences mathematiques de paris centre
  • 2021
  • 2020
  • Linear Bandits on Uniformly Convex Sets.

    Thomas KERDREUX, Christophe ROUX, Alexandre D ASPREMONT, Sebastian POKUTTA
    2021
    Linear bandit algorithms yield $\tilde{\mathcal{O}}(n\sqrt{T})$ pseudo-regret bounds on compact convex action sets $\mathcal{K}\subset\mathbb{R}^n$ and two types of structural assumptions lead to better pseudo-regret bounds. When $\mathcal{K}$ is the simplex or an $\ell_p$ ball with $p\in]1,2]$, there exist bandits algorithms with $\tilde{\mathcal{O}}(\sqrt{nT})$ pseudo-regret bounds. Here, we derive bandit algorithms for some strongly convex sets beyond $\ell_p$ balls that enjoy pseudo-regret bounds of $\tilde{\mathcal{O}}(\sqrt{nT})$, which answers an open question from [BCB12, \S 5.5.]. Interestingly, when the action set is uniformly convex but not necessarily strongly convex, we obtain pseudo-regret bounds with a dimension dependency smaller than $\mathcal{O}(\sqrt{n})$. However, this comes at the expense of asymptotic rates in $T$ varying between $\tilde{\mathcal{O}}(\sqrt{T})$ and $\tilde{\mathcal{O}}(T)$.
  • Local and Global Uniform Convexity Conditions.

    Thomas KERDREUX, Alexandre D ASPREMONT, Sebastian POKUTTA
    2021
    We review various characterizations of uniform convexity and smoothness on norm balls in finite-dimensional spaces and connect results stemming from the geometry of Banach spaces with \textit{scaling inequalities} used in analysing the convergence of optimization methods. In particular, we establish local versions of these conditions to provide sharper insights on a recent body of complexity results in learning theory, online learning, or offline optimization, which rely on the strong convexity of the feasible set. While they have a significant impact on complexity, these strong convexity or uniform convexity properties of feasible sets are not exploited as thoroughly as their functional counterparts, and this work is an effort to correct this imbalance. We conclude with some practical examples in optimization and machine learning where leveraging these conditions and localized assumptions lead to new complexity results.
  • Accelerating conditional gradient methods.

    Thomas KERDREUX, Alexandre d ASPREMONT, Francis BACH, Alexandre d ASPREMONT, Francis BACH, Mikael JOHANSSON, Zaid HARCHAOUI, Sebastian POKUTTA, Martin JAGGI, Mikael JOHANSSON, Zaid HARCHAOUI
    2020
    Frank-Wolfe algorithms are methods for optimizing problems under constraints. They decompose a non-linear problem into a series of linear problems. This makes them the methods of choice for high-dimensional optimization and explains their use in many applied fields. Here we propose new Frank-Wolfe algorithms that converge more quickly to the solution of the optimization problem under certain fairly generic structural assumptions. In particular, we show that, unlike other types of algorithms, this family adapts to these assumptions without having to specify the parameters that control them.
  • Accelerating conditional gradient methods.

    Thomas KERDREUX
    2020
    The Frank-Wolfe algorithms, a.k.a. conditional gradient algorithms, solve constrained optimization problems. They break down a non-linear problem into a series of linear minimization on the constraint set. This contributes to their recent revival in many applied domains, in particular those involving large-scale optimization problems. In this dissertation, we design or analyze versions of the Frank-Wolfe algorithms. We notably show that, contrary to other types of algorithms, this family is adaptive to a broad spectrum of structural assumptions, without the need to know and specify the parameters controlling these hypotheses.
Affiliations are detected from the signatures of publications identified in scanR. An author can therefore appear to be affiliated with several structures or supervisors according to these signatures. The dates displayed correspond only to the dates of the publications found. For more information, see https://scanr.enseignementsup-recherche.gouv.fr