Accelerating conditional gradient methods.

Authors
Publication date
2020
Publication type
Thesis
Summary Frank-Wolfe algorithms are methods for optimizing problems under constraints. They decompose a non-linear problem into a series of linear problems. This makes them the methods of choice for high-dimensional optimization and explains their use in many applied fields. Here we propose new Frank-Wolfe algorithms that converge more quickly to the solution of the optimization problem under certain fairly generic structural assumptions. In particular, we show that, unlike other types of algorithms, this family adapts to these assumptions without having to specify the parameters that control them.
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr