Efficient variable selection by coordinate ascent with theoretical guarantees.

Authors
Publication date
2019
Publication type
report
Summary Despite the advent of representation-based learning, mainly through deep learning, feature selection remains a key element of many machine learning scenarios. This paper presents a new theoretically motivated method for feature selection. This approach deals with the problem of feature selection through coordinate optimization methods taking into account the dependencies of the variables, materializing these dependencies into blocks. The low number of iterations (until the convergence of the method) attests to the efficiency of gradient boosting methods (e.g. the XG-Boost algorithm) for these super-vectored learning problems. In the case of convex and smooth features, we can prove that the convergence rate is polynomial in terms of the dimension of the complete set of features. We compare the results obtained with state-of-the-art feature selection methods: Recursive Features Elimination (RFE) and Binary Coordinate Ascent (BCA), to show that this new method is competitive.
Topics of the publication
  • ...
  • No themes identified
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr