Accelerated Gradient Boosting

Abstract : Gradient tree boosting is a prediction algorithm that sequentially produces a model in the form of linear combinations of decision trees, by solving an infinite-dimensional optimization problem. We combine gradient boosting and Nesterov's accelerated descent to design a new algorithm, which we call AGB (for Accelerated Gradient Boosting). Substantial numerical evidence is provided on both synthetic and real-life data sets to assess the excellent performance of the method in a large variety of prediction problems. It is empirically shown that AGB is much less sensitive to the shrinkage parameter and outputs predictors that are considerably more sparse in the number of trees, while retaining the exceptional performance of gradient boosting.
Type de document :
Article dans une revue
Machine Learning, Springer Verlag, In press, 〈10.1007/s10994-019-05787-1〉
Liste complète des métadonnées

https://hal.sorbonne-universite.fr/hal-01723843
Contributeur : Laurent Rouvière <>
Soumis le : mardi 5 mars 2019 - 16:01:26
Dernière modification le : lundi 18 mars 2019 - 16:01:31

Fichiers

biau-cadre-rouviere-rev.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Citation

Gérard Biau, Benoît Cadre, Laurent Rouvìère. Accelerated Gradient Boosting. Machine Learning, Springer Verlag, In press, 〈10.1007/s10994-019-05787-1〉. 〈hal-01723843v2〉

Partager

Métriques

Consultations de la notice

12

Téléchargements de fichiers

31