Infinitesimal gradient boosting - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... Year : 2021

Infinitesimal gradient boosting

Abstract

We define infinitesimal gradient boosting as a limit of the popular tree-based gradient boosting algorithm from machine learning. The limit is considered in the vanishing-learning-rate asymptotic, that is when the learning rate tends to zero and the number of gradient trees is rescaled accordingly. For this purpose, we introduce a new class of randomized regression trees bridging totally randomized trees and Extra Trees and using a softmax distribution for binary splitting. Our main result is the convergence of the associated stochastic algorithm and the characterization of the limiting procedure as the unique solution of a nonlinear ordinary differential equation in a infinite dimensional function space. Infinitesimal gradient boosting defines a smooth path in the space of continuous functions along which the training error decreases, the residuals remain centered and the total variation is well controlled.
Fichier principal
Vignette du fichier
ms_hal.pdf (657.05 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03209503 , version 1 (27-04-2021)

Licence

Identifiers

  • HAL Id : hal-03209503 , version 1

Cite

Clément Dombry, Jean-Jil Duchamps. Infinitesimal gradient boosting. 2021. ⟨hal-03209503⟩
48 View
53 Download

Share

Gmail Mastodon Facebook X LinkedIn More