When Analytic Calculus Cracks AdaBoost Code - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

When Analytic Calculus Cracks AdaBoost Code

Résumé

The principle of boosting in supervised learning involves combining multiple weak classifiers to obtain a stronger classifier. AdaBoost has the reputation to be a perfect example of this approach. We have previously shown that AdaBoost is not truly an optimization algorithm. This paper shows that AdaBoost is an algorithm in name only, as the resulting combination of weak classifiers can be explicitly calculated using a truth table. This study is carried out by considering a problem with two classes and is illustrated by the particular case of three binary classifiers and presents results in comparison with those from the implementation of AdaBoost algorithm of the Python library scikit-learn.

Mots clés

Fichier principal
Vignette du fichier
2308.01070.pdf (754.16 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04177234 , version 1 (04-08-2023)

Identifiants

Citer

Jean-Marc Brossier, Olivier Lafitte, Lenny Réthoré. When Analytic Calculus Cracks AdaBoost Code. 2023. ⟨hal-04177234⟩
42 Consultations
14 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More