Combining weak classifiers: a logical analysis
Résumé
A classical idea in supervised learning is to compute a strong classifier by combining weak classifiers, one of its famous implementations is the ADABOOST algorithm. We study the cost (objective function) associated to this problem in the particular case of 3 classifiers and two classes. For that purpose, an original representation of the cost, based on a truth table, is formulated as the cornerstone of this study, leading to an optimization algorithm. We prove that, under reasonable hypotheses, this cost function has a unique minimum. We study the properties of this minimum and the validity of the hypotheses. We identify directly the resulting classifier through a mathematical analysis without the need to run a numerical algorithm. The algorithm ADABOOST is not an optimization algorithm but we propose here to use an extremely close classical minimization algorithm — the relaxation algorithm — and we present an example of simulation where the relaxation provides a more convincing result.