Why do tree-based models still outperform deep learning on typical tabular data? - Archive ouverte HAL
Conference Papers Year : 2022

Why do tree-based models still outperform deep learning on typical tabular data?

Abstract

While deep learning has enabled tremendous progress on text and image datasets, its superiority on tabular data is not clear. We contribute extensive benchmarks of standard and novel deep learning methods as well as tree-based models such as XGBoost and Random Forests, across a large number of datasets and hyperparameter combinations. We define a standard set of 45 datasets from varied domains with clear characteristics of tabular data and a benchmarking methodology account- ing for both fitting models and finding good hyperparameters. Results show that tree-based models remain state-of-the-art on medium-sized data (∼10K samples) even without accounting for their superior speed. To understand this gap, we conduct an empirical investigation into the differing inductive biases of tree-based models and neural networks. This leads to a series of challenges which should guide researchers aiming to build tabular-specific neural network: 1. be robust to uninformative features, 2. preserve the orientation of the data, and 3. be able to easily learn irregular functions. To stimulate research on tabular architectures, we contribute a standard benchmark and raw data for baselines: every point of a 20 000 compute hours hyperparameter search for each learner.
Fichier principal
Vignette du fichier
Tabular_NeurIPS2022 (22).pdf (4.71 Mo) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03723551 , version 1 (15-07-2022)
hal-03723551 , version 2 (19-10-2022)
hal-03723551 , version 3 (27-06-2023)

Identifiers

Cite

Léo Grinsztajn, Edouard Oyallon, Gaël Varoquaux. Why do tree-based models still outperform deep learning on typical tabular data?. NeurIPS 2022 Datasets and Benchmarks Track, Nov 2022, New Orleans, United States. ⟨hal-03723551v2⟩
24350 View
4874 Download

Altmetric

Share

More