Linear TreeShap - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Linear TreeShap

Résumé

Decision trees are well-known due to their ease of interpretability. To improve accuracy, we need to grow deep trees or ensembles of trees. These are hard to interpret, offsetting their original benefits. Shapley values have recently become a popular way to explain the predictions of tree-based machine learning models. It provides a linear weighting to features independent of the tree structure. The rise in popularity is mainly due to TreeShap, which solves a general exponential complexity problem in polynomial time. Following extensive adoption in the industry, more efficient algorithms are required. This paper presents a more efficient and straightforward algorithm: Linear TreeShap. Like TreeShap, Linear TreeShap is exact and requires the same amount of memory.
Fichier principal
Vignette du fichier
2209.08192.pdf (314.87 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04446067 , version 1 (08-02-2024)

Identifiants

  • HAL Id : hal-04446067 , version 1

Citer

Peng Yu, Chao Xu, Albert Bifet, Jesse Read. Linear TreeShap. Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, New Orleans, LA, USA, November 28 - December 9, 2022, Nov 2022, New Orleans, United States. ⟨hal-04446067⟩
28 Consultations
8 Téléchargements

Partager

Gmail Mastodon Facebook X LinkedIn More