ESFS: A new embedded feature selection method based on SFS - Archive ouverte HAL Access content directly
Reports (Research Report) Year : 2008

ESFS: A new embedded feature selection method based on SFS

Abstract

Feature subset selection is an important subject when training classifiers in Machine Learning (ML) problems. Too many input features in a ML problem may lead to the so-called "curse of dimensionality", which describes the fact that the complexity of the classifier parameters adjustment during training increases exponentially with the number of features. Thus, ML algorithms are known to suffer from important decrease of the prediction accuracy when faced with many features that are not necessary. In this paper, we introduce a novel embedded feature selection method, called ESFS, which is inspired from the wrapper method SFS since it relies on the simple principle to add incrementally most relevant features. Its originality concerns the use of mass functions from the evidence theory that allows to merge elegantly the information carried by features, in an embedded way, and so leading to a lower computational cost than original SFS. This approach has successfully been applied to the emergent domain of emotion classification in audio signals.
Fichier principal
Vignette du fichier
10.1.1.870.2176.pdf (209.3 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01984705 , version 1 (17-01-2019)

Identifiers

  • HAL Id : hal-01984705 , version 1

Cite

Zhongzhe Xiao, Emmanuel Dellandrea, Weibei Dou, Liming Chen. ESFS: A new embedded feature selection method based on SFS. [Research Report] Ecole Centrale Lyon; Université de Lyon; LIRIS UMR 5205 CNRS/INSA de Lyon/Université Claude Bernard Lyon 1/Université Lumière Lyon 2/École Centrale de Lyon; Tsinghua University, Bejing, China. 2008. ⟨hal-01984705⟩
235 View
236 Download

Share

Gmail Facebook X LinkedIn More