Feature selection in possibilistic modeling - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Pattern Recognition Année : 2015

Feature selection in possibilistic modeling

Résumé

Feature selection is becoming increasingly important for the reduction of computing complexity. In this context, conventional approaches have random performances, because They can succeed for some contexts and fail for others. Possibilistic modeling is a powerful paradigm being able to handle data imperfection or redundancy and is not affected by data variability. Therefore, in this paper, we propose a new feature selection strategy for possibilitic modeling. The proposed approach is based on two issues in order to extract relevant features: the measure of feature importance as well as the possibility distribution uncertainty degree. The importance of one feature can be considered under two aspects: The first one is related to the scattering within one class and the second one reflects the feature power for class discrimination. Therefore, we apply, here, Shapley index paradigm which selects features who minimize the intra-class distance and who maximize the inter-class distance. The previous process is refined using possibility distribution uncertainty degree in order to resolve some conflict problems between feature׳s importance values.
Fichier non déposé

Dates et versions

hal-01474065 , version 1 (22-02-2017)

Identifiants

Citer

Ammar Bouhamed, Imene Khanfir, Dorra Sellami Masmoudi, Basel Solaiman. Feature selection in possibilistic modeling. Pattern Recognition, 2015, 48 (11), pp.3627 - 3640. ⟨10.1016/j.patcog.2015.03.015⟩. ⟨hal-01474065⟩
35 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More