Unsupervised Feature Selection with Ensemble Learning - Archive ouverte HAL
Article Dans Une Revue Machine Learning Année : 2015

Unsupervised Feature Selection with Ensemble Learning

Haytham Elghazel
Alex Aussem

Résumé

In this paper, we show that the way internal estimates are used to measure variable importance in Random Forests are also applicable to feature selection in unsupervised learning. We propose a new method called Random Cluster Ensemble (RCE for short), that estimates the out-of-bag feature importance from an ensemble of partitions. Each partition is constructed using a different bootstrap sample and a random subset of the features. We provide empirical results on nineteen benchmark data sets indicating that RCE, boosted with a recursive feature elimination scheme (RFE), can lead to significant improvement in terms of clustering accuracy, over several state-of-the-art supervised and unsupervised algorithms, with a very limited subset of features. The method shows promise to deal with very large domains. All results, datasets and algorithms are available on line.
Fichier non déposé

Dates et versions

hal-01339161 , version 1 (29-06-2016)

Identifiants

Citer

Haytham Elghazel, Alex Aussem. Unsupervised Feature Selection with Ensemble Learning. Machine Learning, 2015, 98 (1-2), pp.157-180. ⟨10.1007/s10994-013-5337-8⟩. ⟨hal-01339161⟩
208 Consultations
0 Téléchargements

Altmetric

Partager

More