Unsupervised Feature Selection with Ensemble Learning - Archive ouverte HAL Access content directly
Journal Articles Machine Learning Year : 2015

Unsupervised Feature Selection with Ensemble Learning

Haytham Elghazel
Alex Aussem

Abstract

In this paper, we show that the way internal estimates are used to measure variable importance in Random Forests are also applicable to feature selection in unsupervised learning. We propose a new method called Random Cluster Ensemble (RCE for short), that estimates the out-of-bag feature importance from an ensemble of partitions. Each partition is constructed using a different bootstrap sample and a random subset of the features. We provide empirical results on nineteen benchmark data sets indicating that RCE, boosted with a recursive feature elimination scheme (RFE), can lead to significant improvement in terms of clustering accuracy, over several state-of-the-art supervised and unsupervised algorithms, with a very limited subset of features. The method shows promise to deal with very large domains. All results, datasets and algorithms are available on line.
No file

Dates and versions

hal-01339161 , version 1 (29-06-2016)

Identifiers

Cite

Haytham Elghazel, Alex Aussem. Unsupervised Feature Selection with Ensemble Learning. Machine Learning, 2015, 98 (1-2), pp.157-180. ⟨10.1007/s10994-013-5337-8⟩. ⟨hal-01339161⟩
171 View
0 Download

Altmetric

Share

Gmail Facebook X LinkedIn More