PIGMMES: Partial Incremental Gaussian Mixture Model with Efficient Storage
Résumé
The central focus of this work is the Gaussian Mixture Model (GMM), a machine learning model widely used for density estimation and cluster analysis. The Expectation-Maximization (EM) algorithm is commonly used to train GMMs. One of the main challenges facing this algorithm under the shift towards embedded systems is the crippling memory constraints. Indeed, the EM is an iterative algorithm that requires several scans of the data, thus several I/O opeartions. Hence, when the dataset cannot be fully stored in the main memory, its execution is slowed down by the huge data movements (I/Os) between the main memory and the secondary storage. In this work, we present an I/O optimization of the EM algorithm for GMMs that relies on two main contributions: (1) a divide-and-conquer strategy that divides the dataset into chunks, learns the GMM separately in each chunk and combines the results incrementally; (2) a strategy that restricts the training of the GMM on a subset of data while producing a sufficiently good accuracy, by utilizing the information learned from the first chunk.
Origine | Fichiers produits par l'(les) auteur(s) |
---|