A two-step proximal-point algorithm for the calculus of divergence-based estimators in finite mixture models
Résumé
Estimators derived from the expectation-maximization (EM) algorithm are not robust since they are based on the maximization of the likelihood function. We propose an iterative proximal-point algorithm based on the EM algorithm which aims to minimize a divergence criterion between a mixture model and some unknown distribution generating the data. The algorithm estimates in each iteration the proportions and the parameters of the mixture components in two separate steps. Resulting estimators are generally robust against outliers and misspecification. Convergence properties of our algorithm are treated. The convergence of the introduced algorithm is discussed on a two-component Weibull mixture entailing a condition on the initialization of the EM algorithm in order for the latter to converge. Simulations on Gaussian and Weibull mixture models using different statistical divergences are provided to confirm the validity of our work and the robustness of the resulting estimators against outliers in comparison to the EM
Mots clés
mixture model
proximal-point algorithm
robustness
statistical divergence.
EM algorithm
EM algorithm fractional imputation grouped data measurement error models MSC 2010: Primary 62D05 secondary 62G09
fractional imputation
grouped data
measurement error models MSC 2010: Primary 62D05
secondary 62G09
EM algorithm mixture model proximal-point algorithm robustness statistical divergence MSC 2010: Primary 62F35 secondary 62F10, 65K10, 65C60
mixture model
statistical divergence MSC 2010: Primary 62F35
secondary 62F10, 65K10, 65C60
Domaines
Statistiques [math.ST]Origine | Fichiers produits par l'(les) auteur(s) |
---|