Proximal approaches for matrix optimization problems: Application to robust precision matrix estimation
Résumé
In recent years, there has been a growing interest in mathematical models leading to the minimization, in a symmetric matrix space, of a Bregman divergence coupled with a regularization term. We address problems of this type within a general framework where the regularization term is split into two parts, one being a spectral function while the other is arbitrary. A Douglas-Rachford approach is proposed to address such problems, and a list of proximity operators is provided allowing us to consider various choices for the fit-to-data functional and for the regularization term. Based on our theoretical results, two novel approaches are proposed for the noisy graphical lasso problem, where a covariance or precision matrix has to be statistically estimated in the presence of noise. The Douglas-Rachford approach directly applies to the estimation of the covariance matrix. When the precision matrix is sought, we solve a non-convex optimization problem. More precisely, we propose a majorization-minimization approach building a sequence of convex surrogates and solving the inner optimization subproblems via the aforementioned Douglas-Rachford procedure. We establish conditions for the convergence of this iterative scheme. We illustrate the good numerical performance of the proposed approaches with respect to state-of-the-art approaches on synthetic and real-world datasets.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...