A Penalized Subspace Strategy for Solving Large-Scale Constrained Optimization Problems
Résumé
Many data science problems can be efficiently addressed by minimizing a cost function subject to various constraints. In this paper a new method for solving largescale constrained differentiable optimization problems is proposed. To account efficiently for a wide range of constraints, our approach embeds a subspace algorithm into an exterior penalty framework. The subspace strategy, combined with a Majoration-Minimization step search, takes great advantage of the smoothness of the penalized cost function. Assuming that the latter is convex, the convergence of our algorithm to a solution of the constrained optimization problem is proved. Numerical experiments carried out on a large-scale image restoration application show that the proposed method outperforms stateof-the-art algorithms in terms of computational time.
Origine | Fichiers produits par l'(les) auteur(s) |
---|