Tikhonov Regularization for Stochastic Non-Smooth Convex Optimization in Hilbert Spaces
Résumé
To solve non-smooth convex optimization problems with a noisy gradient input, we analyze the global behavior of
subgradient-like flows under stochastic errors. The objective function is composite, being equal to the sum of two
convex functions, one being differentiable and the other potentially non-smooth. We then use stochastic differential
inclusions where the drift term is minus the subgradient of the objective function,
and the diffusion term is either bounded or square-integrable. In this context, under Lipschitz's continuity of the differentiable
term and a growth condition of the non-smooth term, our first main result shows almost sure weak convergence
of the trajectory process towards a minimizer of the objective function. Then, using Tikhonov regularization with a properly tuned vanishing parameter, we can obtain almost sure strong convergence
of the trajectory towards the minimum norm solution. We find an explicit tuning of this parameter when our objective
function satisfies a local error-bound inequality. We also provide a comprehensive complexity analysis by establishing
several new pointwise and ergodic convergence rates in expectation for the convex and strongly convex case.
Domaines
Mathématiques [math]Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
Licence |