Optimization with First Order Algorithms - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

Optimization with First Order Algorithms

Résumé

These notes focus on the minimization of convex functionals using first-order optimization methods, which are fundamental in many areas of applied mathematics and engineering. The primary goal of this document is to introduce and analyze the most classical first-order optimization algorithms. We aim to provide readers with both a practical and theoretical understanding in how and why these algorithms converge to minimizers of convex functions. The main algorithms covered in these notes include gradient descent, Forward-Backward splitting, Douglas-Rachford splitting, the Alternating Direction Method of Multipliers (ADMM), and Primal-Dual algorithms. All these algorithms fall into the class of first order methods, as they only involve gradients and subdifferentials, that are first order derivatives of the functions to optimize. For each method, we provide convergence theorems, with precise assumptions and conditions under which the convergence holds, accompanied by complete proofs. Beyond convex optimization, the final part of this manuscript extends the analysis to nonconvex problems, where we discuss the convergence behavior of these same first-order methods under broader assumptions. To contextualize the theory, we also include a selection of practical examples illustrating how these algorithms are applied in different image processing problems.
Fichier principal
Vignette du fichier
Optimization.pdf (629.14 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04755998 , version 1 (28-10-2024)

Identifiants

Citer

Charles Dossal, Samuel Hurault, Nicolas Papadakis. Optimization with First Order Algorithms. 2024. ⟨hal-04755998⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

More