Relaxed-inertial proximal point type algorithms for problems involving strongly quasiconvex functions
Abstract
Introduced in the 1970's by Martinet for minimizing convex functions and extended shortly afterwards by Rockafellar towards monotone inclusion problems, the proximal point algorithm turned out to be a viable computational method for solving various classes of optimization problems even beyond the convex framework.
In this talk we propose a relaxed-inertial proximal point type algorithm for solving optimization problems consisting in minimizing strongly quasiconvex functions whose variables lie in finitely dimensional linear subspaces. The method is then extended for equilibrium functions involving strongly quasiconvex functions. Computational results confirm the theoretical advances.