An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
Résumé
We consider gradient algorithms for minimizing a quadratic function in R^n with large n. We suggest a particular sequence of step-lengthes and demonstrate that the resulting gradient algorithm has a convergence rate comparable with that of Conjugate Gradients and other methods based on the use of Krylov spaces. When the problem is large and sparse, the proposed algorithm can be more efficient than the Conjugate Gradient algorithm in terms of computational cost, as k iterations of the proposed algorithm only require the computation of O(log k) inner products.
Origine : Fichiers produits par l'(les) auteur(s)
Loading...