Incremental Without Replacement Sampling in Nonconvex Optimization
Résumé
Minibatch decomposition methods for empirical risk minimization are commonly analysed in a stochastic approximation setting, also known as sampling with replacement. On the other hands modern implementations of such techniques are incremental, they rely on sampling without replacement. We reduce this gap between theory and common usage by analysing a versatile incremental gradient scheme. We consider constant, decreasing or adaptive step sizes. In the smooth setting we obtain explicit rates and in the nonsmooth setting we prove that the sequence is attracted by solutions of optimality conditions of the problem.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...