Toward Fast Transform Learning
Résumé
The dictionary learning problem aims at finding a dictionary of atoms that best represents an image according to a given objective. The most usual objective consists of representing an image or a class of images sparsely. Most algorithms performing dictionary learning iteratively estimate the dictionary and a sparse representation of images using this dictionary. Dictionary learning has led to many state of the art algorithms in image processing. However, its numerical complexity restricts its use to atoms with a small support since the computations using the constructed dictionaries require too much resources to be deployed for large scale applications. In order to alleviate these issues, this paper introduces a new strategy to learn dictionaries composed of atoms obtained as a composition of $K$ convolutions with $S$-sparse kernels. The dictionary update step associated with this strategy is a non-convex optimization problem. We reformulate the problem in order to reduce the number of its irrelevant stationary points and introduce a Gauss-Seidel type algorithm, referred to as Alternative Least Square Algorithm, for its resolution. \FR{The search space of the considered optimization problem is of dimension $KS$, which is typically smaller than the size of the target atom and is much smaller than the size of the image. The complexity of the algorithm is linear with regard to the size of the image.} Our experiments show that we are able to approximate with a very high accuracy many atoms such as modified DCT, curvelets, sinc functions or cosines when $K$ is large (say $K=10$). We also argue empirically that, maybe surprisingly, the algorithm generally converges to a global minimum for large values of $K$ and $S$.
Origine | Fichiers produits par l'(les) auteur(s) |
---|