Analyzing sparse dictionaries for online learning with kernels
Résumé
Many signal processing and machine learning methods share essentially the same linear-in-the-parameter model, with as many parameters as available samples as in kernel-based machines. Sparse approximation is essential in many disciplines, with new challenges emerging in online learning with kernels. To this end, several sparsity measures have been proposed in the literature to quantify sparse dictionaries and constructing relevant ones, the most prolific ones being the distance, the approximation, the coherence and the Babel measures. In this paper, we analyze sparse dictionaries based on these measures. By conducting an eigenvalue analysis, we show that these sparsity measures share many properties, including the linear independence condition and inducing a well-posed optimization problem. Furthermore, we prove that there exists a quasi-isometry between the parameter (i.e., dual) space and the dictionary's induced feature space.
Mots clés
sparsity
adaptive
filtering
approximation theory
eigenvalues and eigenfunctions
learning (artificial intelligence)
signal processing
sparse dictionary
online learning
kernel based-learning
machine learning method
linear-in-the-parameter model
sparse approximation
Babel measure
eigenvalue analysis
linear independence condition
quasiisometry
Kernel
Dictionaries
Optimization
Signal processing algorithms
Least squares approximations
Atomic measurements
Adaptive filtering
Gram matrix
kernel-based methods
machine learning
pattern recognition
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...