Regularization in Relevance Learning Vector Quantization Using l one Norms - Archive ouverte HAL Access content directly
Conference Papers Year : 2013

Regularization in Relevance Learning Vector Quantization Using l one Norms

Abstract

We propose in this contribution a method for l one regularization in prototype based relevance learning vector quantization (LVQ) for sparse relevance profiles. Sparse relevance profiles in hyperspectral data analysis fade down those spectral bands which are not necessary for classification. In particular, we consider the sparsity in the relevance profile enforced by LASSO optimization. The latter one is obtained by a gradient learning scheme using a differentiable parametrized approximation of the $l_{1}$-norm, which has an upper error bound. We extend this regularization idea also to the matrix learning variant of LVQ as the natural generalization of relevance learning.
Fichier principal
Vignette du fichier
L1_regularization_3_ESANN.pdf (1.42 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-00874854 , version 1 (18-10-2013)

Identifiers

Cite

Martin Riedel, Marika Kästner, Fabrice Rossi, Thomas Villmann. Regularization in Relevance Learning Vector Quantization Using l one Norms. 21-th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2013), Apr 2013, Bruges, Belgium. pp.17-22. ⟨hal-00874854⟩
187 View
133 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More