Deep transform and metric learning networks - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

Deep transform and metric learning networks

Résumé

Based on its great successes in inference and denosing tasks, Dictionary Learning (DL) and its related sparse optimization formulations have garnered a lot of research interest. While most solutions have focused on single layer dictionaries, the recently improved Deep DL methods have also fallen short on a number of issues. We hence propose a novel Deep DL approach where each DL layer can be formulated and solved as a combination of one linear layer and a Recurrent Neural Network, where the RNN is flexibly regraded as a layer-associated learned metric. Our proposed work unveils new insights between the Neural Networks and Deep DL, and provides a novel, efficient and competitive approach to jointly learn the deep transforms and metrics. Extensive experiments are carried out to demonstrate that the proposed method can not only outperform existing Deep DL, but also state-of-the-art generic Convolutional Neural Networks.
Fichier principal
Vignette du fichier
ICASSP2021_wen_final.pdf (276.95 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03281571 , version 1 (08-07-2021)

Identifiants

  • HAL Id : hal-03281571 , version 1

Citer

Wen Tang, Emilie Chouzenoux, Jean-Christophe Pesquet, Hamid Krim. Deep transform and metric learning networks. ICASSP 2021 - IEEE International Conference on Acoustics, Speech and Signal Processing, Jun 2021, Toronto Virtual, Canada. ⟨hal-03281571⟩
58 Consultations
87 Téléchargements

Partager

More