Joint Learning of Fully Connected Network Models in Lifting Based Image Coders - Archive ouverte HAL
Article Dans Une Revue IEEE Transactions on Image Processing Année : 2023

Joint Learning of Fully Connected Network Models in Lifting Based Image Coders

Résumé

The optimization of prediction and update operators plays a prominent role in lifting-based image coding schemes. In this paper, we focus on learning the prediction and update models involved in a recent Fully Connected Neural Network (FCNN)- based lifting structure. While a straightforward approach consists in separately learning the different FCNN models by optimizing appropriate loss functions, jointly learning those models is a more challenging problem. To address this problem, we first consider a statistical model-based entropy loss function that yields a good approximation to the coding rate. Then, we develop a multi-scale optimization technique to learn all the FCNN models simultaneously. For this purpose, two loss functions defined across the different resolution levels of the proposed representation are investigated. While the first function combines standard prediction and update loss functions, the second one aims to obtain a good approximation to the rate-distortion criterion. Experimental results carried out on two standard image datasets, show the benefits of the proposed approaches in the context of lossy and lossless compression.
Fichier principal
Vignette du fichier
TIP2023_Joint_Learning_of_FCNN_in_Lifting_Based_Image_Coders.pdf (3.65 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04372168 , version 1 (04-01-2024)

Identifiants

Citer

Tassnim Dardouri, Mounir Kaaniche, Amel Benazza-Benyahia, Gabriel Dauphin, Jean-Christophe Pesquet. Joint Learning of Fully Connected Network Models in Lifting Based Image Coders. IEEE Transactions on Image Processing, 2023, 33, pp.134-148. ⟨10.1109/TIP.2023.3333279⟩. ⟨hal-04372168⟩
47 Consultations
89 Téléchargements

Altmetric

Partager

More