Channel-spatial knowledge distillation for efficient semantic segmentation - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Pattern Recognition Letters Année : 2024

Channel-spatial knowledge distillation for efficient semantic segmentation

Ayoub Karine
Thibault Napoléon

Résumé

In this paper, we propose a new lightweight Channel-Spatial Knowledge Distillation (CSKD) method to handle the task of efficient image semantic segmentation. More precisely, we investigate the KD approach that train a compressed neural network called student under the supervision of a heavy one called teacher. In this context, we propose to improve the distillation mechanism by capturing the contextual dependencies in spatial and channel dimensions through a self-attention principle. In addition, to quantify the difference between the teacher and student knowledge, we adopt the Centered Kernel Alignment (CKA) metric that avoids the student to add additional leaning layers to match the teacher features size. Experimental results over Cityscapes, CamVid and Pascal VOC datasets demonstrate that our method can achieve outstanding performance. The code is available at https://github.com/ayoubkarine/CSKD
Fichier non déposé

Dates et versions

hal-04488459 , version 1 (04-03-2024)

Identifiants

Citer

Ayoub Karine, Thibault Napoléon, Maher Jridi. Channel-spatial knowledge distillation for efficient semantic segmentation. Pattern Recognition Letters, 2024, 180, pp.48-54. ⟨10.1016/j.patrec.2024.02.027⟩. ⟨hal-04488459⟩
11 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More