SinKD: Sinkhorn Distance Minimization for Knowledge Distillation - Archive ouverte HAL
Article Dans Une Revue IEEE Transactions on Neural Networks and Learning Systems Année : 2024

SinKD: Sinkhorn Distance Minimization for Knowledge Distillation

Résumé

Knowledge distillation (KD) has been widely adopted to compress large language models (LLMs). Existing KD methods investigate various divergence measures including the Kullback-Leibler (KL), reverse Kullback-Leibler (RKL), and Jensen-Shannon (JS) divergences. However, due to limitations inherent in their assumptions and definitions, these measures fail to deliver effective supervision when few distribution overlap exists between the teacher and the student. In this paper, we show that the aforementioned KL, RKL, and JS divergences respectively suffer from issues of mode-averaging, mode-collapsing, and mode-underestimation, which deteriorates logits-based KD for diverse NLP tasks. We propose the Sinkhorn Knowledge Distillation (SinKD) that exploits the Sinkhorn distance to ensure a nuanced and precise assessment of the disparity between distributions of teacher and student models. Besides, thanks to the properties of the Sinkhorn metric, we get rid of sample-wise KD that restricts the perception of divergences inside each teacher-student sample pair. Instead, we propose a batch-wise reformulation to capture geometric intricacies of distributions across samples in the high-dimensional space. Comprehensive evaluation on GLUE and SuperGLUE, in terms of comparability, validity, and generalizability, highlights our superiority over state-of-the-art methods on all kinds of LLMs with encoder-only, encoder-decoder, and decoder-only architectures. Codes and models are available at https://github.com/2018cx/SinKD.
Fichier principal
Vignette du fichier
TNNLS-preview.pdf (3.02 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence
Commentaire Accepted by IEEE TNNLS

Dates et versions

hal-04803835 , version 1 (26-11-2024)

Licence

Identifiants

  • HAL Id : hal-04803835 , version 1

Citer

Xiao Cui, Yulei Qin, Yuting Gao, Enwei Zhang, Zihan Xu, et al.. SinKD: Sinkhorn Distance Minimization for Knowledge Distillation. IEEE Transactions on Neural Networks and Learning Systems, In press. ⟨hal-04803835⟩
0 Consultations
0 Téléchargements

Partager

More