Network Memory Footprint Compression Through Jointly Learnable Codebooks and Mappings - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Network Memory Footprint Compression Through Jointly Learnable Codebooks and Mappings

Résumé

The massive interest in deep neural networks (DNNs) for both computer vision and natural language processing has been sparked by the growth in computational power. However, this led to an increase in the memory footprint, to a point where it can be challenging to simply load a model on commodity devices such as mobile phones. To address this limitation, quantization is a favored solution as it maps high precision tensors to a low precision, memory efficient format. In terms of memory footprint reduction, its most effective variants are based on codebooks. These methods, however, suffer from two limitations. First, they either define a single codebook for each tensor, or use a memory-expensive mapping to multiple codebooks. Second, gradient descent optimization of the mapping favors jumps toward extreme values, hence not defining a proximal search. In this work, we propose to address these two limitations. First, we initially group similarly distributed neurons and leverage the re-ordered structure to either apply different scale factors to the different groups, or map weights that fall in these groups to several codebooks, without any mapping overhead. Second, stemming from this initialization, we propose a joint learning of the codebook and weight mappings that bears similarities with recent gradient-based post-training quantization techniques. Third, drawing estimation from straight-through estimation techniques, we introduce a novel gradient update definition to enable a proximal search of the codebooks and their mappings. The proposed jointly learnable codebooks and mappings (JLCM) method allows a very efficient approximation of any DNN: as such, a Llama 7B can be compressed down to 2Go and loaded on 5-year-old smartphones.

Dates et versions

hal-04425825 , version 1 (30-01-2024)

Identifiants

Citer

Edouard Yvinec, Arnaud Dapogny, Kevin Bailly. Network Memory Footprint Compression Through Jointly Learnable Codebooks and Mappings. The International Conference on Learning Representations (ICLR'24), May 2024, Vienna, Austria. ⟨hal-04425825⟩
16 Consultations
0 Téléchargements

Altmetric

Partager

More