PowerQuant: Automorphism Search for Non-Uniform Quantization - Archive ouverte HAL
Proceedings/Recueil Des Communications The Eleventh International Conference on Learning Representations (ICLR 2023) Année : 2023

PowerQuant: Automorphism Search for Non-Uniform Quantization

Résumé

Deep neural networks (DNNs) are nowadays ubiquitous in many domains such as computer vision. However, due to their high latency, the deployment of DNNs hinges on the development of compression techniques such as quantization which consists in lowering the number of bits used to encode the weights and activations. Growing concerns for privacy and security have motivated the development of data-free techniques, at the expanse of accuracy. In this paper, we identity the uniformity of the quantization operator as a limitation of existing approaches, and propose a data-free non-uniform method. More specifically, we argue that to be readily usable without dedicated hardware and implementation, non-uniform quantization shall not change the nature of the mathematical operations performed by the DNN. This leads to search among the continuous automorphisms of $(\mathbb{R}_+^*,\times)$, which boils down to the power functions defined by their exponent. To find this parameter, we propose to optimize the reconstruction error of each layer: in particular, we show that this procedure is locally convex and admits a unique solution. At inference time, we show that our approach, dubbed PowerQuant, only require simple modifications in the quantized DNN activation functions. As such, with only negligible overhead, it significantly outperforms existing methods in a variety of configurations.

Dates et versions

hal-04201009 , version 1 (08-09-2023)

Identifiants

Citer

Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly. PowerQuant: Automorphism Search for Non-Uniform Quantization. The Eleventh International Conference on Learning Representations (ICLR 2023), 2023. ⟨hal-04201009⟩
21 Consultations
0 Téléchargements

Altmetric

Partager

More