Using Random Codebooks for Audio Neural AutoEncoders
Résumé
Latent representation learning has been an active field of study for decades in numerous applications. Inspired among others by the tokenization from Natural Language Processing and motivated by the research of a simple data representation, recent works have introduced a quantization step into the feature extraction. In this work, we propose a novel strategy to build the neural discrete representation by means of random codebooks. These codebooks are obtained by randomly sampling a large, predefined fixed codebook. We experimentally show the merits and potential of our approach in a task of audio compression and reconstruction.
Fichier principal
EUSIPCO___RANDOM_QUANTIZATION (1).pdf (1004.25 Ko)
Télécharger le fichier
bare_jrnl.pdf (998.74 Ko)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|