Extending Deep Rhythm for Tempo and Genre Estimation Using Complex Convolutions, Multitask Learning and Multi-input Network - Archive ouverte HAL
Communication Dans Un Congrès Année : 2020

Extending Deep Rhythm for Tempo and Genre Estimation Using Complex Convolutions, Multitask Learning and Multi-input Network

Résumé

Tempo and genre are two inter-leaved aspects of music, genres are often associated to rhythm patterns which are played in specific tempo ranges. In this paper, we focus on the recent Deep Rhythm system based on a harmonic representation of rhythm used as an input to a convolutional neural network. To consider the relationships between frequency bands, we process complex-valued inputs through complexconvolutions. We also study the joint estimation of tempo/genre using a multitask learning approach. Finally, we study the addition of a second input branch to the system based on a VGG-like architecture applied to a mel-spectrogram input. This multi-input approach allows to improve the performances for tempo and genre estimation.
Fichier principal
Vignette du fichier
2020_AIMUSIC.pdf (346.25 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03127155 , version 1 (01-02-2021)

Licence

Copyright (Tous droits réservés)

Identifiants

  • HAL Id : hal-03127155 , version 1

Citer

Hadrien Foroughmand, Geoffroy Peeters. Extending Deep Rhythm for Tempo and Genre Estimation Using Complex Convolutions, Multitask Learning and Multi-input Network. The 2020 Joint Conference on AI Music Creativity, Bob Sturm, Oct 2020, Stockholm, Sweden. ⟨hal-03127155⟩
175 Consultations
193 Téléchargements

Partager

More