AUDIO-BASED AUTO-TAGGING WITH CONTEXTUAL TAGS FOR MUSIC - Archive ouverte HAL Access content directly
Conference Papers Year :

AUDIO-BASED AUTO-TAGGING WITH CONTEXTUAL TAGS FOR MUSIC

Karim M Ibrahim
Jimena Royo-Letelier
  • Function : Author
Elena V. Epure
  • Function : Author
Geoffroy Peeters
Gael Richard

Abstract

Music listening context such as location or activity has been shown to greatly influence the users' musical tastes. In this work, we study the relationship between user context and audio content in order to enable context-aware music recommendation agnostic to user data. For that, we propose a semi-automatic procedure to collect track sets which leverages playlist titles as a proxy for context labelling. Using this, we create and release a dataset of ∼50k tracks labelled with 15 different contexts. Then, we present benchmark classification results on the created dataset using an audio auto-tagging model. As the training and evaluation of these models are impacted by missing negative labels due to incomplete annotations, we propose a sample-level weighted cross entropy loss to account for the confidence in missing labels and show improved context prediction results.
Fichier principal
Vignette du fichier
ICASSP2020_context_v5.1.pdf (237.35 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-02481374 , version 1 (17-02-2020)

Identifiers

Cite

Karim M Ibrahim, Jimena Royo-Letelier, Elena V. Epure, Geoffroy Peeters, Gael Richard. AUDIO-BASED AUTO-TAGGING WITH CONTEXTUAL TAGS FOR MUSIC. International Conference on Acoustics, Speech, and Signal Processing (ICASSP), May 2020, Barcelona, Spain. ⟨10.5281/zenodo.3648287⟩. ⟨hal-02481374⟩
212 View
692 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More