Rendering embodied experience into multimodal data: concepts, tools and applications for Xenakis' piano performance - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Rendering embodied experience into multimodal data: concepts, tools and applications for Xenakis' piano performance

Résumé

The main core of the workshop will be the presentation of the system GesTCom (acronym for Gesture Cutting Through Textual Complexity), which has been developed since 2014 in collaboration with IRCAM (interaction-son-musique-mouvement team). GesTCom is a sensor-based environment for the visualization, analysis and following of the pianist’s gestures in relation to the notation. It comprises four modules, implemented in the form of Max/MSP patches featuring the MuBu toolbox and connected to INScore scripts: a) a module for the synchronized recording of multimodal data of a performance ; b) a module for the reproduction and analysis of the data ; c) a module for the processing of the notation on the basis of the data ; d) a module for real-time gestural interaction with the notation.
Fichier principal
Vignette du fichier
2022_Xenakis22_workshop.pdf (1.5 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04420668 , version 1 (29-01-2024)

Licence

Paternité - Pas d'utilisation commerciale - Pas de modification

Identifiants

  • HAL Id : hal-04420668 , version 1

Citer

Pavlos Antoniadis, Jean-François Jego, Aurélien Duval, Frédéric Bevilacqua, Stella Paschalidou. Rendering embodied experience into multimodal data: concepts, tools and applications for Xenakis' piano performance. Xenakis 22: Centenary International Symposium, May 2022, Athens, Nafplio, France. ⟨hal-04420668⟩
6 Consultations
6 Téléchargements

Partager

Gmail Facebook X LinkedIn More