Touch Interaction for Corpus-based Audio-Visual Synthesis - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Touch Interaction for Corpus-based Audio-Visual Synthesis

Résumé

Audiovisual corpus-based synthesis extends the principle of concatenative sound synthesis to the visual domain, where, in addition to the sound corpus (i.e. a collection of segments of recorded sound with a perceptual description of their sound character), the artist uses a corpus of still images with visual perceptual description (colour, texture, brightness), in order to create an audiovisual musical performance by navigating in real-time through these descriptor spaces, i.e. through the collection of sound grains in a space of perceptual audio descriptors, and at the same time through the visual descriptor space, i.e. selecting images from the visual corpus for rendering, and thus navigate in parallel through both corpora interactively with gestural control via touch sensing. The artistic-scientific question that is explored here is how to control at the same time the navigation through the audio and the image descriptor spaces with gestures, in other words, how to link the touch input to both descriptor spaces in order to create a multi-modal embodied audiovisual experience.
Fichier principal
Vignette du fichier
schwarz-nime2023-cocavs.pdf (1.26 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-04355949 , version 1 (20-12-2023)

Licence

Identifiants

  • HAL Id : hal-04355949 , version 1

Citer

Diemo Schwarz. Touch Interaction for Corpus-based Audio-Visual Synthesis. New Interfaces for Musical Expression (NIME), May 2023, Mexico City, Mexico. ⟨hal-04355949⟩
99 Consultations
83 Téléchargements

Partager

More