Toward a human-like sound perception for reactive virtual agents - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Toward a human-like sound perception for reactive virtual agents

Eliott Zimmermann
Pierre Raimbaud

Résumé

Human social interactions rely on multisensory cues. In this regard, visual and auditory cues are paramount during the initiation of an interaction. In this preliminary work, we propose an approach to let Intelligent Virtual Agents (IVAs) simulating sound perception capabilities. Our model targets to control IVAs' reactive behaviour through their analysis of perceived other agents' emitted sounds. For that, we explored auditory features close to the human system. CCS CONCEPTS • Computing methodologies → Simulation environments; Motion processing; Procedural animation.
Fichier principal
Vignette du fichier
Toward_a_human_like_sound_perception_for_reactive_virtual_agents (1).pdf (1.38 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Licence : CC BY NC SA - Paternité - Pas d'utilisation commerciale - Partage selon les Conditions Initiales

Dates et versions

hal-04418559 , version 1 (30-01-2024)

Identifiants

Citer

Audrey Pichard, Gauthier Couzon, Eliott Zimmermann, Pierre Raimbaud. Toward a human-like sound perception for reactive virtual agents. IVA '23: ACM International Conference on Intelligent Virtual Agents, Sep 2023, Würzburg Germany, France. pp.1-4, ⟨10.1145/3570945.3607346⟩. ⟨hal-04418559⟩
10 Consultations
10 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More