Sonifying tactile interactions and their underlying emotions: experimental studies and applications in virtual reality.
Résumé
With the increase of social isolation and distant communication, it appears timely to allow distant socio-affective interactions. To do so, we developed a methodology combining the use of a novel sonification technique that considers skin as a singular sonic texture and prototypical social touch gestures. Three studies explored the feasibility of this technique, its core components, and its potential application in virtual reality. In the first study, the vibratory signals from prototypical skin-to-skin touches were recorded with a violin microphone, allowing their conversion into sounds. The resulting sonified signals were presented to participants who were able to accurately categorize both the different gestures (stroking, rubbing, tapping, hitting) and their underlying emotional intentions (love, empathy, joy, impatience, fear, anger). The second study investigated the respective roles of rhythm and textural properties on participants’ abilities to recognize social touch through sounds. The same tactile gestures as in the first study were reproduced with two different surface conditions: skin-to-skin and object-to-object. The results revealed that the dynamics of the surface involved is crucial and hence that skin-to-skin interactions bear information that sets them apart from object-to-object movements. The third study was conducted with the platform of virtual agents Greta, which allows simulating 3D touch gestures. We quantified the participants’ differences in perceiving the virtual agent when performing a game involving or not social touch by the agent to convey emotional intentions through our sonified signals. These results pave the way for social touch at a distance both with humans and with virtual agents.