Audio, visual, and audio-visual egocentric distance perception by moving participants in virtual environments
Résumé
A study on audio, visual, and audio-visual egocentric distance perception by moving participants in virtual environments is presented. Audio-visual rendering is provided using tracked passive visual stereoscopy and acoustic wave fi eld synthesis (WFS). Distances are estimated using indirect blind-walking (triangulation) under each rendering condition. Experimental results show that distances perceived in the virtual environment are accurately estimated or overestimated for rendered distances closer than the position of the audio-visual rendering system and underestimated for distances farther. Interestingly, participants perceived each virtual object at a modality-independent distance when using the audio modality, the visual modality, or the combination of both. Results show WFS capable of synthesizing perceptually meaningful sound fields in terms of distance. Dynamic audio-visual cues were used by participants when estimating the distances in the virtual world. Moving may have provided participants with a better visual distance perception of close distances than if they were static. No correlation between the feeling of presence and the visual distance underestimation has been found. To explain the observed perceptual distance compression, it is proposed that, due to con flicting distance cues, the audio-visual rendering system physically anchors the virtual world to the real world. Virtual objects are thus attracted by the physical audio-visual rendering system.
Origine | Fichiers produits par l'(les) auteur(s) |
---|