Virtual Objects Look Farther on the Sides: The Anisotropy of Distance Perception in Virtual Reality
Résumé
The topic of distance perception has been widely investigated in Virtual Reality (VR). However, the vast majority of previous work mainly focused on distance perception of objects placed in front of the observer. Then, what happens when the observer looks on the side? In this paper, we study differences in distance estimation when comparing objects placed in front of the observer with objects placed on his side. Through a series of four experiments (n=85), we assessed participants’ distance estimation and ruled out potential biases. In particular, we considered the placement of visual
stimuli in the field of view, users’ exploration behavior as well as the presence of depth cues. For all experiments a two-alternative forced choice (2AFC) standardized psychophysical protocol was employed, in which the main task was to determine the stimuli that seemed to be the farthest one. In summary, our results showed that the orientation of virtual stimuli with respect to the user introduces a distance perception bias: objects placed on the sides are systematically perceived farther away than objects in front. In addition, we could observe that this bias increases along with the angle, and appears to be independent of both the position of the object in the field of view as well as the quality of the virtual scene. This work sheds a new light on one of the specificities of VR environments regarding the wider subject of visual space theory. Our study paves the way for future experiments evaluating the anisotropy of distance perception in real and virtual environments.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...