Visuo-Haptic Rendering of the Hand during 3D Manipulation in Augmented Reality
Résumé
Manipulating virtual objects with bare hands is a key interaction in Augmented Reality (AR) applications. However, there are still several limitations that affect the manipulation, including the lack of mutual visual occlusion between virtual and real content as well as the lack of haptic sensations. To address the two abovementioned matters, the role of the visuohaptic rendering of the hand as sensory feedback is investigated. The first experiment explores the effect of showing the hand of the user as seen by the AR system through an avatar, comparing six visual hand rendering. The second experiment explores the effect of the visuo-haptic hand rendering by comparing two vibrotactile contact techniques provided at four delocalized positions on the hand and combined with the two most representative visual hand renderings from the first experiment. Results show that delocalized vibrotactile haptic hand rendering improved perceived effectiveness, realism, and usefulness when provided close to the contact point. However, the farthest rendering position, i.e., on the contralateral hand, gave the best performance even though it was largely disliked. The visual hand rendering was perceived as less necessary for manipulation when the haptic hand rendering was available, but still provided useful feedback on the hand tracking.
Origine | Fichiers produits par l'(les) auteur(s) |
---|