Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback - Archive ouverte HAL
Cours Année : 2023

Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback

Résumé

Smartphones are used in different contexts, including scenarios where visual and auditory modalities are limited (e.g., walking or driving). In this context, we introduce a new interaction concept, called Hap2Gest, that can give commands and retrieve information, both eyes-free. First, it uses a gesture as input for command invocation, and then output information is retrieved using haptic feedback perceived through an output gesture drawn by the user. We conducted an elicitation study with 12 participants to determine users’ preferences for the aforementioned gestures and the vibration patterns for 25 referents. Our findings indicate that users tend to use the same gesture for input and output, and there is a clear relationship between the type of gestures and vibration patterns users suggest and the type of output information. We show that the gesture’s speed profile agreement rate is significantly higher than the gesture’s shape agreement rate, and it can be used by the recognizer when the gesture shape agreement rate is low. Finally, we present a complete set of user-defined gestures and vibration patterns and address the gesture recognition problem.
Fichier principal
Vignette du fichier
Hap2Gest An Eyes-free Interaction Concept with Smartphones Using Gestures and Haptic Feedback.pdf (1.52 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04204271 , version 1 (12-09-2023)

Identifiants

Citer

Milad Jamalzadeh, Yosra Rekik, Alexandru Dancu, Laurent Grisoni. Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback. Doctoral. York (United Kingdom), United Kingdom. 2023. ⟨hal-04204271⟩
28 Consultations
38 Téléchargements

Altmetric

Partager

More