Eyes-Free Fingertip Guidance Based on Tactile Cues, an Extension of the Steering Law
Résumé
The use of a modern human-machine interface involves a large amount of possible interactions. In order to allow users to navigate through a large number of available operations, interface designers often use drop-down menus that offer many options in a constrained area. This kind of menu shows good performance for selecting quickly from a large amount of choices. However, they require a high visual attention which is not always possible for the user. Here, we investigate if one can navigate through paths made of orthogonal tunnels, simulating drop-down menus, relying only on tactile cues on a haptic touchscreen. We found that subjects were able to follow the path with a success rate of ∼ 90% for 1 tunnel which decreased linearly to ∼ 40% for 5 tunnels. Four types of friction-modulated haptic feedback were tested and showed no major differences in terms of success rate. Nevertheless, participants were slightly faster with slipping path feedback. The user trajectories presented robust regularities that could be well described by the steering law model. Hence, we propose a novel definition of path difficulty for non-visual conditions based on path width, length and number of orthogonal tunnels. These findings pave the way toward eyes-free guidance on surface haptic interfaces.
Origine | Fichiers produits par l'(les) auteur(s) |
---|