Assisted Music Score Reading Using Fixed-Gaze Head Movement: Empirical Experiment and Design Implications
Résumé
Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. However, it lacks convenient action triggering methods. In our research, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. In this instance, we have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixedgaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user’s hands are occupied by another task, we have designed and developed an experimental application known as EyeMusic. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this paper.
The usability of our application is confirmed by the experimental results, as 85% of participants were able to
use all the head movements we implemented in the prototype. The average success rate of this application
is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our
fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the
performance of each head movement.