Can Synthetic Data Handle Unconstrained Gaze Estimation ?
Résumé
In this article, we aim at solving unconstrained gaze estimation problem using appearance-based approach. Unlike previous methods working in relatively constrained environment , we propose an approach that allows free head motion and significant user-sensor distances using RGB-D sensor. Our paper presents the following contributions : (i) A direct estimation by inferring gaze information from RGB eyes and depth face appearances ;(ii) A channel selection strategy during the learning to evaluate the involvement of each channel in the final prediction ; (iii) Adapting a 3D face morphable model by integrating a parametric gaze model to render an important synthetic RGB-D training set. We also collect real labeled samples using Kinect sensor that allows for evaluating the potential of synthetic learning in handling real configurations and establish an objective comparison with real learning. Results on several users demonstrate the great potential of our approach.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...