A Framework for Recognizing Industrial Actions via Joint Angles
Résumé
This paper proposes a novel framework for recognizing industrial actions, in the perspective of human-robot collaboration. Given a one second long measure of the human's motion, the framework can determine his/her action. The originality lies in the use of joint angles, instead of Cartesian coordinates. This design choice makes the framework sensor agnostic and invariant to affine transformations and to anthropometric differences. On AnDy dataset, we outperform the state of art classifier. Furthermore, we show that our framework is effective with limited training data, that it is subject independent, and that it is compatible with robotic realtime constraints. In terms of methodology, the framework is an original synergy of two antithetical schools of thought: modelbased and data-based algorithms. Indeed, it is the cascade of an inverse kinematics estimator compliant with the International Society of Biomechanics recommendations, followed by a deep learning architecture based on Bidirectional Long Short Term Memory. We believe our work may pave the way to successful and fast action recognition with standard depth cameras, embedded on moving collaborative robots.
Origine | Fichiers produits par l'(les) auteur(s) |
---|