Tracking HOG Descriptors for Gesture Recognition
Résumé
We introduce a new HoG (Histogram of Oriented Gradients) tracker for Gesture Recognition. Our main contribution is to build HoG trajectory descriptors (representing local motion) which are used for gesture recognition. First, we select for each individual in the scene a set of corner points to determine textured regions where to compute 2D HoG descriptors. Second, we track these 2D HoG descriptors in order to build temporal HoG descriptors. Lost descriptors are replaced by newly detected ones. Finally, we extract the local motion descriptors to learn offline a set of given gestures. Then, a new video can be classified according to the gesture occurring in the video. Results shows that the tracker performs well compared to KLT tracker [1]. The generated local motion descriptors are validated through gesture learning-classification using the KTH action database [2].
Fichier principal
Tracking_HoG_Descriptors_for_Gesture_Recognition.pdf (270.06 Ko)
Télécharger le fichier
Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|
Loading...