Pose-configurable generic tracking of elongated objects - Archive ouverte HAL
Communication Dans Un Congrès Année : 2013

Pose-configurable generic tracking of elongated objects

Résumé

Elongated objects have various shapes and can shift, rotate, change scale, and be rigid or deform by flexing, articulating, and vibrating, with examples as varied as a glass bottle, a robotic arm, a surgical suture, a finger pair, a tram, and a guitar string. This generally makes tracking of poses of elongated objects very challenging. We describe a unified, configurable framework for tracking the pose of elongated objects, which move in the image plane and extend over the image region. Our method strives for simplicity, versatility, and efficiency. The object is decomposed into a chained assembly of segments of multiple parts that are arranged under a hierarchy of tailored spatio-temporal constraints. In this hierarchy, segments can rescale independently while their elasticity is controlled with global orientations and local distances. While the trend in tracking is to design complex, structure-free algorithms that update object appearance on-line, we show that our tracker, with the novel but remarkably simple, structured organization of parts with constant appearance, reaches or improves state-of-the-art performance. Most importantly, our model can be easily configured to track exact pose of arbitrary, elongated objects in the image plane. The tracker can run up to 100 fps on a desktop PC, yet the computation time scales linearly with the number of object parts. To our knowledge, this is the first approach to generic tracking of elongated objects

Dates et versions

hal-01060141 , version 1 (03-09-2014)

Identifiants

Citer

Daniel Wesierski, Patrick Horain. Pose-configurable generic tracking of elongated objects. ICCV 2013 : IEEE International Conference on Computer Vision, Dec 2013, Sydney, Australia. pp.2920 - 2927, ⟨10.1109/ICCV.2013.363⟩. ⟨hal-01060141⟩
34 Consultations
0 Téléchargements

Altmetric

Partager

More