ATLAS: adaptive single object tracking using offline learned motion and visual patterns - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

ATLAS: adaptive single object tracking using offline learned motion and visual patterns

Résumé

In this paper we introduce ATLAS, a novel generic single object tracker based on two convolutional neural networks (CNN) trained offline. The key principle consists in alternating between tracking using motion information and predicting the object location in time based on visual similarity. The proposed tracker uses a regression-based approach to learn offline generic relationships between object appearances and its associated motion patterns. Then, by continuously updating the target appearance model, the system adaptively modifies the object bounding box position, size and shape. Starting from the initial candidate location estimated using motion patterns, the object's position is successively shifted within the context search area based on a patch similarity function that does not require any manually designed features. The final track location corresponds to the instance that provides the maximum similarity value. The experimental evaluation, performed on the challenging datasets considered by the Visual Object Tracking (VOT) international contest in 2016 (http://www.votchallenge.net/), demonstrates the performance of our technique when compared with state-of the art methods. Our tracker runs at more than 20 fps using generic motion and visual patterns
Fichier non déposé

Dates et versions

hal-01691133 , version 1 (23-01-2018)

Identifiants

  • HAL Id : hal-01691133 , version 1

Citer

Ruxandra Tapu, Bogdan Mocanu, Titus Zaharia. ATLAS: adaptive single object tracking using offline learned motion and visual patterns. BMVC 2017 : British Machine Vision Conference Workshops, Sep 2017, Londres, United Kingdom. pp.1 - 12. ⟨hal-01691133⟩
143 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More