OnionNet: Sharing Features in Cascaded Deep Classifiers - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2016

OnionNet: Sharing Features in Cascaded Deep Classifiers

Résumé

The focus of our work is speeding up evaluation of deep neural networks in retrieval scenarios, where conventional architectures may spend too much time on negative examples. We propose to replace a monolithic network with our novel cascade of feature-sharing deep classifiers, called OnionNet, where subsequent stages may add both new layers as well as new feature channels to the previous ones. Importantly, intermediate feature maps are shared among classifiers, preventing them from the necessity of being recomputed. To accomplish this, the model is trained end-to-end in a principled way under a joint loss. We validate our approach in theory and on a synthetic benchmark. As a result demonstrated in three applications (patch matching, object detection, and image retrieval), our cascade can operate significantly faster than both monolithic networks and traditional cascades without sharing at the cost of marginal decrease in precision.
Fichier principal
Vignette du fichier
1608.02728ON.pdf (342.67 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01576911 , version 1 (24-08-2017)

Identifiants

Citer

Simonovsky Martin, Nikos Komodakis. OnionNet: Sharing Features in Cascaded Deep Classifiers. 27th British Machine Vision Conference (BMVC), Sep 2016, York, United Kingdom. ⟨hal-01576911⟩
106 Consultations
262 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More