" Look At This One " Detection sharing between modality-independent classifiers for robotic discovery of people - Archive ouverte HAL Access content directly
Conference Papers Year :

" Look At This One " Detection sharing between modality-independent classifiers for robotic discovery of people

Abstract

With the advent of low-cost RGBD sensors, many solutions have been proposed for extraction and fusion of colour and depth information. In this paper, we propose new different fusion approaches of these multimodal sources for people detection. We are especially concerned by a scenario where a robot evolves in a changing environment. We extend the use of the Faster RCNN framework proposed by Girshick et al. [1] to this use case (i), we significantly improve performances on people detection on the InOutDoor RGBD People dataset [2] and the RGBD people dataset [3] (ii), we show these fusion handle efficiently sensor defect like complete lost of a modality (iii). Furthermore we propose a new dataset for people detection in difficult conditions: ONERA.ROOM (iv).
Fichier principal
Vignette du fichier
2017_ECMR_LookAtThisOne.pdf (689.99 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01628762 , version 1 (04-11-2017)

Identifiers

  • HAL Id : hal-01628762 , version 1

Cite

Joris Guerry, Bertrand Le Saux, David Filliat. " Look At This One " Detection sharing between modality-independent classifiers for robotic discovery of people. ECMR 2017 - European Conference on Mobile Robotics, Sep 2017, Paris, France. pp.1-6. ⟨hal-01628762⟩
160 View
165 Download

Share

Gmail Facebook Twitter LinkedIn More