Neuromorphic Event-Based Spatio-temporal Attention using Adaptive Mechanisms - Archive ouverte HAL Accéder directement au contenu
Poster De Conférence Année : 2022

Neuromorphic Event-Based Spatio-temporal Attention using Adaptive Mechanisms

Résumé

Contrary to RGB cameras, Dynamic Vision Sensor (DVS) output visual data in the form of an asynchronous events stream by recording pixel-wise luminance changes at microsecond resolution. While conventional computer vision approaches utilise frame-based input data, thus failing to take full advantage of the high temporal resolution, novel approaches use spiking neural networks Spiking Neural Networks (SNNs) which are more compatible to handle event-based data since these bioinspired neural models intrinsically encode information in a sparse manner using activation spikes trains. This paper presents an attentional mechanism which detects regions with higher event density by using inherent SNN dynamics combined with online weight and threshold adaptation. We implemented the network directly on Intel's research neuromorphic chip Loihi and evaluate our proposed method on the open DVS128 Gesture Dataset. Our system is able to process 1 ms of event-data in 6 ms and reject more than 50% of incoming unwanted events occurring only 20 ms after activity onset.
Fichier principal
Vignette du fichier
AICAS2022_Poster_241_Amelie Gruel.pdf (1.46 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03760704 , version 1 (25-08-2022)

Identifiants

  • HAL Id : hal-03760704 , version 1

Citer

Amélie Gruel, Antonio Vitale, Jean Martinet, Michele Magno. Neuromorphic Event-Based Spatio-temporal Attention using Adaptive Mechanisms. International Conference on Artificial Intelligence Circuits and Systems (AICAS), Jun 2022, Incheon, South Korea. . ⟨hal-03760704⟩
41 Consultations
53 Téléchargements

Partager

Gmail Facebook X LinkedIn More