Unsupervised Learning of Spatio-Temporal Receptive Fields from an Event-Based Vision Sensor - Archive ouverte HAL Access content directly
Conference Papers Year : 2020

Unsupervised Learning of Spatio-Temporal Receptive Fields from an Event-Based Vision Sensor

Thomas Barbier
Céline Teulière

Abstract

Neuromorphic vision sensors exhibit several advantages compared to conventional frame-based cameras including low latencies, high dynamic range, and low data rates. However, how efficient visual representations can be learned from the output of such sensors in an unsupervised fashion is still an open problem. Here we present a spiking neural network that learns spatio-temporal receptive fields in an unsupervised way from the output of a neuromorphic event-based vision sensor. Learning relies on the combination of spike timing-dependent plasticity with different synaptic delays, the homeostatic regulations of synaptic weights and firing thresholds, and fast inhibition among neurons to decorrelate their responses. Our network develops biologically plausible spatio-temporal receptive fields when trained on real world input and is suited for implementation on neuromorphic hardware.
Fichier principal
Vignette du fichier
Unsupervised Learning of Spatio-Temporal Receptive Fields from an Event-Based Vision Sensor.pdf (2.7 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03049596 , version 1 (11-01-2021)

Identifiers

Cite

Thomas Barbier, Céline Teulière, Jochen Triesch. Unsupervised Learning of Spatio-Temporal Receptive Fields from an Event-Based Vision Sensor. 29th International Conference on Artificial Neural Networks,, 2021, Bratislava, Slovakia. pp.622-633, ⟨10.1007/978-3-030-61616-8_50⟩. ⟨hal-03049596⟩
129 View
151 Download

Altmetric

Share

Gmail Facebook X LinkedIn More