Continual Sequence Modeling With Predictive Coding - ETIS, équipe Neurocybernétique Accéder directement au contenu
Article Dans Une Revue Frontiers in Neurorobotics Année : 2022

Continual Sequence Modeling With Predictive Coding

Résumé

Recurrent neural networks (RNNs) have been proved very successful at modeling sequential data such as language or motions. However, these successes rely on the use of the backpropagation through time (BPTT) algorithm, batch training, and the hypothesis that all the training data are available at the same time. In contrast, the field of developmental robotics aims at uncovering lifelong learning mechanisms that could allow embodied machines to learn and stabilize knowledge in continuously evolving environments. In this article, we investigate different RNN designs and learning methods, that we evaluate in a continual learning setting. The generative modeling task consists in learning to generate 20 continuous trajectories that are presented sequentially to the learning algorithms. Each method is evaluated according to the average prediction error over the 20 trajectories obtained after complete training. This study focuses on learning algorithms with low memory requirements, that do not need to store past information to update their parameters. Our experiments identify two approaches especially fit for this task: conceptors and predictive coding. We suggest combining these two mechanisms into a new proposed model that we label PC-Conceptors that outperforms the other methods presented in this study.
Fichier principal
Vignette du fichier
front-neur-robotics-2022.pdf (1.61 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-03689147 , version 1 (07-06-2022)

Identifiants

Citer

Louis Annabi, Alexandre Pitti, Mathias Quoy. Continual Sequence Modeling With Predictive Coding. Frontiers in Neurorobotics, 2022, 16, ⟨10.3389/fnbot.2022.845955⟩. ⟨hal-03689147⟩
37 Consultations
25 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More