A Multi-Componential Analysis of Emotions during Complex Learning with an Intelligent Multi-agent System - Archive ouverte HAL
Article Dans Une Revue Computers in Human Behavior Année : 2015

A Multi-Componential Analysis of Emotions during Complex Learning with an Intelligent Multi-agent System

Résumé

This paper presents the evaluation of the synchronization of three emotional measurement methods (automatic facial expression recognition, self-report, electrodermal activity) and their agreement regarding learners' emotions. Data were collected from 67 undergraduates enrolled at a North American university whom learned about a complex science topic while interacting with MetaTutor, a multi-agent computerized learning environment. Videos of learners' facial expressions captured with a webcam were analyzed using automatic facial recognition software (FaceReader 5.0). Learners' physiological arousal was recorded using Affectiva's Q-Sensor 2.0 electrodermal activity measurement bracelet. Learners' self-reported their experience of 19 different emotional states on five different occasions during the learning session, which were used as markers to synchronize data from FaceReader and Q-Sensor. We found a high agreement between the facial and self-report data (75.6%), but low levels of agreement between them and the Q-Sensor data, suggesting that a tightly coupled relationship does not always exist between emotional response components.
Fichier principal
Vignette du fichier
Harley_et_al_2015.pdf (655.38 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01340608 , version 1 (31-05-2024)

Identifiants

Citer

Jason M. Harley, François Bouchet, M. Sazzad Hussain, Roger Azevedo, Rafael A. Calvo. A Multi-Componential Analysis of Emotions during Complex Learning with an Intelligent Multi-agent System. Computers in Human Behavior, 2015, 48, pp.615--625. ⟨10.1016/j.chb.2015.02.013⟩. ⟨hal-01340608⟩
276 Consultations
82 Téléchargements

Altmetric

Partager

More