Online Human Activity Recognition for Ergonomics Assessment - Archive ouverte HAL
Conference Papers Year : 2018

Online Human Activity Recognition for Ergonomics Assessment

Abstract

We address the problem of recognizing the current activity performed by a human worker, providing an information useful for automatic ergonomic evaluation of workstations for industrial applications.Traditional ergonomic assessment methods rely on pen-and-paper worksheet, such as the Er-gonomic Assessment Worksheet (EAWS). Nowadays, there exists no tool to automatically estimate the ergonomics score from sensors (external cameras or wearable sensors). As the ergonomic evaluation depends of the activity that is being performed, the first step towards a fully automatic ergonomic assessment is to automatically identify the different activities within an industrial task. To address this problem, we propose a method based on wearable sensors and supervised learning based on Hidden Markov Model (HMM). The activity recognition module works in two steps. First, the parameters of the model are learned offline from observation based on both sensors, then in a second stage, the model can be used to recognize the activity offline and online. We apply our method to recognize the current activity of a worker during a series of tasks typical of the manufacturing industry. We recorded 6 participants performing a sequence of tasks using wearable sensors.Two systems were used: the MVN Link suit from Xsens and the e-glove from Emphasis Telematics (See Fig. 1). The first consists of 17 wireless inertial sensors embedded in a lycra suit, and is used to track the whole-body motion. The second is a glove that includes pressure sensors on fingertips, and finger flexion sensors. The motion capture data are combined with the one from the glove and fed to our activity recognition model. The tasks were designed to involve elements of EAWS such as load handling, screwing and manipulating objects while in different static postures. The data are labeled following the EAWS categories such as " standing bent forward " , " overhead work " or " kneeling ". In terms of performances, the model is able to recognize the activities related to EAWS with 91% of precision by using a small subset of features such as the vertical position of the center of mass, the velocity of the center of mass and the angle of the L5S1 joint.
Fichier principal
Vignette du fichier
malaise_sias.pdf (82.81 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-01808832 , version 1 (08-10-2018)

Identifiers

  • HAL Id : hal-01808832 , version 1

Cite

Adrien Malaisé, Pauline Maurice, Francis Colas, Serena Ivaldi. Online Human Activity Recognition for Ergonomics Assessment. SIAS 2018 - 9ème conférence internationale sur la sécurité des systèmes industriels automatisés, Oct 2018, Nancy, France. ⟨hal-01808832⟩
292 View
250 Download

Share

More