An Unsupervised Framework for Online Spatiotemporal Detection of Activities of Daily Living by Hierarchical Activity Models
Résumé
Automatic detection and analysis of human activities captured by various sensors (e.g. 1 sequence of images captured by RGB camera) play an essential role in various research fields in order 2 to understand the semantic content of a captured scene. The main focus of the earlier studies has 3 been widely on supervised classification problem, where a label is assigned for a given short clip. 4 Nevertheless, in real-world scenarios, such as in Activities of Daily Living (ADL), the challenge is 5 to automatically browse long-term (days and weeks) stream of videos to identify segments with 6 semantics corresponding to the model activities and their temporal boundaries. This paper proposes 7 an unsupervised solution to address this problem by generating hierarchical models that combine 8 global trajectory information with local dynamics of the human body. Global information helps in 9 modeling the spatiotemporal evolution of long-term activities and hence, their spatial and temporal 10 localization. Moreover, the local dynamic information incorporates complex local motion patterns of 11 daily activities into the models. Our proposed method is evaluated using realistic datasets captured 12 from observation rooms in hospitals and nursing homes. The experimental data on a variety of 13 monitoring scenarios in hospital settings reveals how this framework can be exploited to provide 14 timely diagnose and medical interventions for cognitive disorders such as Alzheimer's disease. The 15 obtained results show that our framework is a promising attempt capable of generating activity 16 models without any supervision. 17
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...