Perception Assistance for the Visually Impaired Through Smart Objects: Concept, Implementation, and Experiment Scenario
Résumé
The last few years have seen substantial progress in the field of smart objects (SOs): their number, diversity, performance and pervasiveness have all been quickly increasing and this evolution is expected to continue. To the best of our knowledge, little work has been made to leverage this abundance of resources to develop assistive devices for Visually Impaired People (VIP). However, we believe that SOs can both enhance traditional assistive functions (i.e. obstacle detection, navigation) and offer new ways of interacting with the environment. After describing spatial and non-spatial perceptive functions enabled by SOs, this article presents the SO2SEES, a system designed to be an interface between its user and neighboring SOs. The SO2SEES allows VIP to query surrounding SOs in an intuitive manner, relying on knowledge bases distributed on Internet of Things (IoT) cloud platforms and the SO2SEES's own back-end. To evaluate and validate the exposed concepts, we have developed a simple working implementation of the SO2SEES system using semantic web standards. A controlled-environment test scenario has been built around this early SO2SEES system to demonstrate its feasibility. As future works, we plan to conduct field experiments of this first prototype with VIP end users.
Fichier principal
Perception_Assistance_for_the_Visually_Impaired_Through_Smart_Objects_Concept_Implementation_and_Experiment_Scenario.pdf (2.57 Mo)
Télécharger le fichier
Origine | Publication financée par une institution |
---|