When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition
Résumé
In the most recent rapport published by the World Health Organization concerning people with visual disabilities it is highlighted that by the year 2020, worldwide, the number of completely blind will reach 75 million, while the number of visually impaired (VI) humans will rise to 250 millions. For this reason, it is mandatory the development of electronic travel aid (ETA) systems able to increase the safe displacement of VI people in indoor / outdoor spaces, while providing additional cognition over the environment. In this paper, we introduce a novel wearable assistive device designed to facilitate the autonomous navigation in highly dynamic urban scenes. By joining two independent sources of information: ultrasonic sensors and the video camera embedded on a regular smartphone, the system can identify with high confidence, static or highly dynamic objects existent in the scene, regardless on their location, sizes or shape. In addition, the proposed system is able to acquire information about the environments, semantically interpret it and alert users about possible dangerous situations through acoustic feedback. To determine the performance of the proposed methodology we performed an extensive objective and subjective experimental evaluation with the help of 21 VI subjects from two blind associations. At the end of the testing phase, users pointed out that our prototype is very helpful in increasing the mobility, while being friendly and easy to learn