Adaptable multimodal interfaces in pervasive environments
Abstract
In the context of pervasive environments, multimodal interaction plays a pivotal role because multimodality provides flexibility and naturalness for interaction. The challenge of multimodal interfaces in pervasive environments is then to build reliable and autonomic processing systems able to analyze and understand multiple interaction modalities and reconfigure itself in real-time. Addressing this issue, we have developed an autonomic framework called DynaMo (Dynamic multiMOdality) for the development and runtime management of multimodal interaction in pervasive environments. DynaMo is composed by a specification language dedicated to the multimodality domain and a runtime machine that instantiates these specifications. In this paper, we present the overall architecture of our solution DynaMo that is based on partial interaction models, and how these models are completed at runtime to build multimodal interfaces adapted to the local execution environment.
Origin | Files produced by the author(s) |
---|