ISCA Archive Interspeech 2006
ISCA Archive Interspeech 2006

Developing speech dialogs for multimodal HMIs using finite state machines

Silke Goronzy, Raquel Mochales, Nicole Beringer

We present a tool for model-based development of multimodal interfaces. The HMI model captures all involved modalities, thus ensuring highly consistent interfaces. In this paper we focus on the development of speech dialogs. These are specified using state machines, which is in contrast to the traditional way of using flow-charts. The usage of state machines gives us the possibility to fully specify the HMI so that it contains enough information to be fully simulated without the need to connect any target applications as well as for automatic target code generation. Due to the extensive simulation capabilities usability evaluations can be conducted at very early design stages. We further explain how different dialog strategies for different user types can be developed with the help of the user modelling plug-in. The tool thus supports the whole development chain starting from design studies to specification, development and testing over usability studies and target implementation.