Wednesday, April 22, 15:00 - 17:30, Location: Show and Tell Area B
Ralf Kirchherr, Joachim Stegmann, Felix Burkhardt
The proposed demonstrator is a multimodal application specifically designed for smartphones, which illustrates the benefits of a graphical user interface enriched by voice and motion control capabilities. The multimodal concept is shown exemplarily for a message box application that gives mobile access to the user's voice, email and fax messages.
Based on his personal preferences the user can freely choose the most appropriate input modality for his concrete situation, whether it is touch, voice or motion. Unlike most mobile applications which are often restricted to a single input modality or which support other modalities only in a very limited way, the given multimodal interaction concept is fully implemented across all dialog steps.
The interaction by touch, voice or motion is enriched by graphic, acoustic and haptic feedback.
Touch control is well known as an intuitive way to navigate inside large lists in terms of sliding and scrolling. However voice control capabilities can be much more suitable when the user has to perform searching or sorting operations inside large lists, because the user can for instance easily combine several sorting attributes in a single utterance (e.g. "Show me all unread emails from Peter of last week!").
The speech grammar supports also complex voice commands to be used as short cuts independent from the given menu structure, which is not only limited to simple menu options but also for more sophisticated searching functions.
The underlying ASR is realized as an effective embedded software solution, thus allowing the voice control functionality to be used also in offline mode.
Motion control is offered as a further input modality beside touch and the voice control. In situations where one-hand operation is sensible, the entire application can be controlled by intuitive and context sensitive tilt gestures, which can trigger the given operations of the current dialog step. The motion control is based on the built-in acceleration sensor of the mobile phone and can be easily activated by pushing the central cursor button. Apart from situations where one-hand operation is highly appreciated, motion control is just as practical to speed up the navigation in large lists, due to the implemented dynamic character of the motion control, which means that the speed of interaction is directly derived from the intensity of the gesture.
The integrated acoustic and especially the haptic feedback leverage the interaction in lively environments since the user can 'feel' the systems reaction.
The proposed application is realized as a pure software solution on a state-of-the-art smartphone device running on Windows Mobile. The demonstrator is characterized by a consistent user interface, which allows to control the entire application by touch, voice or motion regarding the personal preferences of the user.