Digital Library

cab1

 
Title:      MULTIMODAL INTERACTION AND ACCESS TO COMPLEX DATA
Author(s):      Vladislav Nemec , Pavel Zikovsky , Pavel Slavik
ISBN:      972-98947-1-X
Editors:      Pedro Isaías and Nitya Karmakar
Year:      2003
Edition:      2
Keywords:      Multimodal interaction, data adaptation, speech, user context.
Type:      Short Paper
First Page:      767
Last Page:      770
Language:      English
Cover:      cover          
Full Contents:      click to dowload Download
Paper Abstract:      Today’s users want to access their data everywhere and any time – in various environments and occasions. The data itself can be very complex – the problem is then in providing such a complex data to the user with some interaction limitations implied by current working environment (for example noise preventing user from receiving sound information), abilities of end device (PDA unable to display large images), specific needs of user group (e.g. visually impaired user) etc. One of the possible solutions can be in allowing interaction and providing data in different modalities – audio (speech or “abstract” sounds), visual, tactile etc. Such a multimodal system must also cope with the dynamically changing conditions (e.g. working environment – generally with the user context). Our project is focused on definition of an architecture allowing users to access complex data using different input/output methods such as images from digital camera, speech or haptic feedback as output from the system etc. We propose plug-in oriented modular architecture providing means for multimodal interaction, automatic data adaptation and transformation. Our research also deals with the problem of modalities itself. We have performed number of preliminary tests inspecting “validity” of several modalities concepts. These tests and their results are described in second part of this paper.
   

Social Media Links

Search

Login