User Interface design has been evolving for years to keep pace with different emerging technologies that effect humancomputer interaction. Formal techniques for command-line interfaces, Drag-and-drop User Interface creation for Graphical User Interfaces and Model-Based Interface Design for Multi-Device User Interfaces are a few examples of this evolution. Nowadays communication channels between devices are getting faster, more reliable and will be omnipresent in a matter of time. Human interaction will no longer be about interaction with one device, but involves interacting with a network of devices cooperating as a set of interaction resources. Based on well-known concepts from distributed systems and model-based User Interface Design we develop a model for distributed user interfaces. One missing piece is support for distributed interfaces in the models used during the design of user interfaces. We will discuss how Dygimes, a testbed for model-based user interface development, provides support for distributed user interfaces. User mobility is a key issue, since mostly ubiquitous environments will benefit of distributed User Interfaces. In every respect location transparency plays an important role for human-device and device-device interaction.
Abstract. In this paper we show an approach that allows web interfaces to be dynamically distributed among several interconnected heterogeneous devices in an environment to support the tasks and activities the user performs. The approach uses a light-weight HTTP-based daemon as a distribution manager and RelaxNG schemas to describe the service user interfaces offered by native applications. From these service descriptions, the XHTML-based user interface is generated.
Abstract. This paper describes an approach that uses task modelling for the development of distributed and multimodal user interfaces. We propose to enrich tasks with possible interaction modalities in order to allow the user to perform these tasks using an appropriate modality. The information of the augmented task model can then be used in a generic runtime architecture we have extended to support runtime decisions for distributing the user interface among several devices based on the specified interaction modalities. The approach was tested in the implementation of several case studies. One of these will be presented in this paper to clarify the approach Keywords: Task-based development, model-based user interface development, distributed user interfaces, multimodal user interfaces.
This paper presents a modular runtime architecture supporting our model-based user interface design approach for designing context-aware, distributable user interfaces for ambient intelligent environments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.