2009 IEEE International Conference on
Systems, Man, and Cybernetics |
![]() |
Abstract
Development of graphical user interfaces (GUIs) for multiple devices is still a time-consuming and error-prone task. Each class of physical devices --- and in addition each application-tailored set of physical devices --- has different properties and thus needs a specifically tailored GUI. Current model-driven GUI generation approaches take only few properties into account, like screen size. Additional device properties, especially pointing granularity and available input devices, allow generating GUIs suited for certain classes of devices like touch screens. This paper is based on a model-driven UI development approach for multiple devices based on a discourse model that provides an interaction design. And this approach generates UIs considering an extended device specification and applying model-transformation rules taking them into account. In particular, we show how to semi-automatically generate finger-based touch screen UIs and compare them with usual UIs for use with a mouse that have also been generated semi-automatically.