gesture recognition; touch gestures; mode-switching; adaptation With touch-based interfaces becoming commonplace on personal computing devices ranging from phones and slates to notebook and desktop PCs, a number of common tasks that were once performed using mouse or keyboard input now need to be performed using fingers on the touch surface. Finger-drawn gestures offer a viable alternative to desktop and keyboard shortcuts as shortcuts for common tasks such as launching of applications and navigation of large media collections. In order to be truly effective, the interface for definition, management and invocation of gestures should be highly intuitive, and optimized for the device. In particular, the process of invoking gestures should be seamless and natural. Further, the recognition of gestures needs to be robust for the specific user. In this paper, we describe GeCCo (Gesture Command and Control), a system for personalized finger gesture shortcuts for touch-enabled desktops and trackpad-enabled notebook PCs. One of the key issues addressed in the design of GeCCo is that of mode switching in the context of notebook PCs. We describe a user study to decide between different interactions for mode switching. The interactions are designed such that mode switch and gesture can be simultaneously indicated. Since new gestures may be defined by the user at any time, statistical pattern classification techniques which require large numbers of training samples for each gesture are not useful. Instead we use nearest-neighbor classification with Dynamic Time Warping (DTW) distance, and a writer adaptation scheme for improving accuracy to desired levels. We conclude the paper with experimental results and some thoughts on next steps. Abstract-With touch-based interfaces becoming mainstream on personal computing devices ranging from phones and slates to even notebook and desktop PCs, a number of common tasks that were once performed using mouse or keyboard input now need to be performed using fingers on the touch surface. In this paper, we focus on the use of finger gestures as shortcuts for tasks such as command and control of applications, navigation of large collections. A keyboard shortcut or desktop icon for an application may be substituted with a finger gesture e.g. tracing a gesture 'O' launches MS Outlook. Such gestures may be used even on a conventional notebook PC using the trackpad. However in order to be truly effective, the interface for definition, management and invocation of gestures should be highly intuitive, and customized to the available device hardware. In particular, it should be possible for the user to define and modify gestures as and when required, and the process of invoking gestures should be seamless and natural. Further, the recognition of gestures needs to be robust for the specific user. We describe GeCCo (Gesture Command and Control), an application prototype we have created for touch-enabled devices that allows the user to define his or her own gesture shortcuts for applications. One of t...