Chair: Hrvoje Benko, Microsoft Research, USA
Gesture Coder: A Tool for Programming Multi-Touch Gestures by Demonstration
Contribution & Benefit: We present Gesture Coder, a tool for programming multi-touch gestures by demonstration. It significantly lowers the threshold of programming multi-touch gestures.
Abstract » Multi-touch gestures have become popular on a wide range of touchscreen devices, but the programming of these gestures remains an art. It is time-consuming and error-prone for a developer to handle the complicated touch state transitions that result from multiple fingers and their simultaneous movements. In this paper, we present Gesture Coder, which by learning from a few examples given by the developer automatically generates code that recognizes multi-touch gestures, tracks their state changes and invokes corresponding application actions. Developers can easily test the generated code in Gesture Coder, refine it by adding more examples, and once they are satisfied with its performance integrate the code into their applications. We evaluated our learning algorithm exhaustively with various conditions over a large set of noisy data. Our results show that it is sufficient for rapid prototyping and can be improved with higher quality and more training data. We also evaluated Gesture Coder's usability through a within-subject study in which we asked participants to implement a set of multi-touch interactions with and without Gesture Coder. The results show overwhelmingly that Gesture Coder significantly lowers the threshold of programming multi-touch gestures.ACM
Proton: Multitouch Gestures as Regular Expressions
Contribution & Benefit: Describes a framework that allows developers to declaratively specify multitouch gestures as regular expressions. Supports static analysis of gesture conflicts and the creation of gestures via a graphical editor.
Abstract » Current multitouch frameworks require application developers to write recognition code for custom gestures; this code is split across multiple event-handling callbacks. As the number of custom gestures grows it becomes increasingly difficult to 1) know if new gestures will conflict with existing gestures, and 2) know how to extend existing code to reliably recognize the complete gesture set. Proton is a novel framework that addresses both of these problems. Using Proton, the application developer declaratively specifies each gesture as a regular expression over a stream of touch events. Proton statically analyzes the set of gestures to report conflicts, and it automatically creates gesture recognizers for the entire set. To simplify the creation of complex multitouch gestures, Proton introduces gesture tablature, a graphical notation that concisely describes the sequencing of multiple interleaved touch actions over time. Proton contributes a graphical editor for authoring tablatures and automatically compiles tablatures into regular expressions. We present the architecture and implementation of Proton, along with three proof-of-concept applications. These applications demonstrate the expressiveness of the framework and show how Proton simplifies gesture definition and conflict resolution.ACM
Bootstrapping Personal Gesture Shortcuts with the Wisdom of the Crowd and Handwriting Recognition
Contribution & Benefit: Presents a novel approach for bootstrapping personal gesture shortcuts, using a combination of crowdsourcing and handwriting recognition. Makes gesture-based interaction more scalable by alleviating the effort of defining gesture shortcuts beforehand.
Abstract » Personal user-defined gesture shortcuts have shown great potential for accessing the ever-growing amount of data and computing power on touchscreen mobile devices. However, their lack of scalability is a major challenge for their wide adoption. In this paper, we present Gesture Marks, a novel approach to touch-gesture interaction that allows a user to access applications and websites using gestures without having to define them first. It offers two distinctive solutions to address the problem of scalability. First, it leverages the "wisdom of the crowd", a continually evolving library of gesture shortcuts that are collected from the user population, to infer the meaning of gestures that a user never defined himself. Second, it combines an extensible template-based gesture recognizer with a specialized handwriting recognizer to even better address handwriting-based gestures, which are a common form of gesture shortcut. These approaches effectively bootstrap a user's personal gesture library, alleviating the need to define most gestures manually. Our work was motivated and validated via a series of user studies, and the findings from these studies add to the body of knowledge on gesture-based interaction.ACM
Self-Revealing Gestures: Teaching New Touch Interactions in Windows 8
- Long Case Study
Contribution & Benefit: Case study describing a design process for a teaching method for new touch gestures in Windows 8. Can assist designers in understanding how touch interactions can be taught during interaction.
Abstract » The touch language we use to interact with computers and devices is still developing. How can we teach users of our systems new touch gestures without interfering with their user experience? A team of user experience designers and researchers went through an iterative process to design a teaching method for two new touch interactions. This case study describes the designs they created, their insight from user studies, and the final design that will be implemented in Windows 8.