The Tools of the Trade


May 8, 2012 @ 14:30, Room: 12AB

Chair: Jennifer Thom-Santelli, IBM Research, USA
A Hybrid Mass Participation Approach to Mobile Software Trials - Paper
Contribution & Benefit: Describes methodology for combining simultaneous 'app store' style mobile software trial with local deployment. Allows for explanation of observed behaviour, verification to prevent misleading findings and more solid ethical practice.
Abstract » User trials of mobile applications have followed a steady march out of the lab, and progressively further ‘into the wild’, recently involving ‘app store’-style releases of software to the general public. Yet from our experiences on these mass participation systems and a survey of the literature, we identify a number of reported difficulties. We propose a hybrid methodology that aims to address these, by combining a global software release with a concurrent local trial. A phone–based game, created to explore the uptake and use of ad hoc peer-to-peer networking, was evaluated using this new hybrid trial method, combining a small-scale local trial (11 users) with a ‘mass participation’ trial (over 10,000 users). Our hybrid method offers many benefits, allowing locally observed findings to be verified, patterns in globally collected data to be explained and addresses ethical issues raised by the mass participation approach. We note trends in the local trial that did not appear in the larger scale deployment, and which would therefore have led to misleading results were the application trialled using ‘traditional’ methods alone. Based on this study and previous experience, we provide a set of guidelines to researchers working in this area.
"Yours is Better!" Participant Response Bias in HCI - Paper
Contribution & Benefit: Interviewer demand characteristics can lead to serious experimental biases in HCI. Our study in Bangalore, India shows that researchers should expect significant response biases, especially when interacting with underprivileged populations.
Abstract » Although HCI researchers and practitioners frequently work with groups of people that differ significantly from themselves, little attention has been paid to the effects these differences have on the evaluation of HCI systems. Via 450 interviews in Bangalore, India, we measure participant response bias due to interviewer demand characteristics and the role of social and demographic factors in influencing that bias. We find that respondents are about 2.5x more likely to prefer a technological artifact they believe to be developed by the interviewer, even when the alternative is identical. When the interviewer is a foreign researcher requiring a translator, the bias towards the interviewer's artifact increases to 5x. In fact, the interviewer's artifact is preferred even when it is degraded to be obviously inferior to the alternative. We conclude that participant response bias should receive more attention within the CHI community, especially when designing for underprivileged populations.
Digital Pen and Paper Practices in Observational Research - Paper
Community: user experience
Contribution & Benefit: We present digital pen and paper practices and their integration with ChronoViz, documenting the co-evolution of notetaking and system features as participants used the tool during an 18-month field deployment.
Abstract » Researchers from many disciplines are taking advantage of increasingly inexpensive digital video to capture extensive records of human activity in real-world settings. The ability to record and share such data has created a critical moment in the practice and scope of behavioral research. While recent work is beginning to develop techniques for visualizing and interacting with integrated multimodal information collected during field research, navigating and analyzing these large datasets remains challenging and tools are especially needed to support the early stages of data exploration.

In this paper we describe digital pen and paper practices in observational research and their integration with ChronoViz, a tool for annotating, visualizing, and analyzing multimodal data. The goal is to better support researchers both in the field, while collecting data, and later in the lab, during analysis. We document the co-evolution of notetaking practices and system features as 28 participants used the tool during an 18-month deployment.
User See, User Point: Gaze and Cursor Alignment in Web Search - Paper
Community: user experience
Contribution & Benefit: Describes a lab study of alignment in eye-gaze and mouse cursor positions in Web search. Studies when gaze and cursor are aligned, and presents a model for predicting visual attention.
Abstract » Past studies of user behavior in Web search have correlated eye-gaze and mouse cursor positions, and other lines of research have found cursor interactions to be useful in determining user intent and relevant parts of Web pages. However, cursor interactions are not all the same; different types of cursor behavior patterns exist, such as reading, hesitating, scrolling and clicking, each of which has a different meaning. We conduct a search study with 36 subjects and 32 search tasks to determine when gaze and cursor are aligned, and thus when the cursor position is a good proxy for gaze position. We study the effect of time, behavior patterns, user, and search task on the gaze-cursor alignment, findings which lead us to question the maxim that "gaze is well approximated by cursor." These lessons inform an experiment in which we predict the gaze position with better accuracy than simply using the cursor position, improving the state-of-the-art technique for approximating visual attention with the cursor. Our new technique can help make better use of large-scale cursor data in identifying how users examine Web search pages.