Tuesday, November 15, 2011

Paper Reading #28: Experimental analysis of touch-screen gesture designs in mobile environments

References
Experimental analysis of touch-screen gesture designs in mobile environments by Andrew Bragdon, Eugene Nelson, Yang Li, and Ken Hinckle.  Published in the CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems.



Author Bios

  • Andrew Bragdon is currently a PhD student at Brown University.
  • Eugene Nelson is currently a PhD student at Brown University.
  • Yang Li is a researcher at Google and holds a PhD from the Chinese Academy of Sciences.
  • Ken Hinckle is a Principal Researcher at Microsoft Research and has a PhD from the University of Virginia.

Summary

  • Hypothesis
    • Bezel and marked-based gestures can offer faster, more accurate performance for mobile touch-screen interaction that is less demanding on user attention.
  • Methods 
    • 15 participants performed a series of tasks designed to model varying levels of distraction and measure their interaction with the mobile device.  They studied two major motor activities, sitting and walking, and paired them with three levels of distraction, ranging from no distraction at all to attention-saturating distraction.  The participants were given a pre-study questionnaire and instruction on how to complete the tasks in addition to a demonstration.
  • Results
    • Bezel marks had the lowest mean completion time, though there was no significant performance difference between soft button and hard button mark's mean.  There was also no significant difference between soft button's and bezel's paths, but there was a noticeable increase in mean completion time  between bezel paths and hard button paths.  Bezel marks and soft buttons performed similarly in direct, and with various distraction types bezel marks significantly and consistently outperformed soft buttons.
  • Contents
    • This paper examines the user interaction with soft buttons, hard buttons, and gestures and observes how distractions affect these interactions.  The results of their experiments indicate that direct touch gestures can produce performance and accuracy that is comparable with soft buttons when the user's attention is focused, and actually improve performance in the presence of distractions.  They found that bezel-initiated gestures were the fastest and most preferred by users, and that mark-based gestures were faster and more accurate to perform than free-form path gestures.

Discussion
I believe the authors accomplished their goal of understanding how distractions can play a role in how users prefer to interact with their devices, and I think that they did a good job of covering all of the bases and exploring a wide avenue of possibilities.  I think that their methodology was thorough and sound, and I have nothing to criticize about this paper.

No comments:

Post a Comment