Wednesday, September 14, 2011

Paper Reading #8: Gesture Search: A tool for fast mobile data access

Reference Information
Gesture Search: A tool for fast mobile data access by Yang Li.   Published in the UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology.


Author Bio: Yang Li is currently a Senior Research Scientist working for Google.  He spent time at the University of Washington as a research associate in computer science and engineering.  He holds a PhD in Computer Science from the Chinese Academy of Sciences.


Summary
Hypothesis
Yang Li presents several individual hypotheses to test under the primary goal of demonstrating that Gesture Search is a superior tool for accessing data in a mobile device.  One specific hypothesis is that GUI-oriented touch input should have less variation in its trajectory than gestures. 
Another hypothesis presented was that Gesture Search would provide a quick and less stressful way for users to access mobile data.
Methods
To test the first hypothesis, he collected GUI events on touch screen devices and compared them against gesture data.  Specifically, he asked participants to perform a set of GUI interaction tasks on an Android phone, with the instruction to do it just as they normally would.
For the second hypothesis he made Gesture Search available for download through the company's internal website and asked Android users to test it out and provide feedback after using it for a while.  This was not a controlled laboratory study and users were encouraged to use Gesture Search in whatever capacity was most applicable to their daily life.
Results
From the first study, the results were as expected: GUI touch input had trajectories with much less variation that gestures.  There was one exception to this observation which occurred for GUI manipulations such a scrolling, flicking, and panning.  These input commands required a larger bounding box than most other touch inputs.
The second study provided data for over 5,000 search sessions, and showed that 84% of searches involved only one gesture, and 98% had two gestures or less.
Contents
Yang Li introduces an application called Gesture Search, which is designed to allow users easier and faster access to elements in their mobile device by reading gestures drawn on the screen.  In particular, he focuses on in application in areas such as searching for a contact or tool.  The application is ideally supposed to recognize gestures that take the form of letters and search within the device for information that matches the gesture.  He noted that there was some ambiguity in distinguishing between the taps and commands associated with preexisting GUI controls, particularly those that involve any sort of dragging motion.  He solves this problem through two techniques: allowing a slight time window to capture whenever a tap is actually part of a larger letter, and a test to determine the general area and shape of the gesture. 
Having developed an application that seemed to pass lab testing, he deployed it to a number of Android users and studied the interactions.  He also collected feedback after a period of time to determine how well it was received.  Overall, user reactions were positive.


Discussion
I would like to start by saying that I don't have a "smart" phone, such as an Android or iPhone, so I am not particularly well-qualified to comment on the potential use and effectiveness of this application.   Having said that, while I think he did a good job developing and presenting his product, I also feel that he is trying to solve a problem that isn't really a problem.  From what I have seen, the user interfaces and controls on the phones are already quite intuitive and easy, and it might be more work for users to remember to use the gesture search in the first place.   There was one point in the study that I didn't like very much, and that was when he stated that user data from people who barely used the application at all was thrown out.  I think it would have been better if he had at least made a point of finding out why they didn't use it.  Perhaps in the future this sort of "gesture searching" technology could partner with the swipe keyboard idea in the previous paper for an entirely new, completely finger-squiggle driven interface.

No comments:

Post a Comment