References: Sensing foot gestures from the pocket by Jeremy Scott, David Dearman, Koji Yatani, and Khai N. Truong. Published in the UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology.
Author Bios: Jeremy Scott received his B.Sc., M.Sc., and Ph.D. in Pharmacology & Toxicology from the University of Western Ontario. Dr. Scott is currently part of the Faculty of Medicine at the University of Toronto. David Dearman is currently a PhD student at the University of Toronto in the Department of Computer Science. His research bridges HCI, Ubiquitous Computing, and Mobile Computing. Koji Yatani is a PhD candidate in the University of Toronto under Professor Khai N. Truong. He is interested in HCI and ubiquitous computing with emphasis on hardware and sensing technology. Khai N. Truong is an assistant professor in computer science at the University of Toronto. He holds a Bachelor of Science degree in Computer Engineering from the School of Electrical and Computer Engineering at the Georgia Institute of Technology.
Summary
Hypothesis
Foot motion can be used as an effective alternative to traditional hand motions for computer input.
Methods
The authors first wanted to study how people used their feet in selection tasks and find out what motions were easiest and most effective for users. Participants were asked to perform a target selection task with their foot in three different motions: dorsiflexion, plantar flexion, and heel & toe rotation. They used six motion capture cameras to log the movements of the foot, and participants were given a wireless mouse that they used to indicate the start and end of a selection and to respond to the experiment software’s prompts.
Results
Users were able to select targets more quickly when located near the center of range of motion than when they were on the outer edge of the range of motion. In all exercises except dorsiflection, participants tended to overshoot the small angular targets more than the large angular targets. By contrast, participants tended to significantly undershoot targets in the dorsiflection trials. Participants identified heel rotation as the most comfortable.
Contents
The paper begins with an overview of their goal to find out if feet really can be an acceptable and useful alternative to hands, and then they begin discussing the experiment to determine comfortable range of motion. Once they summarized all of the results mentioned above, they discuss some of the limitations such as distinguishing between deliberate and accidental motions. In the future they plan to implement a classifier on a mobile device and build a real-time foot gesture recognition system. We are also interested in examining the performance of foot gesture recognition and the acceptability of these foot gestures in more naturalistic settings.
Discussion
This was an intriguing topic, and I appreciate the out-of-the box thinking. I feel that they did a good job researching how people use their feet and determining exactly how feasible this physical motion would be. On the other hand, I think it would be very difficult to market something like this for two reasons: First, people are already used to using their hands, and it can be extremely difficult for people to accept new ideas. Second, there is not really anything new to be gained through this method of input, except perhaps for people with special needs.
No comments:
Post a Comment