New Study Shows How Hand Gestures Can Control Our Computer

While touchscreen devices have extended the range of interfaces available to us, much of the way we interact with our digital devices has remained the same for a number of years. Research from the University of Waterloo aims to change that with a system that is capable of recognizing hand gestures to control our computers.

“It started with a simple idea about new ways to use a webcam,” the researchers explain. “The webcam is pointed at your face, but the most interaction happening on a computer is around your hands. So, we thought, what could we do if the webcam could pick up hand gestures?”

Gesture-based controls

So, the team developed a small attachment that helps to redirect the webcam towards our hands.  They then built a software program that was able to understand distinct hand gestures in a variety of conditions.  This was accompanied by a machine learning system to help train the system.

“It’s a neural network, so you need to show the algorithm examples of what you’re trying to detect,” the researchers continue. “Some people will make gestures a little bit differently, and hands vary in size, so you have to collect a lot of data from different people with different lighting conditions.”

A database of hand gestures was collated with the help of dozens of volunteers, who were also surveyed to help inform the ultimate design and functionality of the program.

We’re always setting out to make things people can easily use,” the researchers explain. “People look at something like Typealike, or other new tech in the field of human-computer interaction, and they say it just makes sense. That’s what we want. We want to make technology that’s intuitive and straightforward, but sometimes to do that takes a lot of complex research and sophisticated software.”

Facebooktwitterredditpinterestlinkedinmail