One of the projects on display at the Microsoft Research TechFest event in Redmond this week is an upcoming addition to the Kinect for Windows software development kit that gives the sensor the ability to recognize whether a person’s hand is open or closed. Software developers will be able to use the feature of the SDK to add new capabilities to their Kinect for Windows apps, such as a mid-air “mouse click” for selecting content on screen.

What’s interesting is that the Kinect sensor alone wouldn’t normally be able to recognize such fine-grained changes. Microsoft researchers were able to use huge amounts of data and machine learning to essentially teach the system to be able to tell when a person’s hand is open or closed.

“The resolution of the sensor is such that even just recognizing open and closed fist is beyond the capabilities of the sensor,” explained Peter Lee, managing director of Microsoft Research’s Redmond lab. “However, with enough data, what we’re finding is that you can learn and get more than good enough approximations.”

He continued, “People will find usefulness out of open and closed hands. What’s more important is that underneath this is a machine learning system that can now, as a general tool, be trained for more and more gestures. Exactly how far we can go, relative to the limitations of the sensor, we don’t know. But we’re already seeing that we can go further than the sensor was designed to go.”

PreviouslyMicrosoft’s new vision: A big screen on every wall, and a Kinect sensor in every bezel

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.