A team at Microsoft Research and Carnegie Mellon University have an idea. The researchers have unveiled a project in the works that uses a Kinect-like camera to transform any surface into a touchscreen.
The proof-of-concept prototype — called OmniTouch — is a wearable camera/projection system that "allows the wearer to use their hands, arms and legs as graphical, interactive surfaces," according to a write up at the Microsoft Research website.
"Today’s mobile computers provide omnipresent access to information ... It is undeniable that they have forever changed the way we work, play and interact," the team writes in the OmniTouch research paper here. "However, mobile interaction is far from solved. Diminutive screens and buttons mar the user experience, and otherwise prevent us from realizing their full potential."
Their solution: The OmniTouch device combines a small, laser-based projector and a depth-sensing camera. The camera tracks the movements of the user's fingers and works on a principle similar to Microsoft's motion-sensing Kinect controller for the Xbox 360. In fact, their original prototype used a Kinect camera. But the PrimeSense camera they settled on has been customized to work at short range.
"We wanted to capitalize on the tremendous surface area the real world provides," writes Hrvoje Benko, a researcher in the Natural Interaction Research group at Microsoft. "The surface area of one hand alone exceeds that of typical smart phones. Tables are an order of magnitude larger than a tablet computer. If we could appropriate these ad hoc surfaces in an on-demand way, we could deliver all of the benefits of mobility while expanding the user’s interactive capability."
To see how OmniTouch transforms everyday surfaces into interactive screens, check out the video:
The proof-of-concept prototype — called OmniTouch — is a wearable camera/projection system that "allows the wearer to use their hands, arms and legs as graphical, interactive surfaces," according to a write up at the Microsoft Research website.
"Today’s mobile computers provide omnipresent access to information ... It is undeniable that they have forever changed the way we work, play and interact," the team writes in the OmniTouch research paper here. "However, mobile interaction is far from solved. Diminutive screens and buttons mar the user experience, and otherwise prevent us from realizing their full potential."
Their solution: The OmniTouch device combines a small, laser-based projector and a depth-sensing camera. The camera tracks the movements of the user's fingers and works on a principle similar to Microsoft's motion-sensing Kinect controller for the Xbox 360. In fact, their original prototype used a Kinect camera. But the PrimeSense camera they settled on has been customized to work at short range.
"We wanted to capitalize on the tremendous surface area the real world provides," writes Hrvoje Benko, a researcher in the Natural Interaction Research group at Microsoft. "The surface area of one hand alone exceeds that of typical smart phones. Tables are an order of magnitude larger than a tablet computer. If we could appropriate these ad hoc surfaces in an on-demand way, we could deliver all of the benefits of mobility while expanding the user’s interactive capability."
To see how OmniTouch transforms everyday surfaces into interactive screens, check out the video:

facebook
twitter
google+
fb share