Microsoft turns walls, desks and hands into interactive displays

Mobile computing takes a leap forward as a Microsoft Research project paves the way for users to access device interfaces on any surface – walls, desks, notepads and even your own hand. Researchers have combined a pico projector and Microsoft Kinect sensor into a shoulder worn device. A video shows a researcher dialling a telephone number on a wall and navigating a map in the palm of his hand.

Microsoft Research joined forces with Carnegie Mellon University to create the depth-sensing and projection system that works on multiple surfaces without calibration. The interaction possibilities are numerous as the depth sensors can determine gestures in 3D space or allow the user to manipulate a display on a surface, in 2D, using traditional touch commands. The system also supports simultaneous multitouch.

At the moment the shoulder worn system is ridiculously bulky but Hrvoje Benko, a researcher at Microsoft Research that worked on the project, is undoubtedly correct when he says the prototype device can be developed into a small, manageable device.

This is the third iteration of a project that started off under the name Skinput. In October 2011 Microsoft released a detailed video providing examples of how multiple surfaces can be used for different processes, calling the system OmniTouch. The research body seems to have now ditched the OmniTouch name and is simply calling the system the Wearable Multitouch Projector.

Whatever you call the system the most recent announcement hints at making the system an extension to a Windows phone device, which probably means Microsoft are starting to think about commercialising the product. However, before that can happen the shoulder-worn unit will need to be smaller and more manageable.