Microsoft gesture tracking looks to the hand
Microsoft researchers have revealed a gesture based control system called Handpose to meet a future where people can easily and intuitively interact with virtual reality. The team are working to create a precise and accurate system that uses a minimal amount of processing power and will present their findings at summer academic research conferences.
Because the hand is complex in terms of the gestures and shapes it can make then precise hand tracking is hard to achieve.
The researchers have combined a number of new methods with an algorithm that dates back to the 1940s to deliver a system that allows a user, with a virtual reality headset perform functions such as poking a stuffed bunny, turning a nob, moving dials and writing with a keyboard.
In a blog post, Jamie Shotton, a principal researcher in computer vision at Microsoft’s Cambridge, UK, research lab, said: “How do we interact with things in the real world? Well, we pick them up, we touch them with our fingers, we manipulate them. We should be able to do exactly the same thing with virtual objects. We should be able to reach out and touch them.”
Read full blog