Camera tracks hand movements in VR

Researchers at Purdue University, Indiana have introduced a system that can track physical hand movements in virtual reality (VR) using a depth-sensing camera.

As the visual side of VR and audio come together with the advent of VR microphones, the missing piece of the experience is still perfecting tracking body movements. Hoping to offer a new solution to the market is Purdue University, who have developed a system called ‘DeepHand’ that captures hand movements with depth-sensing cameras.

The Leap Motion-type system is works from a deep-learning "convolutional neural network” to understand 2.5 million of possible hand positions and display them in the virtual world.

The camera reads the position and angle of different points of the hands and runs them through a specialised algorithm which scans a hand pose database to locate the best match and represent in VR.

The program is also able to predict the configurations hands are most likely to change to next by being programmed to identify the "spatial nearest neighbours" within the database for flow of movement. Based on the orientation of adjacent areas, it also able to recognise and display parts of the hand the camera cannot see.

"We identify key angles in the hand, and we look at how these angles change, and these configurations are represented by a set of numbers," says doctorate student and paper author, Ayan Sinha.

The research team say the system can be used on a standard computer after processing.

Source: Purdue University