3Gear, a technology start up in San Francisco, has developed a SDK (software development kit), which uses two 3D cameras taking top down images to accurately track the movements of a user’s hand allowing for much more precise gesture control applications. The current demo system uses a pair of Kinect cameras mounted above a work station on a metal structure, but it is anticipated in future that they could be integrated into a computer display or mounted on it.
The SDK is being released in the hope that third party developers will extend the capabilities of the company’s hand tracking algorithm’s which are an essential part of accurate gesture control.
3Gear's system uses two depth cameras (the same type used with Kinect) that capture 30 frames per second. The position of a user's hands and fingers are matched to a database of 30,000 potential hand and finger configurations. The process of identifying and matching to the database—a well-known approach in the gesture-recognition field—occurs within 33 milliseconds, co-founder Robert Wang says, so it feels like the computer can see and respond to even a millimeter finger movement almost instantly.