Two universities in the US have developed a wearable system that uses thermal sensors to accurately predict hand positions. Called FingerTrak, the wristband has potential applications in both VR and robotics.
Developed by Cornell University and the University of Wisconsin-Madison, FingerTrak is a small form-fitting wristband that enables continuous 3D finger tracking and hand pose estimation with four miniature thermal cameras.
According to a research report that is
published in full here, FingerTrak explores the feasibility of continuously reconstructing the entire hand postures (20 finger joints positions) without the needs of seeing all fingers.
"We demonstrate that our system is able to estimate the entire hand posture by observing only the outline of the hand, i.e., hand silhouettes from the wrist using low-resolution (32 x 24) thermal cameras. A customised deep neural network is developed to learn to "stitch" these multi-view images and estimate 20 joints positions in 3D space. Our user study with 11 participants shows that the system can achieve an average angular error of 6.46° when tested under the same background, and 8.06° when tested under a different background. FingerTrak also shows encouraging results with the re-mounting of the device and has the potential to reconstruct some of the complicated poses."