MIT uses Kinect for real-time mapping in rescue applications
So far the Kinect has allowed you to be a Jedi, participate in video conferences and explore a virtual Nissan car, but now the geniuses at MIT have turned it to a much less frivolous purpose – real time mapping of environments to aid search and rescue operations. Combining Kinect with a laser range finder and a laptop has led to the creation of the SLAM (Simultaneous Localization and Mapping) device, thanks to funding from the US Air Force and Office of Naval Research.
The on-board laser scans a building in a 270-degree arc with a laser range finder, and the information it collects is combined with depth and visual data gathered by the Kinect before it's sent to the laptop to create the map in real time. Because the setup is tailor-made for humans, an inertial sensor is necessary to account for the wearer's gait. And because the system can detect a user's motion, it can create multi-floor maps when it senses activity on a staircase or elevator.
The system is also capable of recognising whether you have previous visited a location based on its dimensions and characteristics. The prototype uses a laptop, but the final system is envisaged as being a handheld device.
Neat, but I wonder if they’ve considered some form of connectivity which would allow the data to be sent back to home base, and the environment recreated in some kind of cave environment allowing a support team to advice and assist rescuers based on the same real-time information.