Two fog screen projected display prototypes that allow interaction with three dimensional images have been developed at the City University of Hong Kong. The systems allow mixed reality settings that allow physical objects to coexist and interact with the 3D imagery in real time with a depth camera deployed for object recognition and hand tracking.
City University researchers Bin Chen, Yaozhun Huang and Miu-Ling Lam say they have used projection mapping techniques on to a non-planar and reconfigurable fog screen.
Two prototype units were developed. The first uses a calibrated projector and fog emitter modules mounted on four linear motion platforms which travel in the direction of the projector’s principle axis. The movement of each platform is synchronised with the image content.
In the second, a 2D array of fog emitters are individually switchable with the switching pattern controlled by a micro controller and custom electronics. Columns of laminar fog form a non-planar screen which scatters light projected on to it.
Researchers say the systems could find uses in a variety of applications including computer-aided design, architectural and landscape planning, training, simulation, telepresence, remote operation, scientific visualisation, medical imaging and creative art.