An augmented reality user interface developed by a group of researchers in the MIT Media Lab has brought a whole new dimension to the control of everyday objects. The Fluid Interfaces Group has been exploring a new way of using a smart phone or tablet to interact with things like the dials on a radio, or a light switch, by associating them with virtual objects.
At a time when tablets are increasingly being deployed for control in AV installations, this new technology suggests that the range of devices that could come under this control may be nearly limitless.
The interface for these ‘Smarter Objects’ can then be developed far beyond the ‘on-off’ switch on a simple electronic device. This new AR interface offers an easy means of modifying the interface, the behavior of that physical object as well as its interactions with other “smarter objectsâ€. To make an ordinary device potentially ‘smart’ requires a small processor and a Wifi connection to a server.
As the user points their smart phone or tablet at a physical object, the AR application recognises the object and offers an intuitive graphical interface to program the object's behavior and interactions with other objects. Once reprogrammed, the Smarter Object can then be operated with a simple tangible interface such as the physical knobs and dials.
In the case of a radio the interface could allow the user to create playlists and easily link speakers to it.