Patti Maes’ Fluid Interfaces Group from the MIT Media Lab have a prototype ‘wearable gestural interface‘ that looks most interesting!
Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.
The SixthSense system is comprised of a digital camera, a portable projector and some colour markers. The projector projects an image on to any surface (preferably flat) directly in front of the user. The digital camera tracks the colour markers (typically placed on the user’s fingertips) by sending the images to a mobile computing device in the user’s pocket, which is also responsible for sending information to the projector. In this way gestures traced out by the user are analysed, and the appropriate action or image to display is realised.
For example, the SixthSense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. The user can stop by any surface or wall and flick through the photos he/she has taken.
Check out this really interesting video for an overview of the system as well as some application examples:
And you should also check out Pranav Mistry’s (grad student working with Patti Maes) website, for a longer description as well as more pictures and video of the system in action.
SixthSense – Pranav Mistry [via Core77]