The wunderkinds at MIT’s Media Lab (Fluid Interfaces Group) have developed a gesture-controlled wearable computing device that feeds you relevant information and turns any surface into an interactive display.
Called the Sixth Sense, the gadget relies on certain gestures and on object recognition to call up virtual gadgets and Web-based information, in a way that conjures up the movie Minority Report.
Sixth Sense aims to integrate information and tech into everyday life.
The team built the Sixth Sense $350 prototype using off-the-shelf components-a simple web cam and portable battery-powered projector with a small mirror-that are fashioned into a pendant-style necklace that communicates with a cell phone.
When might Sixth Sense hit retail shelves? There’s no release date, and MIT Associate Professor and Founder of the school’s Fluid Interfaces Group Pattie Maes calls it “very much a work in progress.” (Perfecting the image recognition, for example, is an ongoing challenge.) Still, the MIT team says it has the potential to be made available today in a limited form.
Developed by Maes and MIT grad student Pranav Mistry (who Maes describes as the genius behind the gadget), along with the help of other MIT students, Sixth Sense aims to more seamlessly integrate online information and tech into everyday life. By making available information needed for decision-making beyond what we have access to with our five senses, it effectively gives users a sixth sense, says Maes.
Plus, it just looks fun to use.
So just what can you do with the Sixth Sense? Here’s a sampling:
Make a call. You can use the Sixth Sense to project a keypad onto your hand, then use that virtual keypad to make a call.
Call up a map. With the map application you can call up the map of your choosing, project it onto a nearby surface, and then use your thumbs and index fingers to navigate the map, for example, to zoom in and out and do other controls.
Take pictures. If you fashion your index fingers and thumbs into a square (the typical “framing” gesture), the system will snap a photo. After taking the desired number of photos, you can project them onto a surface, and use gestures to sort through the photos, and organize and resize them.
Create multimedia reading experiences. Sixth Sense can be programmed to project related videos onto newspaper articles you are reading.
Call up e-mail. By gesturing the @ sign, you can call up and use e-mail.
Get flight updates. The system will recognize your boarding pass and let you know whether your flight is on time and if the gate has changed.
Check the time. Who needs a Rolex? Mistry says with Sixth Sense all you have to do is draw a circle on your wrist to get a virtual watch that gives you the correct time.
Get product information. Maes says Sixth Sense uses image recognition or marker technology to recognize products you pick up, then feeds you information on those products. For example, if you’re trying to shop “green” and are looking for paper towels with the least amount of bleach in them, the system will scan the product you pick up off the shelf and give you guidance on whether this product is a good choice for you. Similarly, if you pick up a book, the system can project Amazon ratings on that book, as well as reviews and other relevant information.
Feed you information on people. The team says Sixth Sense also is capable of “a more controversial use”: when people are standing with you, projecting relevant information such as what they do, where they work, and so on.
Mistry sees huge potential for Sixth Sense, including gaming applications that allow gamers to take their fun outside and better interact with the physical world. He’s also excited about its potential use of translating a deaf person’s sign language into audio and other ways to enhance abilities.