Thursday, July 7, 2011

0 MIT Formulated Electronic Glasses That Read Emotions

“What is the one ability you wished you possessed?”, was the question thrown at me by a friend. ”I wish I could read people’s minds”, I replied. Well, seems like MIT Media Lab has brought me a little closer to my wishes. With the new electronic spectacles they have developed, well, you can’t read anybody’s mind, but you can definitely read their emotions. These “social X-ray specs” have a built-in camera connected to a software that analyzes facial expressions and exposes the respective emotion. This, I believe, is all set to revolutionize the way we interact with each other.

With this, I know what you feel!

These glassed were formulated by Rosalind Picard, Rana el Kaliouby and Simon Baron-Cohen. In the first place, the research was specified to help autistic individuals who might lack the necessary social instincts to make out a person’s emotions during conversations.

The prototype glasses are built with a rice-grain-size camera cabled to a computer the size of a deck of cards. The camera is enabled to track 24 feature points of facial expression. The linked computer then scans for the micro-expressions to gauge how often they appear and how long, while comparing similar expressions produced by actors and identified by volunteers. These glasses relay a resumed version of emotional information to the user through an ear-piece, conveying user about the subject’s emotional state, such as confused or in disagreement. An included light signifies agreement or conflict by turning green or red.

Picard and el Kaliouby also found that an average person could correctly interpret expressions only 54 percent of the time whereas the glasses could identify them correctly 64 percent of the time. As yet, the device has been very effective in getting autistic individuals more involved in conversations. ”They approached people and tested out new facial expressions on themselves and tried to elicit facial expressions from other people,” Picard says. Eventually, she thinks the system could be integrated into a pair of augmented-reality glasses, which would overlay computer graphics onto the scene in front of the wearer. The glasses are definitely very effective, even so, Picard says that they are not foolproof. The glasses can be tricked, however doing so requires a lot of concentration.

This technology is currentky in use at Affectiva, a company owned by Picard and el Kaliouby, to provide deeper market testing for advertisements and movies. In the meantime, another colleagues of researchers, Mohammed Hoque has been fine tuning the algorithms to detect pernicious differences between expressions, such as smiles of delight versus smiles of frustration, along with 10 different types of Japanese smiles.

0 comments:

Post a Comment

 

keepitsimpledears Copyright © 2011 - |- Template created by O Pregador - |- Powered by Blogger Templates