London: Computers that could read our faces may not be far from reality.
Scientists at the Massachusetts Institute of Technology`s Media Lab have developed software that can read the feelings behind facial expressions, New Scientist reported.
The software could lead to empathetic devices and is being used to evaluate and develop better adverts.
It`s getting so good that it could collate millions of peoples` reactions to an event as they sit watching it at home, potentially replacing opinion polls, influencing elections and perhaps fuelling revolutions.
“I feel like this technology can enable us to give everybody a non-verbal voice, leverage the power of the crowd,” said Rana el Kaliouby, a member of the Media Lab`s Affective Computing group. The program called MindReader, developed el Kaliouby and her colleagues, can interpret expressions on the basis of a few seconds of video.
The software tracks 22 points around the mouth, eyes and nose, and notes the texture, colour, shape and movement of facial features.
They used machine-learning techniques to train the software to tell the difference between happiness and sadness, boredom and interest, disgust and contempt.
In tests, the software proved to be better than humans at telling joyful smiles from frustrated smiles. A commercial version of the system, called Affdex, is now being used to test adverts.
ANI