Washington: Scientists have for the first time used brain scans to identify emotions such as happiness, anger, sadness or even envy that a person may be experiencing.
The study combines functional magnetic resonance imaging (fMRI) and machine learning to measure brain signals to accurately read emotions in individuals.
The findings by Carnegie Mellon University illustrate how the brain categorises feelings, giving researchers the first reliable process to analyse emotions.
"This research introduces a new method with potential to identify emotions without relying on people`s ability to self-report," said Karim Kassam, assistant professor of social and decision sciences and lead author of the study.
"It could be used to assess an individual`s emotional response to almost any kind of stimulus, for example, a flag, a brand name or a political candidate," said Kassam.
For the study, 10 drama actors were scanned while viewing the words of nine emotions: anger, disgust, envy, fear, happiness, lust, pride, sadness and shame.
While inside the fMRI scanner, the actors were instructed to enter each of these emotional states multiple times, in random order.
The computer model, constructed from using statistical information to analyse the fMRI activation patterns gathered for 18 emotional words, had learned the emotion patterns from self-induced emotions.
It was able to correctly identify the emotional content of photos being viewed using the brain activity of the viewers.
To identify emotions within the brain, the researchers first used the participants` neural activation patterns in early scans to identify the emotions experienced by the same participants in later scans.
The team took the machine learning analysis of the self-induced emotions to guess which emotion the subjects were experiencing when they were exposed to the disgusting photographs.
The computer model achieved a rank accuracy of 0.91. With nine emotions to choose from, the model listed disgust as the most likely emotion 60 per cent of the time and as one of its top two guesses 80 per cent of the time.
Finally, they applied machine learning analysis of neural activation patterns from all but one of the participants to predict the emotions experienced by the hold-out participant.
"Despite manifest differences between people`s psychology, different people tend to neurally encode emotions in remarkably similar ways," noted Amanda Markey, a graduate student in the Department of Social and Decision Sciences.
A surprising finding from the research was that almost equivalent accuracy levels could be achieved even when the computer model made use of activation patterns in only one of a number of different subsections of the human brain.