- News>
- Science
This novel brain imaging technology can read mind and identify complex thoughts
The study indicate that the mind`s building blocks for constructing complex thoughts are formed by the brain`s various sub-systems and are not word-based.
New York: A team of US researchers have developed a novel brain imaging technology that can "read minds" and identify complex thoughts with 87 per cent accuracy with the help of machine learning algorithm.
The study indicate that the mind's building blocks for constructing complex thoughts are formed by the brain's various sub-systems and are not word-based.
The new technology can tell us what types of thoughts are being contemplated by measuring the activation in each brain system.
Marcel Just, Professor at Carnegie Mellon University, Pennsylvania said,"We have finally developed a way to see thoughts of such complexity in the fMRI signal. The discovery of this correspondence between thoughts and brain activation patterns tells us what the thoughts are built of."
For the study, the team included seven participants and used a computational model to assess how the brain activation patterns for 239 sentences corresponded to the neurally plausible semantic features that characterised each sentence.
Then the programme was able to decode the features of the 240th left-out sentence. They went through leaving out each of the 240 sentences in turn, in what is called cross-validation.
The model was able to predict the features of the left-out sentence, with 87 percent accuracy, despite never being exposed to its activation before.
"Our method overcomes the unfortunate property of fMRI to smear together the signals emanating from brain events that occur close together in time, like the reading of two successive words in a sentence," Just explained.
The professor added, "This advance makes it possible for the first time to decode thoughts containing several concepts. That's what most human thoughts are composed of."
The study was published in the journal Human Brain Mapping.
(With IANS inputs)