Scientists create brain imaging which predicts information on what a person is reading

Team of neuroscientists at the University of California, Berkeley have developed a brain imaging system that will help in identifying inner thoughts and brain narratives when a person reads stories. According to experts, when a person reads, specific emotional and cognitive parts of the brain are activated.

Furthermore, scientists elaborate that whether the people are reading or listening to classics or audiobooks or podcasts, they all process similar semantic information, i.e. word meaning or topography which also forms an important source to a person’s inner thoughts and narratives.

For the study, the participants listened to stories from the popular podcast series, ‘The Moth Radio Hour’, later they also read the same stories from it. By means of a functional MRI, the expert team scanned the participant’s brains for reading and listening conditions. Additionally, they also compared the brain activity data of listening-versus-reading. This helped them to create maps for both types of datasets, which were virtually identical.

Reports suggest, nine volunteers had spent hours listening to and then reading stories from ‘The Moth Radio Hour’ inside functional MRI scanners, while the researchers measured the cerebral blood flow.

The brain activity data in both the listening and reading conditions were then saved into a computer program which scored words based on their relationship to one another. With the help of statistical modeling, experts also arranged thousands of words on the maps based on their semantic relationships. With the help of the maps, that covered at least a third of the cerebral cortex, researchers could understand which words would activate which parts of the brain.

Surprisingly, the scientists found a difference between the semantic information processed in readers versus listeners.

The results of the study could thus be viewed in 3D, color coded map. Reports suggest the words could be grouped in categories like numeric, violent, mental, tactile, locational, visual, emotional and social. These were presented in form of vibrant butterflies on flattened cortices. According to the source, the interactive 3D brain viewer will be available online in this month.

Moreover, the maps could also be used in comparing language processing among healthy people and among those affected by epilepsy, stroke and brain injuries which impair speech. Studying such differences could help in understanding recovery efforts, according to Fatma Deniz, a postdoctoral researcher in neuroscience in the Gallant Lab at UC Berkeley.

“If, in the future, we find that the dyslexic brain has rich semantic language representation when listening to an audiobook or other recording, that could bring more audio materials into the classroom,” added Deniz.

The findings of the study will be published in the Journal of Neuroscience