When people hear a sound, there’s electrical activity in the brain’s temporal lobe.
Researchers at the University of California, Berkeley have decoded those electrical signals, enabling them to predict the words a person heard by translating the brain activity.
It’s a technique that could help thousands of people with damaged speech mechanisms due to stroke or paralysis. In theory, imagined conversations could be reconstructed by monitoring brain activity.
“There is some evidence that hearing the sound and imagining the sound activate similar areas of the brain,” said Brian N. Pasley, a post-doctoral researcher. “If you can understand the relationship well enough between the brain recordings and sound, you could either synthesize the actual sound a person is thinking, or just write out the words with a type of interface device.”
Pasley visited 15 neurosurgical patients and recorded brain activity detected by electrodes as they heard 5-10 minutes of conversation. He then used the data to reconstruct and play back the sounds the patients heard.
The researchers predict the success with audible sound can be extended to imagined, internal verbalizations since previous studies have shown similar brain regions are activated when a person imagines a word and when it’s actually spoken.
The findings are reported in the journal PLoS Biology.