News Release 16-050
Scientists map brain's 'thesaurus' to help decode inner thoughts
Neuroimaging reveals detailed semantic maps across human cerebral cortex
April 27, 2016
This material is available primarily for archival purposes. Telephone numbers or other contact information may be out of date; please see current contact information at media contacts.
What if a map of the brain could help us decode people's inner thoughts?
Scientists at the University of California, Berkeley, have taken a step in that direction by building a "semantic atlas" that shows in vivid colors and multiple dimensions how the human brain organizes language. The atlas identifies brain areas that respond to words that have similar meanings.
The findings, published in the journal Nature and funded by the National Science Foundation (NSF), are based on a brain imaging study that recorded neural activity while study volunteers listened to stories from "The Moth Radio Hour." They show that at least 1/3 of the brain's cerebral cortex -- including areas dedicated to high-level cognition -- is involved in language processing.
Notably, the study found that different people share similar language maps.
"The similarity in semantic topography across different subjects is really surprising," said study lead author Alex Huth, a postdoctoral researcher in neuroscience at UC Berkeley.
When spoken words fail
Detailed maps showing how the brain organizes different words by their meanings could eventually help give voice to those who cannot speak, such as people who have had a stroke, brain damage or motor neuron diseases such as ALS. While mind-reading technology remains far off on the horizon, charting language organization in the brain brings decoding inner dialogue a step closer to reality, the researchers said.
"This discovery paves the way for brain-machine interfaces that can interpret the meaning of what people want to express," Huth said. "Imagine a brain-machine interface that doesn't just figure out what sounds you want to make, but what you want to say."
For example, clinicians could track the brain activity of patients who have difficulty communicating and then match that data to semantic language maps to determine what their patients are trying to express. Another potential application is a decoder that translates what you say into another language as you speak.
"To be able to map out semantic representations at this level of detail is a stunning accomplishment," said Kenneth Whang, a program director in the NSF Information and Intelligent Systems division. "In addition, they are showing how data-driven computational methods can help us understand the brain at the level of richness and complexity that we associate with human cognitive processes."
Huth and six other native English speakers participated in the experiment, which required volunteers to remain still inside a functional Magnetic Resonance Imaging (fMRI) scanner for hours at a time.
Each study participant's brain blood flow was measured as they listened, with eyes closed and headphones on, to more than two hours of stories from The Moth Radio Hour, a public radio show in which people recount humorous and poignant autobiographical experiences.
The participants' brain imaging data were then matched against time-coded, phonemic transcriptions of the stories. Phonemes are units of sound that distinguish one word from another.
The researchers then fed that information into a word-embedding algorithm that scored words according to how closely they are related semantically.
Charting language across the brain
The results were converted into a thesaurus-like map that arranged words on images of the flattened cortices of the left and right hemispheres of the brain. Words were grouped under various headings: visual, tactile, numeric, locational, abstract, temporal, professional, violent, communal, mental, emotional and social.
Not surprisingly, the maps show that many areas of the human brain represent language that describes people and social relations, rather than abstract concepts.
"Our semantic models are good at predicting responses to language in several big swaths of cortex," Huth said. "But we also get the fine-grained information that tells us what kind of information is represented in each brain area. That's why these maps are so exciting and hold so much potential."
Senior author Jack Gallant, a UC Berkeley neuroscientist, said that although the maps are broadly consistent across individuals, "There are also substantial individual differences. We will need to conduct further studies across a larger, more diverse sample of people before we will be able to map these individual differences in detail."
In addition to Huth and Gallant, co-authors of the paper are Wendy de Heer, Frederic Theunissen and Thomas Griffiths, all at UC Berkeley.
This NSF-funded project is an example of how NSF invests in the frontiers of brain research.
Researchers from UC Berkeley aim to systematically map the semantic system of the human brain.
Credit and Larger Version
Jack Gallant, University of California, Berkeley, email: email@example.com
Alex Huth, University of California, Berkeley, email: firstname.lastname@example.org
The U.S. National Science Foundation propels the nation forward by advancing fundamental research in all fields of science and engineering. NSF supports research and people by providing facilities, instruments and funding to support their ingenuity and sustain the U.S. as a global leader in research and innovation. With a fiscal year 2021 budget of $8.5 billion, NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and institutions. Each year, NSF receives more than 40,000 competitive proposals and makes about 11,000 new awards. Those awards include support for cooperative research with industry, Arctic and Antarctic research and operations, and U.S. participation in international scientific efforts.