Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people
Language has interested brain investigators for centuries. In the 1860s, Pierre Paul Broca’s post-mortem examinations of stroke patients with language deficits identified an area of the left frontal lobe involved in speech production. At around the same time, Karl Wernicke identified another region located in the temporal lobe that was damaged in stroke patients who had lost the ability to understand spoken language.
In the latter part of the 20th century, the development of brain scanning techniques such as functional magnetic resonance imaging (fMRI) enabled researchers to understand the functions of these language centers in greater detail, divide them further into distinct subregions, and see how they recover from stroke damage. Brain scan studies have also shown that a network of regions in the left frontal and temporal lobes map the form of words to their meanings and put them together to form phrases and sentences. Yet, we still know little about how individual neurons represent the meaning of words.
Now, researchers have created a cellular map of semantic information, revealing how cells in the frontal cortex encode word meanings and how they are organized. Ziv Williams, a neurosurgeon at Massachusetts General Hospital, and his colleagues had the rare opportunity to examine these processes in 10 patients with drug-resistant epilepsy as they were being evaluated for neurosurgery.
With the patient’s consent, they implanted arrays of microelectrodes into an area of the left prefrontal cortex known to encode information about word meanings. They then recorded activity from about 300 individual neurons while the patient listened to various sentences and stories.
The researchers found that many of the neurons responded selectively to related words with specific meanings. For example, some cells fired when the participants heard words describing foods such as carrot, cake, and salad. Others responded to words denoting objects, such as broom, car, and lampshade.
Verbs elicited sweeping changes in activity for the largest number of cells, whereas words describing spatiotemporal relationships — such as up, down, and behind — elicited limited changes. And while most of the neurons fired only in response to words within one category, several responded to words in two. Significantly, the neurons responded weakly to these same meanings when words were presented in a random order, suggesting that the selectivity of their responses is strongly influenced by context.
“One of the surprising findings is that even small focal cortical areas may be potentially able to represent complex meanings largely in their entirety,” Williams told Big Think. “Areas such as the one we recorded may contain broad mixtures of different cells, each responding to different word meanings to provide a rich and detailed representation of the linguistic information communicated through speech.”
In addition to revealing details of how individual neurons encode word meanings, the findings may lend themselves to designing neural prostheses that help paralyzed patients communicate. Currently, such devices read the activity of neurons that control speech production.
“We were able to decode the meaning of words from a relatively small number of cells, suggesting that it may be possible to read out meanings and concepts from the activity patterns of neurons during natural speech processing,” Williams says. “Such a capacity could be useful for the future development of speech prostheses that aim to decode the meaning of words from neuronal activity rather than their motor representations.”
He adds, “We are interested in studying whether similar semantic representations may be observed across languages or in bilingual speakers and whether accessing word meanings in language comprehension and production elicit similar responses.
“We are also interested in determining how meanings are represented in other parts of the brain outside the prefrontal cortex, and how these may be used for the development of speech prostheses.”
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people