Wednesday, January 29, 2014

Speech Uses Both Sides of Brain


Folk wisdom has it that language is a left brain function, however new research using brain imaging shows that speech is a bilateral function.

Full Citation:
Gregory B. Cogan, Thomas Thesen, Chad Carlson, Werner Doyle, Orrin Devinsky, and Bijan Pesaranx. (2014, Jan 15). Sensory-motor transformations for speech occur bilaterally. Nature, online first. DOI:10.1038/nature12935

Speech uses both sides of brain

Friday 17 January 2014

Many scientists believe we only use one side of our brain for speech and language. Now, a new study from the US shows that as far as speech is concerned, we use both sides.

The study poses a significant challenge to current thinking about brain activity and could have important implications for developing treatment and rehabilitation to help people recover speech after stroke or injury, say the researchers from New York University (NYU) and NYU Langone Medical Center.

Senior author Bijan Pesaran, an associate professor in the Center for Neural Science at NYU, says:
"Our findings upend what has been universally accepted in the scientific community - that we use only one side of our brains for speech. In addition, now that we have a firmer understanding of how speech is generated, our work toward finding remedies for speech afflictions is much better informed."
Speech rarely studied separately from language

As well as listening and speaking, speech involves language for constructing and understanding sentences. Thus most studies conclude that speech, like language, happens on one side of the brain. And these studies rely on indirect measurement of brain activity, explain the researchers.

For their study, they took a different approach and examined the link between speech and brain processes directly.

The data for the study came from a group of patients being treated for epilepsy. They had electrodes implanted directly inside and on the surface of their brains as they performed sensory and thinking tasks.

Co-author Thomas Thesen, director of the NYU ECog Center where the data was collected, and an assistant professor at NYU Lagone, says:
"Recordings directly from the human brain are a rare opportunity. As such, they offer unparalleled spatial and temporal resolution over other imaging technologies to help us achieve a better understanding of complex and uniquely human brain functions, such as language."
To record what happens in the brain during speech alone that is separate from language, the patients were asked to repeat "non-words," like "kig" and "pob."

Study shows speech is a bilateral brain process

The recordings showed that both sides of the brain were involved during speech - suggesting that speech is a "bilateral" brain process, as the researchers conclude in their study report:

"Using a non-word transformation task, we show that bilateral sensory-motor responses can perform transformations between speech-perception- and speech-production-based representations. These results establish a bilateral sublexical speech sensory-motor system."

Prof. Pesaran adds:
"Now that we have greater insights into the connection between the brain and speech, we can begin to develop new ways to aid those trying to regain the ability to speak after a stroke or injuries resulting in brain damage. With this greater understanding of the speech process, we can retool rehabilitation methods in ways that isolate speech recovery and that don't involve language."
Meanwhile, US researchers - who discovered how accurately we respond to a beat is tied to how effectively our brains respond to speech - suggest in The Journal of Neuroscience that musical training could improve our brains' response to language.

Written by Catharine Paddock PhD
Here is the abstract from Nature:

Sensory–motor transformations for speech occur bilaterally

Gregory B. Cogan, Thomas Thesen, Chad Carlson, Werner Doyle, Orrin Devinsky & Bijan Pesaran 
Affiliations
Contributions

Historically, the study of speech processing has emphasized a strong link between auditory perceptual input and motor production output1, 2, 3, 4. A kind of ‘parity’ is essential, as both perception- and production-based representations must form a unified interface to facilitate access to higher-order language processes such as syntax and semantics, believed to be computed in the dominant, typically left hemisphere5, 6. Although various theories have been proposed to unite perception and production2, 7, the underlying neural mechanisms are unclear. Early models of speech and language processing proposed that perceptual processing occurred in the left posterior superior temporal gyrus (Wernicke’s area) and motor production processes occurred in the left inferior frontal gyrus (Broca’s area)8, 9. Sensory activity was proposed to link to production activity through connecting fibre tracts, forming the left lateralized speech sensory–motor system10. Although recent evidence indicates that speech perception occurs bilaterally11, 12, 13, prevailing models maintain that the speech sensory–motor system is left lateralized11, 14, 15, 16, 17, 18 and facilitates the transformation from sensory-based auditory representations to motor-based production representations11, 15, 16. However, evidence for the lateralized computation of sensory–motor speech transformations is indirect and primarily comes from stroke patients that have speech repetition deficits (conduction aphasia) and studies using covert speech and haemodynamic functional imaging16, 19. Whether the speech sensory–motor system is lateralized, like higher-order language processes, or bilateral, like speech perception, is controversial. Here we use direct neural recordings in subjects performing sensory–motor tasks involving overt speech production to show that sensory–motor transformations occur bilaterally. We demonstrate that electrodes over bilateral inferior frontal, inferior parietal, superior temporal, premotor and somatosensory cortices exhibit robust sensory–motor neural responses during both perception and production in an overt word-repetition task. Using a non-word transformation task, we show that bilateral sensory–motor responses can perform transformations between speech-perception- and speech-production-based representations. These results establish a bilateral sublexical speech sensory–motor system.

No comments: