THIS ARTICLE IS MORE THAN FIVE YEARS OLD
This article is more than five years old. Autism research — and science in general — is constantly evolving, so older articles may contain information or theories that have been reevaluated since their original publication date.
People with autism show overly synchronized activity between brain regions while conversing with others, according to a new imaging study. The unpublished findings, presented yesterday at the 2015 Society for Neuroscience annual meeting in Chicago, suggest that too much simultaneous activity in the brain leads to these individuals finding everyday social interactions overwhelming.
Many studies have examined brain connectivity in people with autism, and a popular theory holds that altered connections between different brain regions underlie the disorder. However, connectivity studies reflect the physical wiring of the brain. Synchronization, as measured in this study, is a different concept, says Kyle Jasmin, who presented the work. Jasmin is a graduate student at University College London and a fellow in Alex Martin’s lab at the National Institute of Mental Health in Bethesda, Maryland.
During a complex task, “Just because two regions are active at the same time doesn’t mean they are necessarily talking to each other,” Jasmin says. Instead, they might represent two independent channels of neural activity.
Scientists have been working to design imaging studies that are more relevant to the way social interactions occur during everyday life. Many studies have scanned the brains of people with autism while they view faces or listen to voices. “But that sort of ignores the main venue where you see these deficits in people with autism — during face-to-face conversation, when people have to interact on the fly,” Jasmin says.
In the new study, the researchers scanned the brains of 19 men and boys with autism and 20 age-matched controls in a functional magnetic resonance imaging (fMRI) scanner.
Via a screen in the scanner, the participants engaged in a series of video chats with the researchers: three six-minute conversations about their work, school and hobbies, and three more six-minute periods during which a researcher and a participant took turns reciting lines from nursery rhymes.
The researchers assessed which regions of the brain were active at the same times during the conversation phase of the scan in the two groups of participants. They found 19 pairs of regions that are more in sync in people with autism than they are in controls.
Many of these regions are involved in movement and in processing of sensory information. For example, in people with autism, activity in the left postcentral gyrus, a sensory and motor region, is over-synchronized with activity in the right precuneus, which is involved with memory and self-reflection. And the left postcentral gyrus is also over-synchronized with the right inferior temporal gyrus, another sensory and motor area. “The general pattern is everything is over-synchronized,” Jasmin says.
In fact, there are no regions that are less synchronized with each other in people with autism than in controls, “which was sort of surprising,” says Jasmin.
In a second analysis, the researchers compared brain activity during conversation with activity during nursery rhyme recitation in both groups, in order to zero in on regions that are specifically involved in keeping up the social volley of conversation. They found 14 pairs of these regions that are more synchronized in people with autism than in controls while conversing and, again, none that are less synchronized. “Lots of social brain regions are over-synchronized,” Jasmin says.
The results suggest that people with autism have trouble fine-tuning the various channels of brain activity related to different aspects of a conversation: hearing another person’s words, registering gestures and body language, interpreting thoughts and intentions, and thinking about what to say next. “They either get it all or none,” Jasmin says. “This may be why they find conversation so difficult.”
The researchers are also analyzing eye-tracking data collected during the study. And they plan to relate brain activity to different features of the conversation transcripts, such as choice of words or topics. “This is a really rich dataset,” Jasmin says.
For more reports from the 2015 Society for Neuroscience annual meeting, please click here.