A new way of analyzing the data gathered from electroencephalography (EEG) ― a non-invasive technique that measures brain waves through the scalp ― provides much more information about how brain regions coordinate with one another than standard EEG analysis.
The new approach may be particularly useful for researchers who propose that autism is a consequence of poor temporal coordination among brain regions.
“The which brain areas, the where, has been a powerful focus” of research on autism and other psychiatric diseases, says Scott Kelso, professor of complex systems and brain sciences at Florida Atlantic University. “But if you don’t have a theory of how things are coordinated in time, you’re going to be missing something very essential.”
In 2007, Kelso and colleagues recorded EEG waves from two people in the same room who had been instructed to move one finger up and down. The researchers observed a specific wave frequency ― or ‘neural signature’ ― when the pair’s finger movements were synchronized1. They concluded that social interaction is at least partly encoded by precise temporal patterns in the brain.
Brain activity that is subtly coordinated between different brain regions can be difficult to measure using EEG, however, because scientists normally average brain wave data from many trials and participants across a relatively large time window.
What’s more, once a brain wave leaves its origin, it meets electrical resistance ― in other brain tissues, cerebral spinal fluid, and in the skull ― before reaching the electrodes at the scalp. The electrodes may therefore be sensing distorted signals or ‘false’ instances of synchrony.
Scientists who use standard EEG analyses “cannot do fine discriminations because they have [included] a lot of things which are unrelated to this pure mechanism of interactions between brain areas,” says Emmanuelle Tognoli, research assistant professor at Florida Atlantic University, who collaborated with Kelso on both studies.
In contrast, Tognoli and Kelso’s new ‘4D colorimetric’ method of EEG analysis, published in the January issue of Progress in Neurobiology2, first uses physical laws to map the signals recorded at the skull back onto brain space, and only then looks for instances of synchrony between brain regions.
Also, rather than averaging waves of the same frequency from many participants, the method looks at how individual brain waves change over time. Finally, the method plots the waves in different colors, making it easier for the researchers to visualize how their frequencies coordinate with one another.
Distinguishing between true and false synchronization is one of the most important problems facing EEG analysis, notes Michael Murias, research assistant professor of psychiatry at the University of Washington in Seattle. Murias has used simpler EEG analyses to study how brain regions interact in people with autism. The new approach is “something that I would be interested in applying myself,” he adds.
In 1949, Canadian psychologist D.O. Hebb proposed a ‘neural cell assembly’ theory of how the brain works. Hebb argued that information does not flow linearly from the outside world to a sensory organ, and then from one particular brain region to another.
Instead, he said, large groups of neurons in different regions process information simultaneously. The brain ultimately completes a task ― whether perceiving an image, moving a limb, or taking part in a social interaction ― by synchronizing messages from specialized brain areas.
Just right: Neither too little (left) nor too much (right) synchronization is optimal for brain performance. Efficient processing needs a subtle blend of integration (areas working together) and segregation (areas avoiding each other’s influence).
With the advent of sophisticated imaging techniques in the 1990s, researchers were able to make maps of these functionally specialized areas. “This job is now almost complete,” says William Penny, senior research fellow at the Functional Imaging Lab at University College London. “The next task of the brain imaging community, and neuroscience in general, is to find out how these areas work together.”
When millions of neurons fire at the same time, the combined electrical activity forms a rippling wave: the voltage rises, then drops, then rises again, and the cycle repeats at a steady rate. Waves are measured by their oscillatory frequency, or how many times they cycle per second.
Scientists can ‘see’ brain waves using EEG, in which dozens or hundreds of electrodes are fitted to the scalp and record the aggregated electrical activity of these millions of neurons. EEG can record changes over fractions of a second, unlike brain imaging methods that require seconds or minutes.
Certain physiological or psychological states produce EEG waves of specific frequencies. For instance, at the beginning stages of sleep, theta waves (3 to 6 cycles per second) replace alpha waves (8 to 12 cycles per second).
One of the biggest challenges for scientists is determining whether and when brain waves of different frequencies are signatures of the same cognitive task.
Using EEG electrodes to detect coordination between two brain regions is like using seismographs on the surface of the earth to detect earthquake activity hundreds of feet below. If two seismographs spaced closely together both record a vibration at the same time, then they are probably sensing the same earthquake. But if the seismographs are far apart and record vibrations at the same time, then it is much more likely that they are sensing two different earthquakes.
In the same way, when EEG electrodes spaced far apart on the scalp both record brain waves at the same time, then scientists may reasonably assume that two distinct brain regions are firing at the same time ― and thus, may be synchronized in the same cognitive task.
But when the electrodes are neither close together nor far apart, researchers have much more difficulty in determining coordination. “It’s a very big field of investigation,” says Tognoli.
Adding to the complexity of EEG analysis is the fact that sometimes, depending on the orientation of the neurons relative to the scalp, the brain waves recorded by a specific electrode do not originate from the cortex directly below it.
Rather than analyze the dynamics of the data from the sensors, Tognoli and Kelso have derived complex formulas to project the sensor data into physical brain space ― called ‘source reconstruction’ ― and then analyze the dynamics of the signals.
“This should become a standard part of brain imaging analysis over the next few years,” says Penny.
Autism has sometimes been proposed as a consequence of specialized parts of the brain working too independently and not coordinating with other regions. “Some people have theorized that in autism and schizophrenia, there are not sufficient opportunities for information exchange,” Kelso explains.
This idea is supported by results from a 2007 study by Murias and his colleagues. The researchers measured EEG waves from 18 adults with autism and 18 healthy controls while the participants were in a resting state. Compared with controls, people with autism have reduced brain-wave synchronization between the frontal lobe and the rest of the brain, the study found3.
Some imaging studies have also found that people with autism have larger amounts of white matter ― the tissue connections between neurons ― in their frontal lobe than in other brain regions4.
Picking apart EEG data with the 4D colorimetric method will give a finer temporal picture of the synchronization impairments and perhaps clarify these varied results, the researchers say.
“It’s extremely important for the kinds of questions that are confronting the investigators of autism,” Kelso says. “The ‘autistic brain’, the ‘smart brain’, whatever brain it is you’re talking about, it’s going to have a coordination dynamic.”
The approach must first be validated in a large study. In the meantime, other groups are developing other kinds of sophisticated EEG analysis techniques. “There is no consensus on the right way to do it,” Murias says.