At University Hospital Bonn in Germany over the span of two years, a video camera recorded 14 people with chronic epilepsy, each with 12 electrodes implanted in their brains, as they waited to have a seizure. The goal was to identify the source of the seizures and treat them with surgery.
These people had all agreed to also participate in an unusual experiment to determine how the brain perceives where someone else is looking, a social skill that people with autism often have trouble with. One likely player in this task, the researchers hypothesized, was the brain’s emotion hub, the amygdala, an almond-shaped structure buried deep in the brain.
Inserting eight hair-thin wires through each of the 12 electrodes, neurophysiologist Florian Mormann monitored the activity of more than 900 individual neurons in the participants1. What he found questions the longstanding assumption that the amygdala is involved in eye contact.
Some imaging studies have suggested that the amygdala helps an individual perceive the direction of someone’s gaze2. Where someone is looking is valuable information when interpreting their emotions: An angry person may be more of a threat if he is looking straight at you, for example, than elsewhere.
The new results do not support this idea. Mormann and his colleagues did not find any neurons in the amygdala that seem to play a role in detecting eye contact or determining the direction of someone’s gaze. They published the results in the November issue of Nature Neuroscience.
“The finding was quite a surprise,” says Ralph Adolphs, professor of psychology and neuroscience at the California Institute of Technology in Pasadena, a collaborator on the study. “The way we’re thinking about what the amygdala is doing may be fundamentally wrong.”
The study suggests that neurons in the amgydala do respond to faces, but not in a way that differentiates where someone is looking. “It does something much more abstract than what we’re currently thinking about,” Adolphs says.
In a 2013 study, Adolphs and his team recorded neurons firing in the amygdalae of 10 people, including 2 with autism, using implanted electrodes to monitor seizures. The participants looked at pictures of faces that revealed only certain features, such as the mouth or eyes.
In the controls, these neurons responded most strongly to images of eyes. But in the individuals with autism, they seemed primarily attuned to mouths. These findings bolstered the theory that the amygdala helps people recognize eye contact — an important part of social interactions. Some people with autism have trouble making eye contact. Then, in 2014, another team found neurons in the amygdala of monkeys that fire only when another monkey, which appears in a video, looks straight at them3.
Right now, we’re like a bunch of blind scientists looking at the elephant.
– Ralph Adolphs
In the new study, the researchers sought to pin down the amygdala’s role in gaze perception. They recorded the activity of more than 900 neurons across the 14 individuals, including 223 in the amygdala, as the participants looked at 42 photos of each of five people. The faces in the photos were either looking directly into the camera or had their heads or eyes turned in one of eight directions.
“The method is pretty spectacularly cool,” says Mayada Elsabbagh, assistant professor of psychiatry at McGill University in Montreal, Canada. Several research groups have used single-neuron recordings in animal models, such as rats or non-human primates. But this study is one of the largest amygdala studies in people. “The ability to do it in humans is pretty neat,” Elsabbagh says.
Certain neurons in the amygdala fired only when a participant saw a particular person among the five in the photos, suggesting that these neurons are involved in recognizing faces. But none of the neurons seemed to respond to the direction of the gaze.
Surprised by this result, Mormann visited the participants in their rooms to see if he could get a different result from a live interaction. He sat on their beds and looked at the participants either directly, with averted eyes or with his eyes closed. None of the scenarios provoked increased activity from any of the neurons in the amygdala.
“I think anybody in the field would have thought, ‘Well if you’re going to use a live person as a stimulus, you’re really going to see a lot of amygdala activity,’ ” says Adolphs. “In fact, we saw the opposite.”
The findings don’t rule out the presence of ‘eye-contact’ neurons in the amygdala because the researchers may simply have missed recording from those neurons. But the results hint that scientists should look elsewhere in the brain for this task — perhaps in the brain’s outer shell, the cerebral cortex. “One thing that people should do is stop being so focused on the amygdala and maybe redirect their attention to cortical regions,” Adolphs says.
One candidate region in the cortex is the superior temporal sulcus, a groove at the side of the head that is known to play a role in social recognition. Peter Thier at Eberhard Karls University in Tübingen, Germany, has used functional magnetic resonance imaging (fMRI) to identify a patch in this region that lights up when an individual follows someone’s eyes.
It makes sense that gaze would be encoded by the cortex, which plays a critical role in decision-making, says Thier, professor of cognitive neurology at the university. Choosing where to look is more likely to involve decision-making than the recognition of emotions, he says.
One caveat of the study is that the researchers used photos of faces with neutral expressions. It’s possible that neurons in the amygdala respond to gaze direction only in faces showing emotions rather than neutral ones, says Nouchine Hadjikhani, associate professor on radiology at Harvard Medical School. “A neutral face is a bizarre thing that no one really understands,” she says.
Adolphs and his team are working on alternatives to the face task that may reflect more natural scenarios. In a study published 22 October in Neuron, they showed 20 people with autism and 19 controls pictures of scenes, such as a soccer game or animals on an African savanna, as eye-tracking technology mapped the focus of their gaze4.
The findings confirmed results suggesting that people with autism are attracted to different aspects of a scene than controls are. For example, the participants with autism tended to look in the center of the picture, even if it held nothing of particular interest. They also looked less at faces, and often chose different elements of a scene to look at, than the controls did.
Seeing the elephant:
The researchers intend to track the activity of individual neurons as people look at these images. They also plan to map brain responses to this task using fMRI. The aim is to find convergent results across the three methods — eye-tracking, neuron recording and brain imaging — and use them to piece together the role of the amygdala, Adolphs says.
Results from individuals with autism may help explain why they are attracted to what some may consider unusual features. “Right now, we’re like a bunch of blind scientists looking at the elephant. We’re all looking at pieces, and from that we’re trying to stitch together a story of what the amygdala might do,” he says.
So far, recording neural activity in people with autism is challenging, because this relies on a confluence of events: an individual with autism who is also a candidate for the epilepsy surgery. Instead, Adolphs and his colleagues have been giving all the participants in their neuron-recording studies a questionnaire that rates the level of their autism-like traits. The researchers hope to eventually use these data to link tendencies in neuron activity to features of autism.
“It will just take some patience until we have a big enough sample that we can really see something across the dimension [of autism traits],” says Adolphs.