Skip to main content

Spectrum: Autism Research News

Automated analyses may improve study of social deficits

by  /  5 October 2012
THIS ARTICLE IS MORE THAN FIVE YEARS OLD

This article is more than five years old. Autism research — and science in general — is constantly evolving, so older articles may contain information or theories that have been reevaluated since their original publication date.

Double vision: Two cameras capture the interaction between the experimenter and the child. Specialized software then detects moments of eye contact (blue) between the two.

Analyzing the results of screening tests for autism, which are used to diagnose the disorder and to assess the success of new therapies, can be a major chore.

Behavioral psychologists — or, more often, their graduate students — spend hours watching taped interactions between the experimenter and the child, laboriously scoring specific behaviors, such as instances of eye contact or pointing gestures.

But that is beginning to change, as sophisticated speech and visual tools — which use computer algorithms to analyze interactions — are making it easier to record and interpret these interactions in an automated fashion.

Researchers debuted some of these tools last week at the Engineering and Autism conference at the University of Southern California (USC) in Los Angeles.

Surprisingly, having the experimenter — rather than the child — wear a device to record the data, or analyzing data recorded from the experimenter rather than from the child during a social interaction, is generating the most valuable information.

Lack of eye contact is a common, though variable, feature of autism, and one that researchers often assess in screening tests for the disorder. It may even be an early sign of autism.

“But measuring eye contact in a naturalistic setting is challenging, and we want to avoid putting instruments on the child,” says Yin Li, a graduate student in James Rehg’s lab at the Georgia Institute of Technology in Atlanta.

Double vision:

Li is using special glasses, worn by the experimenter, that automatically detect eye contact. The glasses have two cameras, one facing inward that records the direction of the experimenter’s gaze, and one facing outward that captures high-definition video of the child.

Specialized software then tracks the child’s face, locates his eyes and calculates the direction of his gaze. An algorithm can then detect moments of eye contact between the child and the experimenter.

One of the benefits of this approach is that the child doesn’t have to wear any special equipment, making it easier to study children with autism who have sensory sensitivities.

“It is an interesting solution to see what the child is exactly looking at, and that can help monitor joint attention,” says Connie Kasari, professor of education at the University of California, Los Angeles, who was not involved in the research.

Problems with joint attention, which refers to shared focus an object or event, is a common deficit in children with autism. Kasari says she wonders whether the children look more often at people wearing the glasses, potentially biasing the results.

In a second project, researchers are developing ways to automatically analyze prosody — the stress and intonation patterns in speech.

Daniel Bone, a graduate student in the lab of USC engineering professor Shrikanth Narayanan, said at the conference that this approach may be able to capture dynamic aspects of prosody not measured in standard tests.

The researchers analyzed data from 28 children who had taken the Autism Diagnostic Observation Schedule (ADOS), a diagnostic test for autism. They focused in particular on the part of the test in which the psychologist talks with the child, and used algorithms to analyze prosody patterns in speech.

They found preliminary evidence that atypical prosody is linked to symptom severity in autism. Children with more severe symptoms have speech that is more monotonous, more variable in volume, has a breathier voice quality and is slower than in those with mild symptoms.

The study also analyzed the experimenters’ speech, speculating that they might change their speech pattern in response to the child. When interviewing children with autism, the study found, the experimenters raised their voices more at the end of sentence — as at the end of a question. They also modulated the speed of their speech, perhaps trying to add more emotion to their voices to try to engage the child.

When the researchers compared the children’s and the experimenters’ speech patterns, they found the latter is more predictive of whether the child has autism.

“What we find intriguing is the nature of the behavior patterns that the expert seems to adopt ‘in tune’ with the behavior patterns of the child,” says Narayanan. “What we don’t know yet is how these behavior strategies would vary across a larger set of experts, and if similar patterning would emerge.”

The researchers aim to use the findings to better understand the communication and interaction deficits in children with autism, and perhaps to group cases of autism based on these problems.

For more reports from the Engineering and Autism 2012 conference, please click here.