THIS ARTICLE IS MORE THAN FIVE YEARS OLD
This article is more than five years old. Autism research — and science in general — is constantly evolving, so older articles may contain information or theories that have been reevaluated since their original publication date.
A rapid screen combining a parent questionnaire and analysis of a five-minute home video can detect autism with more than 90 percent accuracy, according to unpublished research presented Thursday at the 2013 International Meeting for Autism Research in San Sebastián, Spain.
The researchers assessed more than 100 families that were waiting at a doctor’s office for a full clinical assessment for autism. Their goal is to assess 200 children.
Parents filled out a five-minute questionnaire on an iPad and uploaded home videos from their smart phones. Preliminary results suggest that the technique is highly specific, meaning it can distinguish autism from other developmental disorders, such as attention deficit hyperactivity disorder.
To further validate the test, the researchers launched a website yesterday where parents of children with autism can fill out the questionnaire and upload a home video for analysis.
Dennis Wall and his team developed the screen using machine learning, an automated approach to looking for patterns in data. The researchers applied the technique to information collected using the Autism Diagnostic Interview-Revised and the Autism Diagnostic Observation Schedule, the gold-standard tools for diagnosing autism.
Both tests are lengthy and require well-trained experts, prompting researchers to seek more efficient alternatives. The average waiting time for a full clinical evaluation is 13 months. The new test could help prioritize the children who should be referred for intensive clinical evaluations, says Wall, director of the Computational Biology Initiative at Harvard Medical School.
“It would open up the bottleneck to enable longer, more appropriate work on kids who are clinically challenging,” Wall says.
Wall had previously reported a high sensitivity for his short screen, meaning its ability to detect autism. But some researchers questioned the test’s specificity.
The children in the new study provide a good proving ground for this criticism.
Wall says about 60 percent of the children were subsequently diagnosed with autism and most of the other 40 percent likely have some kind of learning disability that brought them to the clinic. The short screen matches the result from the full clinical evaluation about 90 percent of the time.
For their evaluation, the researchers asked parents for a video of their child at play, such as in a park or at a birthday party. College students with minimal training evaluate the videos, which last five minutes or less, for eight different behaviors, including social play, motor function, repetitive movements and eye contact.
Preliminary research suggests high reliability among the raters, even when the students score it as they watch the video, rather than after watching it multiple times.
“For me, the biggest piece from this study, aside from specificity, is that we are testing parents’ willingness to submit what might be considered highly sensitive information: a video of their child,” says Wall.
The parent questionnaire evaluates seven behaviors, including group and imaginative play and reciprocal conversation.
One limitation of the study is that it focuses on a clinical group. The researchers haven’t yet tested their screen in a more general population, such as children visiting a pediatrician’s office for an annual check-up. “Ideally, it would be implemented earlier in the diagnostic process,” says Wall.
His team is testing online versions of the screen, recruiting parents from the research and advocacy organization Autism Speaks‘ Facebook page and other social media sources. He says 11,000 parents have responded to the survey in just six months.
For more reports from the 2013 International Meeting for Autism Research, please click here.