THIS ARTICLE IS MORE THAN FIVE YEARS OLD
This article is more than five years old. Autism research — and science in general — is constantly evolving, so older articles may contain information or theories that have been reevaluated since their original publication date.
After being neglected for decades, motor development is becoming a hot topic of conversation in the autism research community.
Part of the difficulty in studying early motor skills — such as sitting up, reaching and grasping — is that infants acquire them in the first few months of life, long before autism emerges. But at a meeting of the High Risk Baby Siblings Research Consortium last week, I heard about a fascinating project that’s measuring the precise movements of infants as they interact with objects and people. The researchers are using the data to learn about infant development and build a ‘social’ robot.
Daniel Messinger, a psychology professor at the University of Miami, and his students have videotaped eight babies between 2.5 and 5 months old while they play with their mothers in a soundproof room full of toys. The babies wear handmade onesies with a small light attached near each joint. The researchers then use software called PhaseSpace to get a quantitative picture of the babies’ movements.
At the meeting, Messinger showed a video clip of a baby hanging in a sling and looking at its mother as she holds a ball and moves it close, and then far, from her baby’s face. Taping this seemingly simple interaction has already shown the researchers many things they didn’t know before.
For instance, as the mom brings the ball closer, the baby suddenly begins moving both arms and both legs with vigor. But when the ball gets very close to the baby’s face, the limb movement slows.
“Babies aren’t just moving their limbs randomly,” Messinger said. “There’s an optimal distance for limb movement: not too close, not too far.”
The experiment is a high-tech demonstration of the theories of Russian psychologist Lev Vygotsky.
Vygotsky proposed that when an object is nearby, a baby can reach for it using only motor skills. But when the object is far away, the baby must learn to enlist the help of someone else to get it. Reaching is, in that situation, a social task.
Messinger is sending the tapes and movement data to collaborator Javier Movellan at the University of California, San Diego, who is using them to build a baby robot that can learn to interact with objects and people without explicit programming of each movement.
What Messinger is most interested in, though, is finding a way to use this experimental set-up to study ‘baby sibs,’ the younger siblings of children with autism, who are at a higher risk of developing the disorder than the general population.
“I’m thinking about how to apply this to kids developing an autism spectrum disorder,” Messinger said. “Where are the problems? Are they motor, are they social, or are they both?”