Skip to main content

Spectrum: Autism Research News

THIS ARTICLE IS MORE THAN FIVE YEARS OLD

This article is more than five years old. Autism research — and science in general — is constantly evolving, so older articles may contain information or theories that have been reevaluated since their original publication date.

The Experts:
Expert
Rosalind Picard
Professor, Massachusetts Institute of Technology
Expert
Ognjen Rudovic
Postdoctoral research fellow, Massachusetts Institute of Technology

In science-fiction movies such as “Star Wars” and “WALL-E,” futuristic robots engage in smooth social interaction and even fall in love. These Hollywood depictions remain a fantasy, but a growing number of robots with limited skills, from Roomba vacuum cleaners to the Amazon Echo personal assistants, exist in the real world. And more intelligent robots may someday help therapists who work with autistic children do their jobs.

In particular, robots could ease heavy workloads, from crunching numbers to analyzing large amounts of data.

Therapists juggle a lot of tasks at once. For example, when a therapist teaches a child with autism to read social cues, such as facial expressions and tone of voice, she must simultaneously track many possible signs of improvement. These may include the direction a child is looking, the tilt of his head, whether he is taking turns with others and how he is engaging with objects or people.

To document this progress, many therapists record their sessions with a digital camera and pore over the footage for the presence or absence of important behavioral cues, as well as the interactions preceding them. This process can take hours as the therapist tries to answer such questions as: How many times did the child show socially important behaviors when I tried this new approach?

Most therapies span several months, sometimes requiring two to three hours per day. Analyzing the cumulative footage is daunting and labor-intensive — and ideally suited to a machine1.

Analyzing behavior:

On the outside, most therapy robots today are designed to appear or act friendly, resembling a pet or a human friend. On the inside, they have machinery that can watch and record movies from therapy sessions, including cameras, microphones and speakers. They also have mechanical engines to move their limbs and one or more computers that orchestrate their behaviors and help analyze data.

The computers run software based on research in an area of artificial intelligence known as machine learning, and in speech recognition, computer vision, affective computing and human-robot interactions. These areas of research — combined efforts from thousands of scientists over the past several decades — are making it possible for computers to automatically analyze video and audio.

Some algorithms can tell the difference between a car and a person; others recognize patterns of movements and can distinguish a smile from a frown. Robots and computers can also receive and analyze information, such as heart rate or skin conductance, from wearable devices. This information can reveal a person’s emotions — for example, that she is anxious or overwhelmed — even if she appears calm.

Our team at the MIT Media Lab at the Massachusetts Institute of Technology has been studying technology-assisted autism therapy since 1999. Earlier this year, we published work in Science Robotics suggesting that robots can use video recordings of children diagnosed with autism to estimate their levels of engagement and whether they appear excited or calm, or show positive or negative emotions2. In other words, the robots can determine some aspects of how the children are acting and feeling from behavioral cues.

Our robots had to learn from human examples. We first asked experts to rate recordings of therapy sessions in terms of a child’s level of excitement and engagement in the tasks as well as how positive or negative the child’s emotional expressions seemed. We developed personalized machine-learning algorithms to teach a robot to map that same information using samples of the recordings and ratings. In this way, the robot learned to estimate the same behaviors as those rated by human experts from therapy sessions with particular children — even when the robot saw these recordings for the first time.

In rating videos of children with autism, the robots agreed with human experts on the therapy session ratings as often as the experts agree with each other 2.

Therapy robots may be especially good at analyzing data collected over many sessions with the same child. In this scenario, the robot can get to know the nuances of that child’s behavior and document the behaviors in the therapy sessions faster and more consistently than human therapists can.

Such results may help guide therapists — especially those new to the field — to behaviors in which there is significant improvement or opportunity to improve. For example, the computer analysis could help answer questions such as: What approach worked best for this child? What might work best for other children who have similar behavior patterns? And how can we use this knowledge to personalize the therapy for each child?

Social robots:

The robots of the future might also be able to help improve the social skills of children with autism. In a study published in August, researchers gave 12 families of children with autism a set of social games on a tablet computer that parents could play with their children at home. They also supplied a commercial robot named Jibo, which they programmed to provide suggestions to a child as the child plays the games. The robot also modeled good social behavior3.

The researchers analyzed more than one month of 30-minute daily sessions between the child and the robot and found that the robot’s presence helped to improve the children’s ability to use gestures and gaze to share attention with their caregivers. In particular, it helped the children redirect their visual attention to the task or the caregiver3. More importantly, the study showed that the children were able to use their new social skills even when the robot wasn’t there.

Therapists might also use social robots to design engaging and personalized material for each child. For instance, a therapist could make an interactive game to practice eye contact, conversation and nonverbal communication. Practicing with a robot may, at least at first, be easier for a child with autism than interacting with a person because of the robot’s predictable and accepting nature.

Games that use robots are like training wheels on a bicycle: The child feels stable as she pedals through tough situations. Once she gets her balance and confidence with a social robot, she will want to drop the training wheels and go farther and faster — with real human interaction.

Although therapy robots aren’t yet widely available, they may one day be on the shelves alongside a Roomba or Amazon Echo. Still, we envision a future in which human life continues to be valued above that of any intelligent machine, and in which machines with social-emotional skills are crafted to offload the tasks that are hard for people to do.

Rosalind Picard is professor of media arts and sciences at the Massachusetts Institute of Technology and director of the Affective Computing Research Group at the MIT Media Lab. Ognjen Rudovic is Postdoctoral Marie Curie Fellow in the group.


References:
  1. Wetherby A.M. et al. Pediatrics 134, 1084-1093 (2014) PubMed
  2. Rudovic O et al. Sci. Robot. 3, eaao6760 (2018) Full text
  3. Scassellati B et al. Sci. Robot. 3, eaat7544 (2018) Full text