As the philosopher and psychologist William James described it, to a baby the world is “one great blooming, buzzing confusion.” Even for adults, this statement captures the essence of our sensory experiences, and highlights the complex and multisensory character of the world around us.
At any moment, a mélange of information bombards our senses of sight, sound, touch, taste, smell and balance. One of the most important tasks for us — or, more accurately, for our brains — is to make sense of the incoming signals. Some of this information belongs to the same object or event — think of the sight and sound of a bouncing ball — and must be integrated or ‘bound,’ to be understood. Other pieces belong to different objects or events and need to be segregated.
Without appropriate integration and segregation of sensory cues, the world becomes the blooming and buzzing confusion to which James referred.
We gain a faster and more accurate perception of the world when we use information from multiple senses1. Picture yourself at a boisterous party. You are far more likely to ‘hear’ what a friend is saying several tables away if you watch her mouth movements and combine this visual cue with the weak auditory one2. Adept social communication requires the ability to grasp such multifaceted sensory input.
It has long been known that up to 90 percent of individuals with autism have unique challenges in the way they process sensory information3. They may be either insensitive or oversensitive to sensory stimuli. They may also be ‘sensory seeking,’ stimulating their senses through repetitive behaviors such as twirling or hand flapping. These traits, which may encompass a number of the senses, are among those of autism spectrum disorder in the DSM-5, the latest revision of the “Diagnostic and Statistical Manual of Mental Disorders.”
Window of sensitivity:
Several laboratories — including our own — have embarked on a series of studies to better understand how people with autism process sensory information. We have found a telling oddity in how these individuals integrate sight and sound information to make sense of an event.
This characteristic involves timing and is a key component of the larger social and cognitive difficulties that people with autism face. The sensory difference also might provide a path to treatment.
Think again of the bouncing ball. When the ball hits the floor, the visual and auditory signals associated with that collision happen in the same place and at the same time. The brain is clever: It uses spatial and temporal information to make judgments about whether those visual and auditory signals came from the same event or from different events. If the two sets of signals come from the same place at around the same time, the brain decides that they belong to the same event, and binds them together.
Of course, light travels faster than sound. So the time that these signals arrive at the eyes and ears is always slightly different. The brain solves this problem by creating a so-called window of time over which it will still bind sights and sounds and consider them connected. This window spans several hundred milliseconds, allowing us to integrate visual input with sound not only from events happening in front of our noses, but also from those occurring farther away — for example, when a friend yells to us across a parking lot.
People with autism tend to integrate auditory and visual information over longer windows of time than most of us do. In a 2014 study, we asked children with and without autism to report on whether a sight and sound presented in close succession were simultaneous4. The children with autism called out sights and sounds as simultaneous even when they were separated by long intervals of time.
A longer window may seem advantageous, as it should allow the brain to bind more information than a shorter one does. But it can in fact create substantial confusion.
In an ordinary conversation with a friend, for example, the brain must rapidly and accurately bind together the sound of your friend’s voice with the images of her mouth moving, as well as any other visual and auditory cues from her body.
The auditory information arrives as ‘phonemes,’ units of sound that make up a word. The visual cues come in the form of ‘visemes,’ the visual cues (as when lip reading) that match up to phonemes. This matching is one-to-one in someone with good speech comprehension, but if the window is too long, it becomes one-to-several — and the quality of the communication deteriorates.
One example involves the ‘McGurk effect,’ a speech illusion in which the pairing of an auditory ‘ba’ and the sight of lips forming ‘ga’ results in the perception of ‘da,’ because the brain combines the auditory and visual cues. The wider a person’s binding window, the less susceptible he or she is to the illusion, suggesting problems integrating this information.
These results show that sensory processing differences in autism extend beyond the individual senses to the integration of multisensory information. These altered multisensory abilities may create difficulties not only with social communication, but also with cognition.
A child learning to read needs to be able to properly bind what they see in writing with a mental replay of the sound of the words. If a child cannot do this, reading is likely to be difficult.
What’s more, multisensory function plays a critical role in our general understanding of our surroundings. Problems in this area may fragment perception, which can cascade into problems in cognitive abilities such as executive function (planning and decision-making), language and memory.
We plan to extend our investigation of multisensory processing into realms such as touch. Many children with autism have unusual tactile sensitivities and perceptions. Yet how these tactile qualities merge with other senses remains largely unexplored. Indeed, differences in the perception of touch may play an important role in social differences in autism, and we believe that these social challenges are also likely to be, at least in part, a result of multisensory problems.
Perception of peripersonal space, the area immediately surrounding our bodies, is built upon the integration of auditory, proprioceptive, tactile and visual inputs. People with autism may differ from typical individuals in their perception of this space. We are investigating whether the unusual integration of multisensory information in autism contributes to this perceptual difference, which could, in turn, underpin some of the social weaknesses in autism.
Studies on the integration of sensory input also provide important clues to the brain networks altered in autism. They suggest that the major nodes for the convergence of sensory information — for instance, the superior temporal cortex and intraparietal cortex — may be critical to making sense of your surroundings on a moment-to-moment basis.
Finally, we believe that these studies provide a foundation for autism treatments. Brain regions underlying multisensory processing are highly malleable. In a 2009 study, we showed that we can narrow the window of time over which neurotypical adults bind auditory and visual cues by providing feedback6. Each time a participant told us that two stimuli were simultaneous (or sequential), we told him whether that judgement was correct. Such training proved highly effective in changing multisensory temporal perception.
If we could use the same approach in people with autism, we might sharpen their perception of the world. Following conversations, reading facial expressions and other aspects of social interactions are likely to be less burdensome when the sensory pieces fit into a coherent whole. In this scenario, the world would be a bit more rational and reassuring, and less blooming and buzzing.