Senses of sight and sound separated in children with autismLike watching a foreign movie that was badly dubbed, children with autism spectrum disorders (ASD) have trouble integrating simultaneous information from their eyes and their ears, according to a Vanderbilt study published in The Journal of Neuroscience.
The study, led by Mark Wallace, Ph.D., director of the Vanderbilt Brain Institute, is the first to illustrate the link and strongly suggests that deficits in the sensory building blocks for language and communication can ultimately hamper social and communication skills in children with autism.
“There is a huge amount of effort and energy going into the treatment of children with autism. Virtually none of it is based on a strong empirical foundation tied to sensory function,” Wallace said. “If we can fix this deficit in early sensory function then maybe we can see benefits in language and communication and social interactions.”
And the findings could have much broader applications because sensory functioning is also changed in developmental disabilities such as dyslexia and schizophrenia, Wallace said.
In the study, Vanderbilt researchers compared 32 typically developing children ages 6-18 years old with 32 high-functioning children with autism, matching the groups in virtually every possible way including IQ.
Study participants worked through a battery of different tasks, largely all computer generated. Researchers used different types of audiovisual stimuli such as simple flashes and beeps, more complex environmental stimuli like a hammer hitting a nail, and speech stimuli, and asked the participants to tell them whether the visual and auditory events happened at the same time.
The study found that children with autism have an enlargement in the temporal binding window (TBW), meaning the brain has trouble associating visual and auditory events that happen within a certain period of time.
A second part of the study found that children with autism also showed weaknesses in how strongly they “bound” or associated audiovisual speech stimuli.
The research was supported by the National Institutes of Health grants DC010927 and DC011993, Simons Foundation Explorer award, Vanderbilt Kennedy Center MARI/Hobbs award, and the Vanderbilt Brain Institute.