By Hilary Hurd Anyaso
Imagine the brain’s delight when experiencing the sounds of Beethoven’s Moonlight Sonata while simultaneously taking in a light show produced by a visualizer. A new Northwestern University (Evanston, Ill) study did much more than that.
To understand how the brain responds to highly complex auditory-visual stimuli like music and moving images, the study tracked parts of the auditory system involved in the perceptual processing of Moonlight Sonata while it was synchronized with the light show made by the iTunes Jelly visualizer.
The study shows how and when the auditory system encodes auditory-visual synchrony between complex and changing sounds and images.
Much of related research looks at how the brain processes simple sounds and images. Locating a woodpecker in a tree, for example, is made easier when your brain combines the auditory (pecking) and visual (movement of the bird) streams and judges that they are synchronous. If they are, the brain decides that the two sensory inputs probably came from a single source.
While that research is important, Julia Mossbridge, lead author of the study and research associate in psychology at Northwestern, said it also is critical to expand investigations to highly complex stimuli like music and movies.
“These kinds of things are closer to what the brain actually has to manage to process in every moment of the day,” says Mossbridge. “Further, it’s important to determine how and when sensory systems choose to combine stimuli across their boundaries.
“If someone’s brain is mis-wired, sensory information could combine when it’s not appropriate,” she said. “For example, when that person is listening to a teacher talk while looking out a window at kids playing, and the auditory and visual streams are integrated instead of separated, this could result in confusion and misunderstanding about which sensory inputs go with what experience.”
It was already known that the left auditory cortex is specialized to process sounds with precise, complex, and rapid timing; this gift for auditory timing may be one reason that, in most people, the left auditory cortex is used to process speech, for which timing is critical. The results of this study show that this specialization for timing applies not just to sounds, but to the timing of complex and dynamic sounds and images.
Previous research indicates that there are multisensory areas in the brain that link sounds and images when they change in similar ways, but much of this research is focused particularly on speech signals (eg, lips moving as vowels and consonants are heard). Consequently, it hasn’t been clear what areas of the brain process more general auditory-visual synchrony or how this processing differs when sounds and images should not be combined.
“It appears that the brain is exploiting the left auditory cortex’s gift at processing auditory timing, and is using similar mechanisms to encode auditory-visual synchrony, but only in certain situations; seemingly only when combining the sounds and images is appropriate,” Mossbridge said.
In addition to Mossbridge, co-authors include Marcia Grabowecky and Satoru Suzuki of Northwestern. The article “Seeing the song: Left auditory structures may track auditory-visual dynamic alignment” appeared in the October 23 edition of PLOS ONE.
Source: Northwestern University