Looking at someone’s lips is good for listening in noisy environments because it helps our brains amplify the sounds we’re hearing in time with what we’re seeing, finds a new University College London (UCL)-led study, the school announced on its website.
The researchers say their findings, published in Neuron, could be relevant to people with hearing aids or cochlear implants, as they tend to struggle hearing conversations in noisy places like a pub or restaurant.
The researchers found that visual information is integrated with auditory information at an earlier, more basic level than previously believed, independent of any conscious or attention-driven processes. When information from the eyes and ears is temporally coherent, the auditory cortex —the part of the brain responsible for interpreting what we hear—boosts the relevant sounds that tie in with what we’re looking at.
“While the auditory cortex is focused on processing sounds, roughly a quarter of its neurons respond to light—we helped discover that a decade ago, and we’ve been trying to figure out why that’s the case ever since,” said the study’s lead author, Dr Jennifer Bizley, UCL Ear Institute.
In a 2015 study, she and her team found that people can pick apart two different sounds more easily if the one they’re trying to focus on happens in time with a visual cue. For this latest study, the researchers presented the same auditory and visual stimuli to ferrets while recording their neural activity. When one of the auditory streams changed in amplitude in conjunction with changes in luminance of the visual stimulus, more of the neurons in the auditory cortex reacted to that sound.
“Looking at someone when they’re speaking doesn’t just help us hear because of our ability to recognize lip movements—we’ve shown it’s beneficial at a lower level than that, as the timing of the movements aligned with the timing of the sounds tells our auditory neurons which sounds to represent more strongly. If you’re trying to pick someone’s voice out of background noise, that could be really helpful,” said Bizley.
The researchers say their findings could help develop training strategies for people with hearing loss, as they have had early success in helping people tap into their brain’s ability to link up sound and sight. The findings could also help hearing aid and cochlear implant manufacturers develop smarter ways to amplify sound by linking it to the person’s gaze direction.
The paper adds to evidence that people who are having trouble hearing should get their eyes tested as well.
The study was led by Bizley and PhD student Huriye Atilgan, UCL Ear Institute, alongside researchers from UCL, the University of Rochester, and the University of Washington, and was funded by Wellcome, the Royal Society; the Biotechnology and Biological Sciences Research Council (BBSRC); Action on Hearing Loss; the National Institutes of Health (NIH), and the Hearing Health Foundation.
Original Paper: Atilgan H, Town SM, Wood KC, et al. Integration of visual information in auditory cortex promotes auditory scene analysis through multisensory binding. Neuron. 2018;97(3)[February]:640–655.e4. doi.org/10.1016/j.neuron.2017.12.03
Source: University College London, Neuron
Image: | Dreamstime.com