As Baby Boomers age, many experience difficulty in hearing and understanding conversations in noisy environments such as restaurants. People who are hearing-impaired and who wear hearing aids or cochlear implants are even more severely impacted. Researchers know that the ability to locate the source of a sound with ease is vital to hear well in these types of situations, but much more information is needed to understand how hearing works to be able to design devices that work better in noisy environments.  

Researchers from the Eaton-Peabody Laboratories of the Massachusetts Eye and Ear, Harvard Medical School, and Research Laboratory of Electronics, Massachusetts Institute of Technology, have gained new insight into how localized hearing works in the brain. Their research paper, “Decoding Sound Source Location and Separation Using Neural Population Activity Patterns,” is published in the October 2, 2013 issue of the Journal of Neuroscience

HR MitchellDayDSC_0099_copy
Mitchell L. Day, PhD (photo credit:  
Garyfallia Pagonis)

“Most people are able to locate the source of a sound with ease, for example, a snapping twig on the left, or a honking horn on the right. However this is actually a difficult problem for the brain to solve,” said Mitchell L. Day, PhD, investigator in the Eaton-Peabody Laboratories at Massachusetts Eye and Ear and instructor of Otology and Laryngology at Harvard Medical School. “The higher levels of the brain that decide the direction a sound is coming from do not have access to the actual sound, but only the representation of that sound in the electrical activity of neurons at lower levels in the brain. How higher levels of the brain use information contained in the electrical activity of these lower-level neurons to create the perception of sound location is not known.” 

In the experiment, researchers recorded the electrical activity of individual neurons in an essential lower-level auditory brain area called the inferior colliculus (IC) while an animal listened to sounds coming from different directions. They found that the location of a sound source could be accurately predicted from the pattern of activation across a population of less than 100 IC neurons—in other words, a particular pattern of IC activation indicated a particular location in space. Researchers further found that the pattern of IC activation could correctly distinguish whether there was a single sound source present or two sources coming from different directions; the pattern of IC activation could segregate concurrent sources. 

“Our results show that higher levels of the brain may be able to accurately segregate and localize sound sources based on the detection of patterns in a relatively small population of IC neurons,” said Dr Day. “We hope to learn more so that someday we can design devices that work better in noisy environments.” 

The research was funded by National Institute on Deafness and Other Communication Disorders grants RO1 DC002258 and P30 DC005209. The paper was co-authored by Mitchell L. Day and Bertrand Delgutte.

Source: Mass Eye and Ear Institute. HR thanks Mary Leach of MEEI/Harvard for her assistance.