Professor Sid P. Bacon, dean of natural sciences in the Arizona State University College of Liberal Arts and Sciences, was recently awarded a National Institutes of Health (NIH) grant — $1.1 million over three years — that will contribute to his ongoing research into electric-acoustic stimulation, or EAS. This technology combines electric stimulation in the mid- to high-frequency (or mid- to high-pitch) region via cochlear implants with normal acoustic stimulation in the low-frequency or low-pitch region.

Electric stimulation and acoustic stimulation separately might improve hearing a small amount, Bacon said, "but the sum is much greater than its parts. There is a synergistic effect when electric and acoustic stimulation are combined." The long-term goal of the research is to gain a better understanding of the cues and processes underlying the benefits of EAS.

"The combination of electric and acoustic stimulation enables individuals to do quite well, even in environments where there is background noise; these are typically very difficult listening situations, especially for people with hearing loss," he said.

Bacon’s research focuses on why there is such a dramatic improvement. The research will test what he suspects are the acoustic cues in speech that account for this improvement.

"When people talk, their voice has a pitch that varies up and down during the course of speaking. Men tend to have a lower overall pitch than women and children. That voice pitch is known as the fundamental frequency, and it tends to be below 400 Hz for all speakers" Bacon said. "It stands to reason that one of the cues from the low-frequency region is this fundamental frequency, this low voice pitch.

"I decided to test this cue directly," he said.

Bacon and his research team, including Christopher Brown and several doctoral students, use software to extract cues from the low-frequency region of speech that he believes may be important for EAS. He then replaces the speech with a tone that carries the cues, either alone or in combination. He has found that voice pitch is an important cue. For some cochlear implant patients, a tone carrying this cue provides as much benefit as speech itself.

"On a theoretical level, it tells us what cues are important for EAS, which is what drove me to look at this initially," Bacon said.

Bacon’s research includes evaluating EAS in people with cochlear implants, as well as simulating EAS in normal-hearing individuals. He has another NIH grant to study various aspects of normal hearing.

"We have access to a lot more individuals with normal hearing," Bacon said, explaining that researchers are able to simulate the experience of electric hearing in normal hearers, enabling them to experiment with new technology and sound processing before extending that technology to implant patients.  "We often see the same pattern of results in the two groups of listeners. Thus, we can experiment extensively in normal-hearing people, and then apply our most promising findings to people with an implant."

Bacon’s findings could potentially expand the range of people who are able to benefit from EAS technology. "Right now people are candidates for EAS only if they have hearing up to a frequency of at least 500 Hz," Bacon said. His research with simulations of EAS in normal-hearing individuals indicates that the tone carrying the acoustic cues can be shifted to a very low frequency without loss of benefit.

"If we can show similar results in individuals with a cochlear implant, it would suggest that you might only need to have hearing up to 100 Hz."

Sources: Medical News Today; Arizona State University College of Liberal Arts and Sciences