According to an April 1, 2015 article in the Journal of Neuroscience, researchers in the School of Medicine at the University of Connecticut (UConn Health) have shown that echoes and fluctuations in volume (amplitude modulation) are what we use to determine the distance between us and the source of a noise. Although researchers have long understood how we can tell a sound’s direction—whether it’s to our left or right, front or back, above or below us—the way we tell how far away a sound is has been poorly understood.

“This opens up a new horizon,” says Duck O. Kim, DSc, a neuroscience researcher at UConn Health. “The third dimension of sound location was pretty much unknown.”

As reported by UConn Health, all sounds have amplitude modulation, and Kim and neuroscience colleague Shigeyuki Kuwada, PhD, hypothesized that amplitude modulation altered by echoes were the key to how we perceive a sound’s distance. Almost any environment has echoes that are produced when sounds bounce off objects like walls, trees, and the ground. The farther away the source of a sound is from a listener, the more echoes there are, and the more degraded the depth of amplitude modulation becomes.

To test their hypothesis, Kim and Kuwada studied rabbits, using tiny microphones to record responses inside rabbits’ ears as they played sounds recorded at different locations. These recordings simulated modulated or unmodulated noises at different distances from the rabbit. The researchers then played the simulated sounds back to the rabbit, and measured the responses of neurons in the rabbit’s inferior colliculus (IC), a region of the midbrain known to be important for sound perception. In response to the simulated sounds, certain types of IC neurons in the rabbit’s brain fired, especially when there was a greater difference between the sound’s maximum and minimum amplitude.

The researchers found that echoes (reverberations) tend to degrade amplitude modulation, smoothing out the amplitude’s peaks and valleys. In their experiments, they showed that the neurons fired less as the sound moved farther away and the depth of amplitude modulation degraded more.

UConn Health reports that Pavel Zahorik, PhD, an auditory researcher at the University of Louisville School of Medicine, tested the same amplitude-modulated noise using human volunteers and got the same results: people need both amplitude modulation and reverberation (echoes) to figure out how far away a sound is. Without amplitude modulation, a person can’t tell how far away that noise is.

Kim and Kuwada explain that while reverberation is usually considered bad for hearing things clearly, it is necessary for perceiving the distance of a sound source, and thereby determining safety, among other things. Kim and Kuwada suggest that gaining a better understanding of the acoustics and neuroscience of distance perception could contribute to better hearing aids and prostheses.

Kim and Kuwada plan to conduct additional studies on distance perception, examining how we perceive distance with horizontal and vertical directions of sound.

Source: UConn Health 

Photo credit: © Kertis | Dreamstime.com