A team of researchers at the University of Oregon conducted a study that shows how the brain captures sound and processes it, transforming its rhythmic structure via the auditory system. As described in an April 8, 2015 study article published in the journal Neuron, when we hear the sound of footsteps or the drilling of a woodpecker, neurons in the lower subcortical region of the brain fire in sync with the sound’s rhythmic structure, and encode its original structure in the timing of spikes.

“Even when the temporal structure of a sound is less obvious, as with human speech, the timing still conveys a variety of important information,” said study co-author Michael Wehr, PhD, a professor of psychology at the University of Oregon. Wehr says this information is transformed in our auditory system.

The new study documents this transformation of information in the auditory system of rats. According to an announcement from the University of Oregon, the research team’s findings are similar to those previously shown in primates, suggesting that the processes are a general feature in the auditory systems of all mammals.

The researchers explain that neurons in the brain use two different “languages” to encode information: temporal coding and rate coding. For neurons in the auditory thalamus, the part of the brain that relays information from the ears to the auditory cortex, this takes the form of temporal coding. The neurons fire in sync with the original sound, providing an exact replication of the sound’s structure in time. In the auditory cortex, however, about half the neurons use rate coding, which instead conveys the structure of the sound through the density and rate of the neurons’ spiking, rather than the exact timing.

The research team explored how sounds might be transformed from one coding system to another. They used a technique known as “whole-cell recording” to capture the thousands of interactions that take place within a single neuron each time it responds to a sound. The team observed how individual cells responded to a steady stream of rhythmic clicks, and noted that individual rate-coding neurons received up to 82% of their inputs from temporal-coding neurons.

“This means that these neurons are acting like translators, converting a sound from one language to another,” Wehr said in the university’s announcement. “By peering inside a neuron, we can see the mechanism for how the translation is taking place.”

The University of Oregon researchers say this study provides a glimpse into how circuits deep within the brain give rise to our perception of the world. Prior to the study, neuroscientists had speculated that the transformation from temporal coding to rate coding might explain the perceptual boundary we experience between rhythm and pitch. Slow trains of clicks sound rhythmic, but fast trains of clicks sound like a buzzy tone. It could be that these two very different experiences of sound are produced by the two different kinds of neurons.

Wehr reported that the transformation in the auditory system is similar to what has been observed in the visual system. “Except that in the auditory system, neurons are encoding information about time instead of about space,” he said, adding that neuroscientists believe rate codes could support multi-sensory integration in the higher cortical areas. A rate code could be a “universal language” that helps us make decisions by putting together what we see and what we hear.

Source: University of Oregon

Photo credit: © Arkela | Dreamstime.com