October 30, 2007

A new study by neuroscientists from Northwestern University focuses on how an individual’s brain processes sensory information about another’s gender to examine whether hearing fundamentally changes visual experience.

The study concludes that it does, weighing in with findings that contribute to provocative evidence about multi-sensory processing of our world that has been emerging in recent years.

“Auditory-Visual Cross-Modal Integration in Perception of Face Gender,” was published in a recent issue of Current Biology. The study’s co-authors are investigators at Northwestern’s Visual Perception, Cognition and Neuroscience Laboratory: lead author Eric Smith, graduate student, Marcia Grabowecky, research assistant professor of psychology, and Satoru Suzuki, associate professor of psychology in the Weinberg College of Arts and Sciences at Northwestern.

“Researchers have long thought that one part of the brain does vision and another does auditory processing and that the two really don’t communicate with each other,” said Grabowecky. “But emerging research suggests that rich information from different senses come together quickly and influence each other so that we don’t experience the world one sense at a time.”

The Northwestern study suggests that sensory interactions are happening at a very early level and tones of voices indeed fundamentally change visual processing.

“For our study, we used simple tones with no explicit gender information to get a window into how vision and audition work together to process gender information,” Grabowecky said. “Unlike stereotypical voices, the tones only hinted at male and female characteristics, and by coupling them with ambiguous faces, we were able to see how processing of various pitches affected vision very early in the sensory process.”

The study builds upon scarce scientific evidence supporting the idea that sounds can alter how masculine or feminine a person looks.

“Our vision can bias our experience of other senses, such as hearing,” said Smith. “We hear, for example, the ventriloquist’s voice coming from the dummy. In this study we wanted to see if hearing could change our visual experience.”

“We learn early on what auditory and visual characteristics accompany female and male voices, starting with our earliest experiences with our mothers and fathers,” said Grabowecky. “The question from the neuroscience perspective is when in the processing of perceptual information do auditory and visual senses interact with each other? How does the brain do this?”

To test whether a sound can influence perception of a face’s gender, the researchers digitally morphed male and female faces to create androgynous faces not easily categorized as male or female. Study participants were asked to look at the faces while listening to brief auditory tones, which fell within the fundamental speaking frequency range of either male or female voices.

In the initial stage of auditory processing, sounds are decomposed into basic frequency components, the lowest one called the fundamental frequency and higher ones called the harmonics. The fundamental frequency in the human voice typically falls between about 100 to 150 Hz for males and 160 to 300 Hz for females. Roughly speaking, the fundamental frequency determines the perceived pitch (lower for men and higher for women), and the harmonics add timbre (the quality of human voice).

In higher auditory brain areas, these frequencies are put back together to be coded as a human voice. The researchers took advantage of the fact that pure tones can be used to deliver individual frequency components that are registered in early auditory brain areas.

The findings showed that when an androgynous face was paired with a pure tone that fell within the female fundamental-frequency range, people were more likely to report that the ambiguous face was that of a female. But when the same face was paired with a pure tone in the male fundamental-frequency range, people were more likely to see a male face. (The bias did not occur when a face was paired with a pure tone that was too low or too high to be in the typical speaking range.)

“The strength of the study is that pure tones sound like beeps, and they primarily activate early stages of auditory processing,” Grabowecky said. “We think that the effect demonstrates a direct input from early auditory processing to visual perception.”

When people were forced to guess whether the tones were in the male range, the female range or outside of the typical speaking frequency range, their guesses were inaccurate and relative. In other words, when people heard a pair of pure tones, they tended to hear the higher tone to be feminine and the lower tone to be masculine regardless of the actual frequencies of the tones.

“Such relativity is not surprising, because our auditory experience depends on relative, rather than absolute, frequencies as most useful and entertaining auditory information, such as speech and music, is carried by how sound frequencies change over time,” Grabowecky said.

Absolute frequencies do not matter much, as we readily understand speech spoken by people with low and high voices and enjoy songs regardless of the keys in which they are played. In contrast, it is the “neglected” absolute-frequency information that influences visual perception of gender.

“A conscious impression of your voice is not what enhances your look of masculinity or femininity,” said Suzuki. “Sounds seem to influence visual gender in a much more fundamental way on the basis of their absolute frequencies processed in early auditory brain areas.”

The researchers focused on gender perception, because people have such a strong need to categorize people as male or female. “We all know the feeling of meeting a person who is very androgynous,” said Smith. “We simply need to know and will use any information at our disposal to identify a person’s gender. It is probably quite evolutionarily adaptive to be able to accurately tell males from females, as far as propagation of one’s genes is concerned.”

What is on the horizon?

“If sound can implicitly bias visual gender perception, then we need to consider whether other senses, such as smell, might yield similar effects,” said Smith. “Future studies might use masculine and feminine colognes, or even human pheromones to bias people to see androgynous faces as either male or female. With the possibility of other senses biasing the way that we see the world, our visual experience of gender might turn out to be much more than meets the eye.”

Source: Northwestern University