Your ability to make sense of Groucho’s words and Harpo’s pantomimes in an old Marx Brothers movie takes place in the same regions of your brain, says new research funded by the National Institute on Deafness and Other Communication Disorders (NIDCD), Bethesda, Md, one of the National Institutes of Health.

In a study published in the Proceedings of the National Academy of Sciences (PNAS), researchers have shown that the brain regions that have long been recognized as a center in which spoken or written words are decoded are also important in interpreting wordless gestures. The findings suggest that these brain regions may play a much broader role in the interpretation of symbols than researchers have thought and, for this reason, could be the evolutionary starting point from which language originated.

“In babies, the ability to communicate through gestures precedes spoken language, and you can predict a child’s language skills based on the repertoire of his or her gestures during those early months,” said James F. Battey Jr, MD, PhD, director of the NIDCD. “These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop their language skills.”

Scientists have known that sign language is largely processed in the same regions of the brain as spoken language. These regions include the inferior frontal gyrus, or Broca’s area, in the front left side of the brain, and the posterior temporal region, commonly referred to as Wernicke’s area, toward the back left side of the brain. It isn’t surprising that signed and spoken language activate the same brain regions, because sign language operates in the same way as spoken language does—with its own vocabulary and rules of grammar.

In this study, NIDCD researchers, in collaboration with scientists from Hofstra University School of Medicine, Hempstead, NY, and San Diego State University, wanted to find out if non-language-related gestures—the hand and body movements we use that convey meaning on their own, without having to be translated into specific words or phrases—are processed in the same regions of the brain as language is. Two types of gestures were considered for the study: pantomimes, which mimic objects or actions, such as unscrewing a jar or juggling balls, and emblems, which are commonly used in social interactions and which signify abstract, usually more emotionally charged concepts than pantomimes. Examples include a hand sweeping across the forehead to indicate “it’s hot in here!” or a finger to the lips to signify “be quiet.”

While inside a functional MRI machine, 20 healthy, English-speaking volunteers—nine males and 11 females—watched video clips of a person either acting out one of the two gesture types or voicing the phrases that the gestures represent. As controls, volunteers also watched clips of the person using meaningless gestures or speaking pseudowords that had been chopped up and randomly reorganized so the brain would not interpret them as language. Volunteers watched 60 video clips for each of the six stimuli, with the clips presented in 45-second time blocks at a rate of 15 clips per block. A mirror attached to the head enabled the volunteer to watch the video projected on the scanner room wall. The scientists then measured brain activity for each of the stimuli and looked for similarities and differences as well as any communication occurring between individual parts of the brain.

The researchers found that for the gesture and spoken language stimuli, the brain was highly activated in the inferior frontal and posterior temporal areas, the long-recognized language regions of the brain.

“If gesture and language were not processed by the same system, you’d have spoken language activating the inferior frontal and posterior temporal areas, and gestures activating other parts of the brain,” said Allen Braun, MD, senior author on the paper, “But in fact we found virtual overlap.”

Current thinking in the study of language is that, like a smart search engine that pops up the most suitable Web site at the top of its search results, the posterior temporal region serves as a storehouse of words from which the inferior frontal gyrus selects the most appropriate match. The researchers suggest that, rather than being limited to deciphering words alone, these regions may be able to apply meaning to any incoming symbols, be they words, gestures, images, sounds, or objects. According to Braun, these regions also may present a clue into how language evolved.

“Our results fit a longstanding theory which says that the common ancestor of humans and apes communicated through meaningful gestures and, over time, the brain regions that processed gestures became adapted for using words,” he said. “If the theory is correct, our language areas may actually be the remnant of this ancient communication system, one that continues to process gesture as well as language in the human brain.”

Braun adds that developing a better understanding of the brain systems that support gestures and words may help in the treatment of some patients with aphasia, a disorder that hinders a person’s ability to produce or understand language.

NIDCD supports and conducts research and research training on the normal and disordered processes of hearing, balance, smell, taste, voice, speech, and language. 

[Source: NIDCD]