Researchers Identify Gene in Age-related Hearing Loss
|Research Roundup updates HR readers on some of the latest research and clinical findings related to hearing health care. Where appropriate, sources and original citations are provided, and readers are encouraged to refer to the primary literature for more detailed information. Additionally, related articles can be found and keywords can be searched in the HR Online Archives.|
Presbycusis, or age-related hearing loss, accounts for 30% of all hearing loss. So, why do some people lose their hearing as they get older but other people can still hear a pin drop? The answer may be in a study released online in the journal Human Molecular Genetics.
“This is the first ever and largest genome-wide association study for age-related hearing loss,” says Rick Friedman, MD, PhD, lead author and House Ear Institute (HEI) principal investigator and surgeon at the House Clinic, Los Angeles.
The study was conducted in collaboration with colleagues at the Phoenix-based Translational Genomics Research Institute (TGen), Affymetrix in Santa Clara, Calif, and the University of Antwerp, Belgium. It uncovered several genes, but one gene—the grm7 gene—may be associated with susceptibility to glutamate excitotoxicity and hearing loss. It is the overexpression of glutamate that causes damage to the inner and outer hair cells in the inner ear, leading to age-related hearing loss.
“Finding the genetic causes of age-related hearing loss could lead to treatments that would bring relief to millions of people worldwide who now suffer from social isolation, depression, and even cognitive impairment as a result of not being able to properly understand what others are saying,” says Dr Matthew Huentelman, an investigator in TGen’s Neurogenomics Division and one of the lead authors.
The researchers believe this paper’s findings represent important and significant progress in the efforts to discover the origins of presbycusis.
“We have known for a long time that genes play an important role in presbycusis, but until now genetic research has lagged behind compared to other important diseases,” says Guy Van Camp, director of the Hereditary Deafness Laboratory and professor, University of Antwerp. “The identification of grm7 is a very exciting result, as it may provide insights in the development of the disease.”
The study participants were Caucasian, ages 53 to 67, and the samples were collected at eight centers in six nations throughout Europe from population registries or audiological consultations. The team of investigators analyzed the samples and identified genetic risks. In the lab, the research team used Affymetrix GeneChip® Human Mapping 500K to score markers across the entire genome of more than 2,000 samples.
Friedman said the next step is developing a laboratory model to test pharmaceuticals for possible treatment of presbycusis in the future. Source: House Ear Institute.
Friedman RA, Van Laer L, Huentelman MJ, et al. grm7 variants confer susceptibility to age-related hearing impairment [published online ahead of print December 1, 2008]. Hum Mol Genet. doi:10.1093/hmg/ddn402.
New Study Offers Insights Into Balance and Hearing Development
A UCLA study shows for the first time how microscopic crystals form sound and gravity sensors inside the inner ear. Located at the ends of cilia—the tiny cellular hairs in the ear that move and transmit signals—these crystals play an important role in detecting sound, maintaining balance, and regulating movement.
Dislodged ear crystals are to blame for the most common form of vertigo. Known as benign paroxysmal positional vertigo (BPPV), the disorder plagues up to 10% of people older than 60 and causes 20% of patients’ dizziness complaints.
The researchers’ findings, published November 30 in the advance online edition of the journal Nature, suggest a potential gene target for the treatment of people suffering from common hearing and balance problems related to cilia disorders.
“People have known for a long time about the importance of cilia for propelling sperm up the uterus and moving mucus out of the lungs,” says Kent Hill, associate professor of microbiology, immunology, and molecular genetics at the David Geffen School of Medicine at UCLA and the UCLA College of Letters and Science. “Our study illustrates that cilia perform many additional jobs that are essential to how our bodies develop and work.”
Hill’s team employed high-speed, high-definition video imaging to watch cilia moving in real time inside the developing ears of embryonic zebrafish. These small, bony fish undergo stages of development similar to those of humans and other vertebrates, making them useful models for research.
The researchers labeled cilia in the fish with fluorescent probes and used video microscopy to visualize the cilia and other inner ear structures. In the control group of fish, long cilia beat like tiny oars, causing tiny particles to circle in a vortex around them. The tornado of whirling particles accumulated at the proper location to form the inner ear’s crystalline sensors.
“We next blocked expression of a gene that controls dynein—a tiny molecular motor that drives cilia movement,” says Hill. “When we examined the embryos, we saw that cilia movement came to a halt. As a result, the particles did not assemble in the correct site. So not only did ear crystals form in the wrong place, but they were misshapen and abnormally sized.
“While it’s been suggested that cilia movement contributes to the formation of ear crystals, this idea had never been tested before,” continues Hill. “Our findings show that cilia in the ear do move and demonstrate that cilia movement is needed for ear crystals to assemble in the right place.”
According to Hill, the findings offer promise for the treatment of patients with hearing disorders and people with ciliopathies—disorders marked by poor cilia function. These conditions include sperm-related infertility, polycystic kidney disease, lung and respiratory disorders, swelling of the brain, and reversal of the internal organs’ sites from one side of the body to the other.
“The idea that physical movement can influence vertebrate development is very provocative,” Hill said. “Scientists typically look at whether a particular gene is switched on or off, or if a particular protein is activated that determines if a tissue develops normally. In this case, microscopic currents in the fluid surrounding developing tissue are affecting its development. We need to understand more details of this process and determine how common it is during development.” Source: UCLA.
Colantonio JR, Vermot J, Wu D, et al. The dynein regulatory complex is required for ciliary motility and otolith biogenesis in the inner ear [published online ahead of print November 30, 2008]. Nature. doi:10.1038/nature07520.
HR Science & Tech Thursday Podcast: Watching the Brain Wrestle with Words
Scientists at the University of Rochester in New York have shown for the first time that our brains automatically consider many possible words and their meanings before we’ve even heard the final sound of the word. Previous theories have proposed that listeners can keep pace with the rapid rate of spoken language—up to five syllables per second—only by anticipating a small subset of all words known by the listener, much like a Google search anticipates words and phrases as you type. This subset consists of all words that begin with the same sounds, such as “candle,” “candy,” and “cantaloupe,” and makes the task of understanding the specific word more efficient than waiting until all the sounds of the word have been presented.
But, until now, researchers had no way to know if the brain also considers the meanings of these possible words. The new findings are the first time that scientists, using an MRI scanner, have been able to actually see this split-second brain activity. The study was a team effort among former Rochester graduate student Kathleen Pirog Revill, now a postdoctoral researcher at Georgia Tech, and three faculty members in the Department of Brain and Cognitive Sciences at the University of Rochester.
With Michael Tanenhaus, Richard Aslin, and Daphne Bavelier, Pirog Revill used fMRIs to focus on a tiny part of the brain called “V5,” which is known to be activated when a person sees motion. The idea was to teach undergraduates a set of invented words, some of which meant “movement,” and then to watch and see if the V5 area became activated when the subjects heard words that sounded similar to the ones that meant “movement.”
The team created a set of words with similar beginning syllables, but with different ending syllables and distinct meanings, as well as a computer program that showed irregular shapes and gave the shapes specific names. After a number of students learned the new words, the team tested them as they lay in an fMRI scanner. The students would see one of the shapes on a monitor and hear “biduko,” or “biduka.” Though only one of the words actually meant “motion,” the V5 area of the brain still activated for both, although less so for the color word than for the motion word. The presence of some activation to the color word shows that the brain, for a split-second, considered the motion meaning of both possible words before it heard the final discriminating syllable /ka/ rather than /ko/.
“Frankly, we’re amazed we could detect something so subtle,” says Aslin. “But it just makes sense that your brain would do it this way…. Choosing from a little subset is much faster than trying to match a finished word against every word in your vocabulary.” Source: University of Rochester. Listen to the HR Podcast at www.hearingreview.com/sciencetech.
Pirog Revill K, Aslin RN, Tanenhaus MK, Bavelier D. Neural correlates of partial lexical activation. Proc Nat Acad Sciences. 2008;105 (35):13111-13115. Available online at: www.pnas.org/content/105/35/13111.full.
Signal Transmission in VOR Confirmed as Linear
Life exists at the edge of chaos, where small changes can have striking and unanticipated effects, and major stimuli may go unheard. But there is no space for ambiguity when the brain needs to transform head motion into precise eye, head, and body movements that rapidly stabilize our posture and gaze; otherwise, we would stumble helplessly through the world, and our vision would resemble an undecipherable blur.
In their latest study, published in the current issue of the journal Neuron, researchers at the Salk Institute for Biological Studies explain how the vestibular-ocular reflex (VOR), which keeps us and the world around us stable, achieves the accuracy it is famous for. Unlike most signals in the brain, whose transmission is frequency-dependent, signals from the vestibular system of the inner ear, which detects motion, are relayed in a linear fashion no matter how fast the neurons are firing.
“Most of what we know about signal transmission between neurons comes from studying special cortical or hippocampal neurons, but many vital functions, such as balance and breathing, are controlled by neurons in the brain stem, which, as we discovered, work very differently,” says Howard Hughes Medical Institute investigator Sascha du Lac, PhD, an associate professor in the Systems Neurobiology Laboratory. “Pursuing the mechanisms that control neurons in the brain stem is important for developing new classes of biotherapeutic agents.”
Du Lac and her team focus on a simple type of learning: How does the brain learn to stabilize an image on the retina and use eye movement to compensate for a moving head? This so-called vestibular-ocular reflex needs to be fast; for clear vision, head movements must be compensated for almost immediately. To achieve the necessary speed, the VOR-circuit involves only three types of neurons: 1) sensory neurons, which detect head movement; 2) motor neurons, directing eye muscles to relax or contract; and 3) so-called vestibular nucleus neurons in the brainstem that link the two.
While the brevity of this circuit keeps reflex times short, it was less clear what qualities of the circuit ensure that eye velocity is precisely matched to head velocity. Since the VOR operates accurately no matter how fast we move our head, scientists long expected that the signal transmission at the synapses—specialized points of contact between nerve cells—that connect the sensory onto the vestibular nucleus neurons would be linear.
Promising Research on Hair Cell Regeneration, by Edwin W. Rubel, PhD. October 2004 HR.
Overview of BPPV: Pathophysiology & Diagnosis, by Richard Gans. Aug 2000 HR.
However, transmission at most synapses is nonlinear. Brain cells signal by sending electrical impulses along axons—the long, hair-like extensions that reach out to neighboring nerve cells. When an electrical signal reaches the end of an axon, the voltage change triggers release of neurotransmitters, the brain’s chemical messengers. These neurotransmitter molecules then travel across the space between neurons at a synapse and trigger an electrical signal in the adjacent cell—or not.
“Most known synapses act as information filters, and both the probability and the extent of neurotransmitter release as well as the efficacy of the postsynaptic response depend heavily on the recent history of the synapse,” says first author Martha W. Bagnall, PhD, a former graduate student in du Lac’s lab and now a postdoctoral researcher at the University of California, San Diego. “But no matter whether you go jogging or watch TV on your couch, the VOR needs to accurately match sensory input with motor output,” she adds.
When Bagnall and her colleagues took a closer look at the first synapse in the VOR circuit, they found that, no matter how fast the sensory neuron was firing, the same amount of neurotransmitter was released. And instead of vacillating, the postsynaptic neuron took the information and transmitted it faithfully. Source: Salk Institute
Bagnall MW, McElvain LE, Faulstich M, du Lac S. Frequency-independent synaptic transmission supports a linear vestibular behavior. Neuron. 2008;60(2):343-352.