Early screening and treatment for infants with hearing problems, and the ability to computer-generate musical scores, are two very different possible outcomes of some “off-the-wall” research.
Until recently, little has been known about the perceptions humans have when they enter the world. Although adult perception has been extensively researched, how, or even if, the brains of newborn babies perceive patterns in the world remained a mystery.
That mystery has been at least partially solved by an EU-funded research project, EmCAP, which brought together what many would consider an unlikely consortium, comprising neuroscientists and music technologists.
What project coordinator Susan Denham describes as “blue-sky thinking” on the part of her and her colleagues when they initially proposed the project led to experiments involving playing music to newborn babies.
In the experiments, sleeping babies were hooked up to an encephalograph (EEG), an instrument able to measure brain activity using electrodes placed on the scalp.
The babies were then played music—to be more exact, simplified tone sequences—to test what sort of patterns they were sensitive to and whether they would predict what was coming next based on what had gone before.
“The babies were presented with sequences of sounds of different tone colour—different musical instruments, if you like—but all of the same pitch," Denham says. "Occasionally, you play a sound of a different pitch and watch the EEG to see if they produce a distinctive reaction to this deviant sound. Similar tests were done to see if babies were sensitive to rhythmic and melodic patterns, too."
Denham says while this sort of technique has been used for many years on adults to measure preconscious detection of unexpected events, it has seldom been used with newborns. The big advantage is that it can work even when somebody is unconscious. So the babies being asleep was not a problem.
The results were exciting, demonstrating newborns had a sense of pitch from birth, and this was not something learned through experience as had previously been thought. The experiments showed they are even sensitive to the beat in music.
“The bottom line is we come into the world with brains that are continually looking for patterns, and telling us when there is something unexpected we should learn about,” Denham says.
István Winkler, who conducted the baby research, concludes this capability allows babies to learn about their environment and the important actors within it.
The discoveries may be applied to developing early screening techniques and treatments for cognitive hearing problems. The screening currently in use simply measures how hard of hearing people are as opposed to the nuances of their actual perceptions.
“Research is needed to determine the norm—and how much variation there is from it—to prevent false diagnoses when a baby is simply developing slowly,” Denham says. But then it should be possible to spot defects at a very early stage and treat them while the brain is still malleable.
The research has thrown new light on music cognition and brought practical benefits to the music technologists involved in the project.
“While it remains unclear whether a capacity for music is rooted in nature, rather than nurture, it is clear that musical competence is a special human capacity, shared across ages and cultures.” says project partner, Henkjan Honing.
Although the ability to detect musical patterns is present from birth, music cognition develops throughout life. However, music cognition is influenced not so much by musical expertise, as by experience. “Frequent listening to a certain musical genre allows listeners without formal musical training to become experts in that musical style,” Honing says.
Details revealed by the experiments about the way the brain checks and adjusts its expectations made it possible to develop computer programs that mimic these processes.
Researchers in EmCAP developed a generic algorithm, basically a bit of smart software, able to detect violations of expected pitch and rhythmic structure, with tonality soon to be added to the list.
“We did the modelling at two levels, one trying to emulate brain function and perception in a simplified but still fairly detailed way, and the other tailored more for practical use in music processing systems,” Denham says.
What this will mean in practice, is the future development of artificial cognitive music systems able to “listen” to music and produce a score in real time showing which instruments play which notes. Project partner Xavier Serra suggests that the next generation of music processors will be based on algorithms that imitate how humans process music.
Further projects are planned on the back of EmCAP, including one starting in March, which will use sounds to detect behavioral patterns of living creatures.
Source: ICT Results
[Source: ScienceDaily]