While your baby may not be able to appreciate Mozart from the womb, it may be able to start picking up on speech patterns earlier than previously thought.
A scientist at the UW is publishing a study with teams at Pacific Lutheran University (PLU) and the Karolinska Institute in Sweden to get an idea of just how closely prenatal babies are paying attention.
Patricia Kuhl, professor and co-director of the Institute for Learning and Brain Sciences, co-authored a study that found babies just a few hours old were able to distinguish vowel sounds from their native language and a foreign language.
“They’re recognizing what is a familiar and recurring voice,” said Christine Moon, lead author and professor of psychology at PLU. “They’re learning a rhythm, melody, and difference in loudness that’s coming in as patterned and are able to notice what’s different from that. And now we know they’re grouping sounds together that are similar.”
Scientists measured the reactions of babies at hospitals in Tacoma and Sweden. Newborns listened to the vowel sounds on headphones while sucking on a pacifier that controlled the sound.
“The sucking was recorded using a pacifier with an air sensor,” said Hugo Lagercrantz, another co-author and professor at the Karolinska Institute. “When the baby heard the sound and sucked, the sound was presented again, so the [babies] controlled the audio presentation of native or non-native vowels.”
Babies were more likely to use their control to listen to newer sounds instead of hearing their own native vowels. According to Kuhl, the infants are eager to learn and “like novelty” after listening to their native language. Findings indicate that babies learn these speech patterns in the last 10 weeks of pregnancy, learning that was previously thought to not occur until six months of age.
“We’ve pushed back the moment at which we can measure the first effects of language experience on the human brain to before birth, and that’s amazing,” Kuhl said. “We thought from previous work that with regard to phonetic perception — that is, perception of the sounds that make up words — infants were naive at birth, still unshaped by experience. Now we know that experience in the womb has already begun to change the human brain by the time the infant is born.”
Kuhl designed the experiment to follow up work she’d done with the effects of early language development in changing the infant brain. The two sets of vowels were the same vowels she used in a previous, similar experiment with 6-month-olds. Vowel sounds are generally louder, longer, and more prominent than consonants.
The study was a continuation of research done for 30 years about infant learning. Scientists previously knew babies responded to running speech in their native language, but this is the first demonstration that newborns can distinguish specific sounds.
“They are operating on a few different levels of acoustics they’re receiving, which is pretty phenomenal because this is disembodied sound,” Moon said. “They can’t see a mouth moving, which adds a huge amount of context. All they have is outside noise, digestive and heartbeat sounds, and the voice of their mother coming through bone conductions. That’s quite a rich sound environment, yet their brain is working on their acoustic signal. It’s pretty dazzling, what the young brain is capable of.”
Reach reporter Zosha Millman at firstname.lastname@example.org. Twitter: @zosham
Please read our Comment policy.