“It has to be social.” That’s the advice Patricia Kuhl gave to me and another CNS 2015 attendee following her riveting talk about language development. It doesn’t matter exactly when you introduce a new language to a child under 7, she said, as much as it matters that the learning is in a social setting.
Co-director of the University of Washington’s Institute for Learning and Brain Sciences, Kuhl is the an expert in baby talk – having pioneered the use of new tools for measuring and understanding how language unfolds in a child’s mind. “We’re doing studies now we couldn’t have dreamed of 10-15 yrs ago,” she said in today’s talk, and those are yielding concrete understanding of how learning works in the mind of a child.
On the occasion of her receiving the George A. Miller Award, Kuhl took us through a brief history of the field and then brought us quickly up to the present, with insights into not only how children learn language but also how that learning translates into other skills and behaviors in their future. She and colleagues are now using neuroimaging and genetics to better understand individual differences in learning.
Here are some highlights from her talk today (hint: it’s mostly about the social brain):
— Zarinah Agnew (@zarinahagnew) March 30, 2015
Most us know, many firsthand, how hard it is to learn a language past childhood. Kuhl and colleagues have discovered that the “critical period” for learning language is from birth until 7 years old. At age 7, language learning skills systematically decline, leading to a dramatic drop at puberty. From 6 to 12 months in particular, babies have an incredible ability to distinguish different sounds no matter what the language. Their listening skills make them citizens of the world. Over time, they start becoming more culture-bound to the language they grew up around, like adults.
The early change in phonetic ability, starting around 12 months old comes down to two types of learning, Kuhl explained: computational and social. Studies have found that babies implicitly learn statistics, simply by listening to the people around them. Even when researchers expose babies to a new language they have never heard, they can pick up on subtle changes in sound. This computational ability is unlike anything a computer can do.
Even how a mother speaks to a child helps in this computational learning. “Motherese” – exaggerating sounds when talking to babies – actually stretches acoustic differences that babies are sensitive to and helps the infant brain code various sounds.
Einstein baby? No: Babies are social and learn language from live interaction, not TV – Pat Kuhl #CNS2015
— CNS News (@CogNeuroNews) March 30, 2015
Statistics, however, are not enough for language learning. The second component, the social aspect, is also crucial.
Kuhl and colleagues have found that babies learn language best from live interaction. Babies watching TV or listening to audio only did not show the ability to distinguish sounds as those engaged with a live tutor did. In other studies, Kuhl has found that babies learn language best in social settings, even in pairs.
In new work coming out of her lab, her team found that babies who are able to track movement from their language tutor to a toy (they get a toy for correctly distinguishing sounds) have electrical brain activity signals (ERPs) that indicate more learning.
— Zarinah Agnew (@zarinahagnew) March 30, 2015
Babies who took a 1 month 12-session music class had greater activity in the prefrontal cortex than those who were in a social class with no music. Kuhl said it appears that music trains for attention in babies, leading to better auditory precision.
Even though, children under age 7 years old have an easier time learning a second language than us adults, some of us are better than others at learning new language. Why is that?
Kuhl, with her post-doc Ping Chao Mamiya, are exploring that question through a combination of neuroscience and genetics. Mamiya, who presented the neurogenetics work earlier in the day at the Data Blitz, explained that executive function appears to play a prominent role.
An ability to initiate and meet goals, executive function is something neuroscientists know is important in bilinguals. So Mamiya and Kuhl have been trying to determine what influences individual changes in executive function.
The answer, they have found may lie in a gene known as the COMT. In an study with Chinese students entering the University of Washington with no English training, they found that variations in this gene were linked to white matter connectivity in the brain and varying abilities to learn English. The white matter connections appear to strengthen executive function, perhaps easing second language acquisition in certain individuals.
Kuhl: Finally, all this data have implications for autism #CNS2015
— Vukovic Nikola (@vukovicnikola) March 30, 2015
Kuhl and colleagues have found that how the brain of someone with autism spectrum disorder responds to words at age 2 predicts learning outcomes at age 6. This means, she says, that language serves as a broader signal of how people learn socially. (Earlier in the talk, she pointed to another predictive capability: The ability of babies to distinguish sounds from people around them at 7 months predicted their reading readiness at 5 years old. Powerful stuff!) Kuhl is hopeful that psychology, neuroscience, education, and machine learning will together lead to a new science of learning.
-Lisa M.P. Munoz
For more on Kuhl’s work, read this 2013 Q&A with her.