Study: Touch influences how infants learn language
April 22, 2014
Amanda Seidl (standing), a Purdue associate professor of speech, language and hearing sciences who studies language acquisition, found that touch can influence how infants learn language. Her research is published in Developmental Science. Seidl is working with graduate student Rana Abu-Zhaya to continue studying the role touch plays. (Purdue University photo/Mark Simons)
WEST LAFAYETTE, Ind. — Tickling a baby's toes may be cute but it's also possible that those touches could help babies learn the words in their language. Research from Purdue University shows that a caregiver's touch could help babies to find words in the continuous stream of speech.
"We found that infants treat touches as if they are related to what they hear and thus these touches could have an impact on their word learning," said Amanda Seidl, an associate professor of speech, language and hearing sciences who studies language acquisition. "We think of touch as conveying affection, but our recent research shows that infants can relate touches to their incoming speech signal. Others have looked at the role of touch with respect to babies forming an attachment and physical development. But until now the impact of touch on language learning has not been explored."
The findings are published in Developmental Science and a video of Seidl explaining the research is available online. Seidl's research was supported by the National Science Foundation. She is interested in the multitude of cues or sources of information that babies may combine to learn their language. Learning words presents a challenge for infants since most of the words they hear are presented in a continuous stream of speech, rather than isolated words, by their caregivers.
"Parents may pause before saying an infant's name, but they almost never do so for other words. This research explored whether touches could help infants to find where words begin and end in the continuous stream of speech. They need to find words before they can attach real meaning to their words," Seidl said. "Because names of body parts are often the first words that babies learn and touching is often involved when caregivers talk about body parts, we speculated that touch could act as a cue to word edges."
A total of 48 English-learning 4-months-olds were tested at Purdue's Infant Speech Lab in two groups as they sat on a parent's lap facing an experimenter while listening to a pre-recorded continuous stream of speech of nonsense words. In the first experiment, every time a nonsense word, such as "dobita," was spoken, the experimenter touched the baby's knee. This occurred two dozen times. Also, the word "lepoga" was played 24 times, but the infant was only touched once on her elbow during the playing of this word. The other 23 touches to the elbow occurred on other syllable sequences. Following this listening, the babies participated in a language preference study, and almost all showed that they had pulled "dobita" out of the continuous stream of speech. This was the word that was reinforced by aligned touching.
In the second experiment, the same format of continuous speech and new words was played, but the experimenter touched his or her eyebrow or chin instead of the baby. The children in this experiment did not show that they had pulled out any words.
"It didn't matter how much time the infant spent looking at the experimenter's face, the babies were not able to use these cues in the same way as they were when their own body was touched," said Seidl, who is now looking at individual differences in how parents speak and touch their baby.
"I am interested in whether we can predict babies' language later on from early measures of speech perception," Seidl said. "If we look at speech perception and learning in a 6-month-old can we predict their language ability at 3 years? If we can find out what kinds of learners young children are, we could target their learning environment to their learning style."
Also part of the research team are Ruth Tincoff, an assistant professor at Bucknell University, and former Purdue undergraduate student Christopher Baker and former Purdue graduate student Alejandrina Cristia.
Writer: Amy Patterson Neubert, 765-494-9723, email@example.com
Source: Amanda Seidl, firstname.lastname@example.org
College of Health and Human Sciences
Department of Speech, Language and Hearing Sciences
Parents interested in their infants participating in a language study at Purdue's Infant Speech Lab can find more information online.
Note to Journalists: Journalists interested in a copy of the journal article can contact Amy Patterson Neubert, Purdue News Service, at 765-494-9723, email@example.com
Why the body comes first: Effects of experimenter touch on infants' word finding
Amanda Seidl, Ruth Tincoff, Christopher Baker and Alejandrina Cristia
The lexicon of 6-months-old is comprised of names and body part words. Unlike names, body part words do not often occur in isolation in the input. This presents a puzzle: How have infants been able to pull out these words from the continuous stream of speech at such a young age? We hypothesize that caregivers' interactions directed at and on the infant's body may be at the root of their early acquisition of body part words. An artificial language segmentation study shows that experimenter-provided synchronous tactile cues help 4-month-olds to find words in continuous speech. A follow-up study suggests that this facilitation cannot be reduced to the highly social situation in which the directed interaction occurs. Taken together, these studies suggest that direct caregiver-infant interaction, exemplified in this study by touch cues, may play a key role in infants' ability to find word boundaries and suggests that words linked with caregiver touches may comprise early vocabulary times.