Purdue News

 

AUDIO
George Hollich, an assistant professor of psychological sciences, talks about how background noise can affect infants' language development. (42 seconds)
Hollich explains how parents can help infants "hear" better with visual cues in noisy environments. (20 seconds)

RELATED INFO
Purdue Department of Psychological Sciences

June 15, 2005

Research: Noise, visual cues affect infants' language development

WEST LAFAYETTE, Ind. – Even moderate background noise can affect how infants learn language at an early and crucial time of their development, according to new research from Purdue University.

George Hollich test an infant
Download photo
caption below

"This research reaffirms how important it is for a child to see the face of a person while hearing him or her speak," says George Hollich, an assistant professor of psychological sciences. "This is the first study to show how children are easily distracted when the background noise is at the same loudness as the person talking to the child. We found that even soft noise can be a problem."

Hollich, who is director of Purdue's Infant Language Laboratory, teamed with Rochelle S. Newman, assistant professor of speech-language pathology at the University of Maryland, and the late Peter W. Jusczyk, a former professor at Johns Hopkins University. Their paper is published in the May/June issue of the journal Child Development.

Background noise in the average household – such as other children playing or watching television – can pose the same problem for children that an older adult with hearing loss encounters at a cocktail party.

"Older adults who are hard of hearing use their other senses, such as vision, to better understand speech," Hollich says. "We thought this might be what infants do when they are in a noisy environment. Struggling to hear can be annoying for adults, so just imagine how distracting it is for infants who are trying to learn a language.

"Unlike the printed word, speech doesn't use commas, spaces or periods to separate words and concepts. If there is more than one source of speech, it's especially hard for the infant to know when one word ends and another begins. That is why infants need to match what they hear with the movements of the speaker's face."

Hollich's four studies, conducted in 2002 at John Hopkins University where he previously worked, analyzed how environmental noises affected 7-month-old infants during this stage of language development. The 116 infants were shown one of four videos of a woman talking while emphasizing a specific word, such as "cup." As the woman talked a man spoke about an unrelated topic in the background.

In one video, the audio matched what the woman was saying. In a second, it did not match. In the third, a still frame of the speaker accompanied the audio. Finally, the audio was synchronized with an oscilloscope pattern, which fluctuated in relationship to the sound.

The infant then listened to two sets of words spoken from both their left and right side. One set repeated the word "cup" from the earlier passage, and the second set emphasized an irrelevant word, such as "dog." The length of time the infant directed its attention to the source of these two sounds was measured in seconds. Infants who watched the original video in which they could see the woman's face while she spoke, focused on the word she had spoken (in this case "cup") an average of two seconds longer than on the irrelevant word.

"Two seconds is a long time when it comes to measuring an infant's attention span," Hollich says.

The infants who watched the second and third videos did not grasp the new word as well. However, in the fourth experiment with the synchronized oscilloscope, infants were able to pick out the emphasized word just as they did when looking at the speaker's face.

"Even though this last experiment did not show a person's face, the results emphasize that children learn best when they are gathering information with their eyes and ears," Hollich says. "The second and third experiments show that a second source of sound can interfere with that method, and now we know being able to see something along with the audio in a noisy situation plays an important part in infants' language development. Now, more research is needed to evaluate how much infants rely on their vision to supplement their hearing when learning a new language. For example, I am interested in knowing if infants use visual cues even when there is no background noise."

The studies were funded by the National Institute of Child Health and Human Development and the National Institute of Mental Health. Hollich's next studies will continue to look at variations of these experiments by testing how older infants and toddlers learn new words.

Writer: Amy Patterson Neubert, (765) 494-9723, apatterson@purdue.edu

Source: George Hollich, (765) 494-2224, ghollich@mac.com

Purdue News Service: (765) 494-2096; purduenews@purdue.edu

 

PHOTO CAPTION:
George Hollich, an assistant professor of psychological sciences, demonstrates the method used to test how infants learn language using both sound and visual cues. Ethan Harrington, 6 months old, is held by mother, Kristie, while watching a video and listening to different sounds. Hollich found that infants hear better in noisy environments when they can see the person's face who is talking to them. (Purdue News Service photo/David Umberger)

A publication-quality photo is available at https://www.purdue.edu/uns/images/+2005/hollich-infant.jpg

 


ABSTRACT

Infants' use of synchronized visual information
to separate streams of speech

George Hollich, Rochelle S. Newman, Peter W. Jusczyk

In four studies, 7.5 month-olds used synchronized visual/auditory correlations to separate a target speech stream when a distractor passage was presented at equal loudness. Infants succeeded in a segmentation task (using the headturn preference procedure with video familiarization) when a video of the talker's face was synchronized with the target passage (Experiment, 1 N=30). Infants, did not succeed in this task when an unsynchronized (Experiment 2, N=30) or static face (Experiment 3, N=30) was presented during familiarization. Infants also succeeded when viewing a synchronized oscilloscope pattern (Experiment 4, N=26) suggesting that their ability to make use of visual information is related to domain-general sensitivities to any synchronized auditory/visual correspondence.

 


 

To the News Service home page