Purdue study shines light on auditory attention, cognition of aphasia patients

Written by: Tim Brouk, tbrouk@purdue.edu

Arianna LaCroix and Emily Lanz pose for a photo

Arianna LaCroix, assistant professor of speech, language, and hearing sciences, and Emily (Sebranek) Lenz, PhD student.(Purdue University Photo/Tim Brouk)

The Purdue University Department of Speech, Language, and Hearing Sciences (SLHS) is a leader in understanding and helping patients recover from stroke-related aphasia. The disorder is most known for affecting the patient’s communication — often robbing them of speech, writing and understanding language.

Arianna LaCroix, assistant professor in SLHS, collaborated with PhD student Emily (Sebranek) Lenz to develop an “auditory attention task” (AAT) that can measure how people pay attention to sounds — both the ones they’re trying to listen to and the ones that distract them. The goal of this work is to shine light on a neurological deficit that may be quietly affecting how people with aphasia listen, understand and express themselves.

“What is often left out of aphasia studies is the idea that cognition may also be impacted by the stroke,” LaCroix explained. “Typically, people are like, ‘Oh, well, aphasia is just a language disorder. They don’t have any difficulty paying attention or remembering information or planning and reasoning.’ And the literature and patient experience tell us that’s not true.”

“There’s a lot of evidence that suggests that in addition to having difficulty speaking and listening, people with aphasia also have impairments in attention, memory and executive function. We focus on attention in the lab because it’s the most foundational aspect of cognition. For example, to be able to maintain and manipulate information — a skill we call working memory — you have to first be able to pay attention to it.”

The attention task

The AAT is a listening test of sorts that helps measure different parts of attention. Alerting is the ability to become mentally ready when something is about to happen, orienting is focusing on the sound you want to pay attention to, and executive control is staying focused and making a decision even when something is distracting.

In the AAT, participants listen to two tones and decide if the first tone is short or long compared to the second tone. This decision-making taps into executive control. The tones can also differ in pitch (frequency). When the two tones differ in pitch (one higher, one lower), the pitch change involuntarily grabs the listener’s attention. They then must quickly redirect their focus back to the first tone to judge its length. The tones were pitched to the frequency of the human voice.

“Slight pitch cues let listeners know if the speaker is asking a question or making a statement. They also help us contrast information to highlight what’s important in a sentence. So, paying attention to pitch is important for language,” LaCroix said. “That’s why we ultimately tried to cue the orienting system with pitch because we’re thinking about trying to use this to predict auditory language abilities.”

Half of the tone sequences administered were accompanied by a short bit of white noise. This noise acts like a warning signal, telling the brain to “get ready.” Comparing performance on trials with and without this warning helps us understand how well a person can prepare their attention in advance. This provides a measure of alerting attention, and as expected, participants performed better on the tone task following the white noise.

Participants completed a total of 160 AAT trials and indicated their response via a keyboard button. Since the task is self-paced, each button press started the next trial. Participants were instructed to be fast and accurate. LaCroix and Lenz tested the AAT with various tone lengths with fractions of a second to mimic human reaction time down to the millisecond. The researchers experimented with how long the tones should be. How much time should be put between the tones? How long should the white noise alert be? Since the brain reacts to things in milliseconds or faster, ranging a tone from 150 milliseconds to 800 is a significant change in LaCroix’s research.

This first study was conducted with 48 young adults and was published in the Journal for Speech, Language, and Hearing Research. The goal of this first paper was to establish that the AAT was working as expected in a sample without attention difficulty. Subsequent testing of the AAT in older adults and people with aphasia is currently ongoing in the lab.

Tasked to help aphasia patients

LaCroix and Lenz hope their AAT can eventually be implemented in the clinic to better highlight the multifaceted afflictions of aphasia.

“There’s really not a lot of work looking at auditory attention in aphasia,” LaCroix said. “I think it’s important to study auditory attention because we work with people that have had a left hemisphere stroke and some research suggests that the brain systems used for focusing on what we hear may rely more heavily on the left side of the brain than the systems used for visual attention. If that’s true, then damage from a left hemisphere stroke could make it harder for someone to pay attention to sounds — even before they try to understand or produce speech. Studying auditory attention may therefore help us uncover a hidden contributor to the communication difficulties many stroke survivors experience.”


Discover more from News | College of Health and Human Sciences

Subscribe to get the latest posts sent to your email.