The Effect of Socioeconomic Status on How the Brain Processes Sound

A UConn researcher has found that children of parents with low levels of education hear differently than their peers.

Erika Skoe, assistant professor of speech, language, and hearing sciences, has found that children of parents with low levels of education hear differently than their peers, which could affect their performance in school. (Peter Morenus/UConn Photo)
Erika Skoe, assistant professor of speech, language, and hearing sciences, has found that children of parents with low levels of education hear differently than their peers, which could affect their performance in school. (Peter Morenus/UConn Photo)

A UConn professor and her colleagues at Northwestern University have found that adolescents from families that are low on the socioeconomic ladder process some sounds less accurately than their peers. The finding could have wide-ranging implications for learning, memory and reading comprehension.

“If your brain is treating a sound differently every time you hear it, then how will you learn the meaning of that sound? How will you learn to read effectively?” says assistant professor of speech, language and hearing sciences Erika Skoe.

Skoe and co-authors Nina Kraus, the study’s principal investigator, and Jennifer Krizman, worked with teenagers in three Chicago-area public schools that serve students with low socioeconomic status. The researchers used the teens’ mothers’ self-reported education level, a common proxy for socioeconomic status, to separate the students into two groups: those whose mothers had a high school education or less, and those whose mothers had any post-secondary education.

Studies have shown that children whose parents have little education hear fewer, simpler words in their homes, have fewer interactive discussions with their family members, and experience more general ambient noise. Skoe wondered if this dearth of what she calls “auditory enrichment” would have an effect on how children’s brains work.

“We want to tap into how one’s experience with sound has wired or rewired the brain,” she says.

If your brain is treating a sound differently every time you hear it, then how will you learn the meaning of that sound? How will you learn to read effectively?

Sixty-six students in the two groups listened to auditory stimuli – single syllables that are easy to understand – while hooked up to electroencephalography (EEG) equipment. The EEG measured responses from the auditory brainstem, the lowest part of the brain where, Skoe says, responses are automatic.

“This part of your brain would respond whether you were awake, asleep or even anesthetized,” says Skoe. Getting this response was important in understanding how the most fundamental brain processes are affected by one’s auditory landscape

The brain waves produced by students in the low-maternal-education group turned out to be “noisier” than those of their higher-maternal-education peers: their brain signals represented sounds less accurately and were less consistent in their representations of the same sound over time.

In other words, students from low-maternal-education backgrounds were hearing sounds less accurately and with more variability.

Skoe says these findings have major implications for how students learn and process information.

“If your brain is creating a different signal each time you hear a sound, or if that signal is noisy, you might be losing some of the details of the sound,” she says. These details could make all the difference in the classroom.

But in some cases, says Skoe, auditory skills are honed in other ways. Studies have shown that being bilingual or playing a musical instrument can give students a leg up. Both are forms of auditory enrichment that may potentially mitigate the effects of socioeconomic status.

Acoustically enhancing the academic environment, through the use of assistive listening devices in the classroom, is another promising option, says Skoe. In a recent Northwestern study, teachers wore a microphone that piped their voice directly into an in-ear receiver used by dyslexic children in their class. The system helped to overcome background noise and classroom acoustics. After one year of using this system, the children’s brain responses improved.

“Modifying the auditory world for a particular student, even if just for a portion of the day, can improve academic performance and fine-tune how sound is automatically encoded in the brain,” says Skoe.

The biggest take-home message of her study, she says, is that the power of education can’t be overstated, and it can affect not only the person who is educated, but their children as well.

“If a parent’s education level can be improved just a little bit, it could make a big difference in her children’s lives,” she says. Even if educational advancement isn’t an option, Skoe encourages parents to engage in conversations with their children, and she points out that in some communities, parents can also take courses on how to better communicate with their children.

Skoe, who joined UConn this fall, is especially interested in working with schools in the Connecticut area on similar questions. If scientists can create concrete data showing the origins of learning and reading problems, then she hopes state and national funding can be allocated to help these issues.

“We’re doing this science to understand what we’ve already been observing in the real world,” says Skoe.

The study appears online today in the Journal of Neuroscience.