https://www.youtube.com/watch?v=fPP6VMevvM8
In UConn’s Music Dynamics Lab in the College of Liberal Arts and Sciences, psychology professor Edward Large and his research team are exploring how music communicates emotion inside the brain.
Some scientists have theorized that music evokes emotion by tapping into deep rooted psychological constructs that have developed in our psyche as humans evolved over time.
But advances in neural imaging are revealing a much more complicated process behind music perception, cognition, and emotion. Neuroscientists like Large believe that music, rather than mimicking some other form of social or primal communication, speaks to the brain in a language all of its own.
To understand what Large means, we need to step back for a moment and understand that today many scientists believe that our minds don’t operate in a steady “stream” of consciousness as much as in oscillating, rhythmic “pulses.” Those pulses are created by firing clusters of neurons that drive the brain waves rippling across our minds at different frequencies, as we think, feel, remember, and process information as it comes in through our senses.
“Our hypothesis is that music, because of its unique structure, oscillations, rhythm, and tempo, is somehow able to directly couple into these oscillating neural systems that are responsible for emotion,” Large says.
We believe music communicates meaning much more directly than speech … which is why we believe we are going to understand music and emotion long before we understand speech and emotion. — Edward Large
Nicole Flaig, a master’s student conducting research on music and emotion in Large’s lab, says: “It’s as if music speaks at the same level as the brain. You have frequencies coming from either the tone or the rhythm in music and those frequencies can, we believe, influence the frequencies of the brain. If those frequencies sound good to someone, it means they are resonating more with the parts of the brain that control emotion.”
“So if we have music that is doing this,” Large continues, “then it is literally going to resonate with your happy place and you are going to feel that feeling.”
Of course, Large will tell you the whole process is much, much more complicated than that, and both he and Flaig are eager to gain access to UConn’s new functional magnetic resonance imaging (fMRI) scanner to probe exactly how the brain’s various underlying neural systems are reacting and connecting with each other as we perceive and process music stimuli.
The fMRI scanner allows researchers to see – and possibly find associations between – different areas of the brain responding to a stimulus at the same time.
Large believes that the research could have implications far beyond the music world. Music, as a highly structured, temporal means of communication, has much in common with language, he says. By studying the neural processes underlying the perception of musical pitch, rhythm, and tonality, he believes we can gain greater insight into how our minds process language and speech patterns.
“What we’re after is meaning, and how meaning is communicated,” says Large. “We believe music communicates meaning much more directly than speech. Speech has a lot of these abstract symbols called words, and that gets kind of complicated. Music doesn’t have that, which is why we believe we are going to understand music and emotion long before we understand speech and emotion.”
Large is already working with associate professor of psychology Inge-Marie Eigsti to see how his research into music communication and emotion may help individuals along the autism spectrum.
“One of the biggest problems for individuals with autism spectrum disorder is communication, especially emotional communication and understanding the emotions of the person they are talking to,” he says. “If we could better understand how emotion works in our minds, maybe we could understand what is going wrong in someone with autism spectrum disorder. Maybe there is a deficit in the rhythmicity of their brain for instance.”
Another area of focus for the lab – and Flaig in particular –is the role expectation plays in music communication. Flaig is exploring a theory that it is not just the tempo of music (slow or fast) that may influence emotion, but the moment of change in that tempo that is conveying emotion.
In an earlier test, Large and Flaig asked people to listen to a piece of classical music performed by a pianist who inserted various nuances into the score, playing louder in some parts, softer in others. Then they asked the individuals to listen to the same piece performed by a machine that never varied the performance. They found that the human performance elicited a greater emotional response, which they believe was due to the pianist’s deviation from the listener’s expectation, even though none of the participants knew the music.
“We could see in the fMRI images greater activation of the limbic and paralimbic areas during the deviations – and those are the emotion processing areas,” says Large.
Flaig is interested in extending the research to also look more closely at speech patterns. Previous research has centered mainly on the role pitch and amplitude have in causing emotional reactions to rhetorical speech. Flaig wants to see what role tempo and, more specifically, changes in tempo, may have in influencing people’s emotional responses to an oratorical speech.