From There to Hear: Locating Sound Distance

Researchers at UConn Health have identified the mechanisms by which rabbits and humans recognize the distance of sound from its origin to the listener.


Mammals are good at figuring out which direction a sound is coming from, whether it’s a predator breathing down our necks or a baby crying for its mother. But the mechanism by which we judge how far away that sound is coming from was a mystery until now.

Researchers from UConn Health report in the April 1 issue of the Journal of Neuroscience that echoes and fluctuations in volume (amplitude modulation) are the cues we use to figure the distance between us and the source of a noise.

“This opens up a new horizon,” says Duck O. Kim, a neuroscientist at UConn Health.

Researchers have long understood how we can tell a sound’s direction – whether it’s to our left or right, front or back, and above or below us. But how we tell how far away it is had remained a mystery.

“The third dimension of sound location was pretty much unknown,” says Kim.

All natural sounds, including speech, have amplitude modulation. Kim and his colleague Shigeyuki Kuwada, also a neuroscience researcher, suspected that amplitude modulation, and how echoes muddy it, were together key to the ability to perceive a sound’s distance. To explore the idea, they used tiny microphones to record the sounds inside rabbits’ ears as they played sounds at different locations. They used these recordings to simulate modulated or unmodulated noise coming from different distances away from the rabbit. Then Kim and Kuwada played the simulated sounds back to the rabbit, and measured the responses of neurons in the rabbit’s inferior colliculus (IC), a region of the midbrain known to be important for sound perception.

When the rabbit heard the simulated sounds, certain types of IC neurons fired more when the sound was closer and the depth of modulation was higher – that is, when there was a bigger difference between the sound’s maximum and minimum amplitude.

A human subject in an anechoic (echo-free) chamber at UConn Health. (Photo courtesy of Duck Kim)
A human subject in an anechoic (echo-free) chamber at UConn Health. (Photo courtesy of Duck Kim)

Reverberations, or echoes, tend to degrade amplitude modulation, smoothing out the amplitude’s peaks and valleys. Almost any environment has echoes, as sounds bounce off objects, walls and trees, the ground, et cetera. The farther away the source of a sound is from a listener, the more echoes there are, and the more degraded the depth of amplitude modulation becomes.

As expected, in the experiment the neurons fired less and less when the sound moved farther away and the depth of amplitude modulation degraded more and more.

Pavel Zahorik, an auditory researcher at the University of Louisville School of Medicine, tested the same amplitude-modulated noise using human volunteers and got the same results: people need both amplitude modulation and reverberation to figure out how far away a sound is. Without amplitude modulation, a person can’t tell how far away that noise is. Neither can she do it in an anechoic (echo-free) room.

“Reverberation is usually considered a bad thing,” detrimental to hearing clearly, says Kuwada. “But it is necessary and beneficial in order to recognize distance.”

Judging sound distance is a crucial survival skill, whether you’re a bunny or a human – is that monster breathing down my neck, or huffing and puffing 20 yards behind me? Do I have time to cross the street before that car I hear in the distance pulls around the bend?

Reverberation is usually considered a bad thing. But it is necessary and beneficial in order to recognize distance. — Duck O. Kim

Kim and Kuwada suggest that getting a better understanding of the acoustics and neuroscience of distance perception could contribute to making better hearing aids and prostheses, and perhaps reveal more subtle aspects of sound perception.

The importance of amplitude modulation is still poorly understood. Laurel Carney, a colleague at the University of Rochester, modeled the ear and inferior colliculus neural circuitry and replicated the neural firing patterns recorded by Kim and Kuwada. The researchers hope that tweaking the model will give them more insight into the neurons’ responses.

Kim and Kuwada’s next step will be to do a two-eared study, and tie together the perception of distance with horizontal and vertical directions of sound.

This research was the result of several grants from the NIH to researchers at UConn Health, the University of Rochester, and University of Louisville.