Behind the Scenes of Our Senses: Part One, Perception

By Lydia Denworth | November 11, 2014 | Psychology Today | Topics: Hearing and Sound, Science and Health

Perceiving the world looks, sounds, and feels easy. It isn’t.

The five sensesOne of the best parts of reporting and writing about science is the gee whiz factor. As a regular part of my day, I stumble across facts and stories that make me say, “wow, I didn’t know that.” Sometimes I am surprised by how much of what I learn has been right under my nose all along.

Take the question of perception. Our eyes, ears, skin, nose and mouth are all receptors. Everything that comes into the brain enters through one of these doors. Because most of us take the world in through our senses effortlessly, we don’t give much thought or attention to how we do this.

Even scientists were guilty of underappreciating the complexity of the senses. Back in the 1950s and 1960s, when computers were in their infancy, the thinking was that it would take a decade or so to build “perceiving machines” that could respond to sight, sound, touch and so on as well as a human being. Such a machine still doesn’t exist.

Lose a sense, however, and you will quickly appreciate what is missing. I know because that’s what happened to me when I found out my son was deaf. There was so much to learn about the way hearing works and the role of sound in the brain that I wrote a whole book about it. That was the long version.

I thought it would be interesting to do something similar in short form for all five senses with a series of posts taking a tour behind the scenes of hearing, vision, touch, taste, and smell. What has to happen to put on the show that is our awareness of our environment? The short answer is an awful lot.

Before I get to the individual senses, I’m going to look at the big picture.  Neuroscientists have recently done some radical rethinking about the very nature of perception.

“Historically, the way we intuitively think about all perception is that we’re like a passive recording device with detectors that are specialized for certain things, like a retina for seeing, a cochlea for hearing, and so forth,” says David Poeppel,  a professor of psychology and neural science at New York University and a director of the newly established Max Planck Institute for Empirical Aesthetics. “We’re kind of a camera or microphone that gets encoded somehow and then magically makes contact with the stuff in your head.”

At the same time, many of the big thinkers who pondered perception, as far back as the 19th century German physician Hermann von Helmholtz, knew that couldn’t be quite right. If we reached for a glass or listened to a sentence, didn’t it help to be able to anticipate what might come next?

In the mid-to-late twentieth century, a handful of prominent researchers proposed models of perception that suggested that we engaged in “active sensing,” seeking out what was possible as we went along. Such ideas didn’t gain much traction until the past decade, when they suddenly became a hot topic in the study of cognition. What everyone is talking about today is the brain’s power of prediction.

On one level, prediction is just common sense, which may be one reason it didn’t get much scientific respect for so long. If you see your doctor in the doctor’s office, you recognize her quickly. If you see her in the grocery store dressed in jeans, you’ll be slower to realize you know her.

Predictable events are easy for the brain; unpredictable events require more effort. “Our expectations for what we’re going to perceive seem to be a critical part of the process,” says Greg Hickok, a neuroscientist at the University of California, Irvine. “It allows the system to make guesses as to what it might be seeing and to use computational shortcuts.”

In the old view of perception, a cascade of responses flows from the ear or the eye through the brain and ends with the ability to follow a complicated sentence or pick out the one person you are looking for in a crowded theater. That is known as bottom up processing. It starts with basic input to any sense—raw data—and ends with such higher-level skills as reasoning and judgment and critical thinking—in other words, our expectations and knowledge.

But that is only half the story. Neuroscientists now believe that the process is also happening in reverse, that the cascade flows both ways, with information being prepared, treated, and converted in both directions simultaneously, from the bottom up and the top down.

This holds for simple responses as well as for complex thinking about philosophy or physics. If a sound is uncomfortably loud, for instance, it is the cortex that registers that fact and sends a message all the way back to the cochlea to stiffen hair cells as a protective measure. The same is true of the retina, adjusting for the amount of light available. It’s not your eye or ear doing that, it’s your brain.

Imagine someone beating rhythmically on a table with a pencil: tap, tap, tap, tap. By the third beat, you have anticipated the timing. By the fourth, scientists like Poeppel and Hickok could see activity in the brain that represents that prediction.

Perception then is an active process of constructing a reality, a conversation between the senses and the cortex that balances new information from the outside world with predictions from the interior world of our brain.

Did you know that? I didn’t.

Coming Next: Hearing (predictably)


Parts of this post originally appeared in I Can Hear You Whisper: An Intimate Journey through the Science of Sound and Language (Dutton 2014).