SUNY Researchers Tease Apart the Processes that Govern Visual Attention

June 21, 2016

Optometry---eye

Look around you. What do you see? Perhaps a cluttered desk, a living room full of household items, a busy streetscape? Visualize yourself searching that space for your car keys, the television remote control or a speed-limit sign. Now imagine that you suffer from a head injury. You search and search for the object but cannot find it. Your brain simply refuses to cooperate.  

Dr. Robert McPeek, associate professor of biological sciences at SUNY Optometry
Dr. Robert McPeek, associate professor of biological sciences at SUNY Optometry

SUNY Optometry’s Dr. Robert McPeek and Stony Brook University’s Dr. Greg Zelinsky are collaborating on a project to tease apart what happens in a normal brain when a person searches his or her environment for an object. The information will be useful in helping people who have suffered from brain damage.

“People who’ve had head injuries often have problems with eye movement that can affect their ability to read, drive, or even just walk around,” said Dr. McPeek. “We’re trying to understand how eye movements work in the normal system first and once we understand the normal system we can start to think about ways to help people who have damage.”

 

A Model of the Brain

The first step in understanding a normal brain, Dr. McPeek said, is to figure out how people decide where to look. “If you’re out there in the world there are a million things you could choose to look at, but somehow our eye movements are very precise,” he said. “They take our eyes to the right place at the right time so we can pick up the information that we need to do whatever it is we’re trying to do. It’s not very well understood how that happens.”

To unravel this mystery, Drs. McPeek and Zelinsky are zeroing in on the part of the brain that is responsible for eye movement—the superior colliculus. Their goal is to create a model that will predict where activity should occur in the brain during the process of searching a visually complex scene for an object.

“Basically, we’re using knowledge about a part of the brain important for eye movement to better predict where attention is directed in everyday tasks,” said Dr. Zelinsky.

According to Dr. Zelinsky, who is spearheading the model’s development, the model will use existing knowledge about the neurophysiology of the superior colliculus to better predict where fixations will be made when viewing visually-complex common objects. This model will also be used to predict the degree and distribution of activity in the superior colliculus in response to a subject viewing these complex visual displays while performing a task. These predictions will then be tested by Dr. McPeek, who is in charge of the neuroscience laboratory that will be making the neural recordings.

Generating Real-World Data

The experiment begins with a rhesus macaque monkey. Dr. McPeek currently is training the monkey how to search for an object, such as a teddy bear, in a realistic visual scene. “It doesn’t know exactly what the teddy bear looks like; there could be 10 different teddy bears,” he said. “The situation is similar to a person searching for a pencil on a cluttered desk.”

Next Dr. McPeek will use information generated by Dr. Zelinsky’s model to strategically attach electrodes to the monkey’s brain to record how frequently neurons fire around the electrodes when the monkey is asked to search for a particular item. He also will measure the monkey’s eye movements using an infrared camera, which precisely tracks where the eye is looking. The team will compare the results to the predictions of the model.

“If the results match the predictions of the model, then we have evidence that maybe we understand how this brain area is wired and how it’s used,” said Dr. McPeek. “If the lab data show us something different from what’s in the model, we’ll know we have to change some of the model’s functions.”

According to the researchers, previous studies of eye movement programming in the brain have used very simple stimuli, such as a dot on a blank screen. “Now we’re going from dots to car keys,” said Dr. McPeek.

Both researchers agree that the study is a major step toward gaining a basic understanding of brain function and will provide essential information needed to help people with vision problems.

The researchers received seed money from the SUNY Networks of Excellence to get started on their project. They aim to continue their work with additional outside funding and currently have a grant application pending at the US National Science Foundation’s program on Collaborative Research in Computational Neuroscience.

Adapted from an article produced by the SUNY Research Foundation