By Rachel Henderson
If you’ve ever been distracted and didn’t notice something right in front of you, you may have realized that what we “see” isn’t simply the result of light entering our eyes — attention plays a crucial role in what we perceive. It is well known that paying attention enhances visual perception, but how that actually happens in the brain has largely been a mystery. But a new study from Michael Silver’s lab has revealed that paying close attention to a moving object lowers ongoing slow fluctuations in brain activity and that this quieting of internally generated activity is associated with better perception of objects in the outside world.
Silver is a professor of Optometry and Vision Science and Neuroscience at UC Berkeley, a member of the Helen Wills Neuroscience Institute, and director of the Berkeley Neuroscience PhD Program. The study, led by David Bressler (a Vision Science PhD Program alum) and co-authored by Ariel Rokem (a Neuroscience PhD Program alum), investigated how attention enhances visual perception by monitoring volunteers’ brain activity with functional MRI (fMRI) as they viewed images on a screen.
Silver says that it was previously known that attention increases the signal-to-noise ratio of activity in some brain areas, where the “signal” is the brain’s response to a visual stimulus and the “noise” is brain activity thought to be unrelated to processing the stimulus. In the new study, which was published in the Journal of Cognitive Neuroscience, the researchers set out to discover whether attention increases this ratio and the ability to perceive visual stimuli by increasing the signal, decreasing the noise, or some combination of the two. They were particularly intrigued by internally generated (endogenous) noise that consists of slow fluctuations in brain activity, which have been observed in many visual areas of the brain. These endogenous fluctuations can occur on a timescale of many seconds — very slow compared to many other brain signals — and their function is currently unknown.
Participants in the study were asked to keep their eyes on a fixation point in the middle of the screen as a wedge-shaped stimulus rotated around the point at a fixed frequency. This allowed the researchers to determine which parts of the participants’ cerebral cortex were responding to the image on the screen. This is because light from the outside world lands on the retina in a way that preserves spatial information about the visual scene. This so-called retinotopic organization is maintained as visual information is transmitted from one brain area to another, resulting in multiple retinotopic “maps” of activity in visual areas of the cerebral cortex. Therefore, the scientists were able to identify the brain areas responding to the rotating wedge because they had the same pattern and frequency of fMRI activity as the wedge moving on the screen.
As their brains were being scanned, the participants were instructed to do one of two tasks. In one task, they had to simply focus their attention on the fixation point and press a button whenever they saw a very small square (the target) appear in that location. In the other task, they were told to focus their attention on the moving wedge (while keeping their eyes on the fixation point) and to press a button whenever the target appeared at a random location within the wedge itself. This second task required the subjects to pay close attention to the wedge, since it was moving continuously, and the target was designed to be difficult to see.
Silver’s team then analyzed the fMRI data at different frequencies of brain activity. This allowed them to distinguish between the response evoked by the rotating wedge stimulus and the endogenous slow fluctuations that were unrelated to the stimulus. They did this analysis in several visually-responsive, retinotopic regions of the cortex, as well as in a non-retinotopic region that served as a control for systemic fluctuations that might occur simply due to physiological processes such as heart rate and breathing.
They found that when the participants were paying attention to the moving wedge, both the evoked response (signal) increased and the endogenous slow fluctuations (noise) decreased, specifically in the retinotopic areas, compared to when the participants just focused on the fixation point. Silver notes that the “suppression of the endogenous fluctuations was more robust. It happened more consistently across brain areas than the enhancement of the stimulus-evoked response.” This suggests that attention can increase signal and also decrease noise, but that the decrease in noise may be a more general phenomenon in the brain.
Indeed, only the decrease in slow endogenous fluctuations was actually correlated with how well the subjects did on the task, as determined by the number of times they correctly pushed a button to indicate they had seen the target in the wedge. “We found that even though attention does enhance the strength of this stimulus-evoked response in some brain areas, that’s not associated with better performance on the task. However, suppression of endogenous activity fluctuations strongly predicts performance on the task,” Silver says. He explains that people naturally have minute-to-minute variations in how well they do in this challenging perceptual task. “The participants were doing best at the task when their endogenous fluctuations were the smallest. So somehow they’re able to use their attentional systems to quiet these endogenous fluctuations, enabling them to perceive the stimulus better.”
Why our brains generate these slow endogenous fluctuations is still a mystery. “Whatever they are, they seem to interfere with sensory processing,” Silver says. “I don’t think the brain is burning through a lot of metabolic resources just to make patterns of activity that disrupt perception, so these patterns probably have some other useful internal cognitive function. … A better understanding of that would give us insight into the different cognitive processes that are going on at any given time and how the brain prioritizes certain ones over others. Sometimes the brain is more externally directed and focused on information from the outside world and generating perceptions. At other times, it’s downgrading the perceptual signals and doing something that is more internally directed.”
Silver says he is excited to do more studies to find out whether this dampening of internally generated activity is applicable to other types of attention, particularly in more naturalistic situations. He says another interesting question is whether people can learn how to quiet their slow endogenous fluctuations to enhance perception.
“Understanding how attention affects perception touches on many different important activities that we do in everyday life,” Silver says, giving the examples of classroom education and choosing items in a grocery store. He points out that in addition to attention deficit hyperactivity disorder, many other neurological conditions are also associated with difficulties with attention. “A better understanding of the brain mechanisms of attention could help guide treatment development” for these conditions in the future, he says.
- Silver lab
- Bressler, DW, Rokem, A, and Silver, MA, “Slow Endogenous Fluctuations in Cortical fMRI Signals Correlate with Reduced Performance in a Visual Detection Task and Are Suppressed by Spatial Attention,” Journal of Cognitive Neuroscience, Jan 2020
- Neuroscience PhD Program alumni profile of co-author Ariel Rokem