pubmed.ncbi.nlm.nih.gov

How attention extracts objects from noise - PubMed

How attention extracts objects from noise

Michael S Pratte et al. J Neurophysiol. 2013 Sep.

Abstract

The visual system is remarkably proficient at extracting relevant object information from noisy, cluttered environments. Although attention is known to enhance sensory processing, the mechanisms by which attention extracts relevant information from noise are not well understood. According to the perceptual template model, attention may act to amplify responses to all visual input, or it may act as a noise filter, dampening responses to irrelevant visual noise. Amplification allows for improved performance in the absence of visual noise, whereas a noise-filtering mechanism can only improve performance if the target stimulus appears in noise. Here, we used fMRI to investigate how attention modulates cortical responses to objects at multiple levels of the visual pathway. Participants viewed images of faces, houses, chairs, and shoes, presented in various levels of visual noise. We used multivoxel pattern analysis to predict the viewed object category, for attended and unattended stimuli, from cortical activity patterns in individual visual areas. Early visual areas, V1 and V2, exhibited a benefit of attention only at high levels of visual noise, suggesting that attention operates via a noise-filtering mechanism at these early sites. By contrast, attention led to enhanced processing of noise-free images (i.e., amplification) only in higher visual areas, including area V4, fusiform face area, mid-Fusiform area, and the lateral occipital cortex. Together, these results suggest that attention improves people's ability to discriminate objects by de-noising visual input in early visual areas and amplifying this noise-reduced signal at higher stages of visual processing.

Keywords: decoding; equivalent noise; functional magnetic resonance imaging; noise reduction; primary visual cortex; selective attention.

PubMed Disclaimer

Figures

Fig. 1.
Fig. 1.

Predictions of the perceptual template model. Performance is plotted as a function of external noise for attended (solid curves) and unattended (dashed curves) conditions. Performance could be a behavioral measure or the accuracy of fMRI pattern classification. A: pure amplification is evidenced by an attentional benefit at low noise levels, but no effect at high noise. B: pure noise filtering is evidenced by an attentional benefit at high levels of noise, but no effect at low noise levels. C: if both amplification and noise-filtering effects are present, then attention is expected to increase performance accuracy across all noise levels.

Fig. 2.
Fig. 2.

Experimental stimuli and cortical regions of interest. A: stimuli consisted of images of shoes, chairs, faces, and houses. The left column shows images with zero noise, the middle and right columns show the same images with 50% and 75% noise, respectively. The objects were still identifiable in the highest noise condition. B: a flattened representation of the right occipital lobe for 1 participant. The activity map shows significant BOLD activity in the visual localizer experiment. VO, ventral occipital cortex; LO, lateral occipital cortex; FFA, fusiform face area; mFus, mid-Fusiform area.

Fig. 3.
Fig. 3.

Mean BOLD activity for each noise condition, plotted by visual area separately for unattended (A) and attended (B) conditions. Error bars denote ±1 SE. Raw BOLD amplitudes were converted to percent signal change units by dividing the mean time course of each voxel by the mean signal for that voxel over the first 16 s of fixation that began each run.

Fig. 4.
Fig. 4.

fMRI pattern classification accuracy plotted across visual areas for each noise level in the unattended (A) and attended (B) conditions. Error bars denote standard errors. Data are in d′ units (left axis); corresponding 4-choice accuracy is shown on the rightmost axis.

Fig. 5.
Fig. 5.

Measures of noise filtering and amplification for individual visual areas. A: attention effect (attended-unattended classification performance) for each visual area in the high-noise condition. B: attention effect in the no-noise condition. C: parameter estimates of noise filtering from perceptual template model (PTM) fits to each visual area. D: parameter estimates of amplification from PTM fits. Parameter estimates that significantly exceed 1.0 indicate a reliable multiplicative effect; asterisks and black circles denote significant (P < 0.05) and marginally significant (P < 0.1) effects, respectively. Error bars denote standard errors.

Fig. 6.
Fig. 6.

Classification accuracy in each visual area plotted as a function of external noise level for attended and unattended conditions. Points denote the data with standard errors; asterisks and black circles denote statistically significant (P < 0.05) and marginally significant (P < 0.1) attentional effects, respectively, for each area and noise level. Curves indicate PTM fits to the fMRI classification data, obtained by generating predictions for each participant and averaging over participants. Whereas the pattern of results in areas V1 and V2 are consistent with pure noise reduction, higher-level areas exhibit both noise reduction and amplification.

Fig. 7.
Fig. 7.

A pattern classifier was trained on the zero-noise unattended condition and used to classify object category in all other conditions. The results are similar to the above analyses, in which classifiers were separately trained and tested within each condition, suggesting that the nature of object-category information is qualitatively similar across attention and noise conditions, but that the strengths of these patterns are modified by attention and noise.

Similar articles

Cited by

References

    1. Beauchamp MS, Cox RW, DeYoe EA. Graded effects of spatial and featural attention on human area MT and associated motion processing areas. J Neurophysiol 78: 516–520, 1997 - PubMed
    1. Boser BE, Guyon IM, Vapnik VN. A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, PA, 1992, p. 144–152
    1. Brainard DH. The psychophysics toolbox. Spatial Vision 10: 433–436, 1997 - PubMed
    1. Buracas GT, Boynton GM. The effect of spatial attention on contrast response functions in human visual cortex. J Neurosci 27: 93–97, 2007 - PMC - PubMed
    1. Carrasco M. Visual attention: the past 25 years. Vision Res 51: 1484–1525, 2011 - PMC - PubMed

Publication types

MeSH terms

LinkOut - more resources