Abstract
Perceptual experience results from a complex interplay of bottom-up input and prior knowledge about the world, yet the extent to which knowledge affects perception, the neural mechanisms underlying these effects, and the stages of processing at which these two sources of information converge, are still unclear. In a series of experiments we show that language, in the form of verbal cues, both aids recognition of ambiguous “Mooney” images and improves objective visual discrimination performance. We then used electroencephalography (EEG) to better understand the mechanisms of this effect. The improved discrimination of images previously labeled was accompanied by a larger occipital-parietal P1 evoked response to the meaningful versus meaningless target stimuli. Time-frequency analysis of the interval between the two stimuli (just prior to the target stimulus) revealed increases in the power of posterior alpha-band (8-14 Hz) oscillations when the meaning of the stimuli to be compared was trained. The magnitude of the prestimulus alpha difference and the P1 amplitude difference was positively correlated across individuals. These results suggest that prior knowledge prepares the brain for upcoming perception via the modulation of prestimulus alpha-band oscillations, and that this preparatory state influences early (~120 ms) stages of visual processing.
Footnotes
The authors declare no conflicts of interest.
This project was supported by NSF-PAC 1331293 to G.L. and by MH095984 to Bradley R.Postle
* These values are quite different from the peak amplitudes in the waveform traces in Fig. 4B because the grand means reflect the average of peaks occurring at different latencies on different trials and so the amplitudes are lower.