Reward has significant impacts on behavior and perception. Numerous studies have suggested a close relationship between reward and attention. However, it remains largely unknown to what extent this relationship depends on the consciousness, because in all the previous work perceptually distinguishable visual cues are used to associate with different reward values. Here we developed a novel method to resolve this issue. The monetary rewarding and non-rewarding visual cues were rendered identical to each other except for their eye-of-origin information. Therefore, the reward coding system cannot rely on the consciousness to select the visual cue associated with monetary reward. In our first experiment, subjects completed this eye-based reward training using an inter-ocular suppression paradigm. Surprisingly, the targets presented to the rewarded eye broke into awareness faster than those presented to the non-rewarded eye. This eye-specific reward learning effect emerged quickly during the training and disappeared immediately in the reward-absent post-test. Although the effect was independent of the consciousness, it was not observed if top-down attention was distracted from the reward training task by a simultaneous RSVP task, suggesting an important role of attention in generating this effect. When reward was associated with both the eye-of-origin and the orientation of the target, we found both an eye-specific and an orientation-specific learning effect. Additional control experiments further disclosed that the eye-specific reward learning effect was absent for monocular reward training without inter-ocular suppression when the subjects were also unaware of the difference between the rewarding and non-rewarding targets. Combining all these findings, the present work suggests that the human's reward coding system can produce two different types of reward-based learning. One of them can induce unsupervised effects independent of the consciousness yet fairly consuming attentional resource. The other type of learning results from volitional selections guided by top-down attention.
- Downloaded 213 times
- Download rankings, all-time:
- Site-wide: 119,807
- In neuroscience: 18,201
- Year to date:
- Site-wide: 147,100
- Since beginning of last month:
- Site-wide: 107,906
Downloads over time
Distribution of downloads per paper, site-wide
- 27 Nov 2020: The website and API now include results pulled from medRxiv as well as bioRxiv.
- 18 Dec 2019: We're pleased to announce PanLingua, a new tool that enables you to search for machine-translated bioRxiv preprints using more than 100 different languages.
- 21 May 2019: PLOS Biology has published a community page about Rxivist.org and its design.
- 10 May 2019: The paper analyzing the Rxivist dataset has been published at eLife.
- 1 Mar 2019: We now have summary statistics about bioRxiv downloads and submissions.
- 8 Feb 2019: Data from Altmetric is now available on the Rxivist details page for every preprint. Look for the "donut" under the download metrics.
- 30 Jan 2019: preLights has featured the Rxivist preprint and written about our findings.
- 22 Jan 2019: Nature just published an article about Rxivist and our data.
- 13 Jan 2019: The Rxivist preprint is live!