Visual awareness is a specific form of consciousness. Binocular rivalry, the alternation of visual consciousness resulting when the two eyes view differing stimuli, allows one to experimentally investigate visual awareness. Observers usually indicate the gradual changes of conscious perception in binocular rivalry by a binary measure: pressing a button. However, in our experiments we used gradual measures such as pupil and joystick movements and found reactions to start around 590 ms before observers press a button, apparently accessing even pre-conscious processes. Our gradual measures permit monitoring the somewhat gradual built-up of decision processes. Therefore these decision processes should not be considered as abrupt events. This is best illustrated by the fact that the process to take a decision may start but then stop before an action has been taken – which we will call an abandoned decision process here. Changes in analog measures occurring before button presses by which observers have to communicate that a decision process has taken place do not prove that these decisions are taken by a force other than the observer – hence eliminating “free will” – but just that they are prepared “pre-thresholdly,” before the observer considers the decision as taken.
Endogenous attention is the cognitive function that selects the relevant pieces of sensory information to achieve goals and it is known to be controlled by dorsal fronto-parietal brain areas. Here we expand this notion by identifying a control attention area located in the temporal lobe. By combining a demanding behavioral paradigm with functional neuroimaging and diffusion tractography, we show that like fronto-parietal attentional areas, the human posterior inferotemporal cortex exhibits significant attentional modulatory activity. This area is functionally distinct from surrounding cortical areas, and is directly connected to parietal and frontal attentional regions. These results show that attentional control spans three cortical lobes and overarches large distances through fiber pathways that run orthogonally to the dominant anterior-posterior axes of sensory processing, thus suggesting a different organizing principle for cognitive control.
We tested how well barn owls can discriminate objects of different sizes. This ability may be important for the owls when catching prey. We performed a quantitative experiment in the laboratory and trained owls in a task in which the owls had to discriminate whether two rhombi presented simultaneously on a computer monitor were of the same or of different sizes. We obtained full data sets with two experienced owls and one data point with a third owl. For objects being sufficiently larger than the spatial resolution of the barn owl, the angular threshold was related to object size, implying that the discrimination followed Weber's law. The range of Weber fractions we determined was between 0.026 and 0.09. For object sizes close to the spatial resolution, performance degraded. We conducted similar experiments with human subjects. Human thresholds showed the same dependence on object size, albeit down to smaller object sizes. Human performance resulted in a range of Weber fractions extending from 0.025 to 0.036. The differences between owls and humans could be explained by the much higher spatial acuity of humans compared with owls.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.