Individuals frequently choose between accomplishing goals using unaided cognitive abilities or offloading cognitive demands onto external tools and resources. For example, in order to remember an upcoming appointment one might rely on unaided memory or create a reminder by setting a smartphone alert. Setting a reminder incurs both a cost (the time/effort to set it up) and a benefit (increased likelihood of remembering). Here we investigate whether individuals weigh such costs/benefits optimally or show systematic biases. In 3 experiments, participants performed a memory task where they could choose between (a) earning a maximum reward for each remembered item, using unaided memory; or (b) earning a lesser amount per item, using external reminders to increase the number remembered. Participants were significantly biased toward using external reminders, even when they had a financial incentive to choose optimally. Individual differences in this bias were stable over time, and predicted by participants' erroneous metacognitive underconfidence in their memory abilities. Bias was eliminated, however, when participants received metacognitive advice about which strategy was likely to maximize performance. Furthermore, we found that metacognitive interventions (manipulation of feedback valence and practicetrial difficulty) yielded shifts in participants' reminder bias that were mediated by shifts in confidence. However, the bias could not be fully attributed to metacognitive error. We conclude that individuals have stable biases toward using external versus internal cognitive resources, which result at least in part from inaccurate metacognitive evaluations. Finding interventions to mitigate these biases can improve individuals' adaptive use of cognitive tools.
Ability in various cognitive domains is often assessed by measuring task performance, such as the accuracy of a perceptual categorization. A similar analysis can be applied to metacognitive reports about a task to quantify the degree to which an individual is aware of his or her success or failure. Here, we review the psychological and neural underpinnings of metacognitive accuracy, drawing on research in memory and decision-making. These data show that metacognitive accuracy is dissociable from task performance and varies across individuals. Convergent evidence indicates that the function of the rostral and dorsal aspect of the lateral prefrontal cortex (PFC) is important for the accuracy of retrospective judgements of performance. In contrast, prospective judgements of performance may depend upon medial PFC. We close with a discussion of how metacognitive processes relate to concepts of cognitive control, and propose a neural synthesis in which dorsolateral and anterior prefrontal cortical subregions interact with interoceptive cortices (cingulate and insula) to promote accurate judgements of performance.
Metacognition refers to the ability to reflect on and monitor one’s cognitive processes, such as perception, memory and decision-making. Metacognition is often assessed in the lab by whether an observer’s confidence ratings are predictive of objective success, but simple correlations between performance and confidence are susceptible to undesirable influences such as response biases. Recently an alternative approach to measuring metacognition has been developed (Maniscalco & Lau, 2012) that characterises metacognitive sensitivity (meta-d′) by assuming a generative model of confidence within the framework of signal detection theory. However, current estimation routines require an abundance of confidence rating data to recover robust parameters, and only provide point estimates of meta-d’. In contrast, hierarchical Bayesian estimation methods provide opportunities to enhance statistical power, incorporate uncertainty in group-level parameter estimates and avoid edge-correction confounds. Here I introduce such a method for estimating metacognitive efficiency (meta-d’/d’) from confidence ratings and demonstrate its application for assessing group differences. A tutorial is provided on both the meta-d’ model and the preparation of behavioural data for model fitting. Through numerical simulations I show that a hierarchical approach outperforms alternative fitting methods in situations where limited data are available, such as when quantifying metacognition in patient populations. In addition, the model may be flexibly expanded to estimate parameters encoding other influences on metacognitive efficiency. MATLAB software and documentation for implementing hierarchical meta-d’ estimation (HMeta-d) can be downloaded at https://github.com/smfleming/HMeta-d.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.