Psychological science relies on behavioral measures to assess cognitive processing; however, the field has not yet developed a tradition of routinely examining the reliability of these behavioral measures. Reliable measures are essential to draw robust inferences from statistical analyses, and subpar reliability has severe implications for measures’ validity and interpretation. Without examining and reporting the reliability of measurements used in an analysis, it is nearly impossible to ascertain whether results are robust or have arisen largely from measurement error. In this article, we propose that researchers adopt a standard practice of estimating and reporting the reliability of behavioral assessments of cognitive processing. We illustrate the need for this practice using an example from experimental psychopathology, the dot-probe task, although we argue that reporting reliability is relevant across fields (e.g., social cognition and cognitive psychology). We explore several implications of low measurement reliability and the detrimental impact that failure to assess measurement reliability has on interpretability and comparison of results and therefore research quality. We argue that researchers in the field of cognition need to report measurement reliability as routine practice so that more reliable assessment tools can be developed. To provide some guidance on estimating and reporting reliability, we describe the use of bootstrapped split-half estimation and intraclass correlation coefficients to estimate internal consistency and test-retest reliability, respectively. For future researchers to build upon current results, it is imperative that all researchers provide psychometric information sufficient for estimating the accuracy of inferences and informing further development of cognitive-behavioral assessments.
During social interactions we automatically infer motives, intentions, and feelings from bodily cues of others, especially from the eye region of their faces. This cognitive empathic ability is one of the most important components of social intelligence, and is essential for effective social interaction. Females on average outperform males in this cognitive empathy, and the male sex hormone testosterone is thought to be involved. Testosterone may not only down-regulate social intelligence organizationally, by affecting fetal brain development, but also activationally, by its current effects on the brain. Here, we show that administration of testosterone in 16 young women led to a significant impairment in their cognitive empathy, and that this effect is powerfully predicted by a proxy of fetal testosterone: the right-hand second digit-to-fourth digit ratio. Our data thus not only demonstrate down-regulatory effects of current testosterone on cognitive empathy, but also suggest these are preprogrammed by the very same hormone prenatally. These findings have importance for our understanding of the psychobiology of human social intelligence.
BackgroundNew indices, calculated on data from the widely used Dot Probe Task, were recently proposed to capture variability in biased attention allocation. We observed that it remains unclear which data pattern is meant to be indicative of dynamic bias and thus to be captured by these indices. Moreover, we hypothesized that the new indices are sensitive to SD differences at the response time (RT) level in the absence of bias.MethodRandomly generated datasets were analyzed to assess properties of the Attention Bias Variability (ABV) and Trial Level Bias Score (TL-BS) indices. Sensitivity to creating differences in 1) RT standard deviation, 2) mean RT, and 3) bias magnitude were assessed. In addition, two possible definitions of dynamic attention bias were explored by creating differences in 4) frequency of bias switching, and 5) bias magnitude in the presence of constant switching.ResultsABV and TL-BS indices were found highly sensitive to increasing SD at the response time level, insensitive to increasing bias, linearly sensitive to increasing bias magnitude in the presence of bias switches, and non-linearly sensitive to increasing the frequency of bias switches. The ABV index was also found responsive to increasing mean response times in the absence of bias.ConclusionRecently proposed DPT derived variability indices cannot uncouple measurement error from bias variability. Significant group differences may be observed even if there is no bias present in any individual dataset. This renders the new indices in their current form unfit for empirical purposes. Our discussion focuses on fostering debate and ideas for new research to validate the potentially very important notion of biased attention being dynamic.
Resilience is considered to be the process by which individuals demonstrate more positive outcomes than would be expected, given the nature of the adversity experienced. We propose that a cognitive approach has the potential to guide studies investigating the relationships between adversity, stress, and resilience. We outline a preliminary cognitive model of resilience in order to facilitate the application of cognitive approaches to the investigation of resilience in the face of adversity. We argue that the situationally appropriate application of flexibility or rigidity in affective-cognitive systems is a key element in promoting resilient responses. We propose that this mapping of cognitive processing can be conceptualised as being undertaken by an overarching mapping system, which serves to integrate information from a variety of sources, including the current situation, prior experience, as well as more conscious and goal-driven processes. We propose that a well-functioning mapping system is an integral part of the cognitive basis for resilience to adversity. Our preliminary model is intended to provide an initial theoretical framework to guide research on the development of cognitive functions that are considered to be important in the resilience process.
Psychological science relies on behavioural measures to assess cognitive processing; however, the field has not yet developed a tradition of routinely examining the reliability of these behavioural measures. Reliable measures are essential to draw robust inferences from statistical analyses, while subpar reliability has severe implications for the measures’ validity and interpretation. Without examining and reporting the reliability of cognitive behavioural measurements, it is near impossible to ascertain whether results are robust or have arisen largely from measurement error. In this paper we propose that researchers adopt a standard practice of estimating and reporting the reliability of behavioural assessments. We illustrate this proposal using an example from experimental psychopathology, the dot-probe task; although we argue that reporting reliability is relevant across fields (e.g. social cognition and cognitive psychology). We explore several implications of low measurement reliability, and the detrimental impact that failure to assess measurement reliability has on interpretability and comparison of results and therefore research quality. We argue that the field needs to a) report measurement reliability as routine practice so that we can b) develop more reliable assessment tools. To provide some guidance on estimating and reporting reliability, we describe bootstrapped split half estimation and IntraClass Correlation Coefficient procedures to estimate internal consistency and test-retest reliability, respectively. For future researchers to build upon current results it is imperative that all researchers provide sufficient psychometric information to estimate the accuracy of inferences and inform further development of cognitive behavioural assessments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.