A new method based on jackknifing is presented for measuring the difference between two conditions in the onset latencies of the lateralized readiness potential (LRP). The method can be used with both stimulus- and response-locked LRPs, and simulations indicate that it provides accurate estimates of onset latency differences in many common experimental conditions.
Miller, Patterson, and Ulrich (1998) introduced a jackknife-based method for measuring the differences between two conditions in the onset latencies of the lateralized readiness potential (LRP). The present paper generalizes such jackknife-based methods to factorial experiments with any combination of within- and between subjects factors. Specifically, we introduce a subsample scoring method to assess potential main and interaction effects on LRP onsets within conventional yet slightly adjusted analyses of variance (ANOVAs) and post hoc comparison procedures.
The cognitive processes underlying the ability of human performers to trade speed for accuracy is often conceptualized within evidence accumulation models, but it is not yet clear whether and how these models can account for decision-making in the presence of various sources of conflicting information. In the present study, we provide evidence that speed-accuracy tradeoffs (SATs) can have opposing effects on performance across two different conflict tasks. Specifically, in a single preregistered experiment, the mean reaction time (RT) congruency effect in the Simon task increased, whereas the mean RT congruency effect in the Eriksen task decreased, when the focus was put on response speed versus accuracy. Critically, distributional RT analyses revealed distinct delta plot patterns across tasks, thus indicating that the unfolding of distractor-based response activation in time is sufficient to explain the opposing pattern of congruency effects. In addition, a recent evidence accumulation model with the notion of time-varying conflicting information was successfully fitted to the experimental data. These fits revealed task-specific time-courses of distractor-based activation and suggested that time pressure substantially decreases decision boundaries in addition to reducing the duration of non-decision processes and the rate of evidence accumulation. Overall, the present results suggest that time pressure can have multiple effects in decision-making under conflict, but that strategic adjustments of decision boundaries in conjunction with different time-courses of distractor-based activation can produce counteracting effects on task performance with different types of distracting sources of information.
Previous studies on voluntary task switching using the self-organized task switching paradigm suggest that task performance and task selection in multitasking are related. When deciding between two tasks, the stimulus associated with a task repetition occurred with a stimulus onset asynchrony ( SOA ) that continuously increased with the number of repetitions, while the stimulus associated with a task switch was immediately available. Thus, the waiting time for the repetition stimulus increased with number of consecutive task repetitions. Two main results were shown: first, switch costs and voluntary switch rates correlated negatively – the smaller the switch costs, the larger the switch rates. Second, participants switched tasks when switch costs and waiting time for the repetition stimulus were similar. In the present study, we varied the SOA that increased with number of task repetitions ( SOA increment ) and also varied the size of the switch costs by varying the intertrial interval. We examined which combination of SOA increment and switch costs maximizes participants’ attempts to balance waiting time and switch costs in self-organized task switching. We found that small SOA increments allow for fine-grained adaptation and that participants can best balance their switch costs and waiting times in settings with medium switch costs and small SOA increments. In addition, correlational analyses indicate relations between individual switch costs and individual switch rates across participants.
Recent evidence suggests that research practices in psychology and many other disciplines are far less effective than previously assumed, which has led to what has been called a “crisis of confidence” in psychological research (e.g., Pashler & Wagenmakers 2012). In response to the perceived crisis, standard research practices have come under intense scrutiny, and various changes have been suggested to improve them. The burgeoning field of metascience seeks to use standard quantitative data-gathering and modeling techniques to understand the reasons for inefficiency, to assess the likely effects of suggested changes, and ultimately to tell psychologists how to do better science. We review the pros and cons of suggested changes, highlighting the many complex research trade-offs that must be addressed to identify better methods. Expected final online publication date for the Annual Review of Psychology, Volume 73 is January 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.