Discriminating among competing statistical models is a pressing issue for many experimentalists in the field of cognitive science. Resolving this issue begins with designing maximally informative experiments. To this end, the problem to be solved in adaptive design optimization is identifying experimental designs under which one can infer the underlying model in the fewest possible steps. When the models under consideration are nonlinear, as is often the case in cognitive science, this problem can be impossible to solve analytically without simplifying assumptions. However, as we show in this paper, a full solution can be found numerically with the help of a Bayesian computational trick derived from the statistics literature, which recasts the problem as a probability density simulation in which the optimal design is the mode of the density. We use a utility function based on mutual information, and give three intuitive interpretations of the utility function in terms of Bayesian posterior estimates. As a proof of concept, we offer a simple example application to an experiment on memory retention.
Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method.
To model behavior, we need to know how models behave. This means learning what other behaviors a model can produce besides the one generated by participants in an experiment. This is a difficult problem because of the complexity of psychological models (e.g., their many parameters) and because the behavioral precision of models (e.g.,interval-scale performance) often mismatches their testable precision in experiments, where qualitative, ordinal predictions are the norm. Parameter space partitioning is a solution that evaluates model performance at a qualitative level. Given a definition of a qualitative data pattern, there exists a partition on the model's parameter space that divides it into regions that correspond to each data pattern.Markov-chain Monte Carlo methods are used to discover and define these regions. Three application examples, all using connectionist models, demonstrate its potential and versatility for studying the global behavior of psychological models. Among other things, one can easily assess how central and robust the empirical data pattern is to the model, as well as the range and characteristics of its other behaviors. Parameter Space Partitioning 3The experimental method of scientific inquiry has proven to work quite well in psychology. Its unique blend of methodological control and statistical inference are effective for testing qualitative (i.e., ordinal) predictions derived from theories of behavior. Data collection and dissemination have become very efficient, so much so that far more may be known about a behavioral phenomenon than is reflected in its corresponding theory.That our knowledge extends beyond the reach of the theory may be a sign of productive science, and underscores the fact that theories are broad conceptualizations about behavior that cannot be expected to explain the minutia in data. Cognitive modeling is a research tool that can act as a counter-force to slow and fill this explanatory gap. It compensates for a theory's limitations of precision in data synthesis, description, and prediction. Whether the model is an implementation of an existing theory, or a neurally inspired environment in which to study processing (e.g., information integration, representational specificity, probabilistic learning), models are rich sources of ideas and information on how to think about perception, cognition, and action. The pros and cons of various implementations can be evaluated. Inconsistencies and hidden assumptions can come to light during model creation and evaluation. In short, the modeler is forced to confront the complexity of what is being modeled, and in the process, can gain insight into the relationship between variables and the functionality of the model (see Shiffrin & Nobel, 1998 for a personal account of this process).Of course, the virtues of modeling are accompanied by vices. One of the more serious, often leveled against connectionist models (Dawson & Shamanski, 1994;McCloskey, 1991) but by no means restricted to them, is that model behavior can be myste...
Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct “smart” experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.