As observations and student models become complex, educational assessments that exploit advances in technology and cognitive psychol ogy can outstrip familiar testing models and ana lytic methods. Within the Portal conceptual framework for assessment design, Bayesian inference networks (BINs) record beliefs about students' knowledge and skills, in light of what they say and do. Joining evidence model BIN fragments-which contain observable variables and pointers to student model variables-to the student model allows one to update belief about knowledge and skills as observations arrive. Markov Chain Monte Carlo (MCMC) techniques can estimate the required conditional probabilities fr om empirical data, supplemented by expert judgment or substantive theory. Details for the special cases of item response theory (IRT) and multivariate latent class modeling are given, with a numerical example of the latter.
Statistics for Social and Behavioral Sciences (SSBS) includes monographs and advanced textbooks relating to education, psychology, sociology, political science, public policy, and law.More information about this series at
Model checking is a crucial part of any statistical analysis. As educators tie models for testing to cognitive theory of the domains, there is a natural tendency to represent participant proficiencies with latent variables representing the presence or absence of the knowledge, skills, and proficiencies to be tested (Mislevy, Almond, Yan, & Steinberg, 2001). Model checking for these models is not straightforward, mainly because traditional χ2‐type tests do not apply except for assessments with a small number of items. Williamson, Mislevy, and Almond (2000) note a lack of published diagnostic tools for these models. This paper suggests a number of graphics and statistics for diagnosing problems with models with discrete proficiency variables. A small diagnostic assessment first analyzed by Tatsuoka (1990) serves as a test bed for these tools. This work is a continuation of the recent work by Yan, Mislevy, and Almond (2003) on this data set. Two diagnostic tools that prove useful are Bayesian residual plots and an analog of the item characteristic curve (ICC) plots. A χ2‐type statistic based on the latter plot shows some promise, but more work is required to establish the null distribution of the statistic. On the basis of the identified problems with the model used by Mislevy (1995), the suggested diagnostics are helpful to hypothesize an improved model that seems to fit better.
There is growing interest in educational assessments that coordinate substantive considerations, learning psychology, task design, and measurement models. This paper concerns an analysis of responses from an assessment of mixed-number subtraction that was created by Kikumi Tatsuoka (1983) in light of cognitive analyses of students' problem solutions. In particular, we fit a binary-skills multivariate latent class model to the data and compare results to those obtained with an item response theory model and a modified latent class model suggested by model criticism indices. Markov chain Monte Carlo (MCMC) techniques are used to estimate the parameters in the model in a Bayesian framework that integrates information from substantive theory, expert judgment, and empirical data.
the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.