Conceptual models have been evaluated along the dimensions of modeling complexity (how easy is it to create schemas given requirements?) and readability (how easy is it to understand the requirements by reading the model schema?). In this work, we update COGEVAL, a propositional framework based on cognitive theories to evaluate conceptual models. We synthesize work from cognitive literature to develop the framework, and show how it can be used to explain earlier empirical results as well as existing theoretical frameworks. We illustrate how COGEVAL can be used as a theoretical basis to design an empirical test of readability of a conceptual model. Unlike much of the earlier empirical work on readability, our approach isolates the effect of a model-independent variable (degree of fragmentation) on readability. From a practical perspective, our findings will have implications for both creators of new models and practitioners who use currently available models to create schemas.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.