A meta-analysis of the comparative distance education (DE) literature between 1985 and 2002 was conducted. In total, 232 studies containing 688 independent achievement, attitude, and retention outcomes were analyzed. Overall results indicated effect sizes of essentially zero on all three measures and wide variability. This suggests that many applications of DE outperform their classroom counterparts and that many perform more poorly. Dividing achievement outcomes into synchronous and asynchronous forms of DE produced a somewhat different impression. In general, mean achievement effect sizes for synchronous applications favored classroom instruction, while effect sizes for asynchronous applications favored DE. However, significant heterogeneity remained in each subset.
Critical thinking (CT), or the ability to engage in purposeful, self-regulatory judgment, is widely recognized as an important, even essential, skill. This article describes an ongoing meta-analysis that summarizes the available empirical evidence on the impact of instruction on the development and enhancement of critical thinking skills and dispositions. We found 117 studies based on 20,698 participants, which yielded 161 effects with an average effect size ( g+) of 0.341 and a standard deviation of 0.610. The distribution was highly heterogeneous ( QT = 1,767.86, p < .001). There was, however, little variation due to research design, so we neither separated studies according to their methodological quality nor used any statistical adjustment for the corresponding effect sizes. Type of CT intervention and pedagogical grounding were substantially related to fluctuations in CT effects sizes, together accounting for 32% of the variance. These findings make it clear that improvement in students’ CT skills and dispositions cannot be a matter of implicit expectation. As important as the development of CT skills is considered to be, educators must take steps to make CT objectives explicit in courses and also to include them in both preservice and in-service training and faculty development.
to use the chapter as a basis for our work, of which we are immensely grateful. Many thanks to the Canadian Council on Learning for allowing the use of the list of Sources of Grey Literature. Campbell Library Methods papers The Campbell Library Methods Series comprises three types of publications: Methods Discussion Papers New or innovative ideas currently in development in the field of methodology, these papers are intended for discussion and do not represent official Campbell policy or guidance Methods Policy Notes Current Campbell Collaboration policy on specific methods for use in Campbell systematic reviews of intervention effects Methods Guides Guides on how to implement specific systematic review methods Disclaimer Campbell Collaboration Methods Discussion Papers are published to promote discussion of new and innovative methods in systematic reviews, making these approaches available to a broad audience. Papers are published as submitted by the authors. They are not subject to review or editing by the Campbell Collaboration. The views expressed are those of the authors, and may not be attributed to the Campbell Collaboration. Campbell Collaboration Methods Discussion Papers do not represent Campbell policy.
n this paper, we provide a description of a CSLP research project that looked at portfolio use within a middle school, the web-based e-portfolio software we have developed within the context of the Quebec educational system, our plans for further development of the tool, and our research plans related to the use of portfolios to support learning. Our aim is to combine research evidence on portfolio use with practical feedback from the field in an attempt to develop easy-to-use, powerful software designed to support active self-regulated student learning in schools.
This paper reports the findings of a Stage I meta-analysis exploring the achievement effects of computer-based technology use in higher education classrooms (non-distance education). An extensive literature search revealed more than 6,000 potentially relevant primary empirical studies. Analysis of a representative sample of 231 studies (k = 310) yielded a weighted average effect size of 0.28 surrounded by wide variability. A mixed effects model was adopted to explore coded moderators of effect size. Research design was found to be not significant across true, quasi-and pre-experimental designs, so the designs were combined. The variable ''degree of technology use'' (i.e., low, medium, and high) was found to be significant, with low and medium use performing significantly higher than high use. For the variable ''type of use'' (i.e., cognitive support tools, presentational tools, and Electronic supplementary material The online version of this article (multiple uses), cognitive support (g? = 0.40) was greater than presentational and multiple uses.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.