Purpose -Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis for more consistent use of coding schemes and to facilitate comparison of studies utilizing different coding schemes. Design/methodology/approach -The paper identifies coding schemes presented in the research literature, classifies these schemes, and presents a list of synthesis codes reflecting the content of the many different schemes for each classification. Findings -Based on the initial and follow-up literature review, 56 different coding schemes were identified as having been employed within the last eight years. Initial sorting indicated that schemes primarily focused on identifying critical thinking, describing social interactions, or characterizing online discussion. Originality/value -In addition to offering a comprehensive resource reflecting the coding schemes currently applied to the analysis of online, asynchronous discussion, the meta-analysis results also inform regarding the current state of research in this area. In addition, current research trends and areas for potential new research and development are revealed.
This study will explore student skill development and research self-efficacy as related to online doctoral students’ first core research course experience. Findings from this study will be used to inform instructors in effective ways to support doctoral students during their early research experiences. This support will ensure that online graduate students will develop well-crafted dissertations and following graduation, continue to conduct research. The original research design was a grounded theory study where both qualitative and quantitative data were to be collected prior to students starting the course and following course completion. It was proposed to collect data through interviews, observations in the form of documented online discussions and correspondences, and the Research Self-Efficacy Scale (RSES). Course grades and final research outline projects were supposed to be examined to determine student mastery of key concepts. A convenience sample was to be drawn but due to a low response rate, the authors decided to apply an exploratory case study method and use this as a pilot to inform a larger study. The final result of the case study analysis will be detailed descriptions of each case with discussion of categories and themes along with the exploration of commonalities and differences.
Although process, structure, and quality of online discussion have been assessed independently, a standard framework integrating these aspects for comprehensive assessment of critical thinking in online discussions is not available (Fahy, 2005;Zohar & Nemet, 2002). This paper describes a theoretically based coding framework for an integrated analysis and assessment of critical thinking in online discussion. The implementation of this critical thinking assessment framework (TAF) in future studies will ultimately help identify online educational activities and tools which best support development and application of critical thinking skills. Furthermore, it might be used to assess critical thinking of individual discussion participants or small groups. The reliability and efficacy of the framework for general discussion assessment are presented.
As online education becomes common, it is essential to identify activities that support critical thinking in this environment. This paper describes the use of a model-eliciting activity (MEA) solved by groups in a synchronous online discussion. The participants displayed their thinking processes and solutions consistent with successful face-to-face (F2F) MEAs. Each group’s discussion showed evidence of critical thinking and incorporated quality logical arguments. This demonstrates that MEAs may be also used in an online environment to successfully support application of critical thinking skills
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.