The purpose of this research was to construct an instrument to measure participants’ satisfaction with synchronous online education, explore its psychometric properties and explore differences among sessions based on their format and content. The quantitative analyses employed factor analysis in conjunction with item response theory for validation purposes and a (multivariate) analysis of variance with multilevel modelling for comparison purposes (N = 433). The qualitative analysis relied on classical content analysis of 303 open‐question feedback responses classified as Promoters or Detractors. Eight out of 10 questions from the initial item pool were retained for the final scale. In contrast with current knowledge about synchronous online education, interactivity was related to overall perceived session quality the least compared with other aspects included. Qualitative research provided pragmatic insights about the participants’ perspective on session quality and a comprehensive map of potentially relevant factors that could be a meaningful focus of future iterations of research. A relatively small and conceptually homogeneous pool of items prevented the extraction of additional factors due to discriminant validity issues. In future research, a larger and more comprehensive pool of items should be used as a starting point for constructing a scale, and if possible, longitudinal measures of learning transfer included as well. Educators can immediately make use of the practical suggestions in their instructional design, use the Perceived Session Quality Scale as a brief screening instrument to evaluate their sessions, and benchmark their quality in light of percentile scores provided for various types of sessions. The major contributions of this paper are the construction of a short, generalizable, and psychometrically valid tool for (synchronous online) education screening assessment—the Perceived Session Quality Scale—and an empirical mapping of potentially relevant aspects that contribute to perceived session quality.
PurposeThe purpose of this research was to explore the conceptual network of live online education efficiency from the Actor Network Theory perspective to reveal different aspects influencing the quality of online training less accounted for in previous research.MethodologyActor Network Theory was used to analyse the qualitative feedback from 100 live online education sessions. Responses from 90 educators and 556 participants were coded into enablers and inhibitors of education quality and further clustered into different actors that might mediate learning success.FindingsThe key finding of this research is a visual representation of the complex network of actors potentially affecting live online education quality, revealing the interplay of non-human aspects (e.g., hardware, software, session design, and descriptions), as well as human elements (participants and their expectations, educators and their emotional reactions attributed to different actors of the network, organisers, and external mentors/experts).LimitationsThe piloting qualitative research was conducted within the framework of one educational event, where participants opted in voluntarily to attend and participate in the study. It is a specific educational context different from workplace training and other non-formal education.Practical ImplicationsLearning and development practitioners can find 10 recommendations designed to support the instructional design and delivery of their (online) sessions based on the collective experiences of the study participants and authors.Originality/valueIt is the first research in the field of live online education, acknowledging and mapping the role of multiple actors posited to play an influential role in the overall quality. It also calls for a transition from “content-focused and controlling” to “contextually-aware and responsive” educator in future research.
The purpose of this study was to explore the factors that affect participants' webinar satisfaction, with special emphasis on the dissatisfaction factors. The method of empathy‐based stories was chosen for the research due to its potential to eliminate bias from other qualitative methods. The research collected 280 different factors potentially impacting training satisfaction. Some of them had not been raised in subject literature before. In contrast to previous studies, this research concluded that the length of the webinar and finding it interesting might not be critical from the participants' point of view. Moreover, participants of the study valued the depth of content more than breadth. The research also discovered a potentially disregarded important satisfaction factor of webinar description, shaping participants' expectations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.