This study examines the effects of timing of corrective formative feedback on processing text information on question‐answering. Undergraduate students read an expository text and answered questions in two attempts. Students were randomly assigned to a no feedback, immediate feedback and delayed feedback conditions. Students in the feedback conditions received feedback on the correctness of their answer after the first attempt and were informed about the right answer after the second attempt. Students were prompted to restudy the text after failing in their first attempt. However, students in the no feedback condition were just prompted to search the text. All students were tested on question‐answering, corrective probability and a post‐test cued‐recall test. Results showed that: (a) feedback reduced the initial time reading the text; (b) feedback increased performance on question answering and cued‐recall; (c) delayed feedback produced no advantages over immediate feedback. Theoretical and practical implications of these results are discussed.
BackgroundInternet-based mindfulness interventions are a promising approach to address challenges in the dissemination and implementation of mindfulness interventions, but it is unclear how the instructional design components of such interventions are associated with intervention effectiveness.ObjectiveThe objective of this study was to identify the instructional design components of the internet-based mindfulness interventions and provide a framework for the classification of those components relative to the intervention effectiveness.MethodsThe critical interpretive synthesis method was applied. In phase 1, a strategic literature review was conducted to generate hypotheses for the relationship between the effectiveness of internet-based mindfulness interventions and the instructional design components of those interventions. In phase 2, the literature review was extended to systematically explore and revise the hypotheses from phase 1.ResultsA total of 18 studies were identified in phase 1; 14 additional studies were identified in phase 2. Of the 32 internet-based mindfulness interventions, 18 were classified as more effective, 11 as less effective, and only 3 as ineffective. The effectiveness of the interventions increased with the level of support provided by the instructional design components. The main difference between effective and ineffective interventions was the presence of just-in-time information in the form of reminders. More effective interventions included more supportive information (scores: 1.91 in phases 1 and 2) than less effective interventions (scores: 1.00 in phase 1 and 1.80 in phase 2), more part-task practice (scores: 1.18 in phase 1 and 1.60 in phase 2) than less effective interventions (scores: 0.33 in phase 1 and 1.40 in phase 2), and provided more just-in-time information (scores: 1.35 in phase 1 and 1.67 in phase 2) than less effective interventions (scores: 0.83 in phase 1 and 1.60 in phase 2). The average duration of more effective, less effective, and ineffective interventions differed for the studies of phase 1, with more effective interventions taking up more time (7.45 weeks) than less effective (4.58 weeks) or ineffective interventions (3 weeks). However, this difference did not extend to the studies of phase 2, with comparable average durations of effective (5.86 weeks), less effective (5.6 weeks), and ineffective (7 weeks) interventions.ConclusionsOur results suggest that to be effective, internet-based mindfulness interventions must contain 4 instructional design components: formal learning tasks, supportive information, part-task practice, and just-in-time information. The effectiveness of the interventions increases with the level of support provided by each of these instructional design components.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.