BackgroundIn previous research, variables such as age, education, treatment credibility, and therapeutic alliance have shown to affect patients’ treatment adherence and outcome in Internet-based psychotherapy. A more detailed understanding of how such variables are associated with different measures of adherence and clinical outcomes may help in designing more effective online therapy.ObjectiveThe aims of this study were to investigate demographical, psychological, and treatment-specific variables that could predict dropout, treatment adherence, and treatment outcomes in a study of online relaxation for mild to moderate stress symptoms.MethodsParticipant dropout and attrition as well as data from self-report instruments completed before, during, and after the online relaxation program were analyzed. Multiple linear and logistical regression analyses were conducted to predict early dropout, overall attrition, online treatment progress, number of registered relaxation exercises, posttreatment symptom levels, and reliable improvement.ResultsDropout was significantly predicted by treatment credibility, whereas overall attrition was associated with reporting a focus on immediate consequences and experiencing a low level of intrinsic motivation for the treatment. Treatment progress was predicted by education level and treatment credibility, whereas number of registered relaxation exercises was associated with experiencing intrinsic motivation for the treatment. Posttreatment stress symptoms were positively predicted by feeling external pressure to participate in the treatment and negatively predicted by treatment credibility. Reporting reliable symptom improvement after treatment was predicted by treatment credibility and therapeutic bond.ConclusionsThis study confirmed that treatment credibility and a good working alliance are factors associated with successful Internet-based psychotherapy. Further, the study showed that measuring adherence in different ways provides somewhat different results, which underscore the importance of carefully defining treatment adherence in psychotherapy research. Lastly, the results suggest that finding the treatment interesting and engaging may help patients carry through with the intervention and complete prescribed assignments, a result that may help guide the design of future interventions.Trial RegistrationClinicaltrials.gov NCT02535598; http://clinicaltrials.gov/ct2/show/NCT02535598 (Archived by WebCite at http://www.webcitation.org/6fl38ms7y).
BackgroundResearch on Internet-based interventions typically use digital versions of pen and paper self-report symptom scales. However, adaptation into the digital format could affect the psychometric properties of established self-report scales. Several studies have investigated differences between digital and pen and paper versions of instruments, but no systematic review of the results has yet been done.ObjectiveThis review aims to assess the interformat reliability of self-report symptom scales used in digital or online psychotherapy research.MethodsThree databases (MEDLINE, Embase, and PsycINFO) were systematically reviewed for studies investigating the reliability between digital and pen and paper versions of psychiatric symptom scales.ResultsFrom a total of 1504 publications, 33 were included in the review, and interformat reliability of 40 different symptom scales was assessed. Significant differences in mean total scores between formats were found in 10 of 62 analyses. These differences were found in just a few studies, which indicates that the results were due to study effects and sample effects rather than unreliable instruments. The interformat reliability ranged from r=.35 to r=.99; however, the majority of instruments showed a strong correlation between format scores. The quality of the included studies varied, and several studies had insufficient power to detect small differences between formats.ConclusionsWhen digital versions of self-report symptom scales are compared to pen and paper versions, most scales show high interformat reliability. This supports the reliability of results obtained in psychotherapy research on the Internet and the comparability of the results to traditional psychotherapy research. There are, however, some instruments that consistently show low interformat reliability, suggesting that these conclusions cannot be generalized to all questionnaires. Most studies had at least some methodological issues with insufficient statistical power being the most common issue. Future studies should preferably provide information about the transformation of the instrument into digital format and the procedure for data collection in more detail.
Introduction There is a constant need for theoretically sound and valid self-report instruments for measuring psychological distress. Previous studies have shown that the Depression, Anxiety and Stress Scale-21 (DASS-21) is theoretically sound, but there have been some inconsistent results regarding its factor structure. Aims The aim of the present study was to investigate and elucidate the factor structure and convergent validity of the DASS-21. Methods A total of 624 participants recruited from student, primary care and psychotherapy populations. The factor structure of the DASS-21 was assessed by confirmatory factor analyses and the convergent validity by investigating its unique correlations with other psychiatric instruments. Results A bifactor structure with depression, anxiety, stress and a general factor provided the best fit indices for the DASS-21. The convergent validity was adequate for the Depression and Anxiety subscales but more ambiguous for the Stress subscale. Discussion The present study overall supports the validity and factor structure of the DASS-21. Implications for practice The DASS-21 can be used to measure symptoms of depression and anxiety as well as overall distress. It can be useful for mental health nurses, and other first-line psychiatric professionals, in need of a short, feasible and valid instrument in everyday care.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.