BackgroundThe importance of findings derived from syntheses of qualitative research has been increasingly acknowledged. Findings that arise from qualitative syntheses inform questions of practice and policy in their own right and are commonly used to complement findings from quantitative research syntheses. The GRADE approach has been widely adopted by international organisations to rate the quality and confidence of the findings of quantitative systematic reviews. To date, there has been no widely accepted corresponding approach to assist health care professionals and policy makers in establishing confidence in the synthesised findings of qualitative systematic reviews.MethodsA methodological group was formed develop a process to assess the confidence in synthesised qualitative research findings and develop a Summary of Findings tables for meta-aggregative qualitative systematic reviews.ResultsDependability and credibility are two elements considered by the methodological group to influence the confidence of qualitative synthesised findings. A set of critical appraisal questions are proposed to establish dependability, whilst credibility can be ranked according to the goodness of fit between the author’s interpretation and the original data. By following the processes outlined in this article, an overall ranking can be assigned to rate the confidence of synthesised qualitative findings, a system we have labelled ConQual.ConclusionsThe development and use of the ConQual approach will assist users of qualitative systematic reviews to establish confidence in the evidence produced in these types of reviews and can serve as a practical tool to assist in decision making.
This article is the first in a new series on systematic reviews from the Joanna Briggs Institute, an international collaborative supporting evidence-based practice in nursing, medicine, and allied health fields. The purpose of the series is to show nurses how to conduct a systematic review-one step at a time. This first installment provides a synopsis of the systematic review as a scientific exercise, one that influences health care decisions.
The concept of validity has been a central component in critical appraisal exercises evaluating the methodological quality of quantitative studies. Reactions by qualitative researchers have been mixed in relation to whether or not validity should be applied to qualitative research and if so, what criteria should be used to distinguish high-quality articles from others. We compared three online critical appraisal instruments' ability to facilitate an assessment of validity. Many reviewers have used the critical appraisal skills program (CASP) tool to complete their critical appraisal exercise; however, CASP appears to be less sensitive to aspects of validity than the evaluation tool for qualitative studies (ETQS) and the Joanna Briggs Institute (JBI) tool. The ETQS provides detailed instructions on how to interpret criteria; however, it is the JBI tool, with its focus on congruity, that appears to be the most coherent.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.