This paper reviews the literature on an important but mysterious phenomenon in qualitative research methodology: the conceptual leap that generates abstract theoretical ideas from empirical data. Drawing on epistemological, prescriptive and reflexive writings, conceptual leaps are described as constituted by both 'seeing' and 'articulating', as grounded in abductive reasoning, and as part of an ongoing dialectical process. Methods for approaching conceptual leaps and the conditions for their realization are discussed in the context of four dialectic tensions: between deliberation and serendipity, between engagement and detachment, between knowing and not knowing, and between self-expression and social connection. The literature review suggests that conceptual leaping is best portrayed as a form of bricolage, drawing resources from the different poles of the four dialectics. Moreover, written and verbal communication play important roles in enabling synthesis. The paper concludes by calling for greater openness and legitimacy for reflexive accounts, as well as further research into the process of discovery in qualitative research.The authors would like to thank three anonymous reviewers for their constructive comments on an earlier version of this paper. The authors are also grateful to the Social Sciences and Humanities Research Council of Canada for its financial support.
Researcher presence in the field (“being there”) has long been a topic of scholarly discussion in qualitative inquiry. However, the representation of field presence in research accounts merits increased methodological attention as it impacts readers’ understanding of study phenomena and theoretical contributions. We maintain that the current ambiguity around representing field involvement is rooted in our scholarly community’s “involvement paradox.” On one hand, we laud field proximity as a tenet of qualitative inquiry. On the other hand, we insist on professional distance to avoid “contamination” of findings. This leaves authors in a difficult position as they attempt to weave field involvement into written accounts. We draw on existing conceptual articles and illustrative exemplars to introduce four interrelated dimensions of representation: visibility, voice, stance, and reflexivity. These are intended to structure thinking about how authors do, and can, cast field involvement in research accounts as they navigate the involvement paradox. We encourage researcher-authors to think carefully about how they attend to their field presence as they craft research accounts, in order to enhance their legitimacy, trustworthiness, and richness.
Introduction Robust evaluation of service models can improve the quality and efficiency of care while articulating the models for potential replication. Even though it is an essential part of learning health systems, evaluations that benchmark and sustain models serving adults with developmental disabilities are lacking, impeding pilot programs from becoming official care pathways. Here, we describe the development of a program evaluation for a specialized medical‐dental community clinic serving adults with autism and intellectual disabilities in Montreal, Canada. Method Using a Participatory Action‐oriented approach, researchers and staff co‐designed an evaluation for a primary care service for this population. We performed an evaluability assessment to identify the processes and outcomes that were feasible to capture and elicited perspectives at both clinical and health system levels. The RE‐AIM framework was used to categorize and select tools to capture data elements that would inform practice at the clinic. Results We detail the process of conceptualizing the evaluation framework and operationalizing the domains using a mixed‐methods approach. Our experience demonstrated (1) the utility of a comprehensive framework that captures contextual factors in addition to clinical outcomes, (2) the need for validated measures that are not cumbersome for everyday practice, (3) the importance of understanding the functional needs of the organization and building a sustainable data infrastructure that addresses those needs, and (4) the need to commit to an evolving, “living” evaluation in a dynamic health system. Conclusions Evaluation employing rigorous patient‐centered and systems‐relevant metrics can help organizations effectively implement and continuously improve service models. Using an established framework and a collaborative approach provides an important blueprint for a program evaluation in a learning health system. This work provides insight into the process of integrating care for vulnerable populations with chronic conditions in health care systems and integrated knowledge generation processes between research and health systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.