BackgroundThe Department of Veterans Affairs (VA) Office of Rural Health (ORH) supports national VA program offices' efforts to expand health care to rural Veterans through its Enterprise-Wide Initiatives (EWIs) program. In 2017, ORH selected Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM), an implementation science framework, to structure the EWI evaluation and reporting process. As part of its mandate to improve EWI program evaluation, the Center for the Evaluation of Enterprise-Wide Initiatives conducted a qualitative evaluation to better understand EWI team' perceptions of, and barriers and facilitators to, the EWI evaluation process.MethodsWe conducted 43 semi-structured interviews with 48 team members (e.g., evaluators, program office leads, and field-based leads) representing 21 EWIs from April-December 2020. Questions focused on participants' experiences using strategies targeting each RE-AIM dimension. Interviews were inductively analyzed in MAXQDA. We also systematically reviewed 51 FY19-FY20 EWI annual reports to identify trends in misapplications of RE-AIM.ResultsParticipants had differing levels of experience with RE-AIM. While participants understood ORH's rationale for selecting a common framework to structure evaluations, the perceived misalignment between RE-AIM and EWIs' work emerged as an important theme. Concerns centered around 3 sub-themes: (1) (Mis)Alignment with RE-AIM Dimensions, (2) (Mis)Alignment between RE-AIM and the EWI, and (3) (Mis)Alignment with RE-AIM vs. other Theories, Models, or Frameworks. Participants described challenges differentiating between and operationalizing dimensions in unique contexts. Participants also had misconceptions about RE-AIM and its relevance to their work, e.g., that it was meant for established programs and did not capture aspects of initiative planning, adaptations, or sustainability. Less commonly, participants shared alternative models or frameworks to RE-AIM. Despite criticisms, many participants found RE-AIM useful, cited training as important to understanding its application, and identified additional training as a future need.DiscussionThe selection of a shared implementation science framework can be beneficial, but also challenging when applied to diverse initiatives or contexts. Our findings suggest that establishing a common understanding, operationalizing framework dimensions for specific programs, and assessing training needs may better equip partners to integrate a shared framework into their evaluations.