Background: Understanding implementation fidelity, or adherence to the intervention-as-intended, is essential to interpreting the results of evaluations. In this paper, we propose a longitudinal, explanatory approach to implementation fidelity through a realist evaluation lens. We apply this approach to a mixed-method assessment of implementation fidelity to an electronic decision support system intervention to improve the quality of antenatal care in Nepal. Methods: The tablet-based electronic decision support system was implemented in 19 primary care facilities in Nepal. As part of the project's process evaluation, we used four data sources--monitoring visit checklists and fieldnotes, software backend data, and longitudinal case studies in four facilities--to examine three components of fidelity: use at the point of care, use for all antenatal visits, and quality of data entry. Quantitative data were analysed descriptively. Qualitative data were analysed thematically using template analysis to examine descriptive findings across the three fidelity components and later to develop and reflect on the causal mechanisms. Findings were synthesised, drawing on Normalization Process Theory, to understand the processes driving the different patterns of fidelity observed. Results: Fidelity to point-of-care use declined over time with healthcare providers often entering data after antenatal visits had ended because providers understood the intervention as primarily about recordkeeping rather than decision support. Even in facilities with higher fidelity to point-of-care use, software decision-support prompts were largely ignored. Low antenatal client caseloads and the suggestion by fieldworkers to practice back-entering data from previous antenatal visits undermined understanding of the intervention's purpose for decision support. Conclusions: Our assessment explains how and why patterns of implementation fidelity occurred, yielding more nuanced understanding of the project evaluation's null result that moves beyond intervention vs implementation failure. Our findings demonstrate the importance of discussing intervention theory in terms fieldworkers and participants understand so as not to undermine fidelity.