Background: A key challenge in behavioural medicine is developing interventions that can be delivered adequately (i.e., with fidelity) within real-world consultations. Accordingly, clinical trials should (but tend not to) report what is actually delivered (adherence), how well (competence) and the distinction between intervention and comparator conditions (differentiation). Purpose: To address this important clinical and research priority, we apply best practice guidelines to evaluate fidelity within a real-world, stepped-wedge evaluation of “EAT: Eating As Treatment”, a new dietitian delivered health behaviour change intervention designed to reduce malnutrition in head and neck cancer (HNC) patients undergoing radiotherapy. Methods: Dietitians (n = 18) from five Australian hospitals delivered a period of routine care and following a randomly determined order each site received training and began delivering the EAT Intervention. A 20% random stratified sample of audio-recorded consultations (control n = 196; intervention n = 194) was coded by trained, independent, raters using a study specific checklist and the Behaviour Change Counselling Inventory. Intervention adherence and competence were examined relative to apriori benchmarks. Differentiation was examined by comparing control and intervention sessions (adherence, competence, non-specific factors, and dose), via multiple linear regression, logistic regression, or mixed-models. Results: Achievement of adherence benchmarks varied. The majority of sessions attained competence. Post-training consultations were clearly distinct from routine care regarding motivational and behavioural, but not generic, skills. Conclusions: Although what level of fidelity is “good enough” remains an important research question, findings support the real-world feasibility of integrating EAT into dietetic consultations with HNC patients and provide a foundation for interpreting treatment effects.