Background: Non-adherence (NA) to immunosuppressants (IS) among renal transplant recipients (RTRs) is associated with higher risk of allograft rejection, graft loss, and mortality. A precise measurement of NA is indispensable, although its prevalence differs greatly depending on the respective measurement methods. The objective of this study was to assess the accuracy and concordance of different measurement methods of NA in patients after renal transplantation. Design and methods: This was a single-center prospective observational study. At baseline (T0), NA was measured via physicians' estimates (PE), self-reports (SR), and tacrolimus trough level variability (CV%) in 78 RTRs. A Visual Analogue Scale (VAS, 0-100%) was applied both for SR and PE. In addition, we used BAASIS© for SR and a 5-point Likert scale for PE. NA was measured prospectively via electronic monitoring (EM, VAICA©) during a three month period. Meanwhile, all participants received phone calls in a two week interval (T1-T6) during which SRs were given. Results: Seventy-eight RTRs participated in our study. At t0, NA rates of 6.4%, 28.6%, and 15.4% were found for PE, SR, and CV%, respectively. No correlation was found between these methods. During the study, the percentages of self-reported and electronically monitored adherence remained high, with a minimum mean of 91.2% for the strictest adherence measure (Timing Adherence ±30 min). Our results revealed a moderate to high association between SR and EM. In contrast to PE and CV%, SR significantly predicted electronically monitored adherence. Overall, a decreasing effect of electronically monitored adherence was found for both taking and timing adherence (±2 h, ±30 min) over the course of the study.