Background-Regardless of statistical methodology, public performance report cards must use the highest-quality validated data, preferably from a prospectively maintained clinical database. Using logistic regression and hierarchical models, we compared hospital cardiac surgery profiling results based on clinical data with those derived from contemporaneous administrative data. Methods and Results-Fiscal year 2003 isolated coronary artery bypass grafting surgery results based on an audited and validated Massachusetts clinical registry were compared with those derived from a contemporaneous state administrative database, the latter using the inclusion/exclusion criteria and risk model of the Agency for Healthcare Research and Quality. There was a 27.4% disparity in isolated coronary artery bypass grafting surgery volume (4440 clinical, 5657 administrative), a 0.83% difference in observed in-hospital mortality (2.05% versus 2.88%), corresponding differences in risk-adjusted mortality calculated by various statistical methodologies, and 1 hospital classified as an outlier only with the administrative data-based approach. The discrepancies in volumes and risk-adjusted mortality were most notable for higher-volume programs that presumably perform a higher proportion of combined procedures that were misclassified as isolated coronary artery bypass grafting surgery in the administrative cohort. Subsequent analyses of a patient cohort common to both databases revealed the smoothing effect of hierarchical models, a 9% relative difference in mortality (2.21% versus 2.03%) resulting from nonstandardized mortality end points, and 1 hospital classified as an outlier using logistic regression but not using hierarchical regression. Conclusions-Cardiac surgery report cards using administrative data are problematic compared with those derived from audited and validated clinical data, primarily because of case misclassification and nonstandardized end points.