2020
DOI: 10.4300/jgme-d-19-00730.1
|View full text |Cite
|
Sign up to set email alerts
|

Trainee and Program Director Perspectives on Meaningful Patient Attribution and Clinical Outcomes Data

Abstract: Background The Accreditation Council for Graduate Medical Education specifies that trainees must receive clinical outcomes and quality benchmark data at specific levels related to institutional patient populations. Program directors (PDs) are challenged to identify meaningful data and provide them in formats acceptable to trainees. Objective We sought to understand what types of patients, data/metrics, and data delivery systems trainees and PDs prefer for supplying trainees with clinical outcomes data. Methods… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…Prior studies have described barriers in developing dashboards for use by trainees, including challenges with patient attribution which can lead trainees to feel that metrics are not as meaningful. 5 25 26 27 28 In our study, resident users seemed overall to understand the limitations of the dashboard, but similarly reported that some metrics were less meaningful on an individual basis due to patient attribution limitations. Resident comments indicated that they felt some metrics were more reflective of decisions made by the care team rather than an individual, which is consistent with findings of other studies regarding the challenges of creating resident-specific performance metrics.…”
Section: Discussionmentioning
confidence: 63%
See 2 more Smart Citations
“…Prior studies have described barriers in developing dashboards for use by trainees, including challenges with patient attribution which can lead trainees to feel that metrics are not as meaningful. 5 25 26 27 28 In our study, resident users seemed overall to understand the limitations of the dashboard, but similarly reported that some metrics were less meaningful on an individual basis due to patient attribution limitations. Resident comments indicated that they felt some metrics were more reflective of decisions made by the care team rather than an individual, which is consistent with findings of other studies regarding the challenges of creating resident-specific performance metrics.…”
Section: Discussionmentioning
confidence: 63%
“…Resident comments indicated that they felt some metrics were more reflective of decisions made by the care team rather than an individual, which is consistent with findings of other studies regarding the challenges of creating resident-specific performance metrics. 5 26 29 Another study has prioritized a list of resident-specific quality metrics which could mitigate this issue, but these metrics primarily focused on content captured within resident documentation (e.g., work of breathing or response to therapy documentation). 30 While these metrics would be very specific to the work of an individual resident, these data are very challenging to integrate into an automated tool without sophisticated natural language processing, so we were not able to include these metrics in this iteration of our dashboard.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…19,28 Focus groups revealed residents felt empowered and appreciated this autonomy, but they found it problematic when goals were not relevant to patient visits. Previous work has suggested using resident-sensitive metrics, or metrics that are actionable and appropriate for Residents who answered strongly agree or agree were considered to "agree" (Likert scale 1-5: strongly disagree, disagree, neutral, agree, strongly agree) resident panel populations, and align with educational 19,[29][30][31] Continuous review of whether metrics are appropriate for residents may be needed. It is also plausible that more time may increase the likelihood that residents would have pertinent patient visits, but future work is needed to address how to better align learner needs with patient care.…”
Section: Discussionmentioning
confidence: 99%
“…10 Two key elements to consider when providing outcomes data are the extent to which the data are timely and actionable. 11 In many large programs, residents rotate between services, units, and even health systems on a monthly basis. If residents are to change their own behavior based on data, they must receive data while they are still on the rotation.…”
Section: Secondary Driver: Make Available Expert Improvement and Innovation Support Locallymentioning
confidence: 99%