1996
DOI: 10.1111/j.1553-2712.1996.tb03371.x
|View full text |Cite
|
Sign up to set email alerts
|

Reliability of Performance‐based Clinical Skill Assessment of Emergency Medicine Residents

Abstract: Objective: To test the overall reliability of a performance-based clinical skill assessment for entering emergency medicine (EM) residents. Also, to investigate the reliability of separate reporting of diagnostic and management scores for a standardized patient case, subjective scoring of patient notes, and interstation exercise scores. Methods: Thirty-four first-year EM residents were tested using a 10-station standardized patient (SP) examination. Following each 10-minute encounter, the residents completed a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2002
2002
2017
2017

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 8 publications
0
8
0
Order By: Relevance
“…The diagnostic tasks included the interpretation of chest radiographs, which are one of the most commonly performed radiologic examinations in the ED setting. Of all the clinical and diagnostic tasks that they were required to perform, they scored the lowest on chest x‐ray interpretation …”
Section: Discussionmentioning
confidence: 99%
“…The diagnostic tasks included the interpretation of chest radiographs, which are one of the most commonly performed radiologic examinations in the ED setting. Of all the clinical and diagnostic tasks that they were required to perform, they scored the lowest on chest x‐ray interpretation …”
Section: Discussionmentioning
confidence: 99%
“…In this study, the more experienced staff physicians' inputs represent the benchmark to which residents' values are compared. This approach is widely used in the literature to evaluate performance of less experienced clinicians and can take the form of comparing novices to experts performing the same task (Nodine, Kundel, Mello-Thoms, Weinstein, Orel, Sullivan & Conant, 1999;Sklar, Hauswald & Johnson, 1991), or having expert clinicians evaluating the performance of novice clinicians (Burdick et al, 1996;Steinbach, 2002;Wray & Friedland, 1983). As a measure of proper elicitation, we use a level of agreement beyond chance between values for CDSS input variables provided by staff physicians and residents.…”
Section: Discussionmentioning
confidence: 99%
“…Our CEE was developed after publication of the ACGME general competencies and the ''Model of the Clinical Practice of Emergency Medicine,'' and incorporates language and principles from these important documents. Other studies involving emergency medicine faculty have noted that standardized objective evaluations provide better interrater reliability than global assessment scoring, 8,9 and we have tried to incorporate that lesson as well.…”
Section: Discussionmentioning
confidence: 99%