1992
DOI: 10.1007/bf02599452
|View full text |Cite
|
Sign up to set email alerts
|

How well do faculty evaluate the interviewing skills of medical students?

Abstract: To accurately evaluate clinical interviewing skills we must enhance rater consistency, particularly in assessing those skills that both satisfy patients and yield crucial data.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

3
26
0

Year Published

1999
1999
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 64 publications
(29 citation statements)
references
References 33 publications
3
26
0
Order By: Relevance
“…Thus, the assessments made by the experts in our study were unreliable. Although our sample was small it concurs with data from other studies involving assessment of consulting skills (Kalet et al, 1992;Noel et al, 1992). It seems reasonable to conclude that unstructured global assessments are unreliable.…”
Section: Discussionsupporting
confidence: 89%
See 1 more Smart Citation
“…Thus, the assessments made by the experts in our study were unreliable. Although our sample was small it concurs with data from other studies involving assessment of consulting skills (Kalet et al, 1992;Noel et al, 1992). It seems reasonable to conclude that unstructured global assessments are unreliable.…”
Section: Discussionsupporting
confidence: 89%
“…However, inconsistencies may arise in teaching because, without structured guidance, experienced clinicians are unreliable when assessing clinical skills (Noel et al, 1992). Interrater reliability has been highlighted as particularly poor in the assessment of interviewing skills (Kalet et al, 1992), students have been scored on the basis of their liability rather than by specific behavioural skills they demonstrate. As a consequence, the ability of assessors to provide specific feedback on behaviour may be limited.…”
mentioning
confidence: 99%
“…Kalet et al 24 had faculty raters rerate videotapes of 21 medical student interviews after a period of 3 to 6 months. They found substantial agreement between the earlier and later ratings only when raters judged the information obtained (i.e., data collection) during the interview.…”
Section: Rater Agreement (Intra-and Interrater Agreement)mentioning
confidence: 99%
“…Kalet et al 24 also had four judges each rate medical interviews. They concluded that interrater agreement was poor when judging the overall interview as well as when judging the information obtained and the interviewing process.…”
Section: Rater Agreement (Intra-and Interrater Agreement)mentioning
confidence: 99%
“…This effort grew out of increasing concern over the lack of authenticity and validity of clinical skills assessment. [9][10][11] Authentic assessments require students to demonstrate knowledge and skills in ways that represent as closely as possible enactment of behaviors required in a professional care practice. The White Paper Report of the Association of Medical Colleges (AMC) Medical School Objectives Project on communication in medicine (www.aamc.org/meded/ msop/msop3.pdf) identified the core behaviors or interpersonal skills in relating to patients that should be taught and assessed in medical students.…”
Section: Introductionmentioning
confidence: 99%