2006
DOI: 10.1097/01.mlg.0000205148.14269.09
|View full text |Cite
|
Sign up to set email alerts
|

Assessing and Documenting General Competencies in Otolaryngology Resident Training Programs

Abstract: Compliance for completion of forms was 97%. The system facilitated the educational management of our training program along multiple dimensions. The small perceptual differences among a highly selected group of residents have made the unambiguous validation of the system challenging. The instruments and approach warrant further study. Improvements are likely best achieved in broad consultation among other otolaryngology programs.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
46
0

Year Published

2007
2007
2020
2020

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(46 citation statements)
references
References 15 publications
0
46
0
Order By: Relevance
“…2 Between 2002 and 2007, complex and customized tools have been used with varying success in cardiothoracic surgery, 3 otolaryngology, 4 general surgery, 5 physical medicine and rehabilitation, 6,7 radiology, 8 and internal medicine 9 residency programs. However, 1 study of trauma residents concluded that the 360-degree evaluation provided limited new information with increased work for little return.…”
Section: Introductionmentioning
confidence: 99%
“…2 Between 2002 and 2007, complex and customized tools have been used with varying success in cardiothoracic surgery, 3 otolaryngology, 4 general surgery, 5 physical medicine and rehabilitation, 6,7 radiology, 8 and internal medicine 9 residency programs. However, 1 study of trauma residents concluded that the 360-degree evaluation provided limited new information with increased work for little return.…”
Section: Introductionmentioning
confidence: 99%
“…Six studies provided no psychometric data (Willoughby et al 1979;Alagna & Reddy 1985;Amato & Novales-Castro 2009;Chen et al 2009;Nofziger et al 2010;Perera et al 2010). Only a few studies (Cottrell et al 2006;Roark et al 2006;O'Brien et al 2008) described the concept of content validity: the extent to which the domain of interest was sampled comprehensively by the items in the questionnaire (Terwee et al 2007). This content validity was usually explained referring to existing literature or frameworks.…”
Section: Psychometric Characteristicsmentioning
confidence: 99%
“…Criterion validity, frequently mentioned in Table 2, refers to the relationship linking the attributes in a tool with the performance on a criterion, whereas predictive validity (Linn et al 1976;Arnold et al 1981;Dijcks et al 2003;Lurie et al 2007) indicates the degree to which test scores predict performance on some future criterion and convergent validity (Lurie et al 2006a) the extent to which different measures of the same construct correlate with one other (DeVon et al 2007). Although no single criterion or gold standard will ever be perfect (Norman et al 1996), usually, faculty ratings are considered to be the gold standard in educational settings (Lin et al 1975;van Rosendaal & Jennett 1994;Davis 2002;Dijcks et al 2003;Bryan et al 2005;Roark et al 2006). Construct validity refers to the extent to which scores on a particular questionnaire relate to other measures in a manner that is consistent with theoretically derived hypotheses concerning the concepts that are being measured (Terwee et al 2007).…”
Section: Psychometric Characteristicsmentioning
confidence: 99%
“…We found 12 articles that described assessment tools used to measure care-management as part of the general evaluation of trainees’ ACGME or CanMEDS competencies 2837. In the majority of these studies, new assessment tools were developed or previously known ones modified for this purpose 2932,3436,38.…”
Section: Resultsmentioning
confidence: 99%