2015
DOI: 10.1016/j.acra.2015.07.003
|View full text |Cite
|
Sign up to set email alerts
|

Conventional Medical Education and the History of Simulation in Radiology

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0
6

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 64 publications
(41 citation statements)
references
References 98 publications
0
33
0
6
Order By: Relevance
“…problem-based learning, case-based learning, and team-based learning [8][9][10]. Unlike these previously studied conventional methods, with the concept of learning from clinical experience, we modified a new experiential education model that enables students to practice radiology interpretation and diagnosis by taking on the radiologists' role in a simulated environment.…”
Section: Studies Focusing On a Variety Of Radiology Education Models mentioning
confidence: 99%
“…problem-based learning, case-based learning, and team-based learning [8][9][10]. Unlike these previously studied conventional methods, with the concept of learning from clinical experience, we modified a new experiential education model that enables students to practice radiology interpretation and diagnosis by taking on the radiologists' role in a simulated environment.…”
Section: Studies Focusing On a Variety Of Radiology Education Models mentioning
confidence: 99%
“…Our literature review on competency management frameworks revealed that the bulk of the Canadian literature on competencies deals with medical education and professional development, within the scope of the Canadian Medical Education Directives (CanMEDS) Framework for Canada's medical postgraduate training programs [32][33][34]. The CanMEDS framework was developed in 1996 and since then has been modified for use in other countries [32].…”
Section: Competency Management Framework In Canadamentioning
confidence: 99%
“…Observational tools include global rating scales, checklists (checking "respect for tissue, efficiency of time and motion, instrument handling, knowledge of instruments, use of assistants, flow of operation, forward planning and knowledge of specific procedural steps"). Non-observational tools include computer based measurements such as scores "generated based on errors, economy of movement, and time to complete the task" [13].…”
Section: Medical Competency Frameworkmentioning
confidence: 99%
“…Issues with simulations and competencies "include the challenges and costs of obtaining and using appropriate simulation software and hardware, concerns about validation of simulations as an educational tool, and difficulty in creating normative standards for grading performance." To summarize, potential barriers to simulation and competencies include: access, cost, instructor availability, educational validity, assessment and outcome measurements [13].…”
Section: Medical Competency Frameworkmentioning
confidence: 99%