2012
DOI: 10.3402/meo.v17i0.19623
|View full text |Cite
|
Sign up to set email alerts
|

Direct short-term effects of EBP teaching: change in knowledge, not in attitude; a cross-cultural comparison among students from European and Asian medical schools

Abstract: IntroductionWe report about the direct short-term effects of a Clinical Epidemiology and Evidence-based Medicine (CE-EBM) module on the knowledge, attitude, and behavior of students in the University Medical Center Utrecht (UMCU), Universitas Indonesia (UI), and University of Malaya (UM).MethodsWe used an adapted version of a 26-item validated questionnaire, including four subscales: knowledge, attitude, behavior, and future use of evidence-based practice (EBP). The four components were compared among the stud… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…The content and structure of the CE-EBM module at FMUI has been described elsewhere 17 , 18 . Adapted from the University Medical Center Utrecht, the module is completed over four weeks.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The content and structure of the CE-EBM module at FMUI has been described elsewhere 17 , 18 . Adapted from the University Medical Center Utrecht, the module is completed over four weeks.…”
Section: Methodsmentioning
confidence: 99%
“…Attitudes were measured using part of the Knowledge Attitudes and Behaviors questionnaire developed by Johnston and colleagues, 25 which had previously been used to assess the first implementation of the CE-EBM module in FMUI and had been validated accordingly, with a Cronbach’s α value of 0.76 18 . Students were asked to score each item on a 6-point Likert scale (from strongly agree to strongly disagree).…”
Section: Methodsmentioning
confidence: 99%
“…Only one study [46] measured outcomes of EBP use and EBP future use, using a validated tool developed by Johnston et al [57]. The tool relies on student self-report but has high reliability and validity measures and has been tested in other studies of undergraduate students EBP [59–61]. A small but significant difference between intervention and control groups was reported for EBP use (mean difference = 0.26, p = 0.015), however no significant difference between groups was reported for EBP future use (mean diff =0.13, p = 0.255).…”
Section: Resultsmentioning
confidence: 99%
“…Reporting strategies of the educational interventions differed substantially. Some studies report precisely, describe the educational intervention and present evaluation data including pre-and postmeasures of student knowledge and student attitude (32,34,38,41). Others characterize the intervention briefly and present post-course evaluation data of student satisfaction (35,(48)(49)(50) or report on the educational intervention without any kind of evaluation (25,28,51).…”
Section: Limitationsmentioning
confidence: 99%
“…knowledge, measures of scientific competence based on tested instruments and randomized trials testing different standards of educational interventions (see for example31,32). Results were quite heterogeneous: some studies showed improvement in student competencies regarding basic principles of science (see for example29,30,33,34), some were able to identify higher competencies…”
mentioning
confidence: 99%