Theory and Best Practices in Science Communication Training 2019
DOI: 10.4324/9781351069366-9
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating science communication training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(10 citation statements)
references
References 1 publication
0
10
0
Order By: Relevance
“…Do programs have defined learning objectives that inform the creation of measurable assessment categories? Although new methods to evaluate science communication training and engagement efforts are being created (e.g., Bogue et al, 2013; Peterman et al, 2017; Robertson Evia et al, 2017; Rodgers et al, 2020; Sevian & Gonsalves, 2008), little is known about how science communication training programs conduct evaluation (Baram-Tsabari & Lewenstein, 2017a; Barel-Ben David & Baram-Tsabari, 2020). Thus, our third research question asks:…”
Section: Literature Reviewmentioning
confidence: 99%
“…Do programs have defined learning objectives that inform the creation of measurable assessment categories? Although new methods to evaluate science communication training and engagement efforts are being created (e.g., Bogue et al, 2013; Peterman et al, 2017; Robertson Evia et al, 2017; Rodgers et al, 2020; Sevian & Gonsalves, 2008), little is known about how science communication training programs conduct evaluation (Baram-Tsabari & Lewenstein, 2017a; Barel-Ben David & Baram-Tsabari, 2020). Thus, our third research question asks:…”
Section: Literature Reviewmentioning
confidence: 99%
“…Meaningful evaluation of these programs would also improve reflexivity. As we found in this study, past work identifies a lack of evaluation in science communication training programs overall (Barel-Ben David & Baram-Tsabari, 2019;Besley et al, 2016;Dudo et al, 2021). Co-created approaches to evaluation that move beyond simple metrics to a systemic lens on organizational impact offer promising avenues for authentic assessment of these programs (Dudo et al, 2021;Snyder-Young, 2018).…”
Section: Reflexivitymentioning
confidence: 59%
“…Previous research on science communication training programs suggests they do not center inclusive approaches; although trainers often expressed strong support for broadening participation, programs often lacked clear evidence that they were being intentional in broadening participation in their staff, their participants, or the audiences they sought out (Besley et al, 2016; Canfield and Menezes, 2020; Dudo et al, 2021). Lack of meaningful evaluation (Barel-Ben David & Baram-Tsabari, 2019; Besley et al, 2016) and lack of infrastructure and interaction between programs (Besley et al, 2016; Canfield and Menezes, 2020; Smith, 2019) may also mean programs are bounded by their own norms and capacities. In contrast, science communication training programs embedded within marginalized cultural contexts appear to be more intentional in not only broadening participation but also welcoming and empowering participants (Brown et al, 2020; Márquez and Porras, 2020).…”
mentioning
confidence: 99%
“…Third, future studies could incorporate insights from experts’ communication practices into the design of science communication training, a “rapidly developing, but as yet under-conceptualized field” (Baram-Tsabari and Lewenstein, 2017: 286). This direction seems particularly promising, given the worldwide growth of social media and the high regard given to promoting trust in science within this field (Barel-Ben David and Baram-Tsabari, 2020).…”
Section: Discussionmentioning
confidence: 99%