2019
DOI: 10.1016/j.evalprogplan.2019.02.014
|View full text |Cite
|
Sign up to set email alerts
|

Using a community-created multisite evaluation to promote evaluation use across a sector

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…We are unaware of other work on this topic within the field of citizen science, but this claim has been substantiated within the context of other informal learning collaboration. For example, Peterman and Gathings (2019) found that the EvalFest collaborative evaluation model was effective at promoting the use of evaluation within science festivals. Similarly, a study of the Nanoscale Informal Science Education Network (called NISE Net) found that their collaborative teambased inquiry approach resulted in museum staff who valued and used evaluation more regularly (Bequette et al 2019).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We are unaware of other work on this topic within the field of citizen science, but this claim has been substantiated within the context of other informal learning collaboration. For example, Peterman and Gathings (2019) found that the EvalFest collaborative evaluation model was effective at promoting the use of evaluation within science festivals. Similarly, a study of the Nanoscale Informal Science Education Network (called NISE Net) found that their collaborative teambased inquiry approach resulted in museum staff who valued and used evaluation more regularly (Bequette et al 2019).…”
Section: Discussionmentioning
confidence: 99%
“…Although evaluation use is one of the most researched areas in the evaluation literature, only in the last decade have researchers examined the extent of evaluation use in different contexts (e.g., Daigneault 2014;D'Ostie-Racine et al 2016;Peterman and Gathings 2019;Shaw and Campbell 2014). Our baseline research contributes to this small number, and as far as we know, is the first multi-project exploration of evaluation use in citizen science.…”
Section: Evaluation Usementioning
confidence: 99%
“…To streamline the validation process of our survey and allow comparisons between existing survey data, we used an existing protocol for science festivals (Peterman and Gathings 2019) as the basis for a pilot evaluation of 2022's Bug Bowl. Intercept surveys of adult attendees are the primary way of evaluating science festivals (Peterman and Verbeke 2020).…”
Section: Methodsmentioning
confidence: 99%
“…We developed our intercept survey instrument with questions adapted from a large multisite informal science evaluation project (Peterman and Gathings 2019), as well as some questions unique to insect festivals and Bug Bowl. The visitor survey was administered as a paper questionnaire for adult visitors.…”
Section: Methodsmentioning
confidence: 99%
“…While edgy science engagement programs have been shown to attract or appeal to young adult audiences (Bisbee O'Connell et al, 2020), it is challenging to demonstrate their impact without disrupting the experience. Research methods like surveys or interviews require extracting participants from the intellectually and emotionally saturated experiences the program designers are seeking to provide (see Michalchik & Gallagher, 2010; Peterman & Gathings, 2019). Participants at the Sensory Speed Dating event, for example, are definitely hoping to strike up a conversation after the event is over but not with the program evaluator.…”
Section: Introductionmentioning
confidence: 99%