2019
DOI: 10.1017/dsi.2019.282
|View full text |Cite
|
Sign up to set email alerts
|

A Systematic Approach to Evaluating Design Prompts in Supporting Experimental Design Research

Abstract: Experiments that study engineering behavior in design often rely on participants responding to a given design prompt or a problem statement. Moreover, researchers often find themselves testing multiple variables with a relatively small participant pool. In such situations multiple design prompts may be used to boost replication by giving each participant an equivalent problem with a different experimental condition. This paper presents a systematic approach to compare given design prompts using a two-step proc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 27 publications
0
5
0
Order By: Relevance
“…Moreover, experimental results are difficult to generalize due to the lack of homogeneity of experimental protocols. Design tasks are countless due to the pervasive nature of Design (Gero, 2011) and different authors rarely reuse the same design problems (Patel et al, 2019). Moreover, researchers have not yet defined a standard structure for design problems nor a standard indicator for their difficulty nor a standard evaluation approach (Kumar and Mocko, 2016;Sosa, 2019).…”
Section: The Resulting Frameworkmentioning
confidence: 99%
“…Moreover, experimental results are difficult to generalize due to the lack of homogeneity of experimental protocols. Design tasks are countless due to the pervasive nature of Design (Gero, 2011) and different authors rarely reuse the same design problems (Patel et al, 2019). Moreover, researchers have not yet defined a standard structure for design problems nor a standard indicator for their difficulty nor a standard evaluation approach (Kumar and Mocko, 2016;Sosa, 2019).…”
Section: The Resulting Frameworkmentioning
confidence: 99%
“…They received different prompts to ensure that participants did not generate identical solutions. Prompts were chosen from prior work by Patel et al (2019) that validated the prompts' similarity for structure, complexity, and solvability. The two design prompts were:…”
Section: Methodsmentioning
confidence: 99%
“…This was done to ensure that participants presented distinct solutions to one another. The two prompts were selected from prior work that validated their similarity in terms of their structure, complexity and solvability (Patel, Elena, & Summers 2019). The design prompts given to participants were as follows: “Design an automatic clothes-ironing machine for use in hotels.…”
Section: Methodsmentioning
confidence: 99%