Many decisions within engineering systems design are typically made by humans. These decisions significantly affect the design outcomes and the resources used within design processes. While decision theory is increasingly being used from a normative standpoint to develop computational methods for engineering design, there is still a significant gap in our understanding of how humans make decisions within the design process. Particularly, there is lack of knowledge about how an individual's domain knowledge and framing of the design problem affect information acquisition decisions. To address this gap, the objective of this paper is to quantify the impact of a designer's domain knowledge and problem framing on their information acquisition decisions and the corresponding design outcomes. The objective is achieved by (i) developing a descriptive model of information acquisition decisions, based on an optimal one-step look ahead sequential strategy, utilizing expected improvement maximization, and (ii) using the model in conjunction with a controlled behavioral experiment. The domain knowledge of an individual is measured in the experiment using a concept inventory, whereas the problem framing is controlled as a treatment variable in the experiment. A design optimization problem is framed in two different ways: a domain-specific track design problem and a domain-independent function optimization problem (FOP). The results indicate that when the problem is framed as a domain-specific design task, the design solutions are better and individuals have a better state of knowledge about the problem, as compared to the domain-independent task. The design solutions are found to be better when individuals have a higher knowledge of the domain and they follow the modeled strategy closely.
Research on decision making in engineering design has focused primarily on how to make decisions using normative models given certain information. However, there exists a research gap on how diverse information stimuli are combined by designers in decision making. In this paper, we address the following question: how do designers weigh different information stimuli to make decisions in engineering design contexts? The answer to this question can provide insights on diverse cognitive models for decision making used by different individuals. We investigate the information gathering behavior of individuals using eye gaze data from a simulated engineering design task. The task involves optimizing an unknown function using an interface which provides two types of information stimuli, including a graph and a list area. These correspond to the graphical stimulus and numerical stimulus, respectively. The study was carried out using a set of student subjects. The results suggest that individuals weigh different forms of information stimuli differently. It is observed that graphical information stimulus assists the participants in optimizing the function with a higher accuracy. This study contributes to our understanding of how diverse information stimuli are utilized by design engineers to make decisions. The improved understanding of cognitive decision making models would also aid in improved design of decision support tools.
Research on expertise in design has focused primarily on understanding expert-novice differences. Although it is well established that experts perform better than novices, there is a lack of formal methods to quantify the potential impact of expertise on the quality of design outcomes. The research question addressed in this paper is: How can the impact of expertise on the quality of design solutions be quantified? Quantifying such impacts can be of particular importance in product development, recruitment processes and design competitions. We utilize an approach based on Item Response Theory (IRT) and Concept Inventories (CI) for expertise quantification. We then investigate and validate the impact of expertise on solution quality through a behavioral experiment involving a track design problem. The results highlight the usefulness of the proposed approach and provide a functional relationship between expertise and solution quality. We also observe behavioral differences between participants with varying scores on a test taken in the behavioral experiment. The proposed approach could be used in the future work to quantify learning.
Existing literature on information sharing in contests has established that sharing contest-specific information influences contestant behaviors, and thereby, the outcomes of a contest. However, in the context of engineering design contests, there is a gap in knowledge about how contest-specific information such as competitors' historical performance influences designers' actions and the resulting design outcomes. To address this gap, the objective of this study is to quantify the influence of information about competitors' past performance on designers' belief about the outcomes of a contest, which influences their design decisions, and the resulting design outcomes. We focus on a single-stage design competition where an objective figure of merit is available to the contestants for assessing the performance of their design. Our approach includes (i) developing a behavioral model of sequential decision-making that accounts for information about competitors' historical performance, and (ii) using the model in conjunction with a human-subject experiment where participants make design decisions given controlled strong or weak performance records of past competitors. Our results indicate that participants spend greater efforts when they know that the contest history reflects that past competitors had a strong performance record than when it reflects a weak performance record. Moreover, we quantify cognitive underpinnings of such informational influence via our model parameters. Based on the parametric inferences about participants' cognition, we suggest that contest designers are better off not providing historical performance records if past contest outcomes do not match their expectations set up for a given design contest.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.