Conducting experiments is an important practice for both engineering design and scientific inquiry. Engineers iteratively conduct experiments to evaluate ideas, make informed decisions, and optimize their designs. However, both engineering design and scientific experimentation are open‐ended tasks that are not easy to assess. Recent studies have demonstrated how technology‐based assessments can help to capture and characterize these open‐ended tasks using unobtrusive data logging. This study builds upon a model to characterize students' experimentation strategies in design (ESD). Ten undergraduate students worked on a design challenge using a computer‐aided design (CAD) tool that captured all their interactions with the software. This “process data” was compared to “think‐aloud data,” which included students' explanations of their rationale for the experiments. The results suggest that the process data and the think‐aloud data have both affordances and limitations toward the goal of assessing students' ESD. While the process data was an effective approach to identify relevant sequences of actions, this type of data failed to ensure that students carried them out with a specific purpose. On the other hand, the think‐aloud data captured students' rationale for conducting experiments, but it depended on students' ability to verbalize their actions. In addition, the implementation of think‐aloud procedures and their analysis are time consuming tasks, and can only be done individually.
Computer‐aided design (CAD) programs are essential to engineering as they allow for better designs through low‐cost iterations. While CAD programs are typically taught to undergraduate students as a job skill, such software can also help students learn engineering concepts. A current limitation of CAD programs (even those that are specifically designed for educational purposes) is that they are not capable of providing automated real‐time help to students. To encourage CAD programs to build in assistance to students, we used data generated from students using a free, open‐source CAD software called Aladdin to demonstrate how student data combined with machine learning techniques can predict how well a particular student will perform in a design task. We challenged students to design a house that consumed zero net energy as part of an introductory engineering technology undergraduate course. Using data from 128 students, along with the scikit‐learn Python machine learning library, we tested our models using both total counts of design actions and sequences of design actions as inputs. We found that our models using early design sequence actions are particularly valuable for prediction. Our logistic regression model achieved a >60% chance of predicting if a student would succeed in designing a zero net energy house. Our results suggest that it would be feasible for Aladdin to provide useful feedback to students when they are approximately halfway through their design. Further improvements to these models could lead to earlier predictions and thus provide students feedback sooner to enhance their learning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.