Design is an important part of engineering education, but cannot be assessed easily through current multiple‐choice tests. In this report, we explore the possibility of including design problems in assessments of engineering students. We report a study comparing constructed‐response design problems and constructed‐response versions of typical engineering assessments, labeled “analysis” problems. Through analyses of verbal protocols, we argue that design problems can elicit qualitatively different behaviors compared with analysis problems, and therefore may assess different underlying constructs. While design problems may be a worthwhile addition to assessments of engineering students, such problems present several challenges, including issues of automatic scoring, the amount of information about candidate skill contained in responses, and the potential for construct‐irrelevant variance. These challenges are surmountable, but require careful construction of design assessments to avoid potential pitfalls. This work represents one part of a research program aimed at informing the redesign of the GRE Engineering Subject Test.