In a digital era where terabytes of structured and unstructured records are created and stored every minute, the importance of collecting small amounts of high quality data is often undervalued. However, this activity plays a critical role in industrial and laboratory settings, when addressing problems from process modeling and analysis, to optimization and robust design. Implementing a screening design is usually the way to begin a systematic statistical Design of Experiments program. Its aim is to find the influential factors and establish a first description of the process under analysis. Recently, new developments have emerged in this field, especially with the appearance of a new class of designs that join screening efficiency with the possibility to estimate quadratic effects (definitive screening designs). Therefore, several alternatives are currently available for conducting screening studies, but information is still scarce on the pros and cons of each methodology. This study was thus designed to gather useful information from the user's perspective by independently considering each design with rather standard settings. For each design, its ability to recover the correct model structure was evaluated, through Monte Carlo simulations on several simulated process structures containing elements likely to be found in practice (they obey the general principles of effect sparsity, hierarchy, and heredity). As some screening designs can estimate quadratic terms, these effects were also considered in the simulated models, and three-level designs were brought to the analysis for comparison purposes, even though they are not screening designs.