The computer‐based Graduate Record Examinations® (GRE®) revised General Test includes interactive item types and testing environment tools (e.g., test navigation, on‐screen calculator, and help). How well do test takers understand these innovations? If test takers do not understand the new item types, these innovations may introduce construct‐irrelevant variance, with test takers performing differently than they would with more familiar item types. Similarly, the navigational and other test environment tools are another potential source of variance, if some test takers understand how to use them and others do not.
In this study, we examined the reactions, engagement, and difficulties encountered as 20 potential test takers completed Verbal and Quantitative Reasoning sections of a practice GRE test. Participants were sophomores and juniors from colleges and universities in the local area. Their reactions were captured through the use of cognitive laboratory sessions that incorporated interviews that required test takers to think aloud, as well as researcher observations as test takers worked quietly. Results of the analysis of this data revealed that some participants needed time to figure out what was being asked of them when they encountered the new item types, although most were able to answer each item eventually. On the other hand, most participants stated that they did not even notice the test environment tools, and few were observed actually using the tools. Several participants provided suggestions about improving the usability of the new item types and test environment tools.