Despite the acknowledgment that learning is a necessary part of all gameplay, the area of Games User Research lacks an established evidence based method through which designers and researchers can understand, assess, and improve how commercial games teach players game-specific skills and information. In this paper, we propose a mixed method procedure that draws together both quantitative and experiential approaches to examine the extent to which players are supported in learning about the game world and mechanics. We demonstrate the method through presenting a case study of the game Portal involving 14 participants, who differed in terms of their gaming expertise. By comparing optimum solutions to puzzles against observed player performance, we illustrate how the method can indicate particular problems with how learning is structured within a game. We argue that the method can highlight where major breakdowns occur and yield design insights that can improve the player experience with puzzle games.