Contemporary digital game developers offer a variety of games for the diverse tastes of their customers. Although the gaming experience often depends on one's preferences, the same may not apply to the level of their immersion. It has been argued whether the player perspective can influence the level of player's involvement with the game. The aim of this study was to research whether interacting with a game in first person perspective is more immersive than playing in the third person point of view (POV). The set up to test the theory involved participants playing a role-playing game in either mode, naming their preferred perspective, and subjectively evaluating their immersive experience. The results showed that people were more immersed in the game play when viewing the game world through the eyes of the character, regardless of their preferred perspectives.
Player experience is an important field of digital games research to understand how games influence players. A common way to directly measure players' reported experiences is through questionnaires. However, the large number of questionnaires currently in use introduces several challenges both in terms of selecting suitable measures and comparing results across studies. In this paper, we review some of the most widely known and used questionnaires and focus on the immersive experience questionnaire (IEQ), the game engagement questionnaire (GEQ), and the player experience of need satisfaction (PENS), with the aim to position each of them in relation to each other. This was done through an online survey, in which we gathered 270 responses from players about their most recent experience of a digital game. Our findings show considerable convergence between these three questionnaires and that there is room to refine them into a more widely applicable measure of general game engagement.
Challenge is a key element of digital games, but a clear conceptualisation and operationalisation of this player experience were long missing. This made it hard for game developers to design for a well-balanced experience across different skill-sets, and impeded the synthesis of challenge-related games research. To overcome this, we introduce a systematic, extensive, and reliable instrument to evaluate the level of players' perceived challenge in digital games. We conceptualise challenge based on a survey of related literature in games user research, design and AI, as well as interviews with researchers and players. Exploratory factor analysis (N=394) highlights four components of experienced challenge: performative, emotional, cognitive and decisionmaking challenge. Refinement of the items allowed us to devise the Challenge Originating from Recent Gameplay Interaction Scale (CORGIS), which has been further validated in a study with nearly 1,000 players. The question
Uncertainty has previously been identified as an important ingredient of engaging games. Design in games can create different levels of uncertainty in players that they can recognise and describe as being either attributable to external forces, such as chance or hidden information, or internal to their own understanding of what to do in relation to their own goals. While it appears that uncertainty can contribute both positive and negative play experiences, there is little work in trying to operationalise and measure this concept as a component of player experience. Reported in this paper is an analysis of data from over 700 players using modern bi-factor analysis techniques resulting in a 5-factor psychometric scale which captures the broad feelings of players about uncertainty in games. Three of these specific factors appear to point towards a single generic factor of uncertainty that is internal to the players, one captures experiences relating external uncertainty, with the final factor relating to player's experience of exploring the game to resolve uncertainty. In order to further validate the scale, we conducted an experiment with a commercial puzzle game manipulating the duration of play with predicted outcomes on the different specific factors of the scale. Overall the scale shows promise with good statistical reliability and construct validity of the separate factors and so will be a useful tool for further investigating player experiences in digital games.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.