We introduce a location-based game called Feeding Yoshi that provides an example of seamful design, in which key characteristics of its underlying technologies-the coverage and security characteristics of WiFi-are exposed as a core element of gameplay. Feeding Yoshi is also a long-term, wide-area game, being played over a week between three different cities during an initial user study. The study, drawing on participant diaries and interviews, supported by observation and analysis of system logs, reveals players' reactions to the game. We see the different ways in which they embedded play into the patterns of their daily lives, augmenting existing practices and creating new ones, and observe the impact of varying location on both the ease and feel of play. We identify potential design extensions to Feeding Yoshi and conclude that seamful design provides a route to creating engaging experiences that are well adapted to their underlying technologies.
Mobile and wearable computers present input/output problems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment -making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds reduced task completion time, perceived annoyance, and allowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' gestures were more accurate when dynamically guided by audio-feedback. These novel interaction techniques demonstrate effective alternatives to visual-centric interface designs on mobile devices.
Mobile and wearable computers present input/output problems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment -making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds reduced task completion time, perceived annoyance, and allowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' gestures were more accurate when dynamically guided by audio-feedback. These novel interaction techniques demonstrate effective alternatives to visual-centric interface designs on mobile devices.
Abstract. This paper presents Treasure, an outdoor mobile multiplayer game inspired by Weiser's notion of seams, gaps and breaks in different media. Playing Treasure involves movement in and out of a wi-fi network, using PDAs to pick up virtual 'coins' that may be scattered outside network coverage. Coins have to be uploaded to a server to gain game points, and players can collaborate with teammates to double the points given for an upload. Players can also steal coins from opponents. As they move around, players' PDAs sample network signal strength and update coverage maps.Reporting on a study of players taking part in multiple games, we discuss how their tactics and strategies developed as their experience grew with successive games. We suggest that meaningful play arises in just this way, and that repeated play is vital when evaluating such games.
Abstract. Sharing events with others is an important part of many enjoyable experiences. While most existing co-presence systems focus on work tasks, in this paper we describe a lightweight mobile system designed for sharing leisure. This system allows city visitors to share their experiences with others both far and near, through tablet computers that share photographs, voice and location. A collaborative filtering algorithm uses historical data of previous visits to recommend photos, web pages and places to visitors, bringing together online media with the city's streets. In an extensive user trial we explored how these resources were used to collaborate around physical places. The trial demonstrates the value of technological support for sociability -enjoyable shared social interaction. Lastly, the paper discusses support for collaborative photography, and the role history can play to integrate online media with physical places.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.