Flexible displays potentially allow for interaction styles that resemble those used in paper documents. Bending the display, e.g., to page forward, shows particular promise as an interaction technique. In this paper, we present an evaluation of the effectiveness of various bend gestures in executing a set of tasks with a flexible display. We discuss a study in which users designed bend gestures for common computing actions deployed on a smartphone-inspired flexible E Ink prototype called PaperPhone. We collected a total of 87 bend gesture pairs from ten participants and their appropriateness over twenty actions in five applications. We identified six most frequently used bend gesture pairs out of 24 unique pairs. Results show users preferred bend gestures and bend gesture pairs that were conceptually simpler, e.g., along one axis, and less physically demanding. There was a strong agreement among participants to use the same three pairs in applications: (1) side of display, up/down (2) top corner, up/down (3) bottom corner, up/down. For actions with a strong directional cue, we found strong consensus on the polarity of the bend gestures (e.g., navigating left is performed with an upwards bend gesture, navigating right, downwards). This implies that bend gestures that take directional cues into account are likely more natural to users.
With recent advances in flexible displays, computer displays are no longer restricted to flat, rigid form factors. In this paper, we propose that the physical form of a flexible display, depending on the way it is held or worn, can help shape its current functionality. We propose Snaplet, a wearable flexible E Ink display augmented with sensors that allow the shape of the display to be detected. Snaplet is a paper computer in the form of a bracelet. When in a convex shape on the wrist, Snaplet functions as a watch and media player. When held flat in the hand it is a PDA with notepad functionality. When held in a concave shape Snaplet functions as a phone. Calls are dropped by returning its shape to a flat or convex shape.
Active Learning Environments with Robotic Tangibles (ALERT) are mixed reality video gaming systems that use sensors, vision systems, and robots to provide an engaging experience that may motivate hitherto underrepresented kinds of learners to become interested in game design, programming, and careers in science, technology, engineering, and mathematics. Through the use of fiducials (i.e., meaningful markers) recognized by robots through computer vision as just-in-time instructions, users engage in spatially-based programming without the encumbrances of traditional procedural programs' syntax and structure. Since humans, robots, and video environments share many inherently spatial qualities, this natural style of physical programming is particularly well suited to fostering playful interactions with mobile robots in dynamic video environments. As these systems broaden the capabilities of video game technology and human-robot interaction (HRI) they are lowering many existing barriers to integrated videorobot game development and programming. Diverse ALERT video game scenarios and applications are enabling a broad range of gamers, learners, and developers to generate and engage in their own physically interactive games.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.