Personal informatics applications have been gaining momentum with the introduction of implicit data collection and alert mechanisms on smart phones. A need for customized design of these applications is emerging and studies on tailoring UI design based on the personality traits of users are well established. This poster investigates how various affordances in gamified personal informatics applications affect motivation levels to track and achieve goals for users with different personality types. We conducted a study to examine how user personality traits relate to (1) motivational affordances in behavior tracking applications and (2) the specific behaviors users prefer to track.
To support mobile, eyes-free web browsing, users can listen to "playlists" of web content − aural flows.Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe (e.g., while walking). This research extends the interaction with aural flows through simulated voice commands as a way to reduce visual interaction. This paper presents the findings of a study with 20 participants who browsed aural flows either through a visual interface only or by augmenting it with voice commands. Results suggest that using voice commands reduced the time spent looking at the device by half but yielded similar system usability and cognitive effort ratings as using buttons. Overall, the low cognitive effort engendered by aural flows, regardless of the interaction modality, allowed participants to do more non-instructed (e.g., looking at the surrounding environment) than instructed activities (e.g., focusing on the user interface).Keywords: User studies, sound-based input/output, mobile computing, multimodal interfaces, information architecture Running Head: Semi-aural Interfaces 4 Research Highlights• We explore a vocabulary of simulated voice commands to control aural flows.• We empirically compare 2 modalities to control aural flows: using buttons vs. voice + buttons.• Voice command users spent 50% less time looking at the device than button-only users.• Walking speed, system usability and cognitive effort are similar in both conditions.• In the voice + button condition, participants use significantly more voice commands than buttons.• Across conditions, aural flows engender more non-instructed than instructed activity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.