This paper presents a novel mathematical model for visual search and selection time in linear menus. Assuming two visual search strategies, serial and directed, and a pointing sub-task, it captures the change of performance with five factors: 1) menu length, 2) menu organization, 3) target position, 4) absence/presence of target, and 5) practice. The novel aspect is that the model is expressed as probability density distribution of gaze, which allows for deriving total selection time. We present novel data that replicates and extends the Nielsen menu selection paradigm and uses eye-tracking and mouse tracking to confirm model predictions. The same parametrization yielded a high fit to both menu selection time and gaze distributions. The model has the potential to improve menu designs by helping designers identify more effective solutions without conducting empirical studies.
In this paper we present our findings from a lab and a field study investigating how passers-by notice the interactivity of public displays. We designed an interactive installation that uses visual feedback to the incidental movements of passersby to communicate its interactivity. The lab study reveals:(1) Mirrored user silhouettes and images are more effective than avatar-like representations. (2) It takes time to notice the interactivity (approximately 1.2s). In the field study, three displays were installed during three weeks in shop windows, and data about 502 interaction sessions were collected. Our observations show: (1) Significantly more passers-by interact when immediately showing the mirrored user image (+90%) or silhouette (+47%) compared to a traditional attract sequence with call-to-action. (2) Passers-by often notice interactivity late and have to walk back to interact (the landing effect). (3) If somebody is already interacting, others begin interaction behind the ones already interacting, forming multiple rows (the honeypot effect). Our findings can be used to design public display applications and shop windows that more effectively communicate interactivity to passers-by.
International audienceMenus are used for exploring and selecting commands in interactive applications. They are widespread in current systems and used by a large variety of users. As a consequence, they have motivated many studies in Human-Computer Interaction (HCI). Facing the large variety of menus, it is difficult to have a clear understanding of the design possibilities and to ascertain their similarities and differences. In this article, we address a main challenge of menu design: the need to characterize the design space of menus. To do this, we propose a taxonomy of menu properties that structures existing work on visual menus. In order to highlight the impact of the properties on performance, we begin by refining performance through a list of quality criteria and by reviewing existing analytical and empirical methods for quality evaluation. The taxonomy of menu properties is an unavoidable step toward the elaboration of advanced predictive models of menu performance and the optimization of menus. A key point of this work is to focus both on menus and on the properties of menus, to enable a fine-grained analysis in terms of performance
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.