Smartphone apps today request permission to access a multitude of sensitive resources, which users must accept completely during installation (e.g., on Android) or selectively configure after installation (e.g., on iOS, but also planned for Android). Everyday users, however, do not have the ability to make informed decisions about which permissions are essential for their usage. For enhanced privacy, we seek to leverage crowdsourcing to find minimal sets of permissions that will preserve the usability of the app for diverse users.We advocate an efficient 'lattice-based' crowd-management strategy to explore the space of permissions sets. We conducted a user study (N = 26) in which participants explored different permission sets for the popular Instagram app. This study validates our efficient crowd management strategy and shows that usability scores for diverse users can be predicted accurately, enabling suitable recommendations.
Abstract:Millions of apps available to smartphone owners request various permissions to resources on the devices including sensitive data such as location and contact information. Disabling permissions for sensitive resources could improve privacy but can also impact the usability of apps in ways users may not be able to predict. We study an efficient approach that ascertains the impact of disabling permissions on the usability of apps through large-scale, crowdsourced user testing with the ultimate goal of making recommendations to users about which permissions can be disabled for improved privacy without sacrificing usability. We replicate and significantly extend previous analysis that showed the promise of a crowdsourcing approach where crowd workers test and report back on various configurations of an app. Through a large, between-subjects user experiment, our work provides insight into the impact of removing permissions within and across different apps (our participants tested three apps: Facebook Messenger (N=218), Instagram (N=227), and Twitter (N=110)). We study the impact of removing various permissions within and across apps, and we discover that it is possible to increase user privacy by disabling app permissions while also maintaining app usability.
We are surrounded by digital images of personal lives posted online. Changes in information and communications technology have enabled widespread sharing of personal photos, increasing access to aspects of private life previously less observable. Most studies of privacy online explore differences in individual privacy preferences. Here we examine privacy perceptions of online photos considering both social norms, collectively—shared expectations of privacy and individual preferences. We conducted an online factorial vignette study on Amazon’s Mechanical Turk (
n
= 279). Our findings show that people share common expectations about the privacy of online images, and these privacy norms are socially contingent and multidimensional. Use of digital technologies to share personal photos is influenced by social context as well as individual preferences, while such sharing can affect the social meaning of privacy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.