Lecture Notes in Computer Science
DOI: 10.1007/978-3-540-72037-9_10
|View full text |Cite
|
Sign up to set email alerts
|

Virtual Walls: Protecting Digital Privacy in Pervasive Environments

Abstract: Abstract. As pervasive environments become more commonplace, the privacy of users is placed at increased risk. The numerous and diverse sensors in these environments can record users' contextual information, leading to users unwittingly leaving "digital footprints." Users must thus be allowed to control how their digital footprints are reported to third parties. While a significant amount of prior work has focused on location privacy, location is only one type of footprint, and we expect most users to be incap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
51
0

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 72 publications
(51 citation statements)
references
References 19 publications
0
51
0
Order By: Relevance
“…Then work in [126] introduces a solution to privacy-preserving trust negotiations. Finally, other privacy concerns are tackled in [73], where users can control external access to private sensor data by enforcing their privacy preferences.…”
Section: Mobility Awareness and Adaptabilitymentioning
confidence: 99%
“…Then work in [126] introduces a solution to privacy-preserving trust negotiations. Finally, other privacy concerns are tackled in [73], where users can control external access to private sensor data by enforcing their privacy preferences.…”
Section: Mobility Awareness and Adaptabilitymentioning
confidence: 99%
“…It can be achieved by following the assumption that the sensitivity of the users' personal information is not stable; it may vary depending on different circumstances in which the user is involved (Lederer et al, 2003a;Wright et al, 2009;Sapuppo and Sørensen, 2011). Consequently, only information that is relevant, but not sensitive in specific circumstances should be disclosed at a time (Bunnig, 2009a,b;Kapadia et al, 2007;Langheinrich, 2001;Yee, 2010). Therefore, no standard rules can be applied for all the cases of disclosure of users' personal data (Altman, 1975(Altman, , 1977Palen and Dourish, 2003).…”
Section: Human Data Disclosurementioning
confidence: 99%
“…A. Kapadia et al [21] proposed a privacy language based on the metaphor of physical walls, and assumes users understand and accept the privacy implication of a physical wall. The approaches of collaborative filtering [12] and K-anonymity [13,20,22,23] assume some semantics (e.g., spatial or temporal) on the underlying data to implicitly control the granularity of data exposure.…”
Section: Related Workmentioning
confidence: 99%