The publication of increasing amounts of anonymised open source data has resulted in a worryingly rising number of successful re-identification attacks. This has a number of privacy and security implications both on an individual and corporate level. This paper uses a Systematic Literature Review to investigate the depth and extent of this problem as reported in peer reviewed literature. Using a detailed protocol ,seven research portals were explored, 10,873 database entries were searched, from which a subset of 220 papers were selected for further review. From this total, 55 papers were selected as being within scope and to be included in the final review. The main review findings are that 72.7% of all successful re-identification attacks have taken place since 2009. Most attacks use multiple datasets. The majority of them have taken place on global datasets such as social networking data, and have been conducted by US based researchers. Furthermore, the number of datasets can be used as an attribute. Because privacy breaches have security, policy and legal implications (e.g. data protection, Safe Harbor etc.), the work highlights the need for new and improved anonymisation techniques or indeed, a fresh approach to open source publishing.
Due to their organisational characteristics, many charities are poorly prepared for the General Data Protection Regulation (GDPR). We present an exemplar process for implementing GDPR and the DPIA Data Wheel, a DPIA framework devised as part of the case study, that accounts for these characteristics. We validate this process and framework by conducting a GDPR implementation with a charity that works with vulnerable adults. This charity processes both special category (sensitive) and personally identifiable data. This GDPR implementation was conducted and devised for the charity sector, but can be equally applied in any organisation that need to implement GDPR or conduct DPIAs.
Cyber Physical Systems (CPS) seamlessly integrate physical objects with technology, thereby blurring the boundaries between the physical and virtual environments. While this brings many opportunities for progress, it also adds a new layer of complexity to the risk assessment process when attempting to ascertain what privacy risks this might impose on an organisation. In addition, privacy regulations, such as the General Data Protection Regulation (GDPR), mandate assessment of privacy risks, including making Data Protection Impact Assessments (DPIAs) compulsory. We present the DPIA Data Wheel, a holistic privacy risk assessment framework based on Contextual Integrity (CI), that practitioners can use to inform decision making around the privacy risks of CPS. This framework facilitates comprehensive contextual inquiry into privacy risk, that accounts for both the elicitation of privacy risks, and the identification of appropriate mitigation strategies. Further, by using this DPIA framework we also provide organisations with a means of assessing privacy from both the perspective of the organisation and the individual, thereby facilitating GDPR compliance. We empirically evaluate this framework in three different real-world settings. In doing so, we demonstrate how CI can be incorporated into the privacy risk decision-making process in a usable, practical manner that will aid decision makers in making informed privacy decisions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.