Transfer learning provides an effective solution for feasibly and fast customize accurate Student models, by transferring the learned knowledge of pre-trained Teacher models over large datasets via fine-tuning. Many pre-trained Teacher models used in transfer learning are publicly available and maintained by public platforms, increasing their vulnerability to backdoor attacks. In this paper, we demonstrate a backdoor threat to transfer learning tasks on both image and time-series data leveraging the knowledge of publicly accessible Teacher models, aimed at defeating three commonly-adopted defenses: pruning-based, retraining-based and input pre-processing-based defenses. Specifically, (A) ranking-based selection mechanism to speed up the backdoor trigger generation and perturbation process while defeating pruning-based and/or retraining-based defenses. (B) autoencoder-powered trigger generation is proposed to produce a robust trigger that can defeat the input pre-processing-based defense, while guaranteeing that selected neuron(s) can be significantly activated. (C) defense-aware retraining to generate the manipulated model using reverse-engineered model inputs. We conduct an in-depth study on the backdoor attacks in building and operating both image and time series data transfer learning systems. We launch effective misclassification attacks on Student models over real-world images, brain Magnetic Resonance Imaging (MRI) data and Electrocardiography (ECG) learning systems. The experiments reveal that our enhanced attack can maintain the 98.4% and 97.2% classification accuracy as the genuine model on clean image and time series inputs respectively while improving 27.9% − 100% and 27.1% − 56.1% attack success rate on trojaned image and time series inputs respectively in the presence of pruning-based and/or retraining-based defenses.
With the increasing concerns over privacy in software systems, there is a growing enthusiasm to develop methods to support the development of privacy aware software systems. Inadequate privacy in software system designs could result in users losing their sensitive data, such as health information and financial information, which may cause financial and reputation loss. Privacy Engineering Methodologies (PEMs) are introduced into the software development processes with the goal of guiding software developers to embed privacy into the systems they design. However, for PEMs to be successful it is imperative that software developers have a positive intention to use PEMs. Otherwise, developers may attempt to bypass the privacy methodologies or use them partially and hence develop software systems that may not protect user privacy appropriately. To investigate the factors that affect software developers’ behavioural intention to follow PEMs, in this article, we conducted a study with 149 software developers. Findings of the study show that the usefulness of the PEM to the developers’ existing work to be the strongest determinant that affects software developers’ intention to follow PEMs. Moreover, the compatibility of the PEM with their way of work and how the PEM demonstrates its results when used were also found to be significant. These findings provide important insights in understanding the behaviour of software developers and how they perceive PEMs. The findings could be used to assist organisations and researchers to deploy PEMs and design PEMs that are positively accepted by software developers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.