Proceedings of the 1st ACM International Workshop on Security and Safety for Intelligent Cyber-Physical Systems 2020
DOI: 10.1145/3417312.3431827
|View full text |Cite
|
Sign up to set email alerts
|

Hard-Label Black-Box Adversarial Attack on Deep Electrocardiogram Classifier

Abstract: Through aiding the process of diagnosing cardiovascular diseases (CVD) such as arrhythmia, electrocardiograms (ECGs) have progressively improved prospects for an automated diagnosis system in modern healthcare. Recent years have seen the promising applications of deep neural networks (DNNs) in analyzing ECG data, even outperforming cardiovascular experts in identifying certain rhythm irregularities. However, DNNs have shown to be susceptible to adversarial attacks, which intentionally compromise the models by … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…Detecting the perturbations becomes difficult because of the restriction of the objective function or convolution processing. Lam et al Lam et al [2020] proposed a black-box attack called boundary attack, which improves the smoothness of perturbations by using a low-pass Hanning filter. In Figure 1, we plot a part of an original ECG signal sample and its counterparts that are attacked by PGD and SAP.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Detecting the perturbations becomes difficult because of the restriction of the objective function or convolution processing. Lam et al Lam et al [2020] proposed a black-box attack called boundary attack, which improves the smoothness of perturbations by using a low-pass Hanning filter. In Figure 1, we plot a part of an original ECG signal sample and its counterparts that are attacked by PGD and SAP.…”
Section: Introductionmentioning
confidence: 99%
“…In this study, we explored the defense against the white-box and black-box adversarial attacks which are aimed at ECG-based DNNs. Furthermore, SAP Han et al [2020] is applied to represent the white-box attack and boundary attack Lam et al [2020] is applied to represent the black-box attack. We defended ECG-based DNNs against SAP and boundary attack with common defense methods, such as adversarial trainingGoodfellow et al [2014], defensive distillationPapernot et al [2016], JRJakubovitz and Giryes [2018] and NSR regularizationMa and Liang [2020b], and found these methods performed well against SAP and boundary attack.…”
Section: Introductionmentioning
confidence: 99%