2020 IEEE Power &Amp; Energy Society Innovative Smart Grid Technologies Conference (ISGT) 2020
DOI: 10.1109/isgt45199.2020.9087649
|View full text |Cite
|
Sign up to set email alerts
|

Attack on Grid Event Cause Analysis: An Adversarial Machine Learning Approach

Abstract: With the ever-increasing reliance on data for data-driven applications in power grids, such as event cause analysis, the authenticity of data streams has become crucially important. The data can be prone to adversarial stealthy attacks aiming to manipulate the data such that residual-based bad data detectors cannot detect them, and the perception of system operators or event classifiers changes about the actual event. This paper investigates the impact of adversarial attacks on convolutional neural network-bas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(13 citation statements)
references
References 12 publications
0
13
0
Order By: Relevance
“…As such, there are several recent proposals in the literature focusing on adversarial distortion of power signals generated in the smart grid [7], [12]- [14]. Niazazari and Livani [14] directly apply the technique proposed in [6] for attacking power event diagnostics ML models.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…As such, there are several recent proposals in the literature focusing on adversarial distortion of power signals generated in the smart grid [7], [12]- [14]. Niazazari and Livani [14] directly apply the technique proposed in [6] for attacking power event diagnostics ML models.…”
Section: Related Workmentioning
confidence: 99%
“…As such, there are several recent proposals in the literature focusing on adversarial distortion of power signals generated in the smart grid [7], [12]- [14]. Niazazari and Livani [14] directly apply the technique proposed in [6] for attacking power event diagnostics ML models. Zhou et al [7] propose white-box attacks on regression ML models designed for power grid load prediction which is a modified version of [6].…”
Section: Related Workmentioning
confidence: 99%
“…Niazazari and Livani [10] performed attacks on a multiclass Convolutional Neural Network (CNN) trained on simulated data. The targeted model classifies power grid events such as line energization, capacitor bank energization, or fault.…”
Section: Defense Approachesmentioning
confidence: 99%
“…Different schemes [10][11][12] have been proposed recently to defend ML algorithms against adversarial examples. Studies [10,11] made use of adversarial training.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation