2021
DOI: 10.1007/s10994-021-06106-3
|View full text |Cite
|
Sign up to set email alerts
|

SAED: self-attentive energy disaggregation

Abstract: The field of energy disaggregation deals with the approximation of appliance electric consumption using only the aggregate consumption measurement of a mains meter. Recent research developments have used deep neural networks and outperformed previous methods based on Hidden Markov Models. On the other hand, deep learning models are computationally heavy and require huge amounts of data. The main objective of the current paper is to incorporate the attention mechanism into neural networks in order to reduce the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(16 citation statements)
references
References 32 publications
0
15
0
Order By: Relevance
“…Models based on the attention mechanism demonstrate promising results in terms of generalization to unseen data. The attention mechanism is incorporated using the self-attention method [11,20,21] or the transformer architecture [22]. Recently, generative models have been proposed for the problem of NILM by using GANs [23] or variational approaches [24][25][26].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Models based on the attention mechanism demonstrate promising results in terms of generalization to unseen data. The attention mechanism is incorporated using the self-attention method [11,20,21] or the transformer architecture [22]. Recently, generative models have been proposed for the problem of NILM by using GANs [23] or variational approaches [24][25][26].…”
Section: Related Workmentioning
confidence: 99%
“…To overcome the aforementioned reproducibility issues, the baseline models are selected based on how easy it is to replicate past experimental results, their wider acceptance by other NILM researchers and the existence of implementations in open source projects such as NILMTK [6,37]. The baseline models are: a convolutional neural network named "sequence-to-point" (S2P) [10], a recurrent neural network named "online GRU" or "window GRU" (WGRU) [9] and a neural network based on the self-attention mechanism named "self-attentive energy dissaggragator" (SAED) Virtsionis-Gkalinikis et al [11]. The first two models have been used either as baselines or as a basis to develop new architectures.…”
Section: Architecture Of Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…The active and reactive power signatures were then matched to the right appliance using a best likelihood method, and similar "steady-state" elements of the power signal were grouped together. Certain two-state (on/off) appliances have been identified with good accuracy using such clustering approaches [10,11]. These methods, on the other hand, have major trouble detecting more complicated appliances with numerous states (e.g., washing machines) and have a tendency to fail in situations when multiple appliances are operating and switching at the same time [12].…”
Section: Literature Reviewmentioning
confidence: 99%