2022 18th International Conference on Distributed Computing in Sensor Systems (DCOSS) 2022
DOI: 10.1109/dcoss54816.2022.00035
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Localness Transformer for Smart Sensor-Based Energy Disaggregation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…Kelly et al [17] CNN RNN UK-DALE Energy disaggregation appliance switch-on events Zhang et al [19] CNN REDD Energy disaggregation appliance switch-on Xia et al [18] RESNET REDD UK-DALE Energy disaggregation Song et al [29] LSTM REDD, UK-DALE Energy disaggregation appliance switch-on Rafiq et al [33] bi-LSTM UK-DALE ECO Energy disaggregation appliance switch-on Yue et al [35] BERT4NILM REDD UK-DALE Energy disaggregation appliance switch-on Yue et al [38] ELTRANSFORMER REDD UK-DALE Energy disaggregation appliance switch-on Sykiotis et al [39] ELECTRICITY REDD UK-DALE REFIT Energy disaggregation appliance switch-on for a 4-layer DB-LSTM, achieving a 66% improvement in F1 score.…”
Section: Methods Datasets Tasksmentioning
confidence: 99%
See 1 more Smart Citation
“…Kelly et al [17] CNN RNN UK-DALE Energy disaggregation appliance switch-on events Zhang et al [19] CNN REDD Energy disaggregation appliance switch-on Xia et al [18] RESNET REDD UK-DALE Energy disaggregation Song et al [29] LSTM REDD, UK-DALE Energy disaggregation appliance switch-on Rafiq et al [33] bi-LSTM UK-DALE ECO Energy disaggregation appliance switch-on Yue et al [35] BERT4NILM REDD UK-DALE Energy disaggregation appliance switch-on Yue et al [38] ELTRANSFORMER REDD UK-DALE Energy disaggregation appliance switch-on Sykiotis et al [39] ELECTRICITY REDD UK-DALE REFIT Energy disaggregation appliance switch-on for a 4-layer DB-LSTM, achieving a 66% improvement in F1 score.…”
Section: Methods Datasets Tasksmentioning
confidence: 99%
“…Nie et al [37] integrated ResNet with a transformer to capture the causal relationships and contextaware properties of the individual appliance power values in the overall power sequence. Yue et al [38] replaced selfattention with local attention to improve the poor performance in capturing local signal patterns. Sykiotis et al [39] proposed a purely transformer-based framework and introduced a novel training method.…”
Section: Methods Datasets Tasksmentioning
confidence: 99%