2019
DOI: 10.1016/j.epsr.2019.01.034
|View full text |Cite
|
Sign up to set email alerts
|

Non-intrusive load disaggregation based on deep dilated residual network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
39
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 82 publications
(39 citation statements)
references
References 24 publications
0
39
0
Order By: Relevance
“…The community has therefore focused on both supervised and unsupervised machine learning techniques. Among the supervised techniques, several neural network architectures have been proposed, such as Multi Layer Perceptron (MLP) [29], Convolutional Neural Network (CNN) [30][31][32][33][34][35][36], Recurrent Neural Network (RNN) [30,[37][38][39], Extreme Learning Machine [40], techniques based on Support Vector Machines (SVM) [16,41], K-Nearest Neighbors (kNN) [41,42] naive Bayes classifiers [15], Random Forest classifier [43] and Conditional Random Fields [44]. Among the unsupervised techniques, it was mainly those based on Hidden Markov Model that were used in this field [26,28,[45][46][47][48], although clustering techniques were also used [49,50].…”
Section: Introductionmentioning
confidence: 99%
“…The community has therefore focused on both supervised and unsupervised machine learning techniques. Among the supervised techniques, several neural network architectures have been proposed, such as Multi Layer Perceptron (MLP) [29], Convolutional Neural Network (CNN) [30][31][32][33][34][35][36], Recurrent Neural Network (RNN) [30,[37][38][39], Extreme Learning Machine [40], techniques based on Support Vector Machines (SVM) [16,41], K-Nearest Neighbors (kNN) [41,42] naive Bayes classifiers [15], Random Forest classifier [43] and Conditional Random Fields [44]. Among the unsupervised techniques, it was mainly those based on Hidden Markov Model that were used in this field [26,28,[45][46][47][48], although clustering techniques were also used [49,50].…”
Section: Introductionmentioning
confidence: 99%
“…Supervised techniques use offline training to achieve a database of information used to design the classifier (s). Some common supervised learning techniques that have been applied in NILM are (shallow) Artificial Neural Networks, mainly Multilayer Perceptron (MLP) [66,84], concatenated Convolutional Neural Networks (CNNs) [85], Deep Neural Networks [53,[86][87][88][89][90][91], Support Vector Machines (SVM) [66,92], K-Nearest Neighbours (k-NN) [92][93][94], naïve Bayes classifiers [64,94,95] and, recently, linear-chain Conditional random fields (CRFs), which takes into account how previous states influence the current state and can deal with multi-state loads [96]. In [97] the performance of three classifiers, MLPs, Radial Basis Function (RBF) networks and SVM, with different kernels, is compared by employing odd harmonics (up to the 15th) from the current waveform, measured in a proprietary experimental setup.…”
Section: Load Identificationmentioning
confidence: 99%
“…The Accuracy and F1-Measure metrics were used to assess the precision of the model's operation. In order to determine the second one, two scores called Recall and Precision, defined analogously to those applied by Kolter and Jaakkola [40], were used. Recall presents the percentage share of correctly classified cases.…”
Section: Performance Metricsmentioning
confidence: 99%
“…Precision is the percentage of correct classifications only for active appliances. If the designations of the numbers of correct and incorrect estimations according to Table 3 are adopted, the Recall and Precision scores can be determined using Formula 3and 4 [40].…”
Section: Performance Metricsmentioning
confidence: 99%