2023
DOI: 10.1109/access.2023.3311641
|View full text |Cite
|
Sign up to set email alerts
|

Non-Intrusive Load Classification and Recognition Using Soft-Voting Ensemble Learning Algorithm With Decision Tree, K-Nearest Neighbor Algorithm and Multilayer Perceptron

Nien-Che Yang,
Ke-Lin Sung

Abstract: Non-intrusive load monitoring (NILM) detects the energy consumption of individual appliances by monitoring the overall electricity usage in a building. By analyzing voltage and current characteristics, NILM can recognize the usage patterns of various appliances, thus facilitating energy conservation and management. To implement non-intrusive load classification and recognition more effectively, this study proposes an ensemble learning algorithm based on soft voting, which comprises a decision tree, K-nearest n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 37 publications
0
3
0
Order By: Relevance
“…The classification of an object is based on the most frequently occurring class among the identified k nearest points. The parameter k denotes the number of nearest neighbors influencing the classification process, and the selection of an appropriate k is a nuanced yet crucial step for optimizing the model's performance [43].…”
Section: Dynamic Time Warping With K-nearest Neighbor (Dtw-knn)mentioning
confidence: 99%
See 2 more Smart Citations
“…The classification of an object is based on the most frequently occurring class among the identified k nearest points. The parameter k denotes the number of nearest neighbors influencing the classification process, and the selection of an appropriate k is a nuanced yet crucial step for optimizing the model's performance [43].…”
Section: Dynamic Time Warping With K-nearest Neighbor (Dtw-knn)mentioning
confidence: 99%
“…Essentially, the synergy between DTW and K-NN capitalizes on DTW's efficacy in capturing temporal nuances and K-NN's proficiency in pattern classification based on similarity. This combined approach facilitates a more comprehensive analysis of time series data, proving particularly beneficial when dealing with complex and dynamic patterns [43].…”
Section: Dynamic Time Warping With K-nearest Neighbor (Dtw-knn)mentioning
confidence: 99%
See 1 more Smart Citation