2023
DOI: 10.3390/su15010789
|View full text |Cite
|
Sign up to set email alerts
|

Pump Feature Construction and Electrical Energy Consumption Prediction Based on Feature Engineering and LightGBM Algorithm

Abstract: In recent years, research on improving the energy consumption ratio of pumping equipment through control algorithms has improved. However, the actual behavior of pump equipment and pump characteristic information do not always correspond, resulting in deviations between the calculated energy consumption operating point and the actual operating point. This eventually results in wasted power. To solve this problem, the data from circulating pumping equipment in a large pumping facility are analyzed, and the nece… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…The second technique, exclusive feature bundling, groups specific features together to effectively reduce feature dimensionality. This concerted effort is geared towards diminishing feature dimensionality and subsequently enhancing prediction accuracy by significantly reducing sample processing time [39].…”
Section: Lightgbmmentioning
confidence: 99%
See 1 more Smart Citation
“…The second technique, exclusive feature bundling, groups specific features together to effectively reduce feature dimensionality. This concerted effort is geared towards diminishing feature dimensionality and subsequently enhancing prediction accuracy by significantly reducing sample processing time [39].…”
Section: Lightgbmmentioning
confidence: 99%
“…The pooling layer, on the other hand, serves the purpose of subsampling the convolution results, diminishing the dimensions of the convolution vector, and mitigating overfitting. Its function is outlined in Equation (39). The aggregate feature values obtained from this subsampling process are consolidated into M = (M 1 , M 2 , .…”
Section: Cnnmentioning
confidence: 99%