2021
DOI: 10.1002/ese3.933
|View full text |Cite
|
Sign up to set email alerts
|

Using MLP‐GABP and SVM with wavelet packet transform‐based feature extraction for fault diagnosis of a centrifugal pump

Abstract: This paper explores artificial intelligent training schemes based on multilayer perceptron, considering back propagation and genetic algorithm (GA). The hybrid scheme is compared with the traditional support vector machine approach in the literature to analyze both fault and normal scenarios of a centrifugal pump. A comparative analysis of the performance of the variables was carried out using both schemes. The study used features extracted for three decomposition levels based on wavelet packet transform. In o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 38 publications
0
4
0
Order By: Relevance
“…This kind of methods needed a lot of human energies, and the artificially chosen hyperparameters may not be suitable for the current Unlike empirically manual tuning, this work employed the program auto-tuning technique in an attempt to improve the efficiency of model tuning through customized parameter range, random parameters selection and early termination of poor performance. The selection range of learning rate was set as [1 × 10 −6 , 1 × 10 −3 ], and the batch size was chosen among [8,16,32,64]. Forty sets of hyperparameters were randomly selected from them for training, and the upper limit of training times is 50.…”
Section: Neural Network and Its Hyperparameters Tuningmentioning
confidence: 99%
See 1 more Smart Citation
“…This kind of methods needed a lot of human energies, and the artificially chosen hyperparameters may not be suitable for the current Unlike empirically manual tuning, this work employed the program auto-tuning technique in an attempt to improve the efficiency of model tuning through customized parameter range, random parameters selection and early termination of poor performance. The selection range of learning rate was set as [1 × 10 −6 , 1 × 10 −3 ], and the batch size was chosen among [8,16,32,64]. Forty sets of hyperparameters were randomly selected from them for training, and the upper limit of training times is 50.…”
Section: Neural Network and Its Hyperparameters Tuningmentioning
confidence: 99%
“…This method achieves fault diagnosis by collecting bearing vibration signals and performing their feature extraction, analysis and identification. The classical methods include root mean square [5], crest factor [6], fast Fourier transform [7], wavelet transform and wavelet packet transform [8], etc. The main advantages of the methods are that they can be used for noise reduction and feature extraction with no need of actual mathematical modeling [9].…”
Section: Introductionmentioning
confidence: 99%
“…At present, most of the methods adopted by researchers for optimizing the design of centrifugal pumps still remain semi empirical and semi theoretical. Common optimization methods include similar design method, velocity coefficient method, and area ratio method [26]. The hydraulic performance of centrifugal pump models designed through such methods usually cannot reach the optimal level.…”
Section: A Multi-objective Optimization Mathematical Model Of Centrif...mentioning
confidence: 99%
“…Training using analog circuits for diagnostic models in fault diagnosis is more valuable. With the continuous refinement of various data-analysis methods, the troubleshooting technology for analog circuits is also improving; common troubleshooting techniques, including PCA [1], Search Grid [2], particle swarm optimization (PSO) [3], ant colony algorithm (ACA) [4], simulated annealing (SA) [5], genetic algorithm (GA) [6], Back Propagation Neural Network (BP) [6], Self-organizing Maps (SOM) [7], Extreme Learning Machine (ELM) [8], decision tree [9], random forest [10] and SVM [11] all have good classification results to some extent [12].…”
Section: Introductionmentioning
confidence: 99%