“…The output layer only makes simple linear transformation to the hidden layer output. Generally, the non‐negative non‐linear Gauss function is applied as the neuron activation function [10]; thus, the hidden neuron output can be expressed as where C k is the centre of the k th hidden layer neuron activation function, and σ is the width of hidden layer neuron activation function. Thus, the network output can be expressed as When C k , σ, and ω kq are well trained, the RBF neural network is established.…”