“…Therefore, the activation function commonly used in the neural network contains nonlinear factors, and the commonly used activation functions include sigmoid, tanh, ReLU, softmax, etc., to improve the expressive ability of the model. 16 According to the layer division, the neural network inside the DNN can be divided into: input layer, hidden layer, and output layer. Generally, the first layer is the input layer, the last layer is the output layer, and the middle layers are hidden layer.…”