Improving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an important area of research. The last researches have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of cross entropy function. One way of entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically, one is the output of the learning system and the other is the target. In this paper, improving the efficiency and convergence rate of multilayer Backpropagation (BP) Neural Networks was proposed. The usual mean square error (MSE) minimization principle is substituted by the minimization of entropy error function (EEM) of the differences between the multilayer perceptions output and the desired target.On this method improving the convergence rate of the backpropagation algorithm is also by adapting the learning rate the determined learning rate is different for each epoch and depends on the weights and gradient values of the previous one. Experimental results show that the proposed method considerably improves the convergence rates of the backpropagation algorithm.
General TermsArtificial neural networks, error back propagation, mean square error, entropy error, learning rate.
KeywordsArtificial neural network; back propagation; mean square error; entropy error; learning rate.