Models for artificial intelligence, machine learning, and neural networks are implemented on digital computers with a von Neumann architecture. Few studies have considered analog neural networks. In our previous study, we used multipliers for representing connecting weights in a neural network. The multipliers calculate the product of input signals and their corresponding connecting weights. However, their operating range is limited by semiconductor characteristics. The input and output ranges for networks that use these multipliers are thus limited. Furthermore, the circuit operation becomes unstable. Here, we propose a logarithmic four-quadrant multiplier for representing connecting weights. Experiments show that this multiplier exhibits stable operation over a wide range. A model that uses only analog electronic circuits is presented. Its learning time is quite short compared to that for models implemented on a digital computer. We increased the number of units and network layers. We suggest the possibility of a hardware implementation of a deep learning model.