We present the complete, equilibrium solution of a neural-network model in which each neuron is connected to a small fraction of the others by symmetric, Hebb-rule synapses. At first replica symmetry is assumed, but the results are then corrected for full symmetry breaking, which leads to a substantial increase in storage capacity.A breakthrough in the field of neural networks was made by Hopfield [l], who pointed out a mapping between them and spin-glasses. Under this mapping, the average steadystate neural firing patterns correspond to equilibrium states of the corresponding spin system. Spin-glasses themselves are solved using an iterative procedure E21 which is *believed to be exact>> for the SK spin glass [3] (for a review of the technique in this and other disordered problems see [4]). When the method was applied to highly connected neural networks, however, it transpired that only the first, simple steps need be taken to obtain a good result, although a complete solution may be very difficult [5,6].
We introduce optimal learning with a neural network, which we define as minimising the expectation generalisation error. We find that the optimally-trained spherical perceptron may learn a linearly-separable rule as well as any possible network. We sketch an algorithm to generate optimal learning, and simulation results support our conclusions. Optimal learning of a well-known, significant unlearnable problem, the <
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.