Using probabilistic approach, the transient dynamics of sparsely connected Hopfield neural networks is studied for arbitrary degree distributions. A recursive scheme is developed to determine the time evolution of overlap parameters. As illustrative examples, the explicit calculations of dynamics for networks with binomial, power-law, and uniform degree distribution are performed. The results are good agreement with the extensive numerical simulations. It indicates that with the same average degree, there is a gradual improvement of network performance with increasing sharpness of its degree distribution, and the most efficient degree distribution for global storage of patterns is the delta function.As a tractable toy model of associative memories and can also be viewed as an extension of the Ising model, Hopfield neural networks [1] received lots of attention in the recent two decades. Equilibrium properties of fully connected Hopfield neural networks have been well studied using spin-glass theory, especially the replica method [2,3]. Their dynamics is also studied using the generating functional method [4] and signal-to-noise analysis [5][6][7].Given the huge number of neurons, there is only a small number of interconnections in the human brain cortex (∼10 11 neurons and ∼10 14 synapses). In order to simulate a biological genuine model rather than the fully connected networks, various random diluted models were studied, including extremely diluted model [8,9], finite diluted model [10,11], and finite connection model [12,13]. But neural connectivity is suggested to be far more complex than a fully random graph, e.g. the networks of c.elegans and cat's cortical neural were reported to be small-world and scale-free, respectively [14,15]. To go one step closer to a more biologically realistic model, many numerical studies are carried out, focusing on how the topology, the degree distribution, and clustering coefficient of a network topology affect the computational performance of the Hopfield model [16][17][18][19]. With the same average connection, a random network was reported to be more efficient for storage and retrieval of patterns than either a small-world network or regular network [17]. Torres et al. reported that the capacity of storage is higher for a neural network with