We consider statistical-mechanics models for spin systems built on hierarchical structures, which provide a simple example of non-mean-field framework. We show that the coupling decay with spin distance can give rise to peculiar features and phase diagrams much richer that their mean-field counterpart. In particular, we consider the Dyson model, mimicking ferromagnetism in lattices, and we prove the existence of a number of meta-stabilities, beyond the ordered state, which get stable in the thermodynamic limit. Such a feature is retained when the hierarchical structure is coupled with the Hebb rule for learning, hence mimicking the modular architecture of neurons, and gives rise to an associative network able to perform both as a serial processor as well as a parallel processor, depending crucially on the external stimuli and on the rate of interaction decay with distance; however, those emergent multitasking features reduce the network capacity with respect to the mean-field counterpart. The analysis is accomplished through statistical mechanics, graph theory, signal-to-noise technique and numerical simulations in full consistency. Our results shed light on the biological complexity shown by real networks, and suggest future directions for understanding more realistic models.PACS numbers: 07.05. Mh, In the last decade, extensive research on complexity in networks has evidenced (among many results [1, 2]) the widespread of modular structures and the importance of quasi-independent communities in many research areas such as neuroscience [3,4], biochemistry [5] and genetics [6], just to cite a few. In particular, the modular, hierarchical architecture of cortical neural networks has nowadays been analyzed in depths [7], yet the beauty revealed by this investigation is not captured by the statistical mechanics of neural networks, nor standard ones (i.e. performing serial processing) [8,9] neither multitasking ones (i.e. performing parallel processing) [10,11]. In fact, these models are intrinsically mean-field, thus lacking a proper definition of metric distance among neurons.Hierarchical structures have been proposed in the past as (relatively) simple models for ferromagnetic transitions beyond the mean-field scenario -the Dyson hierarchical model (DHM) [12]-and are currently experiencing a renewal interest for understanding glass transitions in finite dimension [13,14]. Therefore, times are finally ripe for approaching neural networks embedded in a nonmean-field architecture, and this letter summarizes our findings on associative neural networks where the Hebbian kernel is coupled with the Dyson topology.First, we start studying the DHM mixing the AmitGutfreund-Sompolinsky ansatz approach [9] (to select candidable retrievable states) with the interpolation technique (to check their thermodynamic stability) and we show that, as soon as ergodicity is broken, beyond the ferromagnetic/pure state (largely discussed in the past, see e.g., [15,16]), a number of metastable states suddenly appear and become stable in the ...