We introduce a bipartite, diluted and frustrated, network as a sparse restricted Boltzmann machine and we show its thermodynamical equivalence to an associative working memory able to retrieve several patterns in parallel without falling into spurious states typical of classical neural networks. We focus on systems processing in parallel a finite (up to logarithmic growth in the volume) amount of patterns, mirroring the low-level storage of standard Amit-Gutfreund-Sompolinsky theory. Results obtained through statistical mechanics, the signal-to-noise technique, and Monte Carlo simulations are overall in perfect agreement and carry interesting biological insights. Indeed, these associative networks pave new perspectives in the understanding of multitasking features expressed by complex systems, e.g., neural and immune networks.
In this work, we adopt a statistical-mechanics approach to investigate basic, systemic features exhibited by adaptive immune systems. The lymphocyte network made by B cells and T cells is modeled by a bipartite spin glass, where, following biological prescriptions, links connecting B cells and T cells are sparse. Interestingly, the dilution performed on links is shown to make the system able to orchestrate parallel strategies to fight several pathogens at the same time; this multitasking capability constitutes a remarkable, key property of immune systems as multiple antigens are always present within the host. We also define the stochastic process ruling the temporal evolution of lymphocyte activity and show its relaxation toward an equilibrium measure allowing statistical-mechanics investigations. Analytical results are compared with Monte Carlo simulations and signal-to-noise outcomes showing overall excellent agreement. Finally, within our model, a rationale for the experimentally well-evidenced correlation between lymphocytosis and autoimmunity is achieved; this sheds further light on the systemic features exhibited by immune networks.
a b s t r a c tIn this work, we first revise some extensions of the standard Hopfield model in the low storage limit, namely the correlated attractor case and the multitasking case recently introduced by the authors. The former case is based on a modification of the Hebbian prescription, which induces a coupling between consecutive patterns and this effect is tuned by a parameter a. In the latter case, dilution is introduced in pattern entries, in such a way that a fraction d of them is blank. Then, we merge these two extensions to obtain a system able to retrieve several patterns in parallel and the quality of retrieval, encoded by the set of Mattis magnetizations {m µ }, is reminiscent of the correlation among patterns. By tuning the parameters d and a, qualitatively different outputs emerge, ranging from highly hierarchical to symmetric. The investigations are accomplished by means of both numerical simulations and statistical mechanics analysis, properly adapting a novel technique originally developed for spin glasses, i.e. the Hamilton-Jacobi interpolation, with excellent agreement. Finally, we show the thermodynamical equivalence of this associative network with a (restricted) Boltzmann machine and study its stochastic dynamics to obtain even a dynamical picture, perfectly consistent with the static scenario earlier discussed.
We consider statistical-mechanics models for spin systems built on hierarchical structures, which provide a simple example of non-mean-field framework. We show that the coupling decay with spin distance can give rise to peculiar features and phase diagrams much richer that their mean-field counterpart. In particular, we consider the Dyson model, mimicking ferromagnetism in lattices, and we prove the existence of a number of meta-stabilities, beyond the ordered state, which get stable in the thermodynamic limit. Such a feature is retained when the hierarchical structure is coupled with the Hebb rule for learning, hence mimicking the modular architecture of neurons, and gives rise to an associative network able to perform both as a serial processor as well as a parallel processor, depending crucially on the external stimuli and on the rate of interaction decay with distance; however, those emergent multitasking features reduce the network capacity with respect to the mean-field counterpart. The analysis is accomplished through statistical mechanics, graph theory, signal-to-noise technique and numerical simulations in full consistency. Our results shed light on the biological complexity shown by real networks, and suggest future directions for understanding more realistic models.PACS numbers: 07.05. Mh, In the last decade, extensive research on complexity in networks has evidenced (among many results [1, 2]) the widespread of modular structures and the importance of quasi-independent communities in many research areas such as neuroscience [3,4], biochemistry [5] and genetics [6], just to cite a few. In particular, the modular, hierarchical architecture of cortical neural networks has nowadays been analyzed in depths [7], yet the beauty revealed by this investigation is not captured by the statistical mechanics of neural networks, nor standard ones (i.e. performing serial processing) [8,9] neither multitasking ones (i.e. performing parallel processing) [10,11]. In fact, these models are intrinsically mean-field, thus lacking a proper definition of metric distance among neurons.Hierarchical structures have been proposed in the past as (relatively) simple models for ferromagnetic transitions beyond the mean-field scenario -the Dyson hierarchical model (DHM) [12]-and are currently experiencing a renewal interest for understanding glass transitions in finite dimension [13,14]. Therefore, times are finally ripe for approaching neural networks embedded in a nonmean-field architecture, and this letter summarizes our findings on associative neural networks where the Hebbian kernel is coupled with the Dyson topology.First, we start studying the DHM mixing the AmitGutfreund-Sompolinsky ansatz approach [9] (to select candidable retrievable states) with the interpolation technique (to check their thermodynamic stability) and we show that, as soon as ergodicity is broken, beyond the ferromagnetic/pure state (largely discussed in the past, see e.g., [15,16]), a number of metastable states suddenly appear and become stable in the ...
Inspired by a continuously increasing interest in modeling and framing complex systems in a thermodynamic rationale, in this paper we continue our investigation in adapting well known techniques (originally stemmed in fields of physics and mathematics far from the present) for solving for the free energy of mean field spin models in a statistical mechanics scenario.Focusing on the test cases of bipartite spin systems embedded with all the possible interactions (self and reciprocal), we show that both the fully interacting bipartite ferromagnet as well as the spin glass counterpart, at least at the replica symmetric level, can be solved via the fundamental theorem of calculus, trough an analogy with the Hamilton-Jacobi theory and lastly with a mapping to a Fourier diffusion problem. All these technologies are shown symmetrically for ferromagnets and spin-glasses in full details and contribute as powerful tools in the investigation of complex systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.