Abstract:Modern networked computing systems follow scenarios that differ from those modeled by classical Turing machines. For example, their architecture and functionality may change over time as components enter or disappear. Also, as a rule their components interact with each other and with the environment at unpredictable times and in unpredictable manners, and they evolve in ways that are not pre-programmed. Finally, although the life span of the individual components may be finite, the life span of the systems as a whole is practically unlimited. The examples range from families of cognitive automata to (models of) the Internet and to communities of intelligent communicating agents.We present several models for describing the computational behaviour of evolving interactive systems, in order to characterize their computational power and efficiency. The analysis leads to new models of computation, including 'interactive' Turing machines (ITM's) with advice and new, natural characterizations of non-uniform complexity classes. We will argue that ITM's with advice can serve as an adequate reference model for capturing the essence of computations by evolving interactive systems, showing that 'in theory' the latter are provably more powerful than classical systems.
A finite automaton-the so-called neuromaton, realized by a finite discrete recurrent neural network, working in parallel computation mode, is considered. Both the size of neuromata (i.e., the number of neurons) and their descriptional complexity (i.e., the number of bits in the neuromaton representation) are studied. It is proved that a constant time delay of the neuromaton output does not play a role within a polynomial descriptional complexity. It is shown that any regular language given by a regular expression of length n is recognized by a neuromaton with ⌰(n) neurons. Further, it is proved that this network size is, in the worst case, optimal. On the other hand, generally there is not an equivalent polynomial length regular expression for a given neuromaton. Then, two specialized constructions of neural acceptors of the optimal descriptional complexity ⌰(n) for a single n-bit string recognition are described. They both require O(n 1/ 2 ) neurons and either O(n) connections with constant weights or O(n 1/ 2 ) edges with weights of the O(2 ͌n ) size. Furthermore, the concept of Hopfield languages is introduced by means of so-called Hopfield neuromata (i.e., of neural networks with symmetric weights). It is proved that the class of Hopfield languages is strictly contained in the class of regular languages. The necessary and sufficient so-called Hopfield condition stating when a regular language is a Hopfield language, is formulated. A construction of a Hopfield neuromaton is presented for a regular language satisfying the Hopfield condition. The class of Hopfield languages is shown to be closed under union, intersection, concatenation and complement, and it is not closed under iteration. Finally, the problem whether a regular language given by a neuromaton (or by a Hopfield acceptor) is nonempty, is proved to be PSPACE-complete. As a consequence, the same result for a neuromaton equivalence problem is achieved.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.