1998
DOI: 10.1145/273865.273914
|View full text |Cite
|
Sign up to set email alerts
|

Theory of neuromata

Abstract: A finite automaton-the so-called neuromaton, realized by a finite discrete recurrent neural network, working in parallel computation mode, is considered. Both the size of neuromata (i.e., the number of neurons) and their descriptional complexity (i.e., the number of bits in the neuromaton representation) are studied. It is proved that a constant time delay of the neuromaton output does not play a role within a polynomial descriptional complexity. It is shown that any regular language given by a regular express… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
21
0

Year Published

1999
1999
2021
2021

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 35 publications
(24 citation statements)
references
References 13 publications
3
21
0
Order By: Relevance
“…The computational power of discrete-time recurrent NNs with the saturated-linear activation function 1 depends on the descriptive complexity of their weight parameters [6,7]. NNs with integer weights, corresponding to binary-state (shortly binary) networks which employ the Heaviside activation function (with Boolean outputs 0 or 1), coincide with finite automata (FAs) recognizing regular languages [8,9,10,11,12,13]. Rational weights make the analog-state (shortly analog) NNs (with real-valued outputs in the interval [0, 1]) computationally equivalent to Turing machines (TMs) [10,14], and thus (by the real-time simulation [14]) polynomial-time computations of such networks are characterized by the fundamental complexity class P.…”
Section: Analog Neuron Hierarchymentioning
confidence: 99%
See 2 more Smart Citations
“…The computational power of discrete-time recurrent NNs with the saturated-linear activation function 1 depends on the descriptive complexity of their weight parameters [6,7]. NNs with integer weights, corresponding to binary-state (shortly binary) networks which employ the Heaviside activation function (with Boolean outputs 0 or 1), coincide with finite automata (FAs) recognizing regular languages [8,9,10,11,12,13]. Rational weights make the analog-state (shortly analog) NNs (with real-valued outputs in the interval [0, 1]) computationally equivalent to Turing machines (TMs) [10,14], and thus (by the real-time simulation [14]) polynomial-time computations of such networks are characterized by the fundamental complexity class P.…”
Section: Analog Neuron Hierarchymentioning
confidence: 99%
“…In addition, the units 5 and nxt from this cycle synchronize the incident neurons 6 ∈ Ṽ and out = 7 ∈ Ṽ , respectively, so that the unit 6 can be activated only at the time instants t = 3k for k > 0, by (13), while the output neuron out can fire only at the time instants τ k+1 = 3k + 1 for k ≥ 0. Hence, the result of the recognition is reported by the output neuron out as indicated in Table 1 in boldface, even for each of the four prefixes of x, the empty string ε, 1, 10, and 101, at the time instants τ 1 = 1, τ 2 = 4, τ 3 = 7, τ 4 = 10, respectively, according to (9).…”
Section: Neural Language Acceptors With One Analog Unitmentioning
confidence: 99%
See 1 more Smart Citation
“…A language L corresponds to a single-class classification (decision) problem which is solved by a neural network N , in the sense that N accepts language L containing all positive input instances of the problem that belong to the class. For the finite NNs the following input/output protocol has been used [2,6,14,15,29,30,31,32,38,40]. An input word (string) x = x 1 .…”
Section: The Neural Network Modelmentioning
confidence: 99%
“…1, depends on the descriptive complexity of their weight parameters [30,38]. NNs with integer weights w ji ∈ Z, corresponding to binary-state networks with 2 s global states, coincide with finite automata which accept precisely the regular languages [2,22,40]. For example, size-optimal implementations of a given (deterministic) finite automaton with m states by using a NN with Θ( √ m) neurons have been elaborated [14,15,34].…”
Section: Neural Network and The Chomsky Hierarchymentioning
confidence: 99%