In this paper, we elaborate upon the claim that clustering in the recurrent layer of recurrent neural networks (RNNs) reflects meaningful information processing states even prior to training [1], [2]. By concentrating on activation clusters in RNNs, while not throwing away the continuous state space network dynamics, we extract predictive models that we call neural prediction machines (NPMs). When RNNs with sigmoid activation functions are initialized with small weights (a common technique in the RNN community), the clusters of recurrent activations emerging prior to training are indeed meaningful and correspond to Markov prediction contexts. In this case, the extracted NPMs correspond to a class of Markov models, called variable memory length Markov models (VLMMs). In order to appreciate how much information has really been induced during the training, the RNN performance should always be compared with that of VLMMs and NPMs extracted before training as the "null" base models. Our arguments are supported by experiments on a chaotic symbolic sequence and a context-free language with a deep recursive structure. Index Terms-Complex symbolic sequences, information latching problem, iterative function systems, Markov models, recurrent neural networks (RNNs).
The paper presents design, background and experimental results of the IPTables applied in IPv4 and IP6Tables applied in IPv6 network compared through several tested parameters. The experimental testbed environment is based on P2P grid utilized for DDoS attacks. IPTables tool is used for packet filtering and consequently for preventing DoS/DDoS attacks. It allows a system administrator to configure the tables, the chains and rules it stores in order to manage the incoming and outgoing packets. The packets are treated according to the rules’ results provided by the packet processing. A rule in a chain can be bound with another chain in the table etc. We employ the P2P grid environment to carry out as well as to coordinate DDoS attack on the availability of services to simulate real DDoS attack launched indirectly through many compromised computing systems. The same routing protocols as well as the same firewall rules were used for IPv4 and for IPv6 network. The main aim was to analyse pros and cons of new IP6Tables tool compared with IPTables in IPv4 networks in light of the resistance to DDoS attacks which is still one of the most significant threats in the IPv6 networks.
Abstract.A lot of attention is now being focused on connectionist models known under the name "reservoir computing". The most prominent example of these approaches is a recurrent neural network architecture called an echo state network (ESN). ESNs were successfully applied in more real-valued time series modeling tasks and performed exceptionally well. Also using ESNs for processing symbolic sequences seems to be attractive. In this work we experimentally support the claim that the state space of ESN is organized according to the Markovian architectural bias principles when processing symbolic sequences. We compare performance of ESNs with connectionist models explicitly using Markovian architectural bias property, with variable length Markov models and with recurrent neural networks trained by advanced training algorithms. Moreover we show that the number of reservoir units plays a similar role as the number of contexts in variable length Markov models.
New method for modeling nonlinear systems called the echo state networks (ESNs) has been proposed recently [5]. ESNs make use of the dynamics created by huge randomly created layer of recurrent units. Dynamical behavior of untrained recurrent networks was already explained in the literature and models using this behavior were studied [6], [9]. They are based on the fact that the activities of the recurrent layer of the recurrent network randomly initialized with small weights reflect history of the inputs presented to the network. Knowing how the recurrent layer stores the information and understanding the state dynamics of recurrent neural networks we propose modified ESN architecture. The only "true" recurrent connections are backward connection from output to recurrent units and the reservoir is built only by "forwardly" connected recurrent units. We show that this simplified version of the ESNs can also be successful in modeling nonlinear systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.