Information security is an important and growing need. The most common schemes used for detection systems include pattern-or signature-based and anomaly-based. Anomaly-based schemes use a set of metrics, which outline the normal system behavior and any significant deviation from the established profile will be treated as an anomaly. This paper contributes with an anomaly-based scheme that monitors the bandwidth consumption of a subnetwork, at the Universidad Michoacana, in Mexico. A normal behavior model is based on bandwidth consumption of the subnetwork. The presence of an anomaly indicates that something is misusing the network (viruses, worms, denial of service, or any other kind of attack). This work also presents a scheme for an automatic architecture design and parameters optimization of Hidden Markov Models (HMMs), based on Evolutionary Programming (EP). The variables to be used by the HMMs are: the bandwidth consumption of network (IN and OUT), and the associated time where the network activity occurs. The system was tested with univariate and bivariate observation sequences to analyze and detect anomaly behavior. The HMMs, designed and trained by EP, were compared against semi-random HMMs trained by the Baum-Welch algorithm. On a second experiment, the HMMs, designed and trained by EP, were compared against HMMs created by an expert user. The HMMs outperformed the other methods in all cases. Finally, we made the HMMs time-aware, by including time as another variable. This inclusion made the HMMs capable of detecting activity patterns that are normal during a period of time but anomalous at other times. For instance, a heavy load on the network may be completely normal during working times, but anomalous at nights or weekends.speech and gesture recognition, as well as applications to anomaly detection. In an HMM, the estimation of good model parameters affects the performance of the recognition or detection processes [3]. The HMM parameters can be determined during an iterative process called "training". The Baum-Welch algorithm [4] is one method applied in setting an HMM's parameters, but this method has a drawback: being a gradient-based method, it may converge to a local optimum.Global search techniques can be used to optimize an HMM's parameters. Genetic Algorithms (GA) and Evolutionary Programming (EP) are global optimization techniques that can be used to optimize an HMM's parameters [3]. Studies of using GA to train HMMs are relatively rare, particularly in comparison with the large literature on applying GA to neural network training [2].In the study presented in this paper we want to know (given certain events or behaviors of the study objects), what is expected or what can be anticipated as output. That is, we are interested in studying the behavior of a set of random variables over time, working with stochastic processes. In other words, given the behavior of a computer network, the problem is to automatically develop an HMM capable of discerning between normal and anomalous behavior. This tas...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.