The development of computer network technologies increased the importance of network professional's education and resulted in extensive research in the field of computer science education. Virtual networking laboratories have been very popular in the past few years, especially for training and education of network professionals. These laboratories are emulated computer network environments, based on virtualization technology, and have a number of advantages comparing to classical laboratories. Even with a large number of developed laboratory platforms, there are a small number of attempts to evaluate these systems and to find out how effective they are. This paper presents an approach to the evaluation of the virtual networking laboratory, primarily by the evaluation of its exercises used for a computer-networking course at the university. We are going to employ an approach based on the relation of indiscernibility and the rough set theory in order to produce a model based on ifÀthen rules, starting from low-level data. This model helps to identify the most important parts and eventual mistakes in the evaluated exercises, creating the possibility for their improvement and overall student's achievement as well.
The preprocessing of data is an important task in rough set theory as well as in Entropy. The discretization of data as part of the preprocessing of data is a very influential process. Is there a connection between the segmentation of the data histogram and data discretization? The authors propose a novel data segmentation technique based on a histogram with regard to the quality of a data discretization. The significance of a cut’s position has been researched on several groups of histograms. A data set reduct was observed with respect to the histogram type. Connections between the data histograms and cuts, reduct and the classification rules have been researched. The result is that the reduct attributes have a more irregular histogram than attributes out of the reduct. The following discretization algorithms were used: the entropy algorithm and the Maximal Discernibility algorithm developed in rough set theory. This article presents the Cuts Selection Method based on histogram segmentation, reduct of data and MD algorithm of discretization. An application on the selected database shows that the benefits of a selection of cuts relies on histogram segmentation. The results of the classification were compared with the results of the Naïve Bayes algorithm.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.