Neural Networks are massively parallel processing systems, that require expensive and usually not available hardware, in order to be realized. Fortunately, the development of effective and accessible software, makes their simulation easy. Thus, various neural network's implementation tools exist in the market, which are oriented to the specific learning algorithm used. Furthermore, they can simulate only fixed size networks. In this work, we present some object-oriented techniques that have been used to defined some types of neuron and network objects, that can be used to realize, in a localized approach, some fast and powerful learning algorithms which combine results of the optimal filtering and the multi-model partitioning theory. Thus, one can build and implement intelligent learning algorithms that face both, the training as well as the on-line adjustment of the network size. Furthermore, the design methodology used, results to a system modeled as a collection of concurrent executable objects, making easy the parallel implementation. The whole design results in a general purpose tool box which is characterized by maintainability, reusability, and increased modularity. The provided features are shown by the presentation of some practical applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.