Proceedings 11th International Workshop on Database and Expert Systems Applications
DOI: 10.1109/dexa.2000.875094
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing the parSOM neural network implementation for data mining with distributed memory systems and cluster computing

Abstract: The self-organizing map is a prominent unsupervised neural network model which lends itself to the analysis of high-dimensional input data and data mining applications. However; the high execution times required to train the map put a limit to its application in many high-performance data analysis application domains.In this paper we discuss the ,,&OM imp!ementation, a software-based parallel implementation of the seljorganizing map, and its optimization for the analysis of high-dimensional input data using di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 7 publications
0
7
0
Order By: Relevance
“…(a) the network partitioning schemes are more suitable for a multiprocessor environment where the communication overhead may be greatly reduced thanks to the high speed of the common bus shared by the processors, as shown in [16,30,31]; (b) the data partitioning schemes become appropriate when the parallel SOM is executed in loosely coupled systems, as shown in [32] and in [33], although the proposed implementations exploit only 60%-80% of the overall speed-up due to the increase of the communication overhead at the increase of the number of CEs [34].…”
Section: Parallel Somsmentioning
confidence: 99%
See 1 more Smart Citation
“…(a) the network partitioning schemes are more suitable for a multiprocessor environment where the communication overhead may be greatly reduced thanks to the high speed of the common bus shared by the processors, as shown in [16,30,31]; (b) the data partitioning schemes become appropriate when the parallel SOM is executed in loosely coupled systems, as shown in [32] and in [33], although the proposed implementations exploit only 60%-80% of the overall speed-up due to the increase of the communication overhead at the increase of the number of CEs [34].…”
Section: Parallel Somsmentioning
confidence: 99%
“…P2 = 0. 30 The results for the dataset which will be used also in the case study are reported in Table 9 where the items were clustered following the F1 similarity by the WR WM PSOM executed by 9 slaves. It is important to note that the positive feature matching in average between the sequential and the parallel SOM is very high (about 90%-94%).…”
Section: Feature Analysismentioning
confidence: 99%
“…These wide varieties of examples demonstrate that considerable effort has been placed on applying these multivariate tools. Optimizations to par SOM, a software-based parallel implementation of the SOM was first introduced by Tomsich et al (2000) which provides a better performance compared to other implementation attempts such as the one reported in Boniface et al (1999).…”
Section: Introductionmentioning
confidence: 99%
“…A parallel neural network can be constructed using a variety of different methods (Standish, 1999;Schikuta, 1997;Serbedzija, 1996;Schikuta et al, 2000;Misra, 1992;Misra, 1997), such as the parallel virtual machines (PVM) (Quoy, 2000), the message passing interface (MPI) (Snir et al, 1998;Cropp et al, 1998;Pacheco, 1997), the shared memory model and the implicit parallelization with parallel compiler directives . Concerning the network types that have been paralellized by one of these methods, they cover a very broad range from the supervised back propagation network (Torresen et al, 1994;Torresen and Tomita, 1998;Kumar, 1994) to the unsupervised self-organizing maps (Weigang et al, 1999;Tomsich et al, 2000). In this research the counterpropagation network is parallelized by means of the message passing interface library (Pacheco, 1997).…”
Section: Introductionmentioning
confidence: 99%