The vector components of the winning node Wk with minimum distance Di; is then updated as follows where TJ is the learning rate. The topological ordering property is imposed by also updating weight vectors of nodes in the neighbourhood of the winning node. This can be achieved by the following learning rulewhere N j is a neighbourhood function (defining the region around Wk ) based on the topological displacement of neighbouring neuron from the winning neuron. The size of N j decreases as training progresses.In the vast majority of implementations, the SOM input data and neurons are represented by real numbers, making it difficult to implement on a hardware architecture like the Field Programmable Gate Array (FPGA). However, in many applications the data is either presented as a binary string, or may be conveniently recoded as such (a "binary signature"). For example, in image processing applications a bank of Haar filters produces a long binary signature. In this paper we present a new learning algorithm which takes binary inputs and maintains tri-state weights (neuron) in the SOM. We also present the FPGA implementation of this binary Self Organizing Map (bSOM). The bSOM is designed for efficient hardware implementation, having both greatly reduced circuit size compared to a real-valued SOM, and exceptionally fast execution and training times.In section II, we review previous implementations of SOM on hardware architectures. The novel bSOM algorithm is then presented in III, followed by its FPGA implementation in section IV. Section V, presents the experimental results in software and hardware, and we conclude in section VI.During training, the "nearest" neuron prototype vector to the input vector is identified -this is called the "winning" neuron -using a distance metric, D. The Euclidean distance is most frequently used as the metric.For a given network with M neurons and N-dimensional input vector x, the distance for neuron with weight vector Wj (j < M) is given by