1999
DOI: 10.1109/72.774275
|View full text |Cite
|
Sign up to set email alerts
|

Circular backpropagation networks embed vector quantization

Abstract: This letter proves the equivalence between vector quantization (VQ) classifiers and circular backpropagation (CBP) networks. The calibrated prototypes for a VQ schema can be plugged in a CBP feedforward structure having the same number of hidden neurons and featuring the same mapping. The letter describes how to exploit such equivalence by using VQ prototypes to perform a meaningful initialization for BP optimization. The approach effectiveness was tested considering a real classification problem (NIST handwri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
16
0

Year Published

2002
2002
2015
2015

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(16 citation statements)
references
References 8 publications
0
16
0
Order By: Relevance
“…Sandro Ridella et al [25][26][27] proposed the CBP neural network through adding an extra node to the original BP input layer and taking the sum of all squared components of an input vector presented to the network as an incoming signal of the added node. The authors proved that CBP possesses favorable capabilities in generalization and adaptability compared to the multi-layer perceptron (MLP) model [25][26][27]. Under the CBP framework, both the vector quantization (VQ) [26,31] and the radial basis function (RBF) networks [27] can respectively be constructed, and hence CBP shows great flexibility.…”
Section: Icbp Neural Network Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Sandro Ridella et al [25][26][27] proposed the CBP neural network through adding an extra node to the original BP input layer and taking the sum of all squared components of an input vector presented to the network as an incoming signal of the added node. The authors proved that CBP possesses favorable capabilities in generalization and adaptability compared to the multi-layer perceptron (MLP) model [25][26][27]. Under the CBP framework, both the vector quantization (VQ) [26,31] and the radial basis function (RBF) networks [27] can respectively be constructed, and hence CBP shows great flexibility.…”
Section: Icbp Neural Network Modelmentioning
confidence: 99%
“…The authors proved that CBP possesses favorable capabilities in generalization and adaptability compared to the multi-layer perceptron (MLP) model [25][26][27]. Under the CBP framework, both the vector quantization (VQ) [26,31] and the radial basis function (RBF) networks [27] can respectively be constructed, and hence CBP shows great flexibility. However, several deficiencies exist: (1) The incoming signal of the added node only is an isotropic, i.e., an equally weighted, sum of all squared component values, thus it lacks anisotropy among different components for an input vector; (2) due to such isotropy, it cannot simulate the famous Bayesian classifier in a more direct way; (3) it requires probably more hidden nodes to approximate any continuous function to arbitrary precision.…”
Section: Icbp Neural Network Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Based on EBP, Sandro Ridella et al [2][3][4] proposed a circular back-propogation neural network (CBP) [2,3,4] through adding or augmenting an extra node to the original back-propogation (BP) input layer and taking the sum of all squared components of an input vector presented to the network as a incoming signal of the added node. The authors proved that the CBP possesses favorable capabilities with respect to generalization and adaptability when compared with the MLP model [2][3][4]. Under the CBP framework, both vector quantization (VQ) [4,5] and the radial basis function (RBF) networks [3] can be constructed, hence the CBP shows great flexibility.…”
Section: Introductionmentioning
confidence: 99%
“…On its basis, Sandro Ridella and Stefano Rovetta presented circular back-propagation (CBP) network. CBP adds to its input layer an extra node having an input of the sum of the squared input components [1,2]. With the same architecture of MLP, CBP possesses the following merits: (1) CBP for pattern recognition can switch automatically between prototype-based and surface-based classifiers; (2) CBP takes RBF network as its special example.…”
Section: Introductionmentioning
confidence: 99%