Proceedings of International Conference on Neural Networks (ICNN'96)
DOI: 10.1109/icnn.1996.549207
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of multilayer and radial basis function neural networks for text-dependent speaker recognition

Abstract: This paper compares the use of multilayer perceptrons (MLPs) trained on back-propagation and radial basis function (RBF) neural networks for the task of text-dependent speaker recognition. 10 classifier networks were generated for each of 20 male speakers using randomly-generated training sets consisting of 6 true speaker utterances and 19 false speaker utterances (one from each of the false speakers). The resulting networks were then used to assess verification and identification performance for each of the n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(8 citation statements)
references
References 8 publications
0
8
0
Order By: Relevance
“…From a comparison of the performance of the MLP network with that of the GRBF network, it is found that the GRBF network is superior to the MLP network [15][16][17][18]. However, this cannot be a general fact because different results can be found with respect to the applied applications or the target systems.…”
Section: Network Methodologymentioning
confidence: 98%
“…From a comparison of the performance of the MLP network with that of the GRBF network, it is found that the GRBF network is superior to the MLP network [15][16][17][18]. However, this cannot be a general fact because different results can be found with respect to the applied applications or the target systems.…”
Section: Network Methodologymentioning
confidence: 98%
“…Especially, it represents the nonlinearity by the mapping from the input vector to basis functions via kernel and the linearity by the weights between the basis function outputs and the model output. Consequently RBF model is known to have the performance similar to the MLP or other nonlinear neural networks while being computationally faster than MLP or other nonlinear neural networks [9] [10]. Gaussian RBF model is the most popularly used and represented in the following functional form: for a given input x , the estimate of the output y is ( )…”
Section: Rbf Model For Score-level Fusionmentioning
confidence: 99%
“…In our framework, where the prediction of a continuous function is required, two kinds of NNs with no feedback loops are used and tested over their performance on the negotiation space: the multilayer perceptron (MLP) and the radial basis function (RBF) ones as the most appropriate for on-line function approximation [13]. The comparison of these two NN architectures has already been studied in various areas of research, such as dynamic systems [14], channel equalization in signal processing [15], voice recognition [16], and whenever efficient, stable and low resource real time estimation is required (given that the utilized NNs have a small number of nodes). Note here, that other optimization techniques, such as genetic algorithms, automated learning automata, simulated annealing, etc., although accurate on achieving the global minimum, require extensive computation and considerably enough resources [17] and this is the reason why they are inappropriate for our area of interest where resources and computational complexity should be kept minimal.…”
Section: Mlp and Rbf Neural Networkmentioning
confidence: 99%