Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan)
DOI: 10.1109/ijcnn.1993.713981
|View full text |Cite
|
Sign up to set email alerts
|

Fast training of backpropagation networks employing threshold logic transform

Abstract: A back propagation network model employing Threshold Logic Transform (TLT) for faster training is proposed. TL.T, a simple mathematical transformation, is inserted between the input layer and the hidden layer to seed up extraction of complex features of input. When comparing the conventional method versus adding TLT the study of three classification tasks revealed that with TLT the convergence speed is 5 to 20 times faster. The rate of convergence is also much greater. The conventional method is 33.3%, whereas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 1 publication
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?