1993
DOI: 10.1117/12.152645
|View full text |Cite
|
Sign up to set email alerts
|

<title>Training autoassociative recurrent neural network with preprocessed training data</title>

Abstract: The Auto-Associative Recurrent Network (AARN), a modified version of the Simple Recurrent Network (SRN) can be trained to behave as recognizer of a language generated by a regular grammar. The network is trained successfully on an unbounded number of sequences of the language, generated randomly from the Finite State Automaton (FSA) of the language. But the training algorithm fails when training is restricted to a fixed finite set of examples. Here, we present a new algorithm for training the AARN from a finit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2004
2004
2004
2004

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…Other research in intention recognition investigated the use of grammar parsing methodology to recognize behavior as matching previously defined sequences of events [Clark, 1994;Wang & Arbib, 1993] while others investigated neural networks to do the same [Maskara & Noetzel, 1993]. Wang & Arbib's model, however, required that the complete pattern be presented before the pattern recognition would occur.…”
Section: Previous Workmentioning
confidence: 99%
“…Other research in intention recognition investigated the use of grammar parsing methodology to recognize behavior as matching previously defined sequences of events [Clark, 1994;Wang & Arbib, 1993] while others investigated neural networks to do the same [Maskara & Noetzel, 1993]. Wang & Arbib's model, however, required that the complete pattern be presented before the pattern recognition would occur.…”
Section: Previous Workmentioning
confidence: 99%