1995
DOI: 10.1007/3-540-60607-6_4
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent fuzzy logic using neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

1998
1998
2008
2008

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 5 publications
0
11
0
Order By: Relevance
“…A variety of implementations of FFA have been proposed, some in digital systems [39], [40]. However, here we give a proof that such implementations in sigmoid activation RNN's are stable, i.e., guaranteed to converge to the correct prespecified membership.…”
Section: B Backgroundmentioning
confidence: 87%
See 1 more Smart Citation
“…A variety of implementations of FFA have been proposed, some in digital systems [39], [40]. However, here we give a proof that such implementations in sigmoid activation RNN's are stable, i.e., guaranteed to converge to the correct prespecified membership.…”
Section: B Backgroundmentioning
confidence: 87%
“…There has been much work on the learning, synthesis, and extraction of finite state automata in recurrent neural networks (see, for example, [46]- [53]). A variety of neural network implementations of FFA have been proposed [39], [40], [54], [55]. We have previously shown how fuzzy finite state automata can be mapped into recurrent neural networks with second-order weights using a crisp representation 2 of FFA states [56].…”
Section: B Backgroundmentioning
confidence: 99%
“…Two examples of such a synergy are the adaptive network-based fuzzy inference system (ANFIS) [22], which is a feedforward network representation of the fuzzy reasoning process (see [34] for an extension to RNNs), and the fuzzy-MLP, which is a feedforward network with fuzzified inputs [40], [46]. Neuro-fuzzy systems have been intensively discussed in the literature (see the survey [39]), but rarely do they contain feedback connections [23].…”
Section: Introductionmentioning
confidence: 99%
“…Parameter tuning concerns with adjustments of the position and the shape of membership functions (fuzzy weights). Other existing feedback fuzzy neural approaches (e.g., [18,26]) concentrated on parameter tuning, selecting the structure on a trial-and-error basis. However, there are no simple ways to determine in advance the minimal size of the partition of the input's universe of discourse or the minimal size of the hidden (rule) layer necessary to achieve the desired performance.…”
Section: Introductionmentioning
confidence: 99%