Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
DOI: 10.1109/ijcnn.2005.1556106
|View full text |Cite
|
Sign up to set email alerts
|

Dynamical consistent recurrent neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…The outliers with a normalized error larger than one are due to sensors with an almost constant sensor value whose variation is mainly due to noise. This result indicates that there is even more room for duty cycle reduction for this example: Listening to the neighbors only every fourth interval yields a duty cycle reduction of 4 4 4 k k   ; with 4 neighbors this gives a reduction factor of 0.4, with 9 neighbors it gives 0.325 and the large-k limit is 0.25.…”
Section: Duty Cycle Adaptation In Wireless Sensor Networkmentioning
confidence: 86%
See 1 more Smart Citation
“…The outliers with a normalized error larger than one are due to sensors with an almost constant sensor value whose variation is mainly due to noise. This result indicates that there is even more room for duty cycle reduction for this example: Listening to the neighbors only every fourth interval yields a duty cycle reduction of 4 4 4 k k   ; with 4 neighbors this gives a reduction factor of 0.4, with 9 neighbors it gives 0.325 and the large-k limit is 0.25.…”
Section: Duty Cycle Adaptation In Wireless Sensor Networkmentioning
confidence: 86%
“…Examples can be found in [3] or [4]. However, traditional approaches like Elman or standard recurrent neural networks (SRN) [5], time delay neural networks (TDNN) [6], block-diagonal recurrent neural networks (BDRNN) [7] or echo state neural networks (ESN) [8] have deficiencies with respect to the requirements above.…”
Section: Introductionmentioning
confidence: 99%
“…The development of deep neural networks, especially recurrent neural networks (RNNs) with memory ability, provides novel ideas to solve the problems of IMM-based algorithms [ 6 , 7 , 8 , 9 ]. The RNN [ 10 ] and long short-term memory (LSTM) networks [ 11 ] can estimate the state from the observation at each time step [ 6 , 12 ]. Nevertheless, the LSTM and RNN can only process the input sequence sequentially, resulting in long-distance memory fading problems [ 9 ].…”
Section: Introductionmentioning
confidence: 99%
“…Neural networks, and in particular recurrent neural networks, have proven their suitability at least for offline learning forecast tasks. Examples can be found in [3] or [4]. However, traditional approaches like Elman or standard recurrent neural networks (SRN) [5], time delay neural networks (TDNN) [6], block-diagonal recurrent neural networks (BDRNN) [7] or echo state neural networks (ESN) [8] have deficiencies with respect to the requirements above.…”
Section: Introductionmentioning
confidence: 99%