2020
DOI: 10.3390/e22010102
|View full text |Cite
|
Sign up to set email alerts
|

Learning in Feedforward Neural Networks Accelerated by Transfer Entropy

Abstract: Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers re… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(15 citation statements)
references
References 26 publications
2
13
0
Order By: Relevance
“…In our study, we add specific TE feedback connections between neurons to improve performance. Our results confirm what we obtained in our previous study on a simple feedforward neural architecture [ 20 ]. Adding the TE feedback parameter accelerates the training process, as fewer epochs are needed.…”
Section: Conclusion and Open Problemssupporting
confidence: 92%
See 4 more Smart Citations
“…In our study, we add specific TE feedback connections between neurons to improve performance. Our results confirm what we obtained in our previous study on a simple feedforward neural architecture [ 20 ]. Adding the TE feedback parameter accelerates the training process, as fewer epochs are needed.…”
Section: Conclusion and Open Problemssupporting
confidence: 92%
“…This can be related to the hierarchy of quickly-changing vs. slowly-changing parameters in learning neural causal models [ 45 ]. We observed that the TE feedback generates stability during training, this being compliant with the results presented in [ 20 ].…”
Section: Conclusion and Open Problemssupporting
confidence: 91%
See 3 more Smart Citations