MyJournals Home  

RSS FeedsEntropy, Vol. 22, Pages 102: Learning in Feedforward Neural Networks Accelerated by Transfer Entropy (Entropy)

 
 

16 january 2020 19:03:22

 
Entropy, Vol. 22, Pages 102: Learning in Feedforward Neural Networks Accelerated by Transfer Entropy (Entropy)
 


Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers reporting applications of causality or TE in neural networks. Our contribution is an information-theoretical method for analyzing information transfer between the nodes of feedforward neural networks. The information transfer is measured by the TE of feedback neural connections. Intuitively, TE measures the relevance of a connection in the network and the feedback amplifies this connection. We introduce a backpropagation type training algorithm that uses TE feedback connections to improve its performance.


 
208 viewsCategory: Informatics, Physics
 
Entropy, Vol. 22, Pages 104: Complexity of Cardiotocographic Signals as A Predictor of Labor (Entropy)
Entropy, Vol. 22, Pages 113: Entropy-Based Effect Evaluation of Delineators in Tunnels on Drivers` Gaze Behavior (Entropy)
 
 
blog comments powered by Disqus


MyJournals.org
The latest issues of all your favorite science journals on one page

Username:
Password:

Register | Retrieve

Search:

Physics


Copyright © 2008 - 2024 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten