MyJournals Home  

RSS FeedsEntropy, Vol. 23, Pages 1218: Learning in Convolutional Neural Networks Accelerated by Transfer Entropy (Entropy)


16 september 2021 08:28:22

Entropy, Vol. 23, Pages 1218: Learning in Convolutional Neural Networks Accelerated by Transfer Entropy (Entropy)

Recently, there is a growing interest in applying Transfer Entropy (TE) in quantifying the effective connectivity between artificial neurons. In a feedforward network, the TE can be used to quantify the relationships between neuron output pairs located in different layers. Our focus is on how to include the TE in the learning mechanisms of a Convolutional Neural Network (CNN) architecture. We introduce a novel training mechanism for CNN architectures which integrates the TE feedback connections. Adding the TE feedback parameter accelerates the training process, as fewer epochs are needed. On the flip side, it adds computational overhead to each epoch. According to our experiments on CNN classifiers, to achieve a reasonable computational overhead–accuracy trade-off, it is efficient to consider only the inter-neural information transfer of the neuron pairs between the last two fully connected layers. The TE acts as a smoothing factor, generating stability and becoming active only periodically, not after processing each input sample. Therefore, we can consider the TE is in our model a slowly changing meta-parameter.

68 viewsCategory: Informatics, Physics
Entropy, Vol. 23, Pages 1217: Fault Feature Extraction for Reciprocating Compressors Based on Underdetermined Blind Source Separation (Entropy)
Entropy, Vol. 23, Pages 1219: Hyperbolically Symmetric Versions of Lemaitre–Tolman–Bondi Spacetimes (Entropy)
blog comments powered by Disqus
The latest issues of all your favorite science journals on one page


Register | Retrieve



Copyright © 2008 - 2021 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten