MyJournals Home  

RSS FeedsEntropy, Vol. 22, Pages 387: Statistical Approaches for the Analysis of Dependency Among Neurons Under Noise (Entropy)

 
 

28 march 2020 17:00:17

 
Entropy, Vol. 22, Pages 387: Statistical Approaches for the Analysis of Dependency Among Neurons Under Noise (Entropy)
 


Neuronal noise is a major factor affecting the communication between coupled neurons. In this work, we propose a statistical toolset to infer the coupling between two neurons under noise. We estimate these statistical dependencies from data which are generated by a coupled Hodgkin–Huxley (HH) model with additive noise. To infer the coupling using observation data, we employ copulas and information-theoretic quantities, such as the mutual information (MI) and the transfer entropy (TE). Copulas and MI between two variables are symmetric quantities, whereas TE is asymmetric. We demonstrate the performances of copulas and MI as functions of different noise levels and show that they are effective in the identification of the interactions due to coupling and noise. Moreover, we analyze the inference of TE values between neurons as a function of noise and conclude that TE is an effective tool for finding out the direction of coupling between neurons under the effects of noise.


 
190 viewsCategory: Informatics, Physics
 
Entropy, Vol. 22, Pages 388: A Low Complexity Near-Optimal Iterative Linear Detector For Massive MIMO In Realistic Radio Channels of 5G Communication Systems (Entropy)
Entropy, Vol. 22, Pages 390: Quaternion Valued Risk Diversification (Entropy)
 
 
blog comments powered by Disqus


MyJournals.org
The latest issues of all your favorite science journals on one page

Username:
Password:

Register | Retrieve

Search:

Physics


Copyright © 2008 - 2024 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten