MyJournals Home  

RSS FeedsEntropy, Vol. 21, Pages 524: The Evolution of Neuroplasticity and the Effect on Integrated Information (Entropy)

 
 

24 may 2019 12:00:34

 
Entropy, Vol. 21, Pages 524: The Evolution of Neuroplasticity and the Effect on Integrated Information (Entropy)
 


Information integration theory has been developed to quantify consciousness. Since conscious thought requires the integration of information, the degree of this integration can be used as a neural correlate (Φ) with the intent to measure degree of consciousness. Previous research has shown that the ability to integrate information can be improved by Darwinian evolution. The value Φ can change over many generations, and complex tasks require systems with at least a minimum Φ . This work was done using simple animats that were able to remember previous sensory inputs, but were incapable of fundamental change during their lifetime: actions were predetermined or instinctual. Here, we are interested in changes to Φ due to lifetime learning (also known as neuroplasticity). During lifetime learning, the system adapts to perform a task and necessitates a functional change, which in turn could change Φ . One can find arguments to expect one of three possible outcomes: Φ might remain constant, increase, or decrease due to learning. To resolve this, we need to observe systems that learn, but also improve their ability to learn over the many generations that Darwinian evolution requires. Quantifying Φ over the course of evolution, and over the course of their lifetimes, allows us to investigate how the ability to integrate information changes. To measure Φ , the internal states of the system must be experimentally observable. However, these states are notoriously difficult to observe in a natural system. Therefore, we use a computational model that not only evolves virtual agents (animats), but evolves animats to learn during their lifetime. We use this approach to show that a system that improves its performance due to feedback learning increases its ability to integrate information. In addition, we show that a system’s ability to increase Φ correlates with its ability to increase in performance. This suggests that systems that are very plastic regarding Φ learn better than those that are not.


 
88 viewsCategory: Informatics, Physics
 
Entropy, Vol. 21, Pages 521: Is Independence Necessary for a Discontinuous Phase Transition within the q-Voter Model? (Entropy)
Entropy, Vol. 21, Pages 523: Image Entropy for the Identification of Chimera States of Spatiotemporal Divergence in Complex Coupled Maps of Matrices (Entropy)
 
 
blog comments powered by Disqus


MyJournals.org
The latest issues of all your favorite science journals on one page

Username:
Password:

Register | Retrieve

Search:

Physics


Copyright © 2008 - 2024 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten