The evolution of neuroplasticity and the effect on integrated information

Autor: Jory Schossau, Leigh Sheneman, Arend Hintze
Jazyk: angličtina
Rok vydání: 2019
Předmět:
Computer science
media_common.quotation_subject
neuroevolution
General Physics and Astronomy
lcsh:Astrophysics
Bioinformatik och systembiologi
Measure (mathematics)
Article
Task (project management)
03 medical and health sciences
Animat
0302 clinical medicine
lcsh:QB460-466
lcsh:Science
autonomous learning
030304 developmental biology
media_common
0303 health sciences
Neuroevolution
Bioinformatics and Systems Biology
business.industry
Integrated information theory
Computer Sciences
Autonomous learning
lcsh:QC1-999
Information integration theory
TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES
Datavetenskap (datalogi)
lcsh:Q
information integration theory
Artificial intelligence
Consciousness
business
Constant (mathematics)
Value (mathematics)
lcsh:Physics
030217 neurology & neurosurgery
Zdroj: Entropy
Volume 21
Issue 5
Entropy, Vol 21, Iss 5, p 524 (2019)
Popis: Information integration theory has been developed to quantify consciousness. Since conscious thought requires the integration of information, the degree of this integration can be used as a neural correlate (&Phi
) with the intent to measure degree of consciousness. Previous research has shown that the ability to integrate information can be improved by Darwinian evolution. The value &Phi
can change over many generations, and complex tasks require systems with at least a minimum &Phi
This work was done using simple animats that were able to remember previous sensory inputs, but were incapable of fundamental change during their lifetime: actions were predetermined or instinctual. Here, we are interested in changes to &Phi
due to lifetime learning (also known as neuroplasticity). During lifetime learning, the system adapts to perform a task and necessitates a functional change, which in turn could change &Phi
One can find arguments to expect one of three possible outcomes: &Phi
might remain constant, increase, or decrease due to learning. To resolve this, we need to observe systems that learn, but also improve their ability to learn over the many generations that Darwinian evolution requires. Quantifying &Phi
over the course of evolution, and over the course of their lifetimes, allows us to investigate how the ability to integrate information changes. To measure &Phi
the internal states of the system must be experimentally observable. However, these states are notoriously difficult to observe in a natural system. Therefore, we use a computational model that not only evolves virtual agents (animats), but evolves animats to learn during their lifetime. We use this approach to show that a system that improves its performance due to feedback learning increases its ability to integrate information. In addition, we show that a system&rsquo
s ability to increase &Phi
correlates with its ability to increase in performance. This suggests that systems that are very plastic regarding &Phi
learn better than those that are not.
Databáze: OpenAIRE