entropy-logo

Journal Browser

Journal Browser

Information Theory in Computational Neuroscience

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (30 November 2020) | Viewed by 18594

Special Issue Editors


E-Mail Website
Guest Editor
Institute for Theoretical Physics, Goethe University Frankfurt/Main, Frankfurt, Germany
Interests: computational neuroscience; self-organized robots; information theory; complex and cognitive systems
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Science, Technische Universität Dresden, 01062 Dresden, Germany
Interests: computational and cognitive neuroscience; machine learning; complex systems

Special Issue Information

Dear Colleagues,

Brains empower living organisms with extraordinary information processing capabilities. Complex spatio-temporal dependencies present in their environments may be learned, long- and short-term memories of experiences and plans regarding the future. Importantly, information processing in the brain happens over multiple spatio-temporal scales, from single synapses and synaptic terminals to dendritic trees and neuronal bodies, all the way up to neuronal networks and large brain areas.

Hence, it is not surprising that information theory has led to many exciting developments in computational neuroscience, providing tools essential for our modern understanding of the computational principles that govern the development, structure, physiology, and dynamics of the nervous system.

In this Special Issue, we aim to bring together neuronal models and neuronal plasticity mechanisms that are grounded in information theory principles, modern inference, and learning algorithms. We welcome submissions that use information theory as the basis for defining generative principles of neuronal dynamics over multiple spatio-temporal scales, which informs our understanding of information processing in the brain.

Prof. Dr. Claudius Gros
Dr. Dimitrije Marković
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory;
  • computational neuroscience;
  • brain complexity;
  • generating functional;
  • spatio-temporal dependencies

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 4179 KiB  
Article
Consciousness Detection in a Complete Locked-in Syndrome Patient through Multiscale Approach Analysis
by Shang-Ju Wu, Nicoletta Nicolaou and Martin Bogdan
Entropy 2020, 22(12), 1411; https://doi.org/10.3390/e22121411 - 15 Dec 2020
Cited by 8 | Viewed by 3207
Abstract
Completely locked-in state (CLIS) patients are unable to speak and have lost all muscle movement. From the external view, the internal brain activity of such patients cannot be easily perceived, but CLIS patients are considered to still be conscious and cognitively active. Detecting [...] Read more.
Completely locked-in state (CLIS) patients are unable to speak and have lost all muscle movement. From the external view, the internal brain activity of such patients cannot be easily perceived, but CLIS patients are considered to still be conscious and cognitively active. Detecting the current state of consciousness of CLIS patients is non-trivial, and it is difficult to ascertain whether CLIS patients are conscious or not. Thus, it is important to find alternative ways to re-establish communication with these patients during periods of awareness, and one such alternative is through a brain–computer interface (BCI). In this study, multiscale-based methods (multiscale sample entropy, multiscale permutation entropy and multiscale Poincaré plots) were applied to analyze electrocorticogram signals from a CLIS patient to detect the underlying consciousness level. Results from these different methods converge to a specific period of awareness of the CLIS patient in question, coinciding with the period during which the CLIS patient is recorded to have communicated with an experimenter. The aim of the investigation is to propose a methodology that could be used to create reliable communication with CLIS patients. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

22 pages, 3442 KiB  
Article
What Can Local Transfer Entropy Tell Us about Phase-Amplitude Coupling in Electrophysiological Signals?
by Ramón Martínez-Cancino, Arnaud Delorme, Johanna Wagner, Kenneth Kreutz-Delgado, Roberto C. Sotero and Scott Makeig
Entropy 2020, 22(11), 1262; https://doi.org/10.3390/e22111262 - 06 Nov 2020
Cited by 9 | Viewed by 3631
Abstract
Modulation of the amplitude of high-frequency cortical field activity locked to changes in the phase of a slower brain rhythm is known as phase-amplitude coupling (PAC). The study of this phenomenon has been gaining traction in neuroscience because of several reports on its [...] Read more.
Modulation of the amplitude of high-frequency cortical field activity locked to changes in the phase of a slower brain rhythm is known as phase-amplitude coupling (PAC). The study of this phenomenon has been gaining traction in neuroscience because of several reports on its appearance in normal and pathological brain processes in humans as well as across different mammalian species. This has led to the suggestion that PAC may be an intrinsic brain process that facilitates brain inter-area communication across different spatiotemporal scales. Several methods have been proposed to measure the PAC process, but few of these enable detailed study of its time course. It appears that no studies have reported details of PAC dynamics including its possible directional delay characteristic. Here, we study and characterize the use of a novel information theoretic measure that may address this limitation: local transfer entropy. We use both simulated and actual intracranial electroencephalographic data. In both cases, we observe initial indications that local transfer entropy can be used to detect the onset and offset of modulation process periods revealed by mutual information estimated phase-amplitude coupling (MIPAC). We review our results in the context of current theories about PAC in brain electrical activity, and discuss technical issues that must be addressed to see local transfer entropy more widely applied to PAC analysis. The current work sets the foundations for further use of local transfer entropy for estimating PAC process dynamics, and extends and complements our previous work on using local mutual information to compute PAC (MIPAC). Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

14 pages, 1842 KiB  
Article
A Method to Present and Analyze Ensembles of Information Sources
by Nicholas M. Timme, David Linsenbardt and Christopher C. Lapish
Entropy 2020, 22(5), 580; https://doi.org/10.3390/e22050580 - 21 May 2020
Viewed by 3259
Abstract
Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with [...] Read more.
Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide estimates of information shared between variables (forming a network between variables), or between neural variables and other variables (e.g., behavior or sensory stimuli). However, it can be difficult to (1) evaluate if the ensemble is significantly different from what would be expected in a purely noisy system and (2) determine if two ensembles are different. Herein, we introduce relatively simple methods to address these problems by analyzing ensembles of information sources. We demonstrate how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise. Next, we show how two ensembles can be compared using a randomization process to determine if the sources in one contain more information than the other. All code necessary to carry out these analyses and demonstrations are provided. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

12 pages, 303 KiB  
Article
Limitations to Estimating Mutual Information in Large Neural Populations
by Jan Mölter and Geoffrey J. Goodhill
Entropy 2020, 22(4), 490; https://doi.org/10.3390/e22040490 - 24 Apr 2020
Cited by 3 | Viewed by 4181
Abstract
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily [...] Read more.
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

21 pages, 2602 KiB  
Article
Early Detection of Alzheimer’s Disease: Detecting Asymmetries with a Return Random Walk Link Predictor
by Manuel Curado, Francisco Escolano, Miguel A. Lozano and Edwin R. Hancock
Entropy 2020, 22(4), 465; https://doi.org/10.3390/e22040465 - 19 Apr 2020
Cited by 9 | Viewed by 3392
Abstract
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially [...] Read more.
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially offer the potential to capture asymmetries in the interactions between different anatomical brain regions. The detection of these asymmetries is relevant to detect the disease in an early stage. For this reason, in this paper, we analyze data extracted from fMRI images using the net4Lap algorithm to infer a directed graph from the available BOLD signals, and then seek to determine asymmetries between the left and right hemispheres of the brain using a directed version of the Return Random Walk (RRW). Experimental evaluation of this method reveals that it leads to the identification of anatomical brain regions known to be implicated in the early development of Alzheimer’s disease in clinical studies. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

Back to TopTop