entropy-logo

Journal Browser

Journal Browser

Applications of Information Theory in Neuroscience

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (30 May 2023) | Viewed by 5822

Special Issue Editor


E-Mail Website
Guest Editor
Department of Mathematics, Stockholm University, Kraftriket, 106 91 Stockholm, Sweden
Interests: statistical inference; econometrics; neuroscience; biological information processing and causality problems
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The origins of Information Theory date back to Claude E. Shannon’s publication of the paper “A Mathematical Theory of Communication” in the Bell System Technical Journal in 1948. In terms of the colloquial meaning of information, Shannon’s paper deals with the carriers of information, and not with information itself. However, the importance and flexibility of Shannon’s work was quickly recognized, and many attempts were made to apply his theory in various fields outside its original scope of communications. Very soon after Shannon’s initial publication, several manuscripts were published which provide the foundations of much of the current use of Information Theory in neuroscience.

This Special Issue aims to be a forum for the presentation of novel approaches in neuroscience using Information Theory, as well as the development of new information theoretic results inspired by problems in neuroscience. Research at the interface of neuroscience, Information Theory, and other disciplines is also welcome.

Prof. Dr. Joanna Tyrcha
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  •  information flow
  •  transfer entropy
  •  causality measures
  •  multivariate statistics
  •  time series
  •  statistical inference
  •  neuroscience

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 5468 KiB  
Article
Functional Connectome of the Human Brain with Total Correlation
by Qiang Li, Greg Ver Steeg, Shujian Yu and Jesus Malo
Entropy 2022, 24(12), 1725; https://doi.org/10.3390/e24121725 - 25 Nov 2022
Cited by 5 | Viewed by 3169
Abstract
Recent studies proposed the use of Total Correlation to describe functional connectivity among brain regions as a multivariate alternative to conventional pairwise measures such as correlation or mutual information. In this work, we build on this idea to infer a large-scale (whole-brain) connectivity [...] Read more.
Recent studies proposed the use of Total Correlation to describe functional connectivity among brain regions as a multivariate alternative to conventional pairwise measures such as correlation or mutual information. In this work, we build on this idea to infer a large-scale (whole-brain) connectivity network based on Total Correlation and show the possibility of using this kind of network as biomarkers of brain alterations. In particular, this work uses Correlation Explanation (CorEx) to estimate Total Correlation. First, we prove that CorEx estimates of Total Correlation and clustering results are trustable compared to ground truth values. Second, the inferred large-scale connectivity network extracted from the more extensive open fMRI datasets is consistent with existing neuroscience studies, but, interestingly, can estimate additional relations beyond pairwise regions. And finally, we show how the connectivity graphs based on Total Correlation can also be an effective tool to aid in the discovery of brain diseases. Full article
(This article belongs to the Special Issue Applications of Information Theory in Neuroscience)
Show Figures

Figure 1

23 pages, 391 KiB  
Article
Towards Generalizing the Information Theory for Neural Communication
by János Végh and Ádám József Berki
Entropy 2022, 24(8), 1086; https://doi.org/10.3390/e24081086 - 05 Aug 2022
Cited by 2 | Viewed by 1906
Abstract
Neuroscience extensively uses the information theory to describe neural communication, among others, to calculate the amount of information transferred in neural communication and to attempt the cracking of its coding. There are fierce debates on how information is represented in the brain and [...] Read more.
Neuroscience extensively uses the information theory to describe neural communication, among others, to calculate the amount of information transferred in neural communication and to attempt the cracking of its coding. There are fierce debates on how information is represented in the brain and during transmission inside the brain. The neural information theory attempts to use the assumptions of electronic communication; despite the experimental evidence that the neural spikes carry information on non-discrete states, they have shallow communication speed, and the spikes’ timing precision matters. Furthermore, in biology, the communication channel is active, which enforces an additional power bandwidth limitation to the neural information transfer. The paper revises the notions needed to describe information transfer in technical and biological communication systems. It argues that biology uses Shannon’s idea outside of its range of validity and introduces an adequate interpretation of information. In addition, the presented time-aware approach to the information theory reveals pieces of evidence for the role of processes (as opposed to states) in neural operations. The generalized information theory describes both kinds of communication, and the classic theory is the particular case of the generalized theory. Full article
(This article belongs to the Special Issue Applications of Information Theory in Neuroscience)
Show Figures

Figure 1

Back to TopTop