Dynamics and Bifurcations in Mathematical Neuroscience: Analysis, Modeling and Applications

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Mathematical Biology".

Deadline for manuscript submissions: closed (30 April 2023) | Viewed by 10779

Special Issue Editors


E-Mail Website
Guest Editor
1. Institute of Biology and Biomedicine, Lobachevsky State University of Nizhny Novgorod, 603950 Nizhny Novgorod, Russia
2. Center for Technologies in Robotics and Mechatronics Components, Innopolis University, 420500 Innopolis, Russia
Interests: computational neuroscience; nonlinear dynamics; EEG analysis; biophysics; applied neuroengineering

E-Mail Website
Guest Editor
1. Britton Chance Center for Biomedical Photonics, Wuhan National Laboratory for Optoelectronics-Huazhong University of Science and Technology, Wuhan 430074, China
2. MoE Key Laboratory for Biomedical Photonics, School of Engineering Sciences, Huazhong University of Science and Technology, Wuhan 430074, China
Interests: computational neuroscience; neuroimaging; image processing; data analysis

Special Issue Information

Dear Colleagues,

For many years, mathematics and computational methods have played an important role in our understanding of the nervous system. The goal of this Special Issue is to present some examples of how mathematical techniques can be applied at a variety of levels to increase our understanding of neural systems.

Mathematical neuroscience is an interdisciplinary field that combines analytical methods and computer simulations with experimental neuroscience to develop, simulate, and study multiscale models and theories of neural function—from the level of molecules, through cells and networks, and up to cognition and behavior. We welcome the submission of papers that introduce advanced mathematical techniques to illuminate these questions, including those that contain comparative studies, statistical data analysis, mathematical proofs, computer simulations, experiments, field observations, or even philosophical arguments, which are all methods to support or reject theoretical ideas. A clear statement of the biological significance of the problem being studied will be appreciated.

Dr. Susanna Gordleeva
Dr. Shangbin Chen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Mathematical, computational, biophysical and statistical modeling
  • Cell biology
  • Developmental biology
  • Microbiology, molecular biology, and biochemistry
  • Networks and complex systems
  • Animal behavior and game theory
  • Intracellular intelligence
  • Analysis of EEG signals
  • Analysis of multidimensional time series
  • Nonlinear dynamics
  • Brain–computer interfaces

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 3880 KiB  
Article
Dynamic Image Representation in a Spiking Neural Network Supplied by Astrocytes
by Sergey V. Stasenko and Victor B. Kazantsev
Mathematics 2023, 11(3), 561; https://doi.org/10.3390/math11030561 - 20 Jan 2023
Cited by 12 | Viewed by 1810
Abstract
The mathematical model of the spiking neural network (SNN) supplied by astrocytes is investigated. The astrocytes are a specific type of brain cells which are not electrically excitable but induce chemical modulations of neuronal firing. We analyze how the astrocytes influence images encoded [...] Read more.
The mathematical model of the spiking neural network (SNN) supplied by astrocytes is investigated. The astrocytes are a specific type of brain cells which are not electrically excitable but induce chemical modulations of neuronal firing. We analyze how the astrocytes influence images encoded in the form of the dynamic spiking pattern of the SNN. Serving at a much slower time scale, the astrocytic network interacting with the spiking neurons can remarkably enhance the image representation quality. The spiking dynamics are affected by noise distorting the information image. We demonstrate that the activation of astrocytes can significantly suppress noise influence, improving the dynamic image representation by the SNN. Full article
Show Figures

Figure 1

14 pages, 5332 KiB  
Article
Spatial Computing in Modular Spiking Neural Networks with a Robotic Embodiment
by Sergey A. Lobov, Alexey N. Mikhaylov, Ekaterina S. Berdnikova, Valeri A. Makarov and Victor B. Kazantsev
Mathematics 2023, 11(1), 234; https://doi.org/10.3390/math11010234 - 03 Jan 2023
Cited by 3 | Viewed by 1828
Abstract
One of the challenges in modern neuroscience is creating a brain-on-a-chip. Such a semiartificial device based on neural networks grown in vitro should interact with the environment when embodied in a robot. A crucial point in this endeavor is developing a neural network [...] Read more.
One of the challenges in modern neuroscience is creating a brain-on-a-chip. Such a semiartificial device based on neural networks grown in vitro should interact with the environment when embodied in a robot. A crucial point in this endeavor is developing a neural network architecture capable of associative learning. This work proposes a mathematical model of a midscale modular spiking neural network (SNN) to study learning mechanisms within the brain-on-a-chip context. We show that besides spike-timing-dependent plasticity (STDP), synaptic and neuronal competitions are critical factors for successful learning. Moreover, the shortest pathway rule can implement the synaptic competition responsible for processing conditional stimuli coming from the environment. This solution is ready for testing in neuronal cultures. The neuronal competition can be implemented by lateral inhibition actuating over the SNN modulus responsible for unconditional responses. Empirical testing of this approach is challenging and requires the development of a technique for growing cultures with a given ratio of excitatory and inhibitory neurons. We test the modular SNN embedded in a mobile robot and show that it can establish the association between touch (unconditional) and ultrasonic (conditional) sensors. Then, the robot can avoid obstacles without hitting them, relying on ultrasonic sensors only. Full article
Show Figures

Figure 1

20 pages, 24260 KiB  
Article
Impact of Astrocytic Coverage of Synapses on the Short-Term Memory of a Computational Neuron-Astrocyte Network
by Zonglun Li, Yuliya Tsybina, Susanna Gordleeva and Alexey Zaikin
Mathematics 2022, 10(18), 3275; https://doi.org/10.3390/math10183275 - 09 Sep 2022
Cited by 1 | Viewed by 1649
Abstract
Working memory refers to the capability of the nervous system to selectively retain short-term memories in an active state. The long-standing viewpoint is that neurons play an indispensable role and working memory is encoded by synaptic plasticity. Furthermore, some recent studies have shown [...] Read more.
Working memory refers to the capability of the nervous system to selectively retain short-term memories in an active state. The long-standing viewpoint is that neurons play an indispensable role and working memory is encoded by synaptic plasticity. Furthermore, some recent studies have shown that calcium signaling assists the memory processes and the working memory might be affected by the astrocyte density. Over the last few decades, growing evidence has also revealed that astrocytes exhibit diverse coverage of synapses which are considered to participate in neuronal activities. However, very little effort has yet been made to attempt to shed light on the potential correlations between these observations. Hence, in this article, we leverage a computational neuron–astrocyte model to study the short-term memory performance subject to various astrocytic coverage and we demonstrate that the short-term memory is susceptible to this factor. Our model may also provide plausible hypotheses for the various sizes of calcium events as they are reckoned to be correlated with the astrocytic coverage. Full article
Show Figures

Figure 1

25 pages, 3776 KiB  
Article
Explainable Machine Learning Methods for Classification of Brain States during Visual Perception
by Robiul Islam, Andrey V. Andreev, Natalia N. Shusharina and Alexander E. Hramov
Mathematics 2022, 10(15), 2819; https://doi.org/10.3390/math10152819 - 08 Aug 2022
Cited by 6 | Viewed by 2064
Abstract
The aim of this work is to find a good mathematical model for the classification of brain states during visual perception with a focus on the interpretability of the results. To achieve it, we use the deep learning models with different activation functions [...] Read more.
The aim of this work is to find a good mathematical model for the classification of brain states during visual perception with a focus on the interpretability of the results. To achieve it, we use the deep learning models with different activation functions and optimization methods for their comparison and find the best model for the considered dataset of 31 EEG channels trials. To estimate the influence of different features on the classification process and make the method more interpretable, we use the SHAP library technique. We find that the best optimization method is Adagrad and the worst one is FTRL. In addition, we find that only Adagrad works well for both linear and tangent models. The results could be useful for EEG-based brain–computer interfaces (BCIs) in part for choosing the appropriate machine learning methods and features for the correct training of the BCI intelligent system. Full article
Show Figures

Figure 1

12 pages, 3181 KiB  
Article
A Dynamic Mechanistic Model of Perceptual Binding
by Pavel Kraikivski
Mathematics 2022, 10(7), 1135; https://doi.org/10.3390/math10071135 - 01 Apr 2022
Cited by 5 | Viewed by 1832
Abstract
The brain’s ability to create a unified conscious representation of an object by integrating information from multiple perception pathways is called perceptual binding. Binding is crucial for normal cognitive function. Some perceptual binding errors and disorders have been linked to certain neurological conditions, [...] Read more.
The brain’s ability to create a unified conscious representation of an object by integrating information from multiple perception pathways is called perceptual binding. Binding is crucial for normal cognitive function. Some perceptual binding errors and disorders have been linked to certain neurological conditions, brain lesions, and conditions that give rise to illusory conjunctions. However, the mechanism of perceptual binding remains elusive. Here, I present a computational model of binding using two sets of coupled oscillatory processes that are assumed to occur in response to two different percepts. I use the model to study the dynamic behavior of coupled processes to characterize how these processes can modulate each other and reach a temporal synchrony. I identify different oscillatory dynamic regimes that depend on coupling mechanisms and parameter values. The model can also discriminate different combinations of initial inputs that are set by initial states of coupled processes. Decoding brain signals that are formed through perceptual binding is a challenging task, but my modeling results demonstrate how crosstalk between two systems of processes can possibly modulate their outputs. Therefore, my mechanistic model can help one gain a better understanding of how crosstalk between perception pathways can affect the dynamic behavior of the systems that involve perceptual binding. Full article
Show Figures

Figure 1

Back to TopTop