Advances in Neurotechnology and Brain-Robot Interfaces

A special issue of Bioengineering (ISSN 2306-5354). This special issue belongs to the section "Biosignal Processing".

Deadline for manuscript submissions: closed (15 October 2023) | Viewed by 4330

Special Issue Editors


E-Mail Website
Guest Editor
Centre for Advanced Computational Science, Manchester Metropolitan University, John R. Dalton Building, Manchester M15 6BH, UK
Interests: brain-computer interfaces; neuromorphic computing; neurotechnology; rehabilitation robotics; brain-robots interfaces

E-Mail Website
Guest Editor
SINAPSE, National University of Singapore (NUS), Singapore, Singapore
Interests: brain-machine interfaces; EEG signal processing; upper-limb prostheses; rehabilitation robotics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In recent times, we have witnessed remarkable advances in brain–robot interfaces and the field of neurotechnology in general. In particular, considerable advances have been made in designing sensory-enabled prosthetic devices for amputees, brain interfaces to decode thoughts into speech, novel solutions to treat and rehabilitate stroke patients, and even novel neurotechnology solutions to monitor and treat neurological disorders remotely. These advances could chart a route ahead for a new generation of portable closed-loop and intelligent prosthetic and medical devices. This Special Issue on “ Advances in Neurotechnology & Brain-Robot Interfaces,” therefore, will primarily focus on original research papers and comprehensive reviews covering the topics of brain-signal-based control of robotic systems, rehabilitation robotics, neuroprosthetics, and neurotechnology devices to restore, monitor, diagnose, or treat neurofunctions and neurological conditions. Additional topics of interest include (but are not limited to):

  • Neuroprosthetic devices;
  • Rehabilitation robotics;
  • Brain–computer interfaces;
  • Novel medical devices and neurotechnology solutions/methods for the monitoring, diagnosis, and rehabilitation of neurological functions;
  • Brain-signal-based control of robotic systems, i.e., brain–robot interfaces.

Dr. Zied Tayeb
Dr. Andrei Dragomir
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Bioengineering is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • brain–robot interfaces
  • neurotechnology
  • rehabilitation robotics
  • prosthetic devices

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

20 pages, 3275 KiB  
Article
Classification of EEG Signals Based on Sparrow Search Algorithm-Deep Belief Network for Brain-Computer Interface
by Shuai Wang, Zhiguo Luo, Shaokai Zhao, Qilong Zhang, Guangrong Liu, Dongyue Wu, Erwei Yin and Chao Chen
Bioengineering 2024, 11(1), 30; https://doi.org/10.3390/bioengineering11010030 - 27 Dec 2023
Cited by 1 | Viewed by 1051
Abstract
In brain-computer interface (BCI) systems, challenges are presented by the recognition of motor imagery (MI) brain signals. Established recognition approaches have achieved favorable performance from patterns like SSVEP, AEP, and P300, whereas the classification methods for MI need to be improved. Hence, seeking [...] Read more.
In brain-computer interface (BCI) systems, challenges are presented by the recognition of motor imagery (MI) brain signals. Established recognition approaches have achieved favorable performance from patterns like SSVEP, AEP, and P300, whereas the classification methods for MI need to be improved. Hence, seeking a classification method that exhibits high accuracy and robustness for application in MI-BCI systems is essential. In this study, the Sparrow search algorithm (SSA)-optimized Deep Belief Network (DBN), called SSA-DBN, is designed to recognize the EEG features extracted by the Empirical Mode Decomposition (EMD). The performance of the DBN is enhanced by the optimized hyper-parameters obtained through the SSA. Our method’s efficacy was tested on three datasets: two public and one private. Results indicate a relatively high accuracy rate, outperforming three baseline methods. Specifically, on the private dataset, our approach achieved an accuracy of 87.83%, marking a significant 10.38% improvement over the standard DBN algorithm. For the BCI IV 2a dataset, we recorded an accuracy of 86.14%, surpassing the DBN algorithm by 9.33%. In the SMR-BCI dataset, our method attained a classification accuracy of 87.21%, which is 5.57% higher than that of the conventional DBN algorithm. This study demonstrates enhanced classification capabilities in MI-BCI, potentially contributing to advancements in the field of BCI. Full article
(This article belongs to the Special Issue Advances in Neurotechnology and Brain-Robot Interfaces)
Show Figures

Figure 1

15 pages, 2077 KiB  
Article
An Improved Canonical Correlation Analysis for EEG Inter-Band Correlation Extraction
by Zishan Wang, Ruqiang Huang, Ye Yan, Zhiguo Luo, Shaokai Zhao, Bei Wang, Jing Jin, Liang Xie and Erwei Yin
Bioengineering 2023, 10(10), 1200; https://doi.org/10.3390/bioengineering10101200 - 16 Oct 2023
Viewed by 1095
Abstract
(1) Background: Emotion recognition based on EEG signals is a rapidly growing and promising research field in affective computing. However, traditional methods have focused on single-channel features that reflect time-domain or frequency-domain information of the EEG, as well as bi-channel features that reveal [...] Read more.
(1) Background: Emotion recognition based on EEG signals is a rapidly growing and promising research field in affective computing. However, traditional methods have focused on single-channel features that reflect time-domain or frequency-domain information of the EEG, as well as bi-channel features that reveal channel-wise relationships across brain regions. Despite these efforts, the mechanism of mutual interactions between EEG rhythms under different emotional expressions remains largely unexplored. Currently, the primary form of information interaction between EEG rhythms is phase–amplitude coupling (PAC), which results in computational complexity and high computational cost. (2) Methods: To address this issue, we proposed a method of extracting inter-bands correlation (IBC) features via canonical correlation analysis (CCA) based on differential entropy (DE) features. This approach eliminates the need for surrogate testing and reduces computational complexity. (3) Results: Our experiments verified the effectiveness of IBC features through several tests, demonstrating that the more correlated features between EEG frequency bands contribute more to emotion classification accuracy. We then fused IBC features and traditional DE features at the decision level, which significantly improved the accuracy of emotion recognition on the SEED dataset and the local CUMULATE dataset compared to using a single feature alone. (4) Conclusions: These findings suggest that IBC features are a promising approach to promoting emotion recognition accuracy. By exploring the mutual interactions between EEG rhythms under different emotional expressions, our method can provide valuable insights into the underlying mechanisms of emotion processing and improve the performance of emotion recognition systems. Full article
(This article belongs to the Special Issue Advances in Neurotechnology and Brain-Robot Interfaces)
Show Figures

Graphical abstract

17 pages, 7957 KiB  
Article
A Novel Asynchronous Brain Signals-Based Driver–Vehicle Interface for Brain-Controlled Vehicles
by Jinling Lian, Yanli Guo, Xin Qiao, Changyong Wang and Luzheng Bi
Bioengineering 2023, 10(9), 1105; https://doi.org/10.3390/bioengineering10091105 - 21 Sep 2023
Viewed by 815
Abstract
Directly applying brain signals to operate a mobile manned platform, such as a vehicle, may help people with neuromuscular disorders regain their driving ability. In this paper, we developed a novel electroencephalogram (EEG) signal-based driver–vehicle interface (DVI) for the continuous and asynchronous control [...] Read more.
Directly applying brain signals to operate a mobile manned platform, such as a vehicle, may help people with neuromuscular disorders regain their driving ability. In this paper, we developed a novel electroencephalogram (EEG) signal-based driver–vehicle interface (DVI) for the continuous and asynchronous control of brain-controlled vehicles. The proposed DVI consists of the user interface, the command decoding algorithm, and the control model. The user interface is designed to present the control commands and induce the corresponding brain patterns. The command decoding algorithm is developed to decode the control command. The control model is built to convert the decoded commands to control signals. Offline experimental results show that the developed DVI can generate a motion control command with an accuracy of 83.59% and a detection time of about 2 s, while it has a recognition accuracy of 90.06% in idle states. A real-time brain-controlled simulated vehicle based on the DVI was developed and tested on a U-turn road. Experimental results show the feasibility of the DVI for continuously and asynchronously controlling a vehicle. This work not only advances the research on brain-controlled vehicles but also provides valuable insights into driver–vehicle interfaces, multimodal interaction, and intelligent vehicles. Full article
(This article belongs to the Special Issue Advances in Neurotechnology and Brain-Robot Interfaces)
Show Figures

Figure 1

20 pages, 5320 KiB  
Article
REM Sleep Stage Identification with Raw Single-Channel EEG
by Gabriel Toban, Khem Poudel and Don Hong
Bioengineering 2023, 10(9), 1074; https://doi.org/10.3390/bioengineering10091074 - 11 Sep 2023
Viewed by 951
Abstract
This paper focused on creating an interpretable model for automatic rapid eye movement (REM) and non-REM sleep stage scoring for a single-channel electroencephalogram (EEG). Many methods attempt to extract meaningful information to provide to a learning algorithm. This method attempts to let the [...] Read more.
This paper focused on creating an interpretable model for automatic rapid eye movement (REM) and non-REM sleep stage scoring for a single-channel electroencephalogram (EEG). Many methods attempt to extract meaningful information to provide to a learning algorithm. This method attempts to let the model extract the meaningful interpretable information by providing a smaller number of time-invariant signal filters for five frequency ranges using five CNN algorithms. A bi-directional GRU algorithm was applied to the output to incorporate time transition information. Training and tests were run on the well-known sleep-EDF-expanded database. The best results produced 97% accuracy, 93% precision, and 89% recall. Full article
(This article belongs to the Special Issue Advances in Neurotechnology and Brain-Robot Interfaces)
Show Figures

Figure 1

Back to TopTop