Emerging Trends of Biomedical Signal Processing in Intelligent Emotion Recognition

A special issue of Brain Sciences (ISSN 2076-3425). This special issue belongs to the section "Computational Neuroscience and Neuroinformatics".

Deadline for manuscript submissions: closed (1 February 2024) | Viewed by 10030

Special Issue Editor


E-Mail Website
Guest Editor
Department of Biomedical Engineering, Imam Reza International University, Mashhad, Razavi Khorasan, Iran
Interests: biomedical signal processing; electroencephalography (EEG); nonlinear dynamics; affect recognition; computational neuroscience

Special Issue Information

Dear Colleagues,

The importance of emotions in human interactions, decision making, understanding, attention, and memory is not hidden from anyone. The significance of studying the behavioral, physiological, and psychological manifestations of human emotions with the aim of providing human-like emotional responses to computers has led to the emergence of a new science called "affective computing". The existence of numerous applications for an interface capable of evaluating human emotional states has made automatic emotion recognition an active research field, especially in biomedical signal processing.

This Special Issue of Brain Sciences will focus on emerging biomedical signal processing and advanced machine learning in emotion recognition. Original research studies will be prioritized. However, cutting-edge reviews about the state of the art, current restrictions, and forthcoming directions are welcome.

Authors could address a broad range of topics related to intelligent emotion recognition including but not limited to:

  • Application of EEG, ECG, speech, and other biosignals in emotional state detection.
  • Signal pre-processing, feature extraction/engineering, feature reduction/selection, and channel selection approaches in affective computing applications.
  • Application of Artificial Intelligence (AI) and advanced AI methods in emotion recognition.
  • Machine learning methods/advanced machine learning/deep learning algorithms in emotion recognition applications.
  • Fusion algorithms (data-, feature-, or decision level) in emotion recognition applications.
  • Wearable devices for affective computing applications utilizing biosignals.
  • Applications of such a system in affect-related disorders such as stress, anxiety, phobia, pain, attention deficit hyperactivity disorder (ADHD), autism spectrum disorder (ASD), bipolar disorder, etc.
  • Human–machine interface (HCI) and brain–computer interface (BCI)-based affective computing systems.

Assistive devices based on emotion recognition.

Dr. Ateke Goshvarpour
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Brain Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • biomedical signal processing
  • biomedical data analysis
  • artificial intelligence and deep learning
  • fusion algorithms
  • emotion recognition
  • affect-related disorders
  • brain–computer interface
  • human–machine interface

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

19 pages, 2507 KiB  
Article
Subject-Independent Emotion Recognition Based on EEG Frequency Band Features and Self-Adaptive Graph Construction
by Jinhao Zhang, Yanrong Hao, Xin Wen, Chenchen Zhang, Haojie Deng, Juanjuan Zhao and Rui Cao
Brain Sci. 2024, 14(3), 271; https://doi.org/10.3390/brainsci14030271 - 12 Mar 2024
Viewed by 981
Abstract
Emotion is one of the most important higher cognitive functions of the human brain and plays an important role in transaction processing and decisions. In traditional emotion recognition studies, the frequency band features in EEG signals have been shown to have a high [...] Read more.
Emotion is one of the most important higher cognitive functions of the human brain and plays an important role in transaction processing and decisions. In traditional emotion recognition studies, the frequency band features in EEG signals have been shown to have a high correlation with emotion production. However, traditional emotion recognition methods cannot satisfactorily solve the problem of individual differences in subjects and data heterogeneity in EEG, and subject-independent emotion recognition based on EEG signals has attracted extensive attention from researchers. In this paper, we propose a subject-independent emotion recognition model based on adaptive extraction of layer structure based on frequency bands (BFE-Net), which is adaptive in extracting EEG map features through the multi-graphic layer construction module to obtain a frequency band-based multi-graphic layer emotion representation. To evaluate the performance of the model in subject-independent emotion recognition studies, extensive experiments are conducted on two public datasets including SEED and SEED-IV. The experimental results show that in most experimental settings, our model has a more advanced performance than the existing studies of the same type. In addition, the visualization of brain connectivity patterns reveals that some of the findings are consistent with previous neuroscientific validations, further validating the model in subject-independent emotion recognition studies. Full article
Show Figures

Figure 1

15 pages, 3691 KiB  
Article
Emotion Classification Based on Transformer and CNN for EEG Spatial–Temporal Feature Learning
by Xiuzhen Yao, Tianwen Li, Peng Ding, Fan Wang, Lei Zhao, Anmin Gong, Wenya Nan and Yunfa Fu
Brain Sci. 2024, 14(3), 268; https://doi.org/10.3390/brainsci14030268 - 11 Mar 2024
Viewed by 1121
Abstract
Objectives: The temporal and spatial information of electroencephalogram (EEG) signals is crucial for recognizing features in emotion classification models, but it excessively relies on manual feature extraction. The transformer model has the capability of performing automatic feature extraction; however, its potential has not [...] Read more.
Objectives: The temporal and spatial information of electroencephalogram (EEG) signals is crucial for recognizing features in emotion classification models, but it excessively relies on manual feature extraction. The transformer model has the capability of performing automatic feature extraction; however, its potential has not been fully explored in the classification of emotion-related EEG signals. To address these challenges, the present study proposes a novel model based on transformer and convolutional neural networks (TCNN) for EEG spatial–temporal (EEG ST) feature learning to automatic emotion classification. Methods: The proposed EEG ST-TCNN model utilizes position encoding (PE) and multi-head attention to perceive channel positions and timing information in EEG signals. Two parallel transformer encoders in the model are used to extract spatial and temporal features from emotion-related EEG signals, and a CNN is used to aggregate the EEG’s spatial and temporal features, which are subsequently classified using Softmax. Results: The proposed EEG ST-TCNN model achieved an accuracy of 96.67% on the SEED dataset and accuracies of 95.73%, 96.95%, and 96.34% for the arousal–valence, arousal, and valence dimensions, respectively, for the DEAP dataset. Conclusions: The results demonstrate the effectiveness of the proposed ST-TCNN model, with superior performance in emotion classification compared to recent relevant studies. Significance: The proposed EEG ST-TCNN model has the potential to be used for EEG-based automatic emotion recognition. Full article
Show Figures

Figure 1

17 pages, 1590 KiB  
Article
Emotion Recognition Using a Novel Granger Causality Quantifier and Combined Electrodes of EEG
by Atefeh Goshvarpour and Ateke Goshvarpour
Brain Sci. 2023, 13(5), 759; https://doi.org/10.3390/brainsci13050759 - 04 May 2023
Cited by 1 | Viewed by 1601
Abstract
Electroencephalogram (EEG) connectivity patterns can reflect neural correlates of emotion. However, the necessity of evaluating bulky data for multi-channel measurements increases the computational cost of the EEG network. To date, several approaches have been presented to pick the optimal cerebral channels, mainly depending [...] Read more.
Electroencephalogram (EEG) connectivity patterns can reflect neural correlates of emotion. However, the necessity of evaluating bulky data for multi-channel measurements increases the computational cost of the EEG network. To date, several approaches have been presented to pick the optimal cerebral channels, mainly depending on available data. Consequently, the risk of low data stability and reliability has increased by reducing the number of channels. Alternatively, this study suggests an electrode combination approach in which the brain is divided into six areas. After extracting EEG frequency bands, an innovative Granger causality-based measure was introduced to quantify brain connectivity patterns. The feature was subsequently subjected to a classification module to recognize valence–arousal dimensional emotions. A Database for Emotion Analysis Using Physiological Signals (DEAP) was used as a benchmark database to evaluate the scheme. The experimental results revealed a maximum accuracy of 89.55%. Additionally, EEG-based connectivity in the beta-frequency band was able to effectively classify dimensional emotions. In sum, combined EEG electrodes can efficiently replicate 32-channel EEG information. Full article
Show Figures

Figure 1

17 pages, 2959 KiB  
Article
Emotion Recognition from Spatio-Temporal Representation of EEG Signals via 3D-CNN with Ensemble Learning Techniques
by Rajamanickam Yuvaraj, Arapan Baranwal, A. Amalin Prince, M. Murugappan and Javeed Shaikh Mohammed
Brain Sci. 2023, 13(4), 685; https://doi.org/10.3390/brainsci13040685 - 19 Apr 2023
Cited by 3 | Viewed by 2110
Abstract
The recognition of emotions is one of the most challenging issues in human–computer interaction (HCI). EEG signals are widely adopted as a method for recognizing emotions because of their ease of acquisition, mobility, and convenience. Deep neural networks (DNN) have provided excellent results [...] Read more.
The recognition of emotions is one of the most challenging issues in human–computer interaction (HCI). EEG signals are widely adopted as a method for recognizing emotions because of their ease of acquisition, mobility, and convenience. Deep neural networks (DNN) have provided excellent results in emotion recognition studies. Most studies, however, use other methods to extract handcrafted features, such as Pearson correlation coefficient (PCC), Principal Component Analysis, Higuchi Fractal Dimension (HFD), etc., even though DNN is capable of generating meaningful features. Furthermore, most earlier studies largely ignored spatial information between the different channels, focusing mainly on time domain and frequency domain representations. This study utilizes a pre-trained 3D-CNN MobileNet model with transfer learning on the spatio-temporal representation of EEG signals to extract features for emotion recognition. In addition to fully connected layers, hybrid models were explored using other decision layers such as multilayer perceptron (MLP), k-nearest neighbor (KNN), extreme learning machine (ELM), XGBoost (XGB), random forest (RF), and support vector machine (SVM). Additionally, this study investigates the effects of post-processing or filtering output labels. Extensive experiments were conducted on the SJTU Emotion EEG Dataset (SEED) (three classes) and SEED-IV (four classes) datasets, and the results obtained were comparable to the state-of-the-art. Based on the conventional 3D-CNN with ELM classifier, SEED and SEED-IV datasets showed a maximum accuracy of 89.18% and 81.60%, respectively. Post-filtering improved the emotional classification performance in the hybrid 3D-CNN with ELM model for SEED and SEED-IV datasets to 90.85% and 83.71%, respectively. Accordingly, spatial-temporal features extracted from the EEG, along with ensemble classifiers, were found to be the most effective in recognizing emotions compared to state-of-the-art methods. Full article
Show Figures

Figure 1

Review

Jump to: Research, Other

20 pages, 2785 KiB  
Review
Systematic Review and Future Direction of Neuro-Tourism Research
by Abeer Al-Nafjan, Mashael Aldayel and Amira Kharrat
Brain Sci. 2023, 13(4), 682; https://doi.org/10.3390/brainsci13040682 - 19 Apr 2023
Cited by 5 | Viewed by 2478
Abstract
Neuro-tourism is the application of neuroscience in tourism to improve marketing methods of the tourism industry by analyzing the brain activities of tourists. Neuro-tourism provides accurate real-time data on tourists’ conscious and unconscious emotions. Neuro-tourism uses the methods of neuromarketing such as brain–computer [...] Read more.
Neuro-tourism is the application of neuroscience in tourism to improve marketing methods of the tourism industry by analyzing the brain activities of tourists. Neuro-tourism provides accurate real-time data on tourists’ conscious and unconscious emotions. Neuro-tourism uses the methods of neuromarketing such as brain–computer interface (BCI), eye-tracking, galvanic skin response, etc., to create tourism goods and services to improve tourist experience and satisfaction. Due to the novelty of neuro-tourism and the dearth of studies on this subject, this study offered a comprehensive analysis of the peer-reviewed journal publications in neuro-tourism research for the previous 12 years to detect trends in this field and provide insights for academics. We reviewed 52 articles indexed in the Web of Science (WoS) core collection database and examined them using our suggested classification schema. The results reveal a large growth in the number of published articles on neuro-tourism, demonstrating a rise in the relevance of this field. Additionally, the findings indicated a lack of integrating artificial intelligence techniques in neuro-tourism studies. We believe that the advancements in technology and research collaboration will facilitate exponential growth in this field. Full article
Show Figures

Figure 1

Other

Jump to: Research, Review

51 pages, 775 KiB  
Systematic Review
Review of EEG Affective Recognition with a Neuroscience Perspective
by Rosary Yuting Lim, Wai-Cheong Lincoln Lew and Kai Keng Ang
Brain Sci. 2024, 14(4), 364; https://doi.org/10.3390/brainsci14040364 - 08 Apr 2024
Viewed by 723
Abstract
Emotions are a series of subconscious, fleeting, and sometimes elusive manifestations of the human innate system. They play crucial roles in everyday life—influencing the way we evaluate ourselves, our surroundings, and how we interact with our world. To date, there has been an [...] Read more.
Emotions are a series of subconscious, fleeting, and sometimes elusive manifestations of the human innate system. They play crucial roles in everyday life—influencing the way we evaluate ourselves, our surroundings, and how we interact with our world. To date, there has been an abundance of research on the domains of neuroscience and affective computing, with experimental evidence and neural network models, respectively, to elucidate the neural circuitry involved in and neural correlates for emotion recognition. Recent advances in affective computing neural network models often relate closely to evidence and perspectives gathered from neuroscience to explain the models. Specifically, there has been growing interest in the area of EEG-based emotion recognition to adopt models based on the neural underpinnings of the processing, generation, and subsequent collection of EEG data. In this respect, our review focuses on providing neuroscientific evidence and perspectives to discuss how emotions potentially come forth as the product of neural activities occurring at the level of subcortical structures within the brain’s emotional circuitry and the association with current affective computing models in recognizing emotions. Furthermore, we discuss whether such biologically inspired modeling is the solution to advance the field in EEG-based emotion recognition and beyond. Full article
Show Figures

Figure 1

Back to TopTop