Identifying Complex Emotions in Alexithymia Affected Adolescents Using Machine Learning Techniques
Abstract
:1. Introduction
1.1. Objective and Problem Statement
1.2. Related Works
2. Materials and Methods
2.1. Sampling and Data Collection
2.2. Questionnaire
2.3. Statistical Analysis
2.4. EEG and EEG Data
2.5. Methodology
2.6. Evaluation
- In pictorial stimulations, we used natural scenery pictures representing a neutral scene and other pictures related to inducing emotions of two different categories (pleasure and grief) are shown with the various color schemes accordingly in Figure 4.
- 2.
- In audio stimuli, participants heard eight different pieces of music related to emotional induction for 48 s (i.e., 4 for pleasure and 4 for grief). For every pleasure draggle, very pleasant native music was played in the headphones which helped the participants to recall their good memories. Apart from this, we used some manipulated war and military sounds which are unpleasant stimuli for the next interval of time to elicit disgust and portray the unstable nature of Lebanon. We also used neural conditions with a faded colored scene with no sound in between the two emotional stimuli in audio sessions.
- 3.
- In video stimuli, entrants were comfortably placed in front of a screen where stimulation video files were played with different emotional stimulations. Apart from these videos, we played some neural videos with a light-colored scheme with low music in between the emotional stimulations in video sessions.
- 4.
- With text stimuli, participants were intended to read some self-emotion stimulus text in four intervals of time. Texts were categorized into three statements: pleasing statement, grief statement, and neural statement, as tabulated in Table 2. An EEG cap was placed in all four stimuli sessions to record the spike variation neurons in the brain.
Pleasing Statement | Grief Statement | Neutral Statement |
---|---|---|
He rescued me because he delighted in me | We can’t regain our strength | Homes are built by bricks |
Like an apple tree, among the trees in the forest, is my love among you | He doesn’t have to commute to work | Parrots are green |
Discipline your son, he will bring delight to you | We can satisfy their need | Beqaa valley is home to Lebanese’s famous vineyards, and wineries |
Love always perseveres | Hereafter, we want to be able to cook food in our homes | Lebanon Archs at the rivers of Anjar is good. |
2.7. Experimental Analysis
3. Results
- To determine whether the two distinct emotional states can be rigorously separated in a categorized way using time frequency/FAA and correlational features retrieved within 1 s windows, a machine learning approach is investigated.
- Identifying two complex emotions such as pleasure and grief through common highest ranked features through feature selection distance discriminant (FSDD) where necessary spectral and spatial properties are identified.
- The subjective emotion experienced is explored by self-evaluation (i.e., a questionnaire) after each multimodal visual aid where it excludes the subject-dependent classification.
3.1. Classification of Complex Emotion Using SVM
3.2. Feature Selection across Participants
3.3. Feedback Rating and Correlation with SVM Performance
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Lang, P.J.; Greenwald, M.K.; Bradley, M.M.; Hamm, A.O. Looking at pictures: Affective, facial, visceral, and behavioral reactions. Psychophysiology 1993, 30, 261–273. [Google Scholar] [CrossRef]
- Zhao, G.; Zhang, Y.; Ge, Y. Frontal EEG Asymmetry and Middle Line Power Difference in Discrete Emotions. Front. Behav. Neurosci. 2018, 12, 225. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Anthony, D.; Balleine, B.; Kringelbach, M.; Berridge, K. Pleasures of the Brain. Brain Cogn. 2010, 52, 74–84. [Google Scholar]
- Chowdhury, N.H.; Reaz, M.B.I.; Haque, F.; Ahmad, S.; Ali, S.H.M.; A Bakar, A.A.; Bhuiyan, M.A.S. Performance Analysis of Conventional Machine Learning Algorithms for Identification of Chronic Kidney Disease in Type 1 Diabetes Mellitus Patients. Diagnostics 2021, 11, 2267. [Google Scholar] [CrossRef]
- Al-Nafjan, A.; Hosny, M.; Al-Ohali, Y.; Al-Wabil, A. Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review. Appl. Sci. 2017, 7, 1239. [Google Scholar] [CrossRef] [Green Version]
- Ali, S.I.; Ali, B.; Hussain, J.; Hussain, M.; Satti, F.A.; Park, G.H.; Lee, S. Cost-Sensitive Ensemble Feature Ranking and Automatic Threshold Selection for Chronic Kidney Disease Diagnosis. Appl. Sci. 2020, 10, 5663. [Google Scholar] [CrossRef]
- Cavaliere, C.; Vilades, E.; Alonso-Rodríguez, M.C.; Rodrigo, M.J.; Pablo, L.E.; Miguel, J.M.; López-Guillén, E.; Sánchez Morla, E.M.; Boquete, L.; Garcia-Martin, E. Computer-aided diagnosis of multiple sclerosis using a support vector machine and optical coherence tomography features. Sensors 2019, 19, 5323. [Google Scholar] [CrossRef] [Green Version]
- Apgáua, L.T.; Jaeger, A. Memory for emotional information and alexithymia A systematic review. Dement. Neuropsychol. 2019, 13, 22–30. [Google Scholar] [CrossRef] [PubMed]
- Iannattone, S.; Miscioscia, M.; Raffagnato, A.; Gatta, M. The Role of Alexithymia in Social Withdrawal during Adolescence: A Case—Control Study. Children 2021, 8, 165. [Google Scholar] [CrossRef]
- Sfeir, E.; Geara, C.; Hallit, S.; Obeid, S. Alexithymia, aggressive behavior and depression among Lebanese adolescents: A cross-sectional study. Child Adolesc. Psychiatry Ment. Health 2020, 14, 32. [Google Scholar] [CrossRef] [PubMed]
- Bazgir, O.; Mohammadi, Z.; Habibi, S.A.H. Emotion recognition with machine learning using EEG signals. In Proceedings of the 2018 25th National and 3rd International Iranian Conference on Biomedical Engineering (ICBME), Qom, Iran, 29–30 November 2018; pp. 1–5. [Google Scholar]
- De Filippi, E.; Wolter, M.; Melo, B.R.P.; Tierra-Criollo, C.J.; Bortolini, T.; Deco, G.; Moll, J. Classification of Complex Emotions Using EEG and Virtual Environment: Proof of Concept and Therapeutic Implication. Front. Hum. Neurosci. 2021, 15, 711279. [Google Scholar] [CrossRef]
- Li, S.; Lyu, X.; Zhao, L.; Chen, Z.; Gong, A.; Fu, Y. Identification of Emotion Using ectroencephalogram by Tunable Q-Factor Wavelet Transform and Binary Gray Wolf Optimization. Front. Comput. Neurosci. 2021, 15, 78. [Google Scholar] [CrossRef] [PubMed]
- Doma, V.; Pirouz, M. A comparative analysis of machine learning methods for emotion recognition using EEG and peripheral physiological signals. J. Big Data 2020, 7, 18. [Google Scholar] [CrossRef] [Green Version]
- Gannouni, S.; Aledaily, A.; Belwafi, K.; Aboalsamh, H. Emotion detection using electroencephalography signals and a zero-time windowing-based epoch estimation and relevant electrode identification. Sci. Rep. 2021, 11, 7071. [Google Scholar] [CrossRef]
- Bagby, R.M.; Parker, J.D.A.; Taylor, G.J. The twenty-item Toronto Alexithymia scale—I. Item selection and cross-validation of the factor structure. J. Psychosom. Res. 1994, 38, 23–32. [Google Scholar] [CrossRef]
- Buss, A.H.; Perry, M. The aggression questionnaire. J. Personal. Soc. Psychol. 1992, 63, 452. [Google Scholar] [CrossRef]
- Revah-Levy, A.; Birmaher, B.; Gasquet, I.; Falissard, B. The Adolescent Depression Rating Scale (ADRS): A validation study. BMC Psychiatry 2007, 7, 2. [Google Scholar] [CrossRef] [Green Version]
- Love, J.; Selker, R.; Marsman, M.; Jamil, T.; Dropmann, D.; Verhagen, J.; Ly, A.; Gronau, Q.F.; Smíra, M.; Epskamp, S.; et al. JASP: Graphical Statistical Software for Common Statistical Designs. J. Stat. Softw. 2019, 88, 1–17. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; He, W.; Miao, X.; Yang, J. Dynamic EEG Analysis via the Variability of Band Relative Intensity Ratio: A Time-Frequency Method. IEEE 2005, 2005, 2664–2667. [Google Scholar] [CrossRef]
- Niedermeyer, E. Electroencephalography: Basic Principles, Clinical Applications, and Related Fields, 4th ed.; Lippincott Williams & Wilkins: Philadelphia, PA, USA, 1999. [Google Scholar]
- Rangaswamy, M.; Porjesz, B.; Chorlian, D.B.; Wang, K.; Jones, K.A.; Bauer, L.O.; Rohrbaugh, J.; O’Connor, S.J.; Kuperman, S.; Reich, T.; et al. Beta power in the EEG of alcoholics. Biol. Psychiatry 2002, 52, 831–842. [Google Scholar] [CrossRef] [PubMed]
- Nidal, K.; Malik, A.S. (Eds.) EEG/ERP Analysis: Methods and Applications; CRC Press: Boca Raton, FL, USA, 2014. [Google Scholar]
- Al-Fahoum, A.S.; Al-Fraihat, A.A. Methods of EEG Signal Features Extraction Using Linear Analysis in Frequency and Time-Frequency Domains. ISRN Neurosci. 2014, 2014, 730218. [Google Scholar] [CrossRef]
- Cohen, M.X. A better way to define and describe Morlet wavelets for time-frequency analysis. NeuroImage 2019, 199, 81–86. [Google Scholar] [CrossRef]
- Available online: https://releases.ubuntu.com/16.04/ (accessed on 20 October 2004).
- Available online: https://www.anaconda.com/ (accessed on 17 July 2012).
- Available online: https://www.python.org/downloads/ (accessed on 20 February 1991).
- Available online: https://spark.apache.org/docs/latest/api/python/ (accessed on 26 May 2014).
- Available online: https://octave.org/download#ms-windows (accessed on 2 October 2022).
- Available online: https://sccn.ucsd.edu/eeglab/download.php (accessed on 1 July 2014).
- Available online: https://umap-learn.readthedocs.io/en/latest/ (accessed on 2 October 2022).
- Peng, H.; Long, F.; Ding, C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1226–1238. [Google Scholar] [CrossRef]
- Buś, S.; Jędrzejewski, K.; Guzik, P. Using Minimum Redundancy Maximum Relevance Algorithm to Select Minimal Sets of Heart Rate Variability Parameters for Atrial Fibrillation Detection. J. Clin. Med. 2022, 11, 4004. [Google Scholar] [CrossRef]
- Liang, J.; Yang, S.; Winstanley, A. Invariant optimal feature selection: A distance discriminant and feature ranking based solution. Pattern Recognit. 2008, 41, 1429–1439. [Google Scholar] [CrossRef]
- Aftanas, L.I.; Reva, N.V.; Savotina, L.N.; Makhnev, V.P. Neurophysiological Correlates of Induced Discrete Emotions in Humans: An Individually Oriented Analysis. Neurosci. Behav. Physiol. 2006, 36, 119–130. [Google Scholar] [CrossRef] [PubMed]
- Kuppens, P.; Stouten, J.; Mesquita, B. Individual differences in emotion components and dynamics: Introduction to the Special Issue. Cogn. Emot. 2009, 23, 1249–1258. [Google Scholar] [CrossRef] [Green Version]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [Green Version]
- Chao, H.; Dong, L.; Liu, Y.; Lu, B. Emotion Recognition from Multiband EEG Signals Using CapsNet. Sensors 2019, 19, 2212. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xing, X.; Li, Z.; Xu, T.; Shu, L.; Hu, B.; Xu, X. SAE+LSTM: A New Framework for Emotion Recognition From Multi-Channel EEG. Front. Neurorobotics 2019, 13, 37. [Google Scholar] [CrossRef] [Green Version]
- Chen, J.; Hu, B.; Xu, L.; Moore, P.; Su, Y. Feature-level fusion of multimodal physiological signals for emotion recognition. In Proceedings of the 2015 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Washington, DC, USA, 9–12 November 2015; pp. 395–399. [Google Scholar]
- Mert, A.; Akan, A. Emotion recognition from EEG signals by using multivariate empirical mode decomposition. Pattern Anal. Appl. 2016, 21, 81–89. [Google Scholar] [CrossRef]
Band Name | Band Intervals (Hz) | Functions |
---|---|---|
Delta | <3 | Related to the unconscious mind and it transpires in deep sleep |
Theta [20] | 4–7 | Related to the subconscious mind and transpires in sleeping and dreaming |
Alpha [21] | 8–15 | Related to a relaxed mental state associated with brain activation |
Beta [22] | 16–31 | Related to active mind state and transpires during highly engrossed mental activity |
Gamma [23] | >31 | Related to hyperfocused brain activity |
Sl. No | Name | Version | Description |
---|---|---|---|
1 | Operating System: Ubuntu [26] | 16.04 | Linux operating system |
Intel i7 Processor | 8th Gen | ||
8 GB DDR4 RAM | |||
2 | Anaconda [27] | 2019.03 | Python framework for scientific computing and large-scale data processing, programming language |
3 | Python [28] Pyspark [29] | 3.9 2.4.4 | Interface for Apache Spark to analyze the data in distribution environment |
4 | GNU Octave [30] | 6.1 | High-level language framework mainly used for numerical computation by solving linear and non-linear problems numerically compatible even with MATLAB |
5 | EEG Lab [31] | 9.0.7.6 | GUI to interact with the high-density EEG and other interactive toolboxes for processing continuous data and data related to EEG, MEG, and brain electrophysiological signals |
Python Libraries and Packages | |||
Conda install, pip install, Matplotlib, numpy, pandas, pytorch, SkylearnV 0.21 |
Sl. No | Brain Area | Left Hemisphere | Total Features | Right Hemisphere | Total Features |
---|---|---|---|---|---|
1 | Frontal | Fp1, AF3, AF7, F1, F3, F5, F7 | 35 | FP2, AF4, AF8, F2, F4, F6, F8 | 11 |
2 | Temporal | FT7, FT9, T7, TP7, TP9 | 32 | FT8, FT10, T8, TP8, TP10 | 36 |
3 | Central–Pariental | FC1, FC3, FC5, C1, C3, C5, CP1, CP3, CP5, P1, P3, P5, P7 | 39 | FC2, FC4, FC6, C2, C4, C6, CP2, CP4, CP6, P2, P4, P6, P8 | 20 |
4 | Occipital | PO3, PO7, O7 | 19 | PO4, PO8, O2 | 15 |
5 | Midline | FPz, Fz, Cz, CPz, POz, Oz | 15 |
Study Authors | Algorithms | Features | Evaluation Mode | Accuracy (%) | |
---|---|---|---|---|---|
Arousal | Valence | ||||
Chao et al. [39] | CapsNet | Multiband feature matrix | 10-fold cross-validation | 68.28 | 66.73 |
Xing et al. [40] | LSTM | Frequency band power | 10-fold cross-validation | 74.38 | 81.10 |
Chen et al. [41] | HMM | Fusion feature | 5-fold cross-validation | 73.00 | 75.63 |
Mert and Akan [42] | ANN | MEMD-based feature | Leave-one-trail-out validation | 69.10 | 71.99 |
Proposed Method | SVM | Feature selection distance discriminant (FSDD) | 5-fold cross-validation | 80.20 ± 2.92 | 82.02 ± 3.64 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
ArulDass, S.D.; Jayagopal, P. Identifying Complex Emotions in Alexithymia Affected Adolescents Using Machine Learning Techniques. Diagnostics 2022, 12, 3188. https://doi.org/10.3390/diagnostics12123188
ArulDass SD, Jayagopal P. Identifying Complex Emotions in Alexithymia Affected Adolescents Using Machine Learning Techniques. Diagnostics. 2022; 12(12):3188. https://doi.org/10.3390/diagnostics12123188
Chicago/Turabian StyleArulDass, Stephen Dass, and Prabhu Jayagopal. 2022. "Identifying Complex Emotions in Alexithymia Affected Adolescents Using Machine Learning Techniques" Diagnostics 12, no. 12: 3188. https://doi.org/10.3390/diagnostics12123188