# Exploiting Asymmetric EEG Signals with EFD in Deep Learning Domain for Robust BCI

^{1}

^{2}

^{3}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

#### 1.1. What Is BCI?

#### 1.2. BCI Paradigms

#### 1.3. Literature Review

#### 1.4. Objectives and Contributions

- To alleviate the high complexity, extensive computational load, as well as large fluctuation caused by manual feature extraction [19,31], the EFD combined with pre-train CNN models is proposed to contrive a non-complex and automatic feature extraction model. To the best of our knowledge and understanding, this study is the first attempt to combine EFD with any kind of CNN model and estimate its utility for MI EEG problems.
- To accredit the performance invariance for changing datasets, the proposed EFD-CNN design is validated upon four large- and small-scale binary and tertiary-class MI EEG datasets. The deployed datasets incorporate binary class datasets IVa and IVb from BCI competition III containing six subjects altogether, a binary class GigaDB dataset from the GigaScience repository containing EEG data from 52 participants, and a three-class dataset V from BCI competition III having three subjects collectively.
- A subject-independent framework is exploited by training the EFD-CNN model over the data from a particular group of subjects while testing it over an unseen subject. This is particularly interesting for a real-time BCI system since it allows the subject-to-subject transfer of learned model parameters and the reusability of the current model for a large group of new users.
- An extensive quantitative analysis, including an assessment of 10-fold classification performance, the effect of a varying number of EFD modes, deep feature extraction from CNN models and classification with machine learning models, and comparison with contemporary studies, are performed and validated.

## 2. Offline Data Repositories

- $\mathit{Dataset}\phantom{\rule{0.166667em}{0ex}}\mathbf{1}$: Dataset IVa includes two MI EEG tasks for the right hand (RH, class 1) and right foot (RF, class 2). The computer-aided visual system cued five healthy individuals named AA (A1), AL (A2), AV (A3), AW (A4), and AY (A5) for 3.5 s per task and captured data at 1000 Hz in 118 channels using the International 10-20 system. Each individual completed 280 experiments, including 140 trials for the right hand and 140 samples for the right foot category.
- $\mathit{Dataset}\phantom{\rule{0.166667em}{0ex}}\mathbf{2}$: Dataset IVb is a single-subject EEG dataset with MI tasks for the left hand (LH, class 1) and right foot (RF, class 2). Similar to dataset 1, dataset 2 provides the subject (annotated as subject B) with a 3.5 s visual cue and records data at 1000 Hz with 118 channels. A total of 210 trials were carried out, half with class 1 tasks and the other half with class 2 tasks. Datasets 1 and 2 are downscaled to 100 Hz and filtered with a bandpass filter ranging from 0.5 to 200 Hz.
- $\mathit{Dataset}\phantom{\rule{0.166667em}{0ex}}\mathbf{3}$: GigaDB is a binary class MI EEG signals database collected from 52 participants (including 33 male and 19 female subjects). The information was gathered using 64 Ag/AgCl electrodes in accordance with the International 10-10 standard. Each MI task consisted of 100 or 120 trials lasting for 3 s at a sampling rate of 512 Hz.
- $\mathit{Dataset}\phantom{\rule{0.166667em}{0ex}}\mathbf{4}$: Dataset V has three MI EEG tasks, including imagining repetitive self-paced left-hand movements (class 1), imagining repetitive self-paced right-hand movements (class 2), and generating words starting with random letters (class 3). Three subjects participated in extensive trials for different MI tasks, each last for one second while the data was sampled at 512 Hz using 32 electrodes.

## 3. Method

#### 3.1. Step 1: Denoising with Multiscale Principal Component Analysis

- Let m be the number of time samples in an EEG signal, and n be the number of channels. The single-trial EEG signal matrix could be defined as ${A}_{m\times n}$.
- Fragment each channel of matrix A into Q levels using wavelet transform to obtain ${B}_{i}$B (detailed coefficients) and ${A}_{j}$B (approximate coefficients).
- Normalize the wavelet coefficients at each scale and conduct the principal component analysis (PCA). As per Kaiser’s criterion, choose the coefficients with eigenvalues greater than the average of all eigenvalues.
- Calculate the inverse wavelet transform of the selected coefficients.
- Calculate the PCA of the resultant matrix to obtain the denoised EEG signals.

#### 3.2. Step 2: Signal Resolution with Empirical Fourier Decomposition

- If N ≥ M, where N is the maximum number of pivot points in a Fourier spectrum, the first M − 1 points are chosen.
- If N < M, the number of extractable modes is less than the desired decomposition level, and hence M is automatically reset to N.

#### 3.3. Step 3: Scalogram Transformation with Hilbert Transform (HT)

#### 3.4. Step 4: Feature Extraction and Classification with Pre-Trained Convolutional Neural Network Models

## 4. Experimental Arrangements

## 5. Results and Discussions

#### 5.1. A. 10-Fold Performance Evaluation

#### 5.2. B. Effect of Varying Number of EFD Modes

#### 5.3. C. Performance Comparison with Other SD Methods

#### 5.4. D. Results with Other Pre-Trained CNN Models

#### 5.5. E. Deep Features Extraction and Classification with Machine Learning Methods

#### 5.6. F. Comparison with Other State-of-the-Art Studies

#### 5.7. G. Empirical Results for Dataset 3

#### 5.8. H. Empirical Results for Dataset 4

#### 5.9. I. Subject-Independent Case Results

## 6. Future Work

## 7. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Pfurtscheller, G.; Neuper, C.; Muller, G.; Obermaier, B.; Krausz, G.; Schlogl, A.; Scherer, R.; Graimann, B.; Keinrath, C.; Skliris, D.; et al. Graz-BCI: State of the art and clinical applications. IEEE Trans. Neural Syst. Rehabil. Eng.
**2003**, 11, 1–4. [Google Scholar] [CrossRef] - Nicolas-Alonso, L.F.; Gomez-Gil, J. Brain computer interfaces, a review. Sensors
**2012**, 12, 1211–1279. [Google Scholar] [CrossRef] - Lee, M.H.; Kwon, O.Y.; Kim, Y.J.; Kim, H.K.; Lee, Y.E.; Williamson, J.; Fazli, S.; Lee, S.W. EEG dataset and OpenBMI toolbox for three BCI paradigms: An investigation into BCI illiteracy. GigaScience
**2019**, 8, giz002. [Google Scholar] [CrossRef] [Green Version] - Zhu, H.; Forenzo, D.; He, B. On the Deep Learning Models for EEG-Based Brain-Computer Interface Using Motor Imagery. IEEE Trans. Neural Syst. Rehabil. Eng.
**2022**, 30, 2283–2291. [Google Scholar] [CrossRef] - Ma, T.; Li, Y.; Huggins, J.E.; Zhu, J.; Kang, J. Bayesian Inferences on Neural Activity in EEG-Based Brain-Computer Interface. J. Am. Stat. Assoc.
**2022**, 117, 1–12. [Google Scholar] [CrossRef] - Manojprabu, M.; Dhulipala, V.S. Improved energy efficient design in software defined wireless electroencephalography sensor networks (WESN) using distributed architecture to remove artifact. Comput. Commun.
**2020**, 152, 266–271. [Google Scholar] [CrossRef] - Dai, C.; Wang, J.; Xie, J.; Li, W.; Gong, Y.; Li, Y. Removal of ECG artifacts from EEG using an effective recursive least square notch filter. IEEE Access
**2019**, 7, 158872–158880. [Google Scholar] [CrossRef] - Babu, P.A.; Prasad, K. Removal of ocular artifacts from EEG signals using adaptive threshold PCA and wavelet transforms. In Proceedings of the 2011 IEEE International Conference on Communication Systems and Network Technologies, Katra, India, 3–5 June 2011; pp. 572–575. [Google Scholar]
- Ding, K.; Xiao, L.; Weng, G. Active contours driven by region-scalable fitting and optimized Laplacian of Gaussian energy for image segmentation. Signal Process.
**2017**, 134, 224–233. [Google Scholar] [CrossRef] - Wang, G.; Wang, Y.; Min, Y.; Lei, W. Blind Source Separation of Transformer Acoustic Signal Based on Sparse Component Analysis. Energies
**2022**, 15, 6017. [Google Scholar] [CrossRef] - Gokgoz, E.; Subasi, A. Effect of multiscale PCA de-noising on EMG signal classification for diagnosis of neuromuscular disorders. J. Med. Syst.
**2014**, 38, 1–10. [Google Scholar] [CrossRef] - Sadiq, M.T.; Yu, X.; Yuan, Z.; Aziz, M.Z.; Siuly, S.; Ding, W. A matrix determinant feature extraction approach for decoding motor and mental imagery EEG in subject specific tasks. IEEE Trans. Cogn. Dev. Syst.
**2020**, 14, 375–387. [Google Scholar] [CrossRef] - Riaz, F.; Hassan, A.; Rehman, S.; Niazi, I.K.; Dremstrup, K. EMD-based temporal and spectral features for the classification of EEG signals using supervised learning. IEEE Trans. Neural Syst. Rehabil. Eng.
**2015**, 24, 28–35. [Google Scholar] [CrossRef] [PubMed] - Xiao-Jun, Z.; Shi-qin, L.; Fan, L.j.; Yu, X.L. The EEG signal process based on EEMD. In Proceedings of the 2011 IEEE 2nd International Symposium on Intelligence Information Processing and Trusted Computing, Wuhan, China, 22–23 October 2011; pp. 222–225. [Google Scholar]
- Keerthi Krishnan, K.; Soman, K. CNN based classification of motor imaginary using variational mode decomposed EEG-spectrum image. Biomed. Eng. Lett.
**2021**, 11, 235–247. [Google Scholar] [CrossRef] [PubMed] - Sadiq, M.T.; Yu, X.; Yuan, Z.; Fan, Z.; Rehman, A.U.; Li, G.; Xiao, G. Motor imagery EEG signals classification based on mode amplitude and frequency components using empirical wavelet transform. IEEE Access
**2019**, 7, 127678–127692. [Google Scholar] [CrossRef] - Zhang, Y.; Liu, B.; Ji, X.; Huang, D. Classification of EEG signals based on autoregressive model and wavelet packet decomposition. Neural Process. Lett.
**2017**, 45, 365–378. [Google Scholar] [CrossRef] - Taran, S.; Bajaj, V. Motor imagery tasks-based EEG signals classification using tunable-Q wavelet transform. Neural Comput. Appl.
**2019**, 31, 6925–6932. [Google Scholar] [CrossRef] - Yu, X.; Aziz, M.Z.; Sadiq, M.T.; Fan, Z.; Xiao, G. A new framework for automatic detection of motor and mental imagery EEG signals for robust BCI systems. IEEE Trans. Instrum. Meas.
**2021**, 70, 1–12. [Google Scholar] [CrossRef] - Roy, G.; Bhoi, A.; Bhaumik, S. A comparative approach for MI-based EEG signals classification using energy, power and entropy. IRBM
**2022**, 43, 434–446. [Google Scholar] [CrossRef] - Namazi, H.; Ala, T.S.; Kulish, V. Decoding of upper limb movement by fractal analysis of electroencephalogram (EEG) signal. Fractals
**2018**, 26, 1850081. [Google Scholar] [CrossRef] - Yu, X.; Aziz, M.Z.; Sadiq, M.T.; Jia, K.; Fan, Z.; Xiao, G. Computerized Multidomain EEG Classification System: A New Paradigm. IEEE J. Biomed. Health Inform.
**2022**, 26, 3626–3637. [Google Scholar] [CrossRef] - Sadiq, M.T.; Yu, X.; Yuan, Z.; Aziz, M.Z. Identification of motor and mental imagery EEG in two and multiclass subject-dependent tasks using successive decomposition index. Sensors
**2020**, 20, 5283. [Google Scholar] [CrossRef] [PubMed] - Gaur, P.; Gupta, H.; Chowdhury, A.; McCreadie, K.; Pachori, R.B.; Wang, H. A sliding window common spatial pattern for enhancing motor imagery classification in EEG-BCI. IEEE Trans. Instrum. Meas.
**2021**, 70, 1–9. [Google Scholar] [CrossRef] - Adem, Ş.; Eyupoglu, V.; Ibrahim, I.M.; Sarfraz, I.; Rasul, A.; Ali, M.; Elfiky, A.A. Multidimensional in silico strategy for identification of natural polyphenols-based SARS-CoV-2 main protease (Mpro) inhibitors to unveil a hope against COVID-19. Comput. Biol. Med.
**2022**, 145, 105452. [Google Scholar] [CrossRef] [PubMed] - Mao, W.; Fathurrahman, H.; Lee, Y.; Chang, T. EEG Dataset Classification Using CNN Method; Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2020; Volume 1456, p. 012017. [Google Scholar]
- Baygin, M.; Dogan, S.; Tuncer, T.; Barua, P.D.; Faust, O.; Arunkumar, N.; Abdulhay, E.W.; Palmer, E.E.; Acharya, U.R. Automated ASD detection using hybrid deep lightweight features extracted from EEG signals. Comput. Biol. Med.
**2021**, 134, 104548. [Google Scholar] [CrossRef] [PubMed] - Behncke, J.; Schirrmeister, R.T.; Burgard, W.; Ball, T. The signature of robot action success in EEG signals of a human observer: Decoding and visualization using deep convolutional neural networks. In Proceedings of the 2018 IEEE 6th International Conference on Brain-Computer Interface (BCI), Gangwon, Republic of Korea, 15–17 January 2018; pp. 1–6. [Google Scholar]
- Jordanić, M.; Rojas-Martinez, M.; Mananas, M.A.; Alonso, J.F. Prediction of isometric motor tasks and effort levels based on high-density EMG in patients with incomplete spinal cord injury. J. Neural Eng.
**2016**, 13, 046002. [Google Scholar] [CrossRef] [Green Version] - Giudice, M.L.; Varone, G.; Ieracitano, C.; Mammone, N.; Bruna, A.R.; Tomaselli, V.; Morabito, F.C. 1D Convolutional Neural Network approach to classify voluntary eye blinks in EEG signals for BCI applications. In Proceedings of the 2020 IEEE International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–7. [Google Scholar]
- Yu, X.; Aziz, M.Z.; Hou, Y.; Li, H.; Lv, J.; Jamil, M. An Extended Computer Aided Diagnosis System for Robust BCI Applications. In Proceedings of the 2021 IEEE 9th International Conference on Information, Communication and Networks (ICICN), Xi’an, China, 25–28 November 2021; pp. 475–480. [Google Scholar]
- BCI Competition III. 2022. Available online: http://www.bbci.de/competition/iii/ (accessed on 13 October 2022).
- Zhou, W.; Feng, Z.; Xu, Y.; Wang, X.; Lv, H. Empirical Fourier Decomposition: An Accurate Adaptive Signal Decomposition Method. arXiv
**2020**, arXiv:2009.08047. [Google Scholar] - Sadiq, M.T.; Yu, X.; Yuan, Z.; Zeming, F.; Rehman, A.U.; Ullah, I.; Li, G.; Xiao, G. Motor imagery EEG signals decoding by multivariate empirical wavelet transform-based framework for robust brain–computer interfaces. IEEE Access
**2019**, 7, 171431–171451. [Google Scholar] [CrossRef] - Taheri, S.; Ezoji, M. EEG-based motor imagery classification through transfer learning of the CNN. In Proceedings of the 2020 IEEE International Conference on Machine Vision and Image Processing (MVIP), Qom, Iran, 18–20 February 2020; pp. 1–6. [Google Scholar]
- Taran, S.; Bajaj, V.; Sharma, D.; Siuly, S.; Sengur, A. Features based on analytic IMF for classifying motor imagery EEG signals in BCI applications. Measurement
**2018**, 116, 68–76. [Google Scholar] [CrossRef] - Wang, H.; Zhang, Y. Detection of motor imagery EEG signals employing Naïve Bayes based learning process. Measurement
**2016**, 86, 148–158. [Google Scholar] - Siuly, S.; Li, Y. Improving the separability of motor imagery EEG signals using a cross correlation-based least square support vector machine for brain–computer interface. IEEE Trans. Neural Syst. Rehabil. Eng.
**2012**, 20, 526–538. [Google Scholar] [CrossRef] - Fang, T.; Song, Z.; Zhan, G.; Zhang, X.; Mu, W.; Wang, P.; Zhang, L.; Kang, X. Decoding motor imagery tasks using ESI and hybrid feature CNN. J. Neural Eng.
**2022**, 19, 016022. [Google Scholar] [CrossRef] [PubMed] - Miao, M.; Hu, W.; Yin, H.; Zhang, K. Spatial-frequency feature learning and classification of motor imagery EEG based on deep convolution neural network. Comput. Math. Methods Med.
**2020**, 2020, 1981728. [Google Scholar] [CrossRef] [PubMed] - Kevric, J.; Subasi, A. Comparison of signal decomposition methods in classification of EEG signals for motor-imagery BCI system. Biomed. Signal Process. Control
**2017**, 31, 398–406. [Google Scholar] [CrossRef] - Lotte, F.; Guan, C. Regularizing common spatial patterns to improve BCI designs: Unified theory and new algorithms. IEEE Trans. Biomed. Eng.
**2010**, 58, 355–362. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Sadiq, M.T.; Yu, X.; Yuan, Z.; Aziz, M.Z.; Siuly, S.; Ding, W. Toward the development of versatile brain–computer interfaces. IEEE Trans. Artif. Intell.
**2021**, 2, 314–328. [Google Scholar] [CrossRef] - Kumar, S.; Sharma, A.; Tsunoda, T. Brain wave classification using long short-term memory network based optical predictor. Sci. Rep.
**2019**, 9, 9153. [Google Scholar] [CrossRef] - Li, Y.; Wen, P.P. Clustering technique-based least square support vector machine for EEG signal classification. Comput. Methods Programs Biomed.
**2011**, 104, 358–372. [Google Scholar]

**Figure 2.**EFD modes for a sample trial of subject A1 using channel C3. C1 denotes class 1, while C2 indicates class 2 MI EEG task signal.

**Figure 3.**Hilbert spectrum for EFD modes for class 1 (C1) and class 2 (C2) activities utilizing MI EEG recordings from participants A1–A5 and B.

**Figure 5.**Box plot of 10-fold performance metrics obtained from the devised classification model for all subjects of datasets 1 and 2.

**Figure 6.**Confusion charts based on the classification results obtained from proposed EFD-CNN for all subjects of dataset 1 and 2.

**Figure 7.**Effect of changing EFD decomposition level (M) on subject categorization in datasets 1 and 2.

**Figure 8.**Performance comparison for various pre-trained CNN models in combination with EFD method using dataset 1 and 2 subjects.

**Figure 9.**EFD-CNN evaluation against deep features extracted from AlexNet CNN model and classified with different machine learning classifiers.

**Figure 10.**Performance evaluation and comparison of SI case with previously reported Ref. 1: [19] results.

Dataset # | Dataset Name | Sampling Rate | Trial Duration (s) | No. of Electrodes | Participants | Class 1 Trials | Class 2 Trails | Class 3 Trails |
---|---|---|---|---|---|---|---|---|

Dataset 1 | IVa from BCI Competition III [32] | 100 | 35 | 118 | A1 = AA | 80 | 86 | - |

A2 = AL | 112 | 112 | ||||||

A3 = AV | 42 | 42 | ||||||

A4 = AW | 30 | 26 | ||||||

A5 = AY | 18 | 10 | ||||||

Dataset 2 | IVb from BCI Competition III [32] | 100 | 3.5 | 118 | B | 105 | 105 | - |

Dataset 3 | GigaDB from GigaScience [3] | 512 | 3 | 64 | S1–S52 | 100 or 120 | 100 or 120 | - |

Dataset 4 | V from BCI Competition III [32] | 512 | 1 | 32 | Subject 1 | 280 | 200 | 236 |

Subject 2 | 276 | 201 | 237 | |||||

Subject 3 | 238 | 235 | 238 |

Pre-Trained CNN Model | No. of Layers | Input Dimensions | Replaceable Final Layers | No. of Parameters in the Replaceable Layers |
---|---|---|---|---|

AlexNet | 25 | $227\times 227$ | fc8, prob, output | 8194 |

SqueezeNet | 68 | $227\times 227$ | Pool10, prob, ClassificationLayer_predictions | 1026 |

ShuffleNet | 173 | $224\times 224$ | Node_202, softmax, ClassificationLayer_node_203 | 1090 |

GoogleNet | 144 | $224\times 224$ | Loss3-classifier, prob, output | 2050 |

**Table 3.**EFD-CNN performance comparison with other SD-CNN methods for all subjects of dataset 1 and 2.

SD. Methods | A1 | A2 | A3 | A4 | A5 | B | Average |
---|---|---|---|---|---|---|---|

EFD-CNN | 99.10 | 98.70 | 98.80 | 97.50 | 99.05 | 96.03 | 98.20 |

VMD-CNN | 99.91 | 98.61 | 97.93 | 97.41 | 98.68 | 95.13 | 97.78 |

MVMD-CNN | 96.38 | 96.25 | 96.76 | 95.46 | 96.21 | 94.00 | 95.84 |

EMD-CNN | 96.06 | 95.19 | 95.60 | 93.64 | 95.47 | 92.95 | 94.82 |

EWT-CNN | 94.89 | 94.66 | 94.46 | 92.54 | 94.90 | 91.96 | 93.90 |

TQWT-CNN | 92.78 | 92.46 | 92.54 | 90.89 | 92.71 | 89.20 | 91.76 |

WPD-CNN | 89.81 | 88.73 | 89.08 | 87.55 | 89.15 | 86.85 | 88.53 |

Authored by | Proposition | A1 | A2 | A3 | A4 | A5 | Average |
---|---|---|---|---|---|---|---|

Our Previous Study [19] | EFD + IEFD + Welch PSD + FFNN | 99.9 | 99.8 | 99.9 | 99.8 | 100 | 99.8 |

This study | EFD-CNN | 99.1 | 98.7 | 98.8 | 97.5 | 99 | 98.6 |

Taheri et al. [35] | FT + CSP + DCT + EMD + AlexNet | 100 | 97.6 | 98.8 | 96.4 | 100 | 98.5 |

Sadiq et al. [34] | MEWT + JIA + LS-SVM | 95 | 95 | 95 | 100 | 100 | 97 |

Siuly et al. [36] | EMD + AIMF Features + LS-SVM | 97.7 | 98.8 | 96.6 | 98.8 | 95.5 | 97.5 |

Siuly et al. [37] | OA + NB | 97.9 | 97.8 | 98.2 | 94.4 | 93.2 | 96.3 |

Siuly et al. [38] | CC + LS-SVM | 97.8 | 99.1 | 98.7 | 93.4 | 89.3 | 95.7 |

Sadiq et al. [16] | EWT + IA2 + LS-SVM | 94.5 | 91.7 | 97.2 | 95.6 | 97 | 95.2 |

Fang et al. [39] | ESI + CWT + CNN | 89.9 | 98.8 | 90.6 | 95.6 | 91.2 | 93.2 |

Miao et al. [40] | CNN based on frequency characteristics of MI EEG | 100 | 90 | 90 | 90 | 80 | 90 |

Kevric et al. [41] | WPD + Statistical Features + kNN | 77.1 | 72.2 | 75.2 | 85.6 | 86 | 79.2 |

Lotte at al. [42] | SSRCSP | 70.5 | 96.4 | 53.5 | 71.8 | 75.3 | 73.5 |

Lotte et al. [42] | TRCSP | 71.4 | 96.4 | 63.2 | 71.8 | 86.9 | 77.9 |

Lotte at al. [42] | WTRCSP | 72.3 | 96.4 | 60.2 | 77.4 | 86.5 | 78.6 |

Kevric et al. [41] | DWT + Statistical Features + k-NN | 56.6 | 60.7 | 55.9 | 55.1 | 90.1 | 63.6 |

Subjects | Performance Metrics | Performance Comparison | |||||
---|---|---|---|---|---|---|---|

Accuracy | F1-Score | Kappa | Sadiq | Sadiq | Yu | Kumar | |

(%) | (%) | (%) | et al. [25] | et al. [43] | et al. [31] | et al. [44] | |

S01 | 94.17 | 94.57 | 88.27 | 90.79 | 87.69 | 82.5 | 80 |

S02 | 85.83 | 85.95 | 71.67 | 91.38 | 90.6 | 85.48 | 52.33 |

S03 | 99.17 | 99.22 | 98.32 | 94.88 | 95.68 | 95.71 | 94 |

S04 | 93.33 | 93.44 | 86.72 | 91.58 | 87.68 | 83.52 | 78 |

S05 | 100 | 100 | 100 | 100 | 99.06 | 98.58 | 99 |

S06 | 96.67 | 96.67 | 93.34 | 92.69 | 89.03 | 86.51 | 82.33 |

S07 | 82.5 | 80.37 | 64.71 | 84.71 | 81.83 | 79.5 | 52.78 |

S08 | 87.5 | 86.73 | 74.92 | 82.55 | 79.85 | 78.55 | 55.67 |

S09 | 89.17 | 86.02 | 77.18 | 82.17 | 79.23 | 82.55 | 56.4 |

S10 | 92.5 | 93.13 | 84.88 | 89.65 | 86.65 | 80.52 | 73 |

S11 | 82.5 | 82.05 | 64.98 | 82.75 | 76.03 | 75.56 | 57.33 |

S12 | 90 | 91.18 | 79.64 | 83.22 | 78.44 | 80.53 | 67.17 |

S13 | 94.17 | 94.31 | 88.33 | 91.21 | 89.01 | 94.58 | 84.5 |

S14 | 100 | 100 | 100 | 100 | 98.02 | 97.57 | 96.5 |

S15 | 89.17 | 87.85 | 78.09 | 90.98 | 87.2 | 75.57 | 64.67 |

S16 | 83.33 | 85.07 | 66.22 | 82.85 | 77.49 | 81.54 | 47.5 |

S17 | 83.33 | 81.13 | 66.24 | 78.54 | 73.06 | 79.56 | 49.67 |

S18 | 80.83 | 82.44 | 61.37 | 81.31 | 77.07 | 77.58 | 53.33 |

S19 | 87.5 | 86.73 | 74.92 | 84.1 | 78.26 | 80.56 | 57.17 |

S20 | 91.67 | 91.38 | 83.32 | 88.8 | 85.42 | 77.53 | 72.83 |

S21 | 90.83 | 90.43 | 81.64 | 91.05 | 88.21 | 74.51 | 66 |

S22 | 87.5 | 87.8 | 75.03 | 82.91 | 79.03 | 78.58 | 58.67 |

S23 | 94.17 | 94.31 | 88.33 | 87.82 | 87.84 | 87.5 | 84.17 |

S24 | 84.17 | 83.76 | 68.33 | 81.94 | 78.22 | 79.55 | 60.67 |

S25 | 83.33 | 84.38 | 66.53 | 82.59 | 78.65 | 72.51 | 53 |

S26 | 99.17 | 99.2 | 98.33 | 100 | 99.04 | 98.56 | 96.83 |

S27 | 81.67 | 82.54 | 63.28 | 82.49 | 76.49 | 76.54 | 44.67 |

S28 | 95 | 95 | 90 | 92.23 | 91.43 | 78.6 | 80.83 |

S29 | 85 | 83.64 | 69.82 | 79.82 | 74.8 | 91.54 | 43 |

S30 | 87.5 | 86.24 | 74.83 | 87.78 | 86.64 | 76.51 | 55.5 |

S31 | 86.67 | 85.71 | 73.28 | 86.09 | 84.43 | 78.52 | 62.33 |

S32 | 88.33 | 88.71 | 76.64 | 86.12 | 83.66 | 77.56 | 50.57 |

S33 | 87.5 | 88.37 | 74.9 | 83.24 | 79.6 | 73.59 | 55.67 |

S34 | 85.83 | 85.95 | 71.67 | 84.09 | 77.65 | 80.52 | 58 |

S35 | 94.17 | 94.74 | 88.2 | 90.57 | 91.67 | 85.54 | 81.83 |

S36 | 88.33 | 89.55 | 76.34 | 84.88 | 83.7 | 82.53 | 69.5 |

S37 | 92.5 | 91.74 | 84.88 | 91.12 | 90.42 | 83.52 | 77 |

S38 | 84.17 | 84.55 | 68.42 | 83.1 | 79.1 | 77.6 | 48.5 |

S39 | 91.67 | 91.53 | 83.33 | 86.57 | 86.21 | 80.51 | 73 |

S40 | 83.33 | 83.33 | 66.68 | 82.51 | 75.81 | 77.54 | 51.17 |

S41 | 94.17 | 94.66 | 88.24 | 89.7 | 87.64 | 92.59 | 85.33 |

S42 | 86.67 | 85.45 | 73.18 | 81.4 | 74.64 | 75.54 | 48.33 |

S43 | 100 | 100 | 100 | 98.08 | 99.04 | 99.53 | 95.83 |

S44 | 97.5 | 97.3 | 94.97 | 92.85 | 93.29 | 94.53 | 89.17 |

S45 | 86.67 | 84.62 | 72.86 | 84.57 | 82.29 | 82.58 | 52.5 |

S46 | 85 | 86.15 | 69.82 | 79.49 | 75.89 | 85.57 | 25.42 |

S47 | 91.67 | 91.53 | 83.33 | 88.48 | 89.08 | 88.57 | 74.17 |

S48 | 91.67 | 92.96 | 82.76 | 89.67 | 89.21 | 90.53 | 78.17 |

S49 | 97.5 | 97.48 | 95 | 94.73 | 93.41 | 91.54 | 87.5 |

S50 | 100 | 100 | 100 | 99 | 100 | 98.58 | 100 |

S51 | 84.17 | 85.27 | 68.26 | 85.06 | 84.04 | 89.56 | 53.17 |

S52 | 89.17 | 89.6 | 78.3 | 85.71 | 82.69 | 84.54 | 61.83 |

Average | 89.97 | 89.9 | 79.81 | 87.68 | 85.02 | 83.83 | 67.24 |

Subjects | Cases | Combinations |
---|---|---|

Subject 1 | Case 1 | Class 1 vs. Class 2 |

Case 2 | Class 1 vs. Class 3 | |

Case 3 | Class 2 vs. Class 3 | |

Subject 2 | Case 4 | Class 1 vs. Class 2 |

Case 5 | Class 1 vs. Class 3 | |

Case 6 | Class 2 vs. Class 3 | |

Subject 3 | Case 7 | Class 1 vs. Class 2 |

Case 8 | Class 1 vs. Class 3 | |

Case 9 | Class 2 vs. Class 3 |

Cases | Performance Metrics | Performance Comparison | |||
---|---|---|---|---|---|

Accuracy | F1-Score | Kappa | Siuly | Sadiq | |

(%) | (%) | (%) | et al. [45] | et al. [12] | |

Case 1 | 96.42 | 96.29 | 92.85 | 65.88 | 94.58 |

Case 2 | 92.85 | 95 | 82.5 | 75.35 | 91.16 |

Case 3 | 85.71 | 84.61 | 71.42 | 62.68 | 81.17 |

Case 4 | 98.98 | 97.97 | 95.99 | 58.95 | 100 |

Case 5 | 100 | 100 | 100 | 73.04 | 100 |

Case 6 | 85.71 | 81.81 | 70.05 | 62.35 | 82.15 |

Case 7 | 96.42 | 97.14 | 92.39 | 47.84 | 97.46 |

Case 8 | 99 | 99.01 | 97.99 | 51.47 | 99.16 |

Case 9 | 89.28 | 91.89 | 76.13 | 52.71 | 80.54 |

Average | 93.81 | 93.74 | 86.65 | 61.14 | 91.8 |

**Table 8.**Subject independent case results for different training and testing data combinations using dataset 1 subjects.

Combination # | Subject | A1 | A2 | A3 | A4 | A5 |
---|---|---|---|---|---|---|

Combinations | ||||||

Comb. 1 | A1 | - | 77.74 | 72.9 | 83.46 | 79.79 |

Comb. 2 | A2 | 82.24 | - | 83.08 | 83.62 | 87.06 |

Comb. 3 | A3 | 69.89 | 79.24 | - | 72.03 | 74.82 |

Comb. 4 | A4 | 86.02 | 83.43 | 84.82 | - | 86.46 |

Comb. 5 | A5 | 82.27 | 69.82 | 72.45 | 86.78 | - |

Comb. 6 | A1, A2 | - | - | 75.24 | 82.94 | 79.65 |

Comb. 7 | A1, A3 | - | 72.56 | - | 85.64 | 69.11 |

Comb. 8 | A1, A4 | - | 71.31 | 74.96 | - | 83.32 |

Comb. 9 | A1, A5 | - | 86.7 | 83.63 | 72.47 | - |

Comb. 10 | A2, A3 | 86.97 | - | - | 71.53 | 79.91 |

Comb. 11 | A2, A4 | 83.17 | - | 86.74 | - | 71.86 |

Comb. 12 | A2, A5 | 74.19 | - | 72.83 | 76.48 | - |

Comb. 13 | A3, A4 | 81.55 | 81.86 | - | - | 77.55 |

Comb. 14 | A3, A5 | 80.7 | 80.5 | - | 83.1 | - |

Comb. 15 | A4, A5 | 86.81 | 78.37 | 86.12 | - | - |

Comb. 16 | A1, A2, A3 | - | - | - | 80.55 | 74.22 |

Comb. 17 | A1, A2, A4 | - | - | 73.5 | - | 71.77 |

Comb. 18 | A1, A2, A5 | - | - | 75.43 | 76.09 | - |

Comb. 19 | A1, A3, A4 | - | 80.08 | - | - | 80.36 |

Comb. 20 | A1, A3, A5 | - | 78.68 | - | 72.45 | - |

Comb. 21 | A1, A4, A5 | - | 70.02 | 72.27 | - | - |

Comb. 22 | A2, A3, A4 | 83.57 | - | - | - | 79.51 |

Comb. 23 | A2, A3, A5 | 73.54 | - | - | 73.55 | - |

Comb. 24 | A2, A4, A5 | 74.5 | - | 78.17 | - | - |

Comb. 25 | A3, A4, A5 | 81.49 | 78.19 | - | - | - |

Comb. 26 | A1, A2, A3, A4 | - | - | - | - | 74.04 |

Comb. 27 | A1, A2, A3, A5 | - | - | - | 76.11 | - |

Comb. 28 | A1, A2, A4, A5 | - | - | 73.39 | - | - |

Comb. 29 | A1, A3, A4, A5 | - | 74.51 | - | - | - |

Comb. 30 | A2, A3, A4, A5 | 74.61 | - | - | - | - |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Huang, B.; Xu, H.; Yuan, M.; Aziz, M.Z.; Yu, X.
Exploiting Asymmetric EEG Signals with EFD in Deep Learning Domain for Robust BCI. *Symmetry* **2022**, *14*, 2677.
https://doi.org/10.3390/sym14122677

**AMA Style**

Huang B, Xu H, Yuan M, Aziz MZ, Yu X.
Exploiting Asymmetric EEG Signals with EFD in Deep Learning Domain for Robust BCI. *Symmetry*. 2022; 14(12):2677.
https://doi.org/10.3390/sym14122677

**Chicago/Turabian Style**

Huang, Binwen, Haiqin Xu, Miao Yuan, Muhammad Zulkifal Aziz, and Xiaojun Yu.
2022. "Exploiting Asymmetric EEG Signals with EFD in Deep Learning Domain for Robust BCI" *Symmetry* 14, no. 12: 2677.
https://doi.org/10.3390/sym14122677