Next Article in Journal
Emotional Dysregulation Mechanisms in Psychosomatic Chronic Diseases Revealed by the Instability Coefficient
Next Article in Special Issue
Advances in Multimodal Emotion Recognition Based on Brain–Computer Interfaces
Previous Article in Journal
Cognitive Training Deep Dive: The Impact of Child, Training Behavior and Environmental Factors within a Controlled Trial of Cogmed for Fragile X Syndrome
Previous Article in Special Issue
Independent Components of EEG Activity Correlating with Emotional State
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Emotion Assessment of Stroke Patients by Using Bispectrum Features of EEG Signals

by
Choong Wen Yean
1,
Wan Khairunizam Wan Ahmad
2,*,
Wan Azani Mustafa
2,
Murugappan Murugappan
3,
Yuvaraj Rajamanickam
4,
Abdul Hamid Adom
2,
Mohammad Iqbal Omar
1,
Bong Siao Zheng
1,
Ahmad Kadri Junoh
5,
Zuradzman Mohamad Razlan
6 and
Shahriman Abu Bakar
6
1
Faculty of Electronic Engineering Technology, Universiti Malaysia Perlis (UniMAP), Arau 02600, Perlis, Malaysia
2
Faculty of Electrical Engineering Technology, Universiti Malaysia Perlis (UniMAP), Arau 02600, Perlis, Malaysia
3
Department of Electronics and Communication Engineering, Kuwait College of Science and Technology, Doha Area, 7th Ring Road, Kuwait City 13133, Kuwait
4
School of Electrical and Electronic Engineering, Nanyang Technological University (NTU), 50 Nanyang Avenue, Singapore 639798, Singapore
5
Institute of Engineering Mathematics, Universiti Malaysia Perlis, Arau 02600, Perlis, Malaysia
6
Faculty of Mechanical Engineering Technology, Universiti Malaysia Perlis (UniMAP), Arau 02600, Perlis, Malaysia
*
Author to whom correspondence should be addressed.
Brain Sci. 2020, 10(10), 672; https://doi.org/10.3390/brainsci10100672
Submission received: 9 August 2020 / Revised: 11 September 2020 / Accepted: 22 September 2020 / Published: 25 September 2020

Abstract

:
Emotion assessment in stroke patients gives meaningful information to physiotherapists to identify the appropriate method for treatment. This study was aimed to classify the emotions of stroke patients by applying bispectrum features in electroencephalogram (EEG) signals. EEG signals from three groups of subjects, namely stroke patients with left brain damage (LBD), right brain damage (RBD), and normal control (NC), were analyzed for six different emotional states. The estimated bispectrum mapped in the contour plots show the different appearance of nonlinearity in the EEG signals for different emotional states. Bispectrum features were extracted from the alpha (8–13) Hz, beta (13–30) Hz and gamma (30–49) Hz bands, respectively. The k-nearest neighbor (KNN) and probabilistic neural network (PNN) classifiers were used to classify the six emotions in LBD, RBD and NC. The bispectrum features showed statistical significance for all three groups. The beta frequency band was the best performing EEG frequency-sub band for emotion classification. The combination of alpha to gamma bands provides the highest classification accuracy in both KNN and PNN classifiers. Sadness emotion records the highest classification, which was 65.37% in LBD, 71.48% in RBD and 75.56% in NC groups.

Graphical Abstract

1. Introduction

Stroke is one of the highest causes of death in Malaysia, with more than 40,000 survivors are managing their health today [1]. Globally, there were 6.2 million deaths caused by stroke in 2017, where the highest rates of stroke mortality countries were reported in Eastern Europe, Africa, and Central Asia [2]. Stroke is caused by the insufficient supply of oxygen to the brain, damaging damage brain cells. This is turn will definitely affect some brain functions which results in stroke survivors to have difficulties in daily living, such as mobility, communication and expressing their thoughts. Also, stroke patients often suffer from emotional and behavioral changes due to their dissatisfaction with the current conditions.
Past studies have been carried out to investigate emotional changes in stroke patients as the influence of their physiological phenomenon [3,4,5]. These studies revealed that emotions and thoughts are seen as interactive reactions and are intimately related to the health and physiological problems. This leads to the increase risk of a second or recurrent stroke with persistent depression. Therefore, emotion recognition of stroke patients is very helpful in the diagnosis of their psychological and physiological conditions.
The assessment of the emotional conditions and mood of stroke patients is required during rehabilitation to identify the presence of mental health problems such as persistent depression and mood disorders. This also assists in identifying the severity of associated functional impairment of the patients.
Conventionally, emotion assessment can be done through interviews with patients [3,6], an observation on patients’ behaviors [6] as well as using standardized measures such as the Hospital Depression and Anxiety Scale (HADS) [6] and Beck Depression Inventory (BDI) [7]. These standardized measures determine the emotional states of the patient by scoring. However, the conventional approaches could be cheated by patients and information that is acquired then will be not accurate. Consequently, researchers tried other approaches to understand the emotional states of patients. Recent studies have stated that emotion assessment can be performed using physiological signals [8], such as skin conductance (SC) [9], respiration signal [10], electrocardiogram (ECG) [11], and electroencephalogram (EEG) [12].
This paper is organized as follows: Section 1, the problem and the context of the study in emotion assessment in stroke patients is discussed. Then, Section 2 reviews the literature on EEG analysis and non-linear features. This section also discusses the use of bispectrum features in EEG analysis. Section 3 focuses on the materials and methods used in this study, including the description of the EEG data, preprocessing, feature extraction, statistical analysis and classification methods. The next section presents the results and their discussions. Lastly, the summary of this paper is briefly discussed.

2. Related Works

EEG is the brain signal that can be measured from placing electrode sensors along the scalp to record the electrical activity of the brain that happens near the surface of the scalp [13]. EEG signal can be used for diagnostic purposes and any abnormalities detected from it connotes that there is brain disorder in that person.
From previous research, the brain has been reported as having higher responsibility and involvement in emotional activities [14]. The brain as the center of emotions is responsible to give responses when it perceives a stimulus. Hence, brain signals are able to provide emotional information of a stroke patient. With this concept, most recent studies of emotion assessment for stroke patients have utilized brain signals. Adamaszek et al. studied the emotional impairment using the event-related potentials (ERP) of a stroke patient [15], Doruk et al. studied the emotional impairment in stroke patients by comparing the emotional score in the Stroke Impact Scale (SIS) with the EEG features, the EEG power asymmetry and coherence [16]. Bong et al. assessed the emotions of stroke patients by using EEG signals in the time-frequency domain [12]. In this study, the electroencephalogram (EEG)-based emotion recognition algorithm is proposed to study the emotional states of stroke patients.
According to previous studies, stroke patients suffer from emotional impairment and consequently, hence their emotional experiences are less significant compared to normal people [15,17,18,19]. In another related study by Bong et al. [12], left brain damage (LBD) stroke patients were most dominant in perceiving sadness emotion and RBD stroke patients were most dominant in anger emotion. In addition, the authors’ dominant frequency band was the beta band by using wavelet packet transform (WPT) with Hurst exponent feature. The highest accuracy obtained was 76.04% in the happiness emotion for the normal control (NC) group using the features from beta to gamma band.
Previous emotional classification studies by Yuvaraj et al. [20] have also shown that the accuracy of the feature extracted from the single frequency band was higher in the beta and gamma bands. Also, the highest accuracy was obtained when all frequency bands from delta to gamma were combined. The author obtained an average accuracy of 66.80% in NC using the combinations of the five frequency bands. In addition, they obtained an average accuracy of 64.73% with the beta band and 65.80% with the gamma band of NC. These results were optimized with the application of the feature selection technique.
The brain is a chaotic dynamical system [21,22,23], where EEG signals are generated by nonlinear deterministic processes. This is also referred to as deterministic chaos theory with nonlinear coupling interactions between neuronal populations [24,25]. In contrast with linear analysis, nonlinear analysis methods will give more meaningful information about the emotional state changes of stroke patients. Over the last few years, a number of research works have been reported on analyzing EEG signals by using non-linear methods [25,26,27]. For example, a recurrence measure was applied to study the seizure EEG signals [25]. Zappasodi et al. used fractal dimension (FD) to study the neuronal impairment in stroke patients [26]. In addition, Acharya et al. studied sleep stage detection in EEG signals by using different nonlinear dynamic methods, higher order spectra (HOS) features and recurrence quantification analysis (RQA) features [28]. In their study, HOS was used to extract momentous information which helped with the diagnosis of neurological disorders.
HOS has been claimed as an effective method for analyzing EEG signals. HOS feature has been the most commonly used nonlinear feature. It is the frequency domain or spectral representation of higher order cumulants of a random process. HOS only includes cumulants with third order and above. HOS gains its advantage with the elimination of Gaussian noise and provides a high signal to noise ratio (SNR) [29,30]. HOS provides the ability to extract information deviation from Gaussian and preserves the phase information of signals. Thus, HOS is able to estimate the phase of the non-Gaussian parametric signals. In addition, HOS detects and characterizes nonlinearities in signals. In contrast, the second order measure is power spectrum, which can only reveal linear and Gaussian information of signals.
The third order HOS is bispectrum and is able to preserve phase information of EEG signals. Bispectrum is the easiest HOS to be worked out [31]. Bispectrum has been utilized in the emotional study in EEG signals. Yuvaraj et al. applied bispectrum to study the difference between Parkinson’s disease patients and normal people in six discrete emotions (happiness, sadness, fear, anger, surprise, and disgust) [27,32]. Hosseini applied bispectrum to classify the two emotional states (calm and negatives states) of normal subjects [33].
However, the emotional states of stroke patients are yet to be analyzed using the bispectrum features. Hence, in this work, the bispectrum feature is used to classify stroke patients’ EEG signals in different emotional states.
Bispectrum is proven in its ability to detect quadratic phase coupling (QPC), a phenomenon of nonlinearity interaction in EEG signals. QPC is the sum of phases at two frequency variables given by f 1 + f 2 [34,35]. Bispectrum can be estimated through two approaches: direct and indirect methods. For a stationary, discrete time, random process x ( k ) , the direct method is estimated by taking the 1D-Fourier transform of the discrete series given by:
B i ( f 1 , f 2 ) = E [ X ( f 1 ) X ( f 2 ) X * ( f 1 + f 2 ) ] ,
where Bi is the bispectrum magnitude, E [ ] denotes statistical expectation operation, X ( f ) is the Fourier transform (1-D FFT) of the time series, x ( k ) and * denote the complex conjugate.
For the indirect method, bispectrum is estimated by first estimating the third order cumulants of the random process x ( k ) . Then the nth-order moment is equal to the expectation over the process multiplied by the (n − 1) lagged version of itself. Therefore, the third order moment, m 3 x is:
m 3 x ( τ 1 , τ 2 ) = E [ X ( k ) X ( k + τ 1 ) X ( k + τ 2 ) ] ,
where E [ ] denotes statistical expectation operation, τ 1   and   τ 2 are lags of the moment sequence.
The third order cumulant sequence, C 3 x ( τ 1 , τ 2 ) , is identical to its third order moment sequence. It can be calculated by taking an expectation over the process multiplied by 2 lagged versions given by:
C 3 x ( τ 1 , τ 2 ) = E [ X ( k ) X ( k + τ 1 ) X ( k + τ 2 ) ] .
The bispectrum, B ( f 1 , f 2 ) , is the 2D-Fourier transform of the third order cumulant function is given by:
B ( f 1 , f 2 ) =     τ 1 = τ 2 = C 3 x ( τ 1 , τ 2 ) exp [ j ( f 1 τ 1 + f 2 τ 2 ) ] ,      
for | f 1 | π , | f 2 | π , and | f 1 + f 2 | π .
Bispectrum is a symmetric function as shown in Figure 1. The shaded area is the non-redundant region of computation of the bispectrum, where f 2 0 ,   f 2 f 1 ,   f 1 + f 2 π , which is sufficient to describe the whole bispectrum [36].

3. Materials and Methods

3.1. EEG Data

The EEG database used in this study was collected from stroke patients, with left brain damage (LBD), right brain damage (RBD) and normal control (NC) at the Hospital Canselor Tuanku Muhriz (HCTM), Kuala Lumpur. (formal approval obtained from UKM Medical Center and Ethics committee for human research, reference no.: UKM 1.5.3.5/244/FF-354-2012). The EEG raw signals of 15 subjects each from every group (LBD, RBD, and NC) were used for the analysis. The background and neurophysiological characteristics of the subjects in the three groups are described in Table 1. The subjects passed the Mini-Mental State Examination (MMSE) with scores of more than 24 over a total of 30 points which was conducted to exclude dementia. The subjects also passed the Beck Depression Inventory (BDI) with scores of less than 18 points, to exclude subjects with psychological problems. The Edinburg Handedness Inventory (EHI) was used to determine the handedness of the subjects, and measured in a scale from −1 to 1. The scales were interpreted as pure left hander for a score of −1, mixed left hander for a score of −0.5, neutral for a score of 0, mixed right hander for a score of 0.5 and pure right hander for a score of 1. From Table 1, the scores show that all subjects were right handers. All subjects were self-reported to have normal vision or corrected to normal vision (with spectacles or contact lenses) to ensure better effect of perceiving emotions from audio–visual stimuli.
The EEG data were collected using a 14-channel wireless EEG device, Emotiv EPOC headset, with built in digital 5th order Sinc filter. The electrode placement was based on the international standard 10–20 system as shown in Figure 2. The EEG data were collected at sampling frequency of 128 Hz. One of the limitations of the EEG is that it has poor spatial resolution as compared to high resolution brain imaging devices, such as functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) scans [37]. However, the Emotiv EPOC device has 14 electrodes with 2 references providing appropriate spatial resolution as well as practical in terms of time and money for this study. Moreover, the EEG device provides high temporal resolution data that record the neural activity changes in milliseconds, which is impossible for fMRI and PET scans.
To collect the emotional EEG data, an emotional elicitation protocol was designed to stimulate the emotional states of subjects. The data collection protocol is shown in Figure 3. The stimuli used to evoke the emotions in subjects were audio–visual in the form of video clips edited from International Affective Picture System (IAPS) and International Affective Digital Sound (IADS). Six emotional content video clips were presented to stimulate six discrete emotions, namely anger (A), disgust (D), fear (F), happiness (H), sadness (S) and surprise (SU) [12].
Prior to the experiment, the subjects were asked to complete the MMSE, BDI, and EHI tests and informed consent was given to them. Then, instruction about the experimental procedure was given to the subjects. The experiment started with a sample video clip, followed by six trials of video clips which were displayed continuously. Emotional EEG signals were recorded during the six trials video clips display. After that, the EEG recording was stopped for self-assessment, where the subjects were asked about the emotions they felt or perceived from the video clips. The self-assessment time was at least 1 min and it was subject-dependent. During this period, subjects were asked to relax and get ready for the next video. This was to avoid stimulus order effects. Then the experiment began with the sadness emotion. The same experimental procedures were repeated for happiness (H), fear (F), disgust (D), surprise (SU) and anger (A) emotions. There were a total of 42 video clips, including those of the sample video clips. The duration of each video clip was 46 s to 1 min, therefore, the total duration of the data collection was between 90 to 120 min.

3.2. Preprocessing

There were a total of 36 trials of EEG signals collected from each subject in all groups (LBD, RBD and NC). The collected EEG signals were preprocessed to remove the effects of noises and artifacts that caused interference to the raw signals. The preprocessing of the EEG signals was performed using MATLAB.
The artifacts due to eye blinks were filtered using thresholding method, where the potentials higher than 80 µV and lower than −80 µV were offset from each EEG raw signal [32]. A 6th order Butterworth bandpass filter was used to filter the EEG signals with cut-off frequencies from 0.5 to 49 Hz to extract the delta to gamma frequency bands [32].

3.3. Feature Extraction

The indirect method was used in this study to estimate bispectrum using the bispeci function in the MATLAB Higher Order Statistics Toolbox. The number of points used to form each fast Fourier transform (NFFT) was 1024. The bispectrum features were extracted from data by using 50% overlap with Hanning window. The preprocessed time domain EEG data were segmented into six seconds length for every channel. Each data segment is also known as an epoch, and contains 768 data. Three types of EEG frequency sub-bands were used for analysis, namely the alpha (8–13) Hz, beta (13–30) Hz and gamma (30–49) Hz bands.
Bispectrum features were computed from the non-redundant region (Ω) of bispectrum The features extracted from each epoch were variance ( v ), sum of logarithmic of bispectrum ( H 1 ), sum of logarithmic of diagonal elements in the bispectrum ( H 2 ), first moment of diagonal elements in the bispectrum ( H 3 ), second moment of diagonal elements in the bispectrum ( H 4 ) and moment of bispectrum ( H 5 ).
The variance, v of the bispectrum was computed as:
v = 1 N 1 i = 1 N | B i μ | 2 ,
where N is the total number of bispectrum in Ω, μ is the mean of bispectrum in Ω, B i is the bispectrum series for i = 1, 2, 3, …, N.
Sum of logarithmic amplitudes of bispectrum (H1):
H 1 = Ω log ( | B ( f 1 , f 2 ) | ) ,
where Ω is the non-redundant region of bispectrum, f 1 and f 2 are frequency variables of bispectrum and B ( f 1 , f 2 ) is the bispectrum feature of f 1 and f 2 in Ω.
The sum of logarithmic amplitudes of diagonal elements in the bispectrum (H2):
H 2 = Ω log ( | B ( f m , f m ) | ) ,
where Ω is the non-redundant region of bispectrum and B ( f m , f m ) is the diagonal element of bispectrum feature in Ω.
The first-order spectral moment of amplitudes of diagonal elements in the bispectrum (H3):
H 3 = m = 1 N m · log ( | B ( f m , f m ) | ) ,
where B ( f m , f m ) is the diagonal element of bispectrum feature in Ω, N is the total number diagonal elements of bispectrum in Ω, and m = 1, 2, 3, …, N.
The second-order spectral moment of the amplitudes of diagonal elements in the bispectrum (H4):
H 4 = m = 1 N ( m H 3 ) 2 · log ( | B ( f m , f m ) | ) ,
where B ( f m , f m ) is the diagonal element of bispectrum feature in Ω, N is the total number diagonal elements of bispectrum in Ω, and m = 1, 2, 3, …, N.
The moment of bispectrum (H5):
H 5 = ( f 1 2 + f 2 2 ) · | B ( f 1 , f 2 ) | ,
where f 1 and f 2 are frequency variables of bispectrum and B ( f 1 , f 2 ) is the bispectrum feature of f 1 and f 2 in Ω.

3.4. Statistical Analysis

One-way analysis of variance (ANOVA) was used to test the significant difference of the bispectrum features among the six emotions classes for LBD, RBD and NC respectively. The use of ANOVA was to statistically analyze the bispectrum features for whether there were differences among the class means of the six emotions. The use of ANOVA requires the assumption that the observations from the feature were approximately normally distributed, the observations were independent and the variances of the classes were equal. The null hypothesis was: “All the emotions of the extracted feature have equal mean”. The null hypothesis was rejected and the bispectrum features were validated as statistically significant among the six emotional states if the p-value is less than or equal to 0.05. When the null hypothesis was not rejected, it implies that all the emotions of the extracted feature have equal mean, thus the feature that failed to reject the null hypothesis was not suitable to be used for emotion classification.

3.5. Classification

Each feature used for classification has a total of 90 trials (6 trials × 15 subjects) with 84 feature vectors (14 channels × 6 windows) for each emotion. The k-nearest neighbor (KNN) and probabilistic neural network (PNN) classifiers were used to classify the six emotions in the three groups (LBD, RBD and NC). The KNN is one of the most widely applied classifier due to it lower complexity and fast decision making. The KNN searches for the nearest distance or to examine for the most likeliness between the unknown sample and the training dataset. The distance of the unknown sample and the training dataset is determined by the distance metric. In this study, the Cityblock distance metric was implemented in the KNN classification [38].
The PNN uses the Parzen window for nonparametric approximation of the probability distribution function (PDF) of each class and applies Bayes’ rule to allocate the new input data to the class with the highest probability by using the PDF of each class [39]. The classifier parameter is the spread value and is proportional to the standard deviation of the Parzen window in PNN. A small spread value gives narrow PDF, whereas a large spread value gives wide PDF and the classifier becomes less selective [40,41].
In this work, the k values of 1 to 15 were tested in KNN and the spread values of 0.1 to 1.5 with an increment of 0.1 were used in PNN to classify the features. The performance of the classifiers was validated through 10-fold cross validation, where 90% of the data were used for training and 10% of the data were used for testing.

4. Results and Discussions

Bispectrum features were extracted from the EEG signals in three groups of subjects (LBD, RBD, and NC) for the analysis of six emotions, namely anger (A), disgust (D), fear (F), happiness (H), sadness (S), and surprise (SU). The contour plots of the estimated bispectrum using the anger emotion of one subject from the LBD group was plotted for the alpha, beta and gamma bands as shown in Figure 4. The plots of the bispectrum magnitude show the relationship between the two bispectrum frequency variables, f 1 and f 2 , of the anger emotion. In Figure 4, the f 1 (x-axis) and f 2 (y-axis) are phased coupled. Frequency variables that are phase coupled indicate the presence of quadratic phase coupling (QPC) [31], where the QPC represents the underlying neuronal interaction of the emotional state at the frequencies ( f 1 and f 2 ). The higher magnitude indicates stronger QPC between the frequencies. The red color represents the greatest increase in the magnitude of bispectrum, while the blue color represents the greatest decrease in the magnitude of bispectrum.
The distribution of the bispectrum over the ( f 1 , f 2 ) plane differs in each frequency band. The alpha band in Figure 4a shows more bispectrum distribution at lower phase coupled frequencies, which is between (0.04, 0.04) Hz and (0.1, 0.1) Hz in the non-redundant region and other symmetry regions. Whereas the beta band in Figure 4b and gamma band in Figure 4c show the bispectrum distribution at higher phase coupled frequencies. These are between (0.1, 0.1) Hz and (0.2, 0.2) Hz in the beta band and between (0.3, 0.3) Hz and (0.4, 0.4) Hz in the gamma band. Moreover, the maximum magnitude of the bispectrum in the alpha band is the lowest among the three frequency bands. The beta band has larger maximum bispectrum magnitude than the alpha band, while the gamma band has the largest maximum bispectrum magnitude among the frequency bands.
Figure 5, Figure 6 and Figure 7 show the bispectrum plots in the non-redundant region and one symmetry region of the six emotions of Subjects #1 from NC, LBD and RBD groups, respectively. From these figures, the different emotional states have different bispectrum distribution over the plane with different phased coupled peaks and maximum magnitude for each. In the past studies, bispectrum has been claimed as a useful signal classification method as it is able to show distinctive distribution in different conditions, such as the left-hand motor imagery and the right-hand motor imagery [42]. The bispectrum provides an EEG feature that is able to recognize these two conditions. Another study has shown that the bispectrum feature is different before meditation and during meditation [43]. This study revealed that the bispectrum exhibit more phase-coupled distribution during meditation than the state before meditation. In addition, the maximum bispectrum magnitude increased during meditation. For the non-human experiment, the “induced’’ ischemic stroke in rat showed the difference bispectrum distribution in different states of ischemia [44]. In this study, the bispectrum distribution decreased as the rat turns from normal state to ischemic state. Consequently, the distinctive bispectrum pattern of the six emotional states presented in this study implies that the emotional states of each group were distinguishable by applying bispectrum analysis. The significant difference of the emotional states using the bispectrum feature is further validated by the statistical analysis using ANOVA as shown in Table 2.
From the experiment, six types of bispectrum features were extracted from preprocessed EEG data of LBD, RBD and NC. The statistical test using ANOVA was performed on the extracted features and the degrees of freedom were 45,354. The results are shown in Table 2 in three different frequency bands for LBD, RBD and NC respectively. For a p-value less than or equal to 0.05, this indicates that the differences between some of the means of the emotional states are statistically significant. The significant bispectrum features imply that there is an interaction of neuronal subcomponents at different frequencies in different emotional states. The shaded p-values show the feature which are statistically not significant between the means of emotion classes as they are larger than 0.05. All the bispectrum features were statistically significant in LBD, RBD and NC except the second moment of the diagonal elements in bispectrum (H4), thus, H4 was discarded in classification. Moreover, from Table 2, the F values are higher in H1 and H3 for LBD, H2 and H5 in RBD and v, H5, and H2 in NC. The highest overall F values are in the LBD group, while NC has comparably smaller values compared to both LBD and RBD groups.
In emotion classification, the features were trained with varying k values for KNN and spread values for PNN. The classifiers were tested for all groups and frequency bands. Figure 8, Figure 9 and Figure 10 show the classification performance of varying k values in three individual EEG frequency sub-bands (alpha, beta, and gamma) and the combination of the three bands using Cityblock KNN classifier. From the figures, the average accuracy of the bispectrum features was similar across all k values tested for alpha, beta, and gamma bands. However, the k value of 1 achieved the highest average accuracy when using the features from the combination of the alpha to gamma band. Moreover, the combination of the alpha to gamma band significantly performs better than other frequency bands for all k values as shown in Figure 8, Figure 9 and Figure 10. The beta band, on the other hand, is the single band that performs best among the three EEG sub-bands.
In Figure 11, Figure 12 and Figure 13, the average accuracy of the emotion classification of varying spread values using PNN classifier are plotted for LBD, RBD and NC, respectively. For most of the features, the accuracies are consistent at the spread value between 0.1 and 0.6. Then the accuracy is observed to gradually drop when spread value is larger than 0.6 and further declines when spread value increases. The spread value of 0.4 was chosen to classify the six emotional states as it has achieved the optimum accuracy for most of the features. Similarly, the combination of frequency bands has the highest average accuracy for all spread values. The beta frequency band is the best performance individual frequency band among the three frequency sub-bands in all groups.
Table 3 shows the average accuracy of all emotional states of the bispectrum features from the combination of all bands from alpha to gamma using KNN and PNN classifiers. For both classifiers, the optimum parameters for LBD, RBD and NC are found to be the same. The optimum k value in KNN classification for all three groups is 1, and the optimum spread values in PNN classification is 0.4. In Table 3, the KNN is observed to have higher average accuracy than the PNN classifier for all three groups. Notably, the H3 feature is seen to have the highest average accuracy feature in all three groups using KNN, whereas the H3 feature achieves the highest accuracy in the LBD group using PNN. Meanwhile, the H1 feature obtains highest average accuracy in RBD and NC using PNN. According to the results obtained, the top three features are H 3 , H1, and H2 for all the groups, whereas the worst performing bispectrum feature is the variance. The highest average classification accuracy is 65.40% in the NC group using KNN. Hence the H3 feature is considered the most effective bispectrum feature in this study.
The confusion matrix of the H3 feature in KNN emotion classification is presented in Table 4, Table 5 and Table 6 for LBD, RBD and NC, respectively. From Table 4, the highest predicted class is happiness in LBD. In Table 5, the RBD group has the highest predicted value in sadness and surprise emotions, whereas the NC group has the highest predicted value in sadness emotion in Table 6. For PNN classification, the confusion matrix is presented in Table 7, Table 8 and Table 9 for each subject group. Likewise, the PNN classification predicted the happiness emotion correctly in the LBD group, as shown in Table 7. In addition, the sadness emotion has the highest classification accuracy in RBD and NC groups as shown in Table 8 and Table 9.
The classification rates using the KNN classifier for individual emotions are show in Figure 14. From the figure, the emotion with highest accuracy in all groups was sadness, where the LBD group achieved 65.37%, RBD group achieved 71.48% and NC group achieved 75.56%. Meanwhile, the fear emotion recorded the lowest accuracy in all the three groups, which was 53.52%, 57.96% and 60.74% for LBD, RBD and NC respectively.
The accuracy of the individual emotional states classified using the PNN classifier is shown in Figure 15. The PNN classifier has the same highest accuracy emotion with KNN, which was the sadness emotion. The LBD group has 57.41%, RBD has 62.59% and NC has 65.19% classification accuracy for sadness emotion using PNN. In Figure 15, the lowest classification accuracy for LBD and NC groups is the surprise emotion, which is 50.93% and 47.22%, respectively. In the RBD group, on the other hand, disgust emotion recorded the lowest classification accuracy of only 50.00%.
In this work, the surprise and fear achieved lower recognition rates compared to other emotions. The happiness was the most accurately recognized emotion, as well as the facial expressions for anger, sadness and disgust [45,46]. According to past studies, there is no convincing evidence for the surprise and fear emotions to be accurately recognized [47,48,49].
The emotional state that shows highest classification accuracy in each group (LBD, RBD and NC) indicates that the emotion is more significant compared to other emotional states in the respective groups. From this current result, all of the LBD, RBD and NC groups show highest classification accuracy for sadness. Meanwhile, NC group exhibits the highest average accuracy for both classifiers, followed by RBD with LBD trailing behind.
As a result, in this study, the LBD and RBD stroke patients have recorded a lower classification accuracy compared to the NC. This suggests that the emotional states of NC are more significant than the stroke patients. In order to validate the significant differences among the three groups, ANOVA was used to test the statistical difference among the average accuracy obtained from the KNN classifier and the resultant p-value was less than 0.05. Hence, the emotion classification accuracy for LBD, RBD and NC were statistically significant. This signifies the significant difference among the three groups, which implies that there are differences in the emotional experiences between LBD, RBD and NC groups. From this work, the NC group was observed to have the highest emotion classification accuracy, followed by the RBD group and the LBD group performed worst. Therefore, the NC group has the highest efficiency in EEG emotional classification with the use of machine learning, while the LBD group the lowest.
This work is significant to those past studies in which only second order measures of statistics, such as the power spectrum [50,51], which is a linear feature, was used. The power spectrum can only reveal the amplitude information about the EEG signals, the phase information, such as the phase coupling in the signal, cannot be observed by applying the power spectrum. Furthermore, the use of linear approaches has ignored the nonlinear characteristics of the EEG signals, thus, bispectrum was implemented in this study to detect and characterize the nonlinearities of EEG signals. Also, this current study using the bispectrum was able to provide distinctive information for different emotional states which was useful for emotion classification by achieving the highest accuracy of 75.56% using the H 3 . bispectrum feature.

5. Conclusions

The importance of emotion assessment of stroke patients stems from the need to seek information on the severity of emotional impairment symptoms. Therefore, an accurate emotion assessment approach is required to identify the symptoms of mood disorders in stroke patients. This work proposed the use of the bispectrum feature to classify the discrete emotions (anger, disgust, fear, happiness, sadness and surprise) of stroke patients and normal people. This study aims to develop an accurate emotion identification method, which can be used to recognize the current emotional state of strokes patient during diagnosis.
In this work, the bispectrum reveals the presence of QPC in the EEG signals and exhibits different QPC relations in each emotional state. This difference in the harmonic components and peaks were shown in the bispectrum contour plots arising from the nonlinear interactions between neuronal populations in each emotional state. In this study, the proposed method of emotion classification by using the bispectrum feature and KNN classifier has shown its effectiveness in the combination of alpha to gamma frequency bands. In addition, the bispectrum feature, H 3 , was able to provide an accuracy of 75.56% in the NC group. Moreover, the proposed method gave a comparable result with some current studies in emotion classification, However, there were only six types of bispectrum features implemented in this study and there are more to be explored. Also, future works could also focus on the optimization of the classification accuracy.
To conclude, bispectrum-based features are effective to analyze the nonlinearity EEG signals, and therefore is a useful feature for emotion assessment. Bispectrum feature was able to provide the emotional information of stroke patients and hence can be used as the substitute for conventional observation-based or scoring methods.

Author Contributions

Conceptualization, M.I.O., C.W.Y. and W.K.W.A.; methodology, M.I.O., C.W.Y. and W.K.W.A.; software, C.W.Y. and S.A.B.; validation, M.M., Y.R. and W.A.M.; formal analysis, A.H.A.; investigation, C.W.Y. and W.K.W.A.; resources, B.S.Z. and M.M.; data curation, A.K.J.; writing—original draft preparation, C.W.Y. and W.K.W.A.; writing—review and editing, C.W.Y. and A.H.A.; visualization, Z.M.R.; supervision, W.K.W.A.; project administration, W.K.W.A.; funding acquisition, W.K.W.A. and Z.M.R. All authors have read and agreed to the published version of the manuscript.

Funding

The author would like to acknowledge the support from the Fundamental Research Grant Scheme (FRGS) under a grant number of FRGS/1/2019/ICT04/UNIMAP/02/1 from the Ministry of Education Malaysia.

Acknowledgments

The author would like to thank Medyna Rehab and Services for allowing us to conduct data collection.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lee, Y.; Shafie, A.; Sidek, N.; Aziz, Z. Economic burden of stroke in Malaysia: Results from national neurology registry. J. Neurol. Sci. 2017, 381, 167–168. [Google Scholar] [CrossRef]
  2. American Heart Association. High Blood Cholesterol and Other Lipids. In 2019 Heart Disease & Stroke Statistical Update Fact Sheet Blacks & Cardiovascular Diseases; American Heart Association: Dallas, TX, USA, 2019; pp. 2012–2015. [Google Scholar]
  3. Li, J.; Oakley, L.D.; Brown, R.L.; Li, Y.; Ye, M.; Luo, Y. Early symptom measurement of Post-Stroke Depression (PSD). J. Affect. Disord. 2016, 197, 215–222. [Google Scholar] [CrossRef] [PubMed]
  4. Mayor, S. Persistent depression doubles stroke risk despite treatment, study finds. Stroke Vasc. Neurol. 2015, 350, h2611. [Google Scholar] [CrossRef] [PubMed]
  5. Chriki, L.S.; Stern, T.A.; Bullian, S.S. The Recognition and Management of Psychological Reactions to Stroke: A Case Discussion. Prim. Care Companion J. Clin. Psychiatry 2006, 8, 234–240. [Google Scholar] [CrossRef] [Green Version]
  6. Improvement.nhs.uk. “Psychological Care after Stroke,” NHS Improvement—Stroke. 2011. Available online: https://www.england.nhs.uk/improvement-hub/publication/psychological-care-after-stroke-improving-stroke-services-for-people-with-cognitive-and-mood-disorders/ (accessed on 24 September 2020).
  7. Berg, A.; Kaste, M.; Lönnqvist, J.; Palomäki, H. Assessment of Depression after Stroke. Stroke 2009, 40, 523–529. [Google Scholar] [CrossRef] [Green Version]
  8. Jerritta, S.; Murugappan, M.; Nagarajan, R.; Wan, K. Physiological signals based human emotion Recognition: A review. In Proceedings of the 2011 IEEE 7th International Colloquium on Signal Processing and Its Applications, Penang, Malaysia, 4 March 2011; pp. 410–415. [Google Scholar] [CrossRef]
  9. Nava, E.; Romano, D.; Grassi, M.; Turati, C. Skin conductance reveals the early development of the unconscious processing of emotions. Cortex 2016, 84, 124–131. [Google Scholar] [CrossRef]
  10. Zhang, Q.; Chen, X.; Zhan, Q.; Yang, T.; Xia, S. Respiration-based emotion recognition with deep learning. Comput. Ind. 2017, 92, 84–90. [Google Scholar] [CrossRef]
  11. Selvaraj, J.; Murugappan, M.; Wan, K.; Yaacob, S. Electrocardiogram-based emotion recognition system using empirical mode decomposition and discrete Fourier transform. Expert Syst. 2013, 31, 110–120. [Google Scholar] [CrossRef]
  12. Bong, S.Z.; Wan, K.; Murugappan, M.; Ibrahim, N.M.; Rajamanickam, Y.; Mohamad, K. Implementation of wavelet packet transform and non linear analysis for emotion classification in stroke patient using brain signals. Biomed. Signal Process. Control. 2017, 36, 102–112. [Google Scholar] [CrossRef]
  13. Schalk, G.; Mellinger, J. Brain Sensors and Signals. In A Practical Guide to Brain—Computer Interfacing with BCI2000; Springer: London, UK, 2010; pp. 9–35. [Google Scholar]
  14. Lindquist, K.A.; Wager, T.D.; Kober, H.; Bliss-Moreau, E.; Barrett, L.F. The brain basis of emotion: A meta-analytic review. Behav. Brain Sci. 2012, 35, 121–143. [Google Scholar] [CrossRef] [Green Version]
  15. Adamaszek, M.; Olbrich, S.; Kirkby, K.; Woldag, H.; Willert, C.; Heinrich, A. Event-related potentials indicating impaired emotional attention in cerebellar stroke—A case study. Neurosci. Lett. 2013, 548, 206–211. [Google Scholar] [CrossRef] [PubMed]
  16. Doruk, D.; Simis, M.; Imamura, M.; Brunoni, A.R.; Morales-Quezada, L.; Anghinah, R.; Fregni, F.; Battistella, L.R. Neurophysiologic Correlates of Post-stroke Mood and Emotional Control. Front. Hum. Neurosci. 2016, 10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Yuvaraj, R.; Murugappan, M.; Sundaraj, K.; Khairiyah, M.; Norlinah, M. Review of Emotion Recognition in Stroke Patients. Dement. Geriatr. Cogn. Disord. 2013, 36, 179–196. [Google Scholar] [CrossRef]
  18. Yeh, Z.-T.; Tsai, C.-F. Impairment on theory of mind and empathy in patients with stroke. Psychiatry Clin. Neurosci. 2014, 68, 612–620. [Google Scholar] [CrossRef]
  19. Aben, H.P.; Reijmer, Y.D.; Visser-Meily, J.M.A.; Spikman, J.M.; Biessels, G.J.; De Kort, P.L.M.; PROCRAS Study Group. Impaired Emotion Recognition after Left Hemispheric Stroke: A Case Report and Brief Review of the Literature. Case Rep. Neurol. Med. 2017, 2017, 1045039. [Google Scholar] [CrossRef] [Green Version]
  20. Yuvaraj, R.; Murugappan, M.; Acharrya, U.R.; Adeli, H.; Ibrahim, N.M.; Mesquita, E. Brain functional connectivity patterns for emotional state classification in Parkinson’s disease patients without dementia. Behav. Brain Res. 2016, 298, 248–260. [Google Scholar] [CrossRef]
  21. Klonowski, W. Everything you wanted to ask about EEG but were afraid to get the right answer. Nonlinear Biomed. Phys. 2009, 3, 2. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Tsuda, I. Toward an interpretation of dynamic neural activity in terms of chaotic dynamical systems. Behav. Brain Sci. 2001, 24, 793–810. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Huang, G.; Zhang, D.; Meng, J.; Zhu, X. Interactions between two neural populations: A mechanism of chaos and oscillation in neural mass model. Neurocomputing 2011, 74, 1026–1034. [Google Scholar] [CrossRef]
  24. Lee, Y.-J.; Zhu, Y.-S.; Xu, Y.-H.; Shen, M.-F.; Zhang, H.-X.; Thakor, N.V. Detection of non-linearity in the EEG of schizophrenic patients. Clin. Neurophysiol. 2001, 112, 1288–1294. [Google Scholar] [CrossRef]
  25. Rangaprakash, D.; Pradhan, N. Study of phase synchronization in multichannel seizure EEG using nonlinear recurrence measure. Biomed. Signal Process. Control. 2014, 11, 114–122. [Google Scholar] [CrossRef]
  26. Zappasodi, F.; Olejarczyk, E.; Marzetti, L.; Assenza, G.; Pizzella, V.; Tecchio, F. Fractal Dimension of EEG Activity Senses Neuronal Impairment in Acute Stroke. PLoS ONE 2014, 9, e100199. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Sundaraj, K.; Omar, M.I.; Mohamad, K.; Palaniappan, R.; Omar, M.I. Detection of emotions in Parkinson’s disease using higher order spectral features from brain’s electrical activity. Biomed. Signal Process. Control. 2014, 14, 108–116. [Google Scholar] [CrossRef]
  28. Acharrya, U.R.; Bhat, S.; Faust, O.; Adeli, H.; Chua, E.C.-P.; Lim, W.J.E.; Koh, J.E.W. Nonlinear Dynamics Measures for Automated EEG-Based Sleep Stage Detection. Eur. Neurol. 2015, 74, 268–287. [Google Scholar] [CrossRef]
  29. Nikias, C.L. Higher-order spectral analysis. In Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Societ, San Diego, CA, USA, 28 October 1993. [Google Scholar] [CrossRef]
  30. Nikias, C.L.; Raghuveer, M.R. Bispectrum estimation: A digital signal processing framework. Proc. IEEE 1987, 75, 869–891. [Google Scholar] [CrossRef]
  31. Nikias, C.; Mendel, J. Signal processing with higher-order spectra. IEEE Signal Process. Mag. 1993, 10, 10–37. [Google Scholar] [CrossRef]
  32. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Sundaraj, K.; Omar, M.I.; Mohamad, K.; Palaniappan, R. Optimal set of EEG features for emotional state classification and trajectory visualization in Parkinson’s disease. Int. J. Psychophysiol. 2014, 94, 482–495. [Google Scholar] [CrossRef] [PubMed]
  33. Hosseini, S.A. Classification of Brain Activity in Emotional States Using HOS Analysis. Int. J. Image Graph. Signal Process. 2012, 4, 21–27. [Google Scholar] [CrossRef] [Green Version]
  34. Venkatakrishnan, P.; Sukanesh, R.; Sangeetha, S. Detection of quadratic phase coupling from human EEG signals using higher order statistics and spectra. Signal Image Video Process. 2010, 5, 217–229. [Google Scholar] [CrossRef]
  35. Kiciński, W.; Szczepański, A. Quadratic Phase Coupling Phenomenon and Its Properties. Hydroacoustics 2004, 7, 97–106. [Google Scholar]
  36. Chua, K.C.; Chandran, V.; Acharrya, U.R.; Lim, C.M. Application of higher order statistics/spectra in biomedical signals—A review. Med. Eng. Phys. 2010, 32, 679–689. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Burle, B.; Spieser, L.; Roger, C.; Casini, L.; Hasbroucq, T.; Vidal, F. Spatial and temporal resolutions of EEG: Is it really black and white? A scalp current density view. Int. J. Psychophysiol. 2015, 97, 210–220. [Google Scholar] [CrossRef]
  38. Choong, W.Y. Analysis of the Distance Metrics of KNN Classifier for EEG Signal in Stroke Patients. In Proceedings of the 2018 International Conference on Computational Approach in Smart Systems Design and Applications (ICASSDA), Kuching, Malaysia, 15–17 August 2018. [Google Scholar] [CrossRef]
  39. Specht, D.F. Probabilistic neural networks. Neural Netw. 1990, 3, 109–118. [Google Scholar] [CrossRef]
  40. Farrokhrooz, M.; Karimi, M.; Rafiei, A. A new method for spread value estimation in multi-spread PNN and its application in ship noise classification. In Proceedings of the 2007 9th International Symposium on Signal Processing and Its Applications, Sharjah, UAE, 12–15 February 2007; pp. 1–4. [Google Scholar] [CrossRef]
  41. Pastell, M.; Kujala, M. A Probabilistic Neural Network Model for Lameness Detection. J. Dairy Sci. 2007, 90, 2283–2292. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Zhou, S.-M.; Gan, Q.; Sepulveda, F. Classifying mental tasks based on features of higher-order statistics from EEG signals in brain–computer interface. Inf. Sci. 2008, 178, 1629–1640. [Google Scholar] [CrossRef]
  43. Goshvarpour, A.; Goshvarpour, A.; Rahati, S.; Saadatian, V. Bispectrum Estimation of Electroencephalogram Signals during Meditation. Iran. J. Psychiatry Behav. Sci. 2012, 6, 48–54. [Google Scholar]
  44. Zhang, J.-W.; Zheng, C.-X.; Xie, A. Bispectrum analysis of focal ischemic cerebral EEG signal using third-order recursion method. IEEE Trans. Biomed. Eng. 2000, 47, 352–359. [Google Scholar] [CrossRef]
  45. Elfenbein, H.A.; Ambady, N. On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychol. Bull. 2002, 128, 203–235. [Google Scholar] [CrossRef] [Green Version]
  46. Mancini, G.; Biolcati, R.; Agnoli, S.; Andrei, F.; Trombini, E. Recognition of Facial Emotional Expressions among Italian Pre-adolescents, and Their Affective Reactions. Front. Psychol. 2018, 9, 1303. [Google Scholar] [CrossRef] [Green Version]
  47. Reisenzein, R.; Horstmann, G.; Schützwohl, A. The Cognitive-Evolutionary Model of Surprise: A Review of the Evidence. Top. Cogn. Sci. 2017, 11, 50–74. [Google Scholar] [CrossRef]
  48. Delicato, L.S. A robust method for measuring an individual’s sensitivity to facial expressions. Atten. Percept. Psychophys. 2020, 82, 2924–2936. [Google Scholar] [CrossRef] [PubMed]
  49. Crivelli, C.; Russell, J.A.; Jarillo, S.; Fernández-Dols, J.-M. The fear gasping face as a threat display in a Melanesian society. Proc. Natl. Acad. Sci. USA 2016, 113, 12403–12407. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Omar, M.I.; Sundaraj, K.; Mohamad, K.; Palaniappan, R.; Mesquita, E.; Satiyan, M. On the analysis of EEG power, frequency and asymmetry in Parkinson’s disease during emotion processing. Behav. Brain Funct. 2014, 10, 12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Jadhav, N.; Manthalkar, R.; Joshi, Y.V. Effect of meditation on emotional response: An EEG-based study. Biomed. Signal Process. Control. 2017, 34, 101–113. [Google Scholar] [CrossRef]
Figure 1. Symmetry regions and non-redundant region (Ω) of bispectrum.
Figure 1. Symmetry regions and non-redundant region (Ω) of bispectrum.
Brainsci 10 00672 g001
Figure 2. Electrodes placement of Emotiv EPOC according to 10–20 system.
Figure 2. Electrodes placement of Emotiv EPOC according to 10–20 system.
Brainsci 10 00672 g002
Figure 3. Data collection protocol.
Figure 3. Data collection protocol.
Brainsci 10 00672 g003
Figure 4. Bispectrum contour plot of LBD anger emotion in (a) alpha band, (b) beta band, and (c) gamma band.
Figure 4. Bispectrum contour plot of LBD anger emotion in (a) alpha band, (b) beta band, and (c) gamma band.
Brainsci 10 00672 g004
Figure 5. The bispectrum contour plot of the non-redundant region and one symmetry region in the alpha band of subject #1 NC group.
Figure 5. The bispectrum contour plot of the non-redundant region and one symmetry region in the alpha band of subject #1 NC group.
Brainsci 10 00672 g005
Figure 6. The bispectrum contour plot of the non-redundant region and one symmetry region in the alpha band of subject #1 LBD group.
Figure 6. The bispectrum contour plot of the non-redundant region and one symmetry region in the alpha band of subject #1 LBD group.
Brainsci 10 00672 g006
Figure 7. The bispectrum contour plot of the non-redundant region and one symmetry region in the alpha band of subject #1 RBD group.
Figure 7. The bispectrum contour plot of the non-redundant region and one symmetry region in the alpha band of subject #1 RBD group.
Brainsci 10 00672 g007
Figure 8. Average classification performance of bispectrum features by varying k values for the LBD group using k-nearest neighbor (KNN).
Figure 8. Average classification performance of bispectrum features by varying k values for the LBD group using k-nearest neighbor (KNN).
Brainsci 10 00672 g008
Figure 9. Average classification performance of bispectrum features by varying k values for the RBD group using KNN.
Figure 9. Average classification performance of bispectrum features by varying k values for the RBD group using KNN.
Brainsci 10 00672 g009
Figure 10. Average classification performance of bispectrum features by varying k values for the NC group using KNN.
Figure 10. Average classification performance of bispectrum features by varying k values for the NC group using KNN.
Brainsci 10 00672 g010
Figure 11. Average classification performance of bispectrum features by varying spread values for the LBD group using probabilistic neural network (PNN).
Figure 11. Average classification performance of bispectrum features by varying spread values for the LBD group using probabilistic neural network (PNN).
Brainsci 10 00672 g011
Figure 12. Average classification performance of bispectrum features by varying spread values for the RBD group using PNN.
Figure 12. Average classification performance of bispectrum features by varying spread values for the RBD group using PNN.
Brainsci 10 00672 g012
Figure 13. Average classification performance of bispectrum features by varying spread values for the NC group using PNN.
Figure 13. Average classification performance of bispectrum features by varying spread values for the NC group using PNN.
Brainsci 10 00672 g013
Figure 14. The accuracy of each emotional state using the KNN classifier.
Figure 14. The accuracy of each emotional state using the KNN classifier.
Brainsci 10 00672 g014
Figure 15. The accuracy of each emotional state using the PNN classifier.
Figure 15. The accuracy of each emotional state using the PNN classifier.
Brainsci 10 00672 g015
Table 1. Background and neurophysiological characteristics (mean ± std) of left brain damage (LBD), right brain damage (RBD), and normal control (NC) subjects.
Table 1. Background and neurophysiological characteristics (mean ± std) of left brain damage (LBD), right brain damage (RBD), and normal control (NC) subjects.
VariablesLBDRBDNCp-Value
Sample size, N151515NA
Age (year)56.73 ± 6.1555.87 ± 6.2151.87 ± 4.190.05979
Female/male5/105/1012/3NA
MMSE score (range: 0–30)26.93 ± 1.7727.2 ± 1.8728.87 ± 0.810.00392
BDI scale (range: 0–21)7.07 ± 2.916.20 ± 1.905.47 ± 1.200.14936
EHI (range: −1 to 1)0.85 ± 0.500.85 ± 0.470.89 ± 0.400.96493
Duration of disease (year)1.74 ± 1.472.24 ± 1.97NANA
Table 2. Statistical validation results of LBD, RBD, and NC by using ANOVA (statistically significant at p ≤ 0.05).
Table 2. Statistical validation results of LBD, RBD, and NC by using ANOVA (statistically significant at p ≤ 0.05).
GroupFeaturesAlphaBetaGamma
F Valuep-ValueF Valuep-ValueF Valuep-Value
LBDv11.403<0.0018.081<0.00128.332<0.001
H149.542<0.00137.080<0.00136.839<0.001
H235.788<0.00148.353<0.00178.123<0.001
H335.993<0.00150.731<0.00178.527<0.001
H41.7160.1271.3470.2411.5950.158
H527.171<0.00131.501<0.00141.921<0.001
RBDv3.0910.00918.464<0.0019.743<0.001
H111.070<0.0014.2550.00110.508<0.001
H218.778<0.00114.958<0.00128.436<0.001
H318.529<0.00115.113<0.00126.585<0.001
H41.7850.1128.005<0.0012.6760.020
H57.429<0.00127.299<0.00125.434<0.001
NCv6.256<0.00114.068<0.0018.138<0.001
H12.9930.0116.004<0.00110.874<0.001
H24.887<0.00120.415<0.00138.428<0.001
H34.903<0.00120.139<0.00136.136<0.001
H42.3350.0400.4420.8201.5500.171
H56.704<0.00111.123<0.00113.184<0.001
Table 3. Summary of average accuracy of all emotions of different bispectrum features using the combination of alpha to gamma bands.
Table 3. Summary of average accuracy of all emotions of different bispectrum features using the combination of alpha to gamma bands.
ClassifierKNNPNN
GroupLBDRBDNCLBDRBDNC
k/Spread1110.40.40.4
v31.0230.2832.7228.4328.3028.70
H158.2761.7963.6753.2455.0657.41
H255.5960.0662.9049.8849.7255.25
H358.6762.2265.4054.5752.8456.11
H546.5447.3847.1040.4640.3138.83
Table 4. Confusion matrix of the LBD group using the H3 feature in KNN classification.
Table 4. Confusion matrix of the LBD group using the H3 feature in KNN classification.
Actual
ADFHSSU
PredictedA3584328
D4316355
F5531322
H3274173
S3354355
SU4510331
Table 5. Confusion matrix of the RBD group using the H3 feature in KNN classification.
Table 5. Confusion matrix of the RBD group using the H3 feature in KNN classification.
Actual
ADFHSSU
PredictedA3635303
D2354324
F41032363
H3493654
S1126382
SU8123338
Table 6. Confusion matrix of the NC group using the H3 feature in KNN classification.
Table 6. Confusion matrix of the NC group using the H3 feature in KNN classification.
Actual
ADFHSSU
PredictedA3941416
D2403039
F1434434
H2363850
S2166411
SU8242134
Table 7. Confusion matrix of the LBD group using the H3 feature in PNN classification.
Table 7. Confusion matrix of the LBD group using the H3 feature in PNN classification.
Actual
ADFHSSU
PredictedA3263417
D8327341
F5530342
H5453777
S3265353
SU1532334
Table 8. Confusion matrix of the RBD group using H1 feature in PNN classification.
Table 8. Confusion matrix of the RBD group using H1 feature in PNN classification.
Actual
ADFHSSU
PredictedA2740116
D5355245
F7332841
H4263425
S1266412
SU10853235
Table 9. Confusion matrix of the NC group using the H1 feature in PNN classification.
Table 9. Confusion matrix of the NC group using the H1 feature in PNN classification.
Actual
ADFHSSU
PredictedA3232218
D53510468
F5333243
H4553532
S3239403
SU5612030
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wen Yean, C.; Wan Ahmad, W.K.; Mustafa, W.A.; Murugappan, M.; Rajamanickam, Y.; Adom, A.H.; Omar, M.I.; Zheng, B.S.; Junoh, A.K.; Razlan, Z.M.; et al. An Emotion Assessment of Stroke Patients by Using Bispectrum Features of EEG Signals. Brain Sci. 2020, 10, 672. https://doi.org/10.3390/brainsci10100672

AMA Style

Wen Yean C, Wan Ahmad WK, Mustafa WA, Murugappan M, Rajamanickam Y, Adom AH, Omar MI, Zheng BS, Junoh AK, Razlan ZM, et al. An Emotion Assessment of Stroke Patients by Using Bispectrum Features of EEG Signals. Brain Sciences. 2020; 10(10):672. https://doi.org/10.3390/brainsci10100672

Chicago/Turabian Style

Wen Yean, Choong, Wan Khairunizam Wan Ahmad, Wan Azani Mustafa, Murugappan Murugappan, Yuvaraj Rajamanickam, Abdul Hamid Adom, Mohammad Iqbal Omar, Bong Siao Zheng, Ahmad Kadri Junoh, Zuradzman Mohamad Razlan, and et al. 2020. "An Emotion Assessment of Stroke Patients by Using Bispectrum Features of EEG Signals" Brain Sciences 10, no. 10: 672. https://doi.org/10.3390/brainsci10100672

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop