Next Article in Journal
Selection of Specific Aptamer against Rivaroxaban and Utilization for Label-Free Electrochemical Aptasensing Using Gold Nanoparticles: First Announcement and Application for Clinical Sample Analysis
Next Article in Special Issue
Few-Electrode EEG from the Wearable Devices Using Domain Adaptation for Depression Detection
Previous Article in Journal
An Electrochemical Immunosensor for the Determination of Procalcitonin Using the Gold-Graphene Interdigitated Electrode
Previous Article in Special Issue
Use of Deep Learning to Detect the Maternal Heart Rate and False Signals on Fetal Heart Rate Recordings
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Electric Wheelchair Manipulating System Using SSVEP-Based BCI System

1
Graduate Institute of Automation Technology, National Taipei University of Technology, Taipei 10608, Taiwan
2
Department of Mechatronics Control, Industrial Technology Research Institute, Hsinchu 310401, Taiwan
3
Department of Mechanical Engineering, National Taiwan University of Science and Technology, Taipei 106335, Taiwan
4
Department of Radiation Oncology, MacKay Memorial Hospital, Taipei 10449, Taiwan
*
Author to whom correspondence should be addressed.
Biosensors 2022, 12(10), 772; https://doi.org/10.3390/bios12100772
Submission received: 7 August 2022 / Revised: 5 September 2022 / Accepted: 16 September 2022 / Published: 20 September 2022
(This article belongs to the Special Issue Biomedical Signal Processing in Healthcare and Disease Diagnosis)

Abstract

:
Most people with motor disabilities use a joystick to control an electric wheelchair. However, those who suffer from multiple sclerosis or amyotrophic lateral sclerosis may require other methods to control an electric wheelchair. This study implements an electroencephalography (EEG)-based brain–computer interface (BCI) system and a steady-state visual evoked potential (SSVEP) to manipulate an electric wheelchair. While operating the human–machine interface, three types of SSVEP scenarios involving a real-time virtual stimulus are displayed on a monitor or mixed reality (MR) goggles to produce the EEG signals. Canonical correlation analysis (CCA) is used to classify the EEG signals into the corresponding class of command and the information transfer rate (ITR) is used to determine the effect. The experimental results show that the proposed SSVEP stimulus generates the EEG signals because of the high classification accuracy of CCA. This is used to control an electric wheelchair along a specific path. Simultaneous localization and mapping (SLAM) is the mapping method that is available in the robotic operating software (ROS) platform that is used for the wheelchair system for this study.

1. Introduction

With the advancement of science and technology, electric wheelchairs are widely used to help disabled people move to the destination they want to go. However, a person who suffers from a neurodegenerative disease such as amyotrophic lateral sclerosis (ALS), multiple sclerosis (MS) [1] (pp. 204–219), and so on may not be able to use an electric wheelchair because they find it difficult to use a joystick as the control. Therefore, an autonomous electric wheelchair with a navigation system [2,3] can be operated by these individuals. An electrode cap acquires electroencephalography (EEG) signals from the human brain and brain–computer interface (BCI) systems translate the EEG signals into motion commands in real time [4,5].
BCI systems have been developed over several years, and can record many types of neural signals non-invasively or invasively: these include EEG, functional magnetic resonance imaging (fMRI), microelectrode arrays (MEAs), intracortical recording, electrocorticography (ECoG) and so forth [6] (pp. 814–826). EEG and fMRI belong to the non-invasive BCI systems, and intracortical recording and ECoG are the two invasive modalities mainly used in BCI research [7] (p. 6285). This research focused on the non-invasive EEG-based BCIs that have been realized using a variety of paradigms, including steady-state visual evoked potential (SSVEP), P300, rapid serial visual presentation (RSVP), movement-related potentials (MRPs), motor imagery [8,9,10], etc. These paradigms can be identified in two groups: endogenous and exogenous. The exogenous BCI paradigms need to use an external scene to stimulate the brain cortex, such as flashing stimuli or auditory beeps, to evoke discriminative patterns in the brain. SSVEP, P300 and RSVP all belong to the exogenous BCI paradigms. The endogenous BCI paradigms, such as motor imagery and MRPs, produce brainwave signatures spontaneously without external stimulation [11]. SSVEP is widely used in the speller system because it has a high ITR, good signal-to-noise ratio (SNR) and allows users to select multiple targets. The P300 waveforms are the brain patterns that are commonly used for EEG-based BCIs in traditional studies. These involve rare, task-relevant events and are often recorded at a latency of approximately 300 ms after stimuli are enacted [12]. However, the literature shows that SSVEP is more accurate than P300. Although the ITR for a hybrid BCI combining P300 and SSVEP is higher than the ITR for SSVEP, the total time that is required for the experiment on P300 is significantly more than that for SSVEP [13] (p. 101, 884). This study uses SSVEP to present brain patterns. SSVEP scenarios involve several white flickers flashing rapidly on a black background. Repetitive visual stimuli can elicit them at frequencies from 1 to 100 Hz [14] (pp. 346–353). The screen display is an interface that shows the SSVEP scenarios for the user during the experiment [15] (pp. 614–627). This study uses three types of SSVEP scenarios; one controls the motion of an electric wheelchair for five types of motion: forward, left, right, backward and ending for an emergency, and a second uses room information for a destination, such as a room number and name. The third scenario gathers information about a room and creates a map to allow users to specify a destination more intuitively. If navigation is automatic, it may be restricted due to environmental factors [16] (pp. 128–139). Systems can account for environmental factors but converting the environmental map can require much time, so this study uses automatic navigation and users can control the direction of a vehicle.
EEG signals are acquired using an electrode cap that is positioned on the user’s scalp. These are interpreted using analytical methods or learning models, such as canonical correlation analysis (CCA), multivariate synchronization index (MSI) [17], support vector machine (SVM) [18], power spectral density analysis, k-nearest neighbors (k-NN) and linear discriminant analysis (LDA) [19,20]. The CCA method without channel location selection and parameter optimization analyzes the relationship between two samples of frequency information from multiple-channel EEG signals. Studies show that the CCA analyzes SSVEP signals and detects their frequencies better than other conventional recognition techniques [21,22]. Therefore, this study passes EEG signals through a bandpass filter to reduce interference and classification uses CCA to recognize SSVEP events.
The ITR is measured in bits per minute and is used to determine the communication speed for a BCI [23] (p. 025015). This study calculates ITR and classification accuracy using CCA to verify the online performance of the BCI system. Finally, the classification results from the CCA are then translated into motion commands and communicated to a robotic arm or an electric wheelchair. As an application of this paper, the classification results are used to produce motion commands and navigational instructions in real time to convey an electric wheelchair to a destination that is defined by the user [24].
Mapping is the most elemental application for an electric wheelchair with automatic navigation. Before the automatic navigation system is activated, an environment map must be established using sensors such as a camera, sonar or laser scanner that the wheelchair system uses to localize. The most common mapping method is SLAM, which creates a map while it localizes the robot’s location [25]. In this case, GMapping is an algorithm for SLAM that is used for this study. The navigation system is configured in the ROS environment to ensure safety and convenience.
Inspired by these researches, this research implements an EEG-based BCI system and an SSVEP to manipulate an electric wheelchair. The purpose of this system is to make life easier for people with reduced mobility. By using CCA as a tool for analyzing EEG signals, they can be more accurately identified as corresponding categories. The main contributions of this paper are as follows:
  • For EEG signal analysis, two training-free algorithms, CCA and MSI, are applied in this article. The experimental results show that the accuracy rate is higher than 80.9%, so the proposed stimulus can be analyzed by various algorithms.
  • This article improves the display stimuli, using MR goggles as a display tool, with significantly increased accuracy and improved space usage on the electric wheelchair.
The remainder of this paper is organized as follows. Section 2 presents the architecture for this study. Section 3 describes the design of the BCI system, including the SSVEP scenario and the method of analysis. Section 4 demonstrates the proposed BCI-based electric wheelchair control system. Section 5 details the experiment and the results are presented in Section 6.

2. Architecture

The EEG signals were collected from subjects who wore an EEG electrode cap and gazed at the stimulus of the scenario on a monitor or used mixed reality goggles. The three scenarios used correspond to different functions of the wheelchair system. The first scenario describes the direction of the wheelchair: there are four arrow-shaped patterns covered by four flickers, representing four directions and one square image with the word end covered by the remaining flicker. The second scenario uses five rectangles with information about the destination room. These are covered by the five different frequencies of flickers. The third uses an environment map with five rooms and five different frequencies of flickers overlapping each room image.
Before the EEG data is acquired, the environment map must be constructed by the wheelchair system. Scenario 3, which presents the appearance of the map, is then shown on the screen or mixed reality goggles to allow users to navigate to the designated locations. The user chooses a destination and a CCA-based classifier classifies the EEG signals into frequency classes to generate five confidence scores. The confidence scores represent the probability of the corresponding frequency of the EEG signal. The highest score indicates that this EEG signal is most related to this frequency and the electric wheelchair is moved according to the corresponding commands translated using the result of CCA. In other words, if the confidence score that corresponds to one of the frequency categories is significantly higher than the other scores, the destination coordinates that correspond to this frequency category are transmitted to the electric wheelchair to allow it to autonomously navigate to a location that is specified by the user. The program ends when the device reaches a destination. If all of the confidence scores are lower than the threshold, Scenario 2 replaces Scenario 3.
Scenario 2 uses the same five flickers as Scenario 3 and the EEG signal that corresponds to each flicker also produces five confidence scores after processing by the CCA classifier. If all of the confidence scores are less than a threshold, TR, Scenario 2 is replaced by Scenario 1 to allow the user to control the wheelchair by looking in the direction of travel.
Five confidence scores are also generated for Scenario 1. Similarly, if all of these scores are less than a threshold, TD, Scenario 3 is used for navigation. If one of the confidence scores is significantly higher than the others, the wheelchair moves in the direction that corresponds to this frequency. If the frequency category with the highest confidence score corresponds to 13 Hz with the word “end” the program ends. It also means that the user moves the electric wheelchair to the designated position and does not need to move. All in all, Scenario 2 and Scenario 3 allow automatic navigation and Scenario 1 allows users to manipulate the wheelchair. The architecture of the online system for this study is shown in Figure 1.

3. Brain–Computer Interface System

3.1. SSVEP Scenario Configuration and Design

To acquire the EEG datasets from the human brain, three different SSVEP scenarios are viewed by the user. The five flickers have specific frequencies (7, 8, 9, 11 and 13 Hz) that overlap on the figures and are configured at the four corners and the middle of the black background. There are two command modes to move the electric wheelchair: automatic mode or using human interface devices, such as a joystick or a keyboard.
For this study, Scenario 1 uses five color images to replace the joystick. There are four different orientation arrows and an end choice. Scenario 2 presents the information for each room, which the user can choose to directly move to the room. Scenario 3 uses a map that is constructed by the guidance system for the electric wheelchair. The flickers are separated to avoid interference when participants watch one of the targets. Scenarios 2 and 3 allow the electric wheelchair to move automatically. The configurations for these three scenarios are shown in Figure 2.

3.2. Canonical Correlation Analysis for EEG Signals

Among many recognition methods, SVM and CCA differ in that CCA does not divide data into trial and test data. The CCA method implements correlation maximization between the multichannel EEG signals. However, the 60 Hz AC noise in the environment is contained in the datasets that are collected by the EEG electrode cap. A four-order bandpass infinite impulse response filter eliminates interference and retains frequencies from 3 to 40 Hz. The filtered datasets are analyzed using CCA, which identifies and measures the associations between two sets of variables [26]. The relationship between each EEG signal that is collected by stimulating the visual cortex of the brain and the frequencies for the five classes, which are 7, 8, 9, 11 and 13 Hz, are compared with their harmonic frequency to calculate the classification accuracy. There are five classification rates for each EEG signal and the result with the highest accuracy determines the class of the dataset.
Assume two matrixes X R n × p and Y R m × p and define its cross-covariance matrix as X Y = c o v ( X , Y ) which is m-by-n matrix. Find the vector a R n and b R m by canonical correlation analysis to maximize the correlation ρ = c o r r ( U , V ) between random variables U = a T X and V = b T Y . Therefore, the ρ can be derived as
ρ = c o r r ( U , V ) = c o v ( U , V ) σ U σ V = a T X Y b a T X X a b T Y Y b
Then solve a and b in order to obtain the maximum solution for c; the conditions for optimizing this problem are defined as Equation (2)
Maximize   a T X Y b Subject   to   a T X X a = 1 , b T Y Y b = 1
According to the Lagrange multiplier method, Equation (3) is obtained
L = a T X Y b λ 2 ( a T X X a 1 )   θ 2 ( b T Y Y b 1 )
Take the partial differential calculation of L with a and b, respectively, and let the two Equations be 0, as shown in Equations (4) and (5)
L a = X Y b λ ( X X a ) = 0
L b = Y X a θ ( Y Y b ) = 0
Multiply the left side of Equation (6) by a T and the left side of Equation (7) by b T , and then arrange it according to the constraints of Equation (2) to obtain Equation (8)
a T X Y b λ ( a T X X a ) = 0
b T Y X a θ ( b T Y Y b ) = 0
λ = θ = a T X Y b
According to the conditions of Equations (2) and (8), it is known that the desired λ is the maximum value, and Equations (6) and (7) are simplified and sorted into Equations (9) and (10)
X X - 1 X Y b = λ a
Y Y - 1 Y X a = λ b
Put (10) into (9) to obtain (11); you can find the eigenvalue λ 2 and the eigenvector a, and put (9) into (10) to obtain the eigenvector b
X X - 1 X Y Y Y - 1 Y X a = λ 2 a
Y Y - 1 Y X X X - 1 X Y b = λ 2 b
When λ is the maximum value, a and b at this time are called canonical variates, and λ is the maximum correlation coefficient between U and V, which is shown in Equation (13)
ρ = c o r r ( U , V ) = λ

3.3. Multivariate Synchronization Index

MSI is an algorithm that can be directly analyzed without the training that CCA needs. This measure is to estimate the synchronization between the actual mixed signals and the reference signals as a potential index for recognizing the stimulus frequency [17,27]. We use the same filtering method as CCA to process the EEG signals before using MSI for analysis.
Assume an N by M matrix X R N × M representing the filtered EEG signal. The MSI must also create a sample signal from the stimulus frequencies used in an SSVEP-based BCI system, similarly to CCA, and we assume a 2Nh by M matrix Y to represent it, where N is the number of channels used in the experiment, M is the number of sampling points of the EEG signal and Nh is the resonant frequency multiplier taken in the experiment. To find the synchronization index between the two sets of signals, the calculation is derived as follows
First the correlation matrix between X and Y must be calculated
c = [ c 11 c 12 c 21 c 22 ]
where
c 11 = 1 M X X T
c 22 = 1 M Y Y T
c 12 = c 2 1 T = 1 M X Y T
Because this correlation matrix includes a cross-correlation matrix and an autocorrelation matrix, where the autocorrelation matrix will affect the synchronization calculation, we convert the autocorrelation matrix into a linear form for calculation to eliminate the influence
C 11 = c 1 1 - 1 2
C 22 = c 2 2 - 1 2
U = [ C 11 0 0 C 22 ]
R = U C U T = [ I N × N C 11 c 12 C 22 C 22 c 21 C 11 I N h × N h ]
Let the matrix R obtained from Equation (21) have the number of K eigenvalues, where K = N + N h . Then, the calculation to normalize all the eigenvalues λ 1 λ K is as follows
E = λ i i = 1 K λ i
Finally, calculate the max of synchronization index S as shown in Equation (23)
S = 1 + i = 1 K E i l o g ( E i ) l o g ( K )

3.4. Information Transfer Rate of EEG Signals

A standard measure for communication systems is the bit rate, which is the amount of information that is communicated per unit. The bit rate depends on speed and accuracy [28] (pp. 94–109). In order to determine whether the data are converted into a stimulus to increase accuracy, this study calculates the ITR for each flicker for different frequencies is shown in Figure 3. The ITR is calculated as
B ( b i t s / t r i a l ) = l o g 2 n + p l o g 2 p + ( 1 p ) l o g 2 1 p n 1
Q ( t r i a l s / m i n ) = S t × 60 1
I T R ( b i t s / m i n ) = B ( b i t s / t r i a l ) × Q ( t r i a l s / m i n )
where n is the number of targets (for this study, there are five classes in each scenario), p is the average value for the classification accuracy for five classes in each scenario, S is the number of trails and t is the average time for one selection, which includes the stimulation time and rest time before the following stimulus appears: the unit is seconds. For the experiment, the stimulation time and the rest time before the following stimulus is three seconds per trial is shown in Figure 4.

4. Electric Wheelchair Control

This study uses a human–machine interaction that uses brain signals to manipulate an electric wheelchair without a joystick. Therefore, the stimulus that replaces the joystick is described in the previous section. When the CCA reconstructs the brain signals into a corresponding control command, an interpreter translates the command into machine language ROS scripts to control the electric wheelchair.

4.1. Hardware and Software

The electric wheelchair for this study is shown in Figure 5. It is controlled using a laptop from ASUS in Taiwan with an Intel Core i7-6700HQ CPU, 8GB RAM and an NVIDIA GeForce GTX 960M GPU with 512 Tensor Cores. To reach a specified destination avoiding obstacles, an Intel® RealSense D435i RGB-D camera from Intel Corporation, Santa Clara, California, United States and a SICK TIM551 2D-LiDAR from SICK, Waldkirch, Germany are used to determine the geography of the environment, as shown in Figure 6. Furthermore, the operating system is installed and configured in an ROS environment on Ubuntu 18.04 from Canonical company in UK on the PC.

4.2. Navigation

The concept of the autonomous navigation system is based on the mapping process using SLAM, which includes visual SLAM (V-SLAM) and Li-DAR-SLAM. For this study, the GMapping algorithm depicts the environment on a map using 2D-LiDAR before the electric wheelchair is driven to a specified destination. The current location of the electric wheelchair must be determined on the map. The current coordinate point is sent to SLAM to generate the navigational information by subscribing and publishing. Finally, the control command, which is the classification results from the EEG signals, is transmitted to the navigation system to complete the corresponding task.
The picture depicting the direction in which the wheelchair moves in Scenario 1 has five flickers that correspond to end, forward, backward, left and right turn. When the CCA classifier recognizes the class of the EEG signal corresponding to the flicker, the flicker that the subject observes is identified. For example, in Scenario 1, the flicker with a frequency of 7 Hz corresponds to the backward direction. When the CCA classifier recognizes the EEG data, the wheelchair receives a command to move backward.
The automatic navigation system is initiated if the second mode is used. The flicker with a frequency of 7 Hz corresponds to room “801A” in Scenario 2. After that, the wheelchair receives the navigation instruction, which is the location coordinates of room “801A”, and the map of field experimentation is used to allow the electric wheelchair to move to room “801A”.
Mode 3 triggers the automatic navigation system that allows the wheelchair to move autonomously. If the user does not understand the site map or the relative location of the room, Mode 3 generates an intuitive scenario for users. The participant looks at the screen for Scenario 3, which shows the environment map. The flicker with a frequency of 7 Hz also corresponds to room “801A” in Scenario 3, and the acquired EEG data are transmitted to the CCA classifier to verify the frequency.

5. Experiments

5.1. Experimental Setup

To ensure the accuracy of CCA classification in real-time system for every user, an experiment analyzed the acquired EEG signals for twelve subjects. During the data-acquisition phase, each subject performed 20 trials using each scenario that was displayed on the monitor and MR goggles. The total length of each recorded data sequence was 3 s.
This study acquired EEG signals using an OpenBCI Cyton board with OpenBCI software from Brooklyn, New York, USA and collected the EEG signals from a 21-channel EEG cap that has a sampling rate of 250 Hz. The scenarios were displayed on a monitor and MR goggles. An ASUS XG279Q, which is a high-level stimulating monitor with a 144 Hz refresh rate, and a Microsoft HoloLens 2 with a 60 Hz refresh rate were used to create stimuli for the BCI experiment. A red square overlapping on the flicker was used to specify the picture at which the participant looks.
The subject focused on the marked object and followed the instructions that were displayed on the screen or MR goggles to collect the data for frequencies of 7, 8, 9, 11 and 13 Hz. During the experiment using a screen, the participants sat 30 cm away from the screen and observed the flickers. The configuration for the experiment using a screen is shown in Figure 7. For the experiment that used MR goggles, participants wore an electrode cap kit and then put on the MR goggles. The configuration for the experiment using MR goggles is shown in Figure 8. Twelve subjects, ten males and two females, participated in the experiment. Their ages were in the range 39 ± 17. Each subject read and signed an informed consent form that was approved by the Study Ethics Committee for Human Study Protections (21MMHIS241e). When the data were acquired, they were fed into a CCA classifier to determine the classification accuracy.
When the CCA classification rate was verified, the electric wheelchair was operated online. When controlling the electric wheelchair using the real-time system, the subject is absorbed in the interface or MR goggles and has five options that correspond to respective flickers. This EEG signal measures the associations with the classified dataset. The outcome is translated into a command to control the movement of the wheelchair or a target location and is transmitted via TCP/IP.

5.2. SSVEP Experimental Results

This study collected the EEG signals from a 21-channel EEG cap that has a sampling rate of 250 Hz. A laptop configured with an Intel i7-10750H CPU, 16GB RAM and RTX 1660Ti 6GB GPU was used to acquire the EEG signals from the amplifier. The configuration of the channels on the EEG cap is shown in Figure 9. The three channels O1, O2 and Pz, in the occipital region, which is the yellow region in Figure 9, were used as the CCA classifier’s input signal to reconstruct the SSVEP-based stimulus.
Three scenarios describe the direction (Scenario 1), room information (Scenario 2), and environmental map (Scenario 3), which were analyzed using the CCA and MSI classifier. One experiment used a screen to present the scenarios and the other used MR goggles to display the flickers. The results of the first experiment using a screen and the classification method CCA, collected from twelve participants, are shown in Table 1, Table 2 and Table 3, respectively. The results of another experiment using MSI as an analysis tool are shown in Table 4, Table 5 and Table 6, respectively.
Scenario 1 was designed as a similar function as a joystick, used to control the direction of an electric wheelchair and the accuracies (ACCs) of CCA for four orientations, backward, forward, left and right and an end option, at frequencies of 7 Hz, 8 Hz, 9 Hz, 11 Hz and 13 Hz were 95%, 90.8%, 90.4%, 94.2% and 83%, respectively. The average ACC for all frequencies was 90.7%.
For the experiment involving automatic control five pictures of the room tag with the rooms’ names and identification numbers were used. For Scenario 2, the same frequencies were used, and the classification ACCs of CCA were 94.6%, 92.1%, 92.1%, 88.3% and 77.1%, respectively. The average ACC for all frequencies was 88.8%.
Scenario 3 used a map of the entire experimental field so the subject selected the location of the rooms directly. The ACCs of CCA were 87.1%, 82.5%, 87.1%, 82.1% and 76.7%, respectively. The average ACC for all frequencies was 83.1%.
For the experiment that used a screen to present the scene, Scenario 3 used a map and the user saw the location of the destination initially, but the ACC was obviously low. Therefore, Scenario 2 was used to confirm the user’s choice and increase the accuracy of the BCI system.
Similar to Scenario 1 using the CCA classifier, the results from using the MSI analysis tool were 94.5%, 85%, 89.6%, 88.8% and 72.9%, respectively. The average ACC for all frequencies was 86.2%. For Scenario 2, using the same frequency and MSI for analysis, the classification ACCs were 94.2%, 91.3%, 90.8%, 79.6% and 70%, respectively. The average ACC for all frequencies was 86.3%. Then the classification rates in Scenario 3 were 92.1%, 84.6%, 82.1%, 77.5% and 68.3%, respectively. The average ACC for all frequencies was 80.9%.
The results for the experiment that used MR goggles and CCA for analysis, collected from the same twelve participants, are shown in Table 7, Table 8 and Table 9 respectively. The experimental results using MSI as the analysis method are shown in Table 10, Table 11 and Table 12. Scenarios 1, 2 and 3 were the same as those for the experiment using the screen. Using MR goggles, Scenario 1, which controlled the direction of the electric wheelchair, had respective accuracies (ACCs) for the four orientations, backward, forward, left and right, and an end option at frequencies of 7 Hz, 8 Hz, 9 Hz, 11 Hz and 13 Hz were 95.8%, 97.9%, 100%, 98.8% and 97.5%. The average ACC for all frequencies was 98%. Scenario 2 used the same frequencies and the classification ACCs were 94.6%, 96.3%, 98.8%, 98.3% and 95.8%, respectively. The average ACC for all frequencies was 96.8%. Scenario 3 used a map of the entire experimental field and the ACCs were 99.6%, 98.8%, 100%, 98.3% and 97.5%, respectively. The average ACC for all frequencies was 98.8%.
In the experiment using MR goggles as a display, the results of analyzing the EEG signal with MSI for Scenario 1 were 97.5%, 98.8%, 100%, 97.1% and 95.4%. The average ACC for all frequencies was 97.8%. Scenario 2 used the same frequencies and the classification ACCs were 96.3%, 93.8%, 98.8%, 95% and 92.1%, respectively. The average ACC for all frequencies was 95.2%. Scenario 3 used a map of the entire experimental field and the ACCs were 99.2%, 98.8%, 98.8%, 96.7% and 88.3%, respectively. The average ACC for all frequencies was 95%.
During an online CCA experiment, each category of the different frequencies generates a confidence score. The CCA classifier generates classification results using these confidence scores so the proposed recognition algorithm determines the highest score to classify this EEG signal to the corresponding frequency. However, if the user does not pay attention to the flickers on the screen or wants to change the mode, the confidence scores are low. A threshold based on these twelve subjects is proposed. For Scenario 1, the threshold, TD, is 0.215 and for Scenarios 2 and 3, the thresholds, TR and TM, have a value of 0.22. Table 13 shows the specified thresholds for this study for the BCI system to enter the next mode.
These experimental results show that the CCA classifier accurately classifies the EEG signals into corresponding classes. Using MR goggles to present flickering stimuli is more accurate and convenient than using a screen to present the scenario. A higher classification rate allows the electric wheelchair to move more stably and safely and the BCI system is easier to use.

5.3. ITR Experimental Results

After analyzing the accuracy of the classification results from the CCA classifier was determined, the ITR then determined whether the scenarios for this study were suitable for stimulating the brain.
For each experiment, one selection took three seconds: stimulation took two seconds and there was a rest for one second. Five targets were used for each scenario, with frequencies of 7 Hz, 8 Hz, 9 Hz, 11 Hz and 13 Hz. This study used two types of experiments and three scenarios to determine the accuracy. A screen and MR goggles were used to present the scene.
Table 14 shows the average values for the ACC and the ITR for the proposed three scenarios for the first experiment. The respective ITR values for Scenarios 1, 2 and 3 were 33.79, 31.84 and 26.57 bits per minute, which present the degree of data converting to stimulate, as shown in Table 14. Table 15 shows the mean values for ACC and ITR for the three scenarios for the second experiment. The ITR values were 42.81 bits per minute for Scenario 1, 41.07 bits per minute for Scenario 2 and 44.08 bits per minute for Scenario 3. The ITR distribution graphs for these two experiments are shown in Figure 10 and Figure 11.

6. Discussion

According to the experimental results, they can be mainly divided into two parts: (1) This paper mainly uses CCA as a tool for analyzing EEG, and compares it with other SSVEP-based studies such as MSI are shown in Table 16. (2) There were changes in the monitors that display the stimuli and comparisons of their advantages and disadvantages.
The experimental results show that CCA has a higher accuracy in both experiment 1 and experiment 2. It means that the BCI system proposed in this article is feasible and accurate, and has a certain degree of reliability in controlling electric wheelchairs. The other results show that MR goggles give a significantly better accuracy than the screen. The MR goggles are mounted on the subject’s head so the stimulus remains in the line of sight, even if the subject moves their head. The MR goggles also wrap around the eyes so the visual area is not easily disturbed. The position of the MR goggles’ host computer coincides with the visual area of the brain wave cap. The host computer is pressed onto the electrodes so the electrode is not easily displaced when the subject moves their head.
However, the subjects of the current experiment are all healthy people, and the experimental field is not in the hospital, which has limitations on the development of the entire system. However, we cooperated with Mackay Memorial Hospital in Taipei, and we will continue to keep in touch and plan further experiments and recruit more contextual subjects in the future.

7. Conclusions

This study proposes an electric wheelchair that is controlled using EEG signals that are acquired using an SSVEP-based BCI system. Firstly, three channels that are related to the visual cortex in a 21-channel EEG electrode cap are used to collect the EEG signals from the operator. The subjects focus on the monitor, which displays the three proposed scenarios.
For Scenario 1, the operator directly controls the electric wheelchair’s direction of motion. Hence, four orientations and a termination choice are displayed for this scenario. Scenarios 2 and 3 are designed to involve automatic control. Scenario 2 shows the information for five rooms. Furthermore, Scenario 3 migrates map information with the environment to Scenario 2.
When the EEG signals are collected, this study analyzes the data using a CCA classifier. The ITR is calculated to evaluate the classification results from the CCA to the stimulus. Finally, the processed EEG signals are then translated into commands to the electric wheelchair to complete the tasks.
The average ACCs for the three scene classifications for the first experiment are 90.67%, 88.82% and 83.08%, respectively. For the experiment using MR goggles to present stimuli, the average ACCs for the three scene classifications are 98%, 96.8% and 98.8%, respectively.
These results show that MR goggles are more effective than a screen to present a scene stimulus. A screen is large and occupies most of the space on the wheelchair, but MR goggles allow the subject to observe, require little space and are not easily disturbed by the outside world. The CCA classification rate is also better for MR goggles.
The accurate results show that the proposed system classifies the EEG signal into the correct category. Furthermore, the electric wheelchair is accurately and safely guided and the BCI system for this study allows the user to reach a specified location easily.

Author Contributions

Conceptualization, C.-S.C., S.-K.C. and Y.-H.L.; methodology, C.-S.C.; software, W.C. and S.-K.C.; validation, C.-S.C. and Y.-J.C.; formal analysis, Y.-H.L.; investigation, C.-S.C., Y.-H.L. and Y.-J.C.; resources, C.-S.C.; data curation, W.C. and S.-K.C.; writing—original draft preparation, W.C. and S.-K.C.; writing—review and editing, C.-S.C.; visualization, W.C. and S.-K.C.; supervision, C.-S.C., Y.-H.L. and Y.-J.C.; project administration, C.-S.C.; funding acquisition, C.-S.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported in part by National Taipei University of Technology and Mackay Memorial Hospital Joint Research Program (NTUT-MMH-101-09); and in part by the Ministry of Science and Technology under project No. MOST 109-2221-E-027-044-MY3.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Taiwan and approved by the Research Ethics Committee for Human Research Protections (21MMHIS241e).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Stephenson, J.; Nutma, E.; van der Valk, P.; Amor, S. Inflammation in CNS neurodegenerative diseases. Immunology 2018, 154, 204–219. [Google Scholar] [CrossRef]
  2. Levine, S.P.; Bell, D.A.; Jaros, L.A.; Simpson, R.C.; Koren, Y.; Borenstein, J. The NavChair assistive wheelchair navigation system. IEEE Trans. Rehabil. Eng. 1999, 7, 443–451. [Google Scholar] [CrossRef]
  3. Grewal, H.; Matthews, A.; Tea, R.; George, K. LIDAR-based autonomous wheelchair. In Proceedings of the 2017 IEEE Sensors Applications Symposium (SAS), Glassboro, NJ, USA, 13–15 March 2017; pp. 1–6. [Google Scholar]
  4. Li, Y.; Pan, J.; Wang, F.; Yu, Z. A hybrid BCI system combining P300 and SSVEP and its application to wheelchair control. IEEE Trans. Biomed. Eng. 2013, 60, 3156–3166. [Google Scholar]
  5. Chen, X.; Wang, Y.; Gao, S.; Jung, T.-P.; Gao, X. Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain–computer interface. J. Neural Eng. 2015, 12, 046008. [Google Scholar] [CrossRef]
  6. Hwang, H.-J.; Kim, S.; Choi, S.; Im, C.-H. EEG-based brain-computer interfaces: A thorough literature survey. Int. J. Hum.-Comput. Interact. 2013, 29, 814–826. [Google Scholar] [CrossRef]
  7. Palumbo, A.; Gramigna, V.; Calabrese, B.; Ielpo, N. Motor-imagery EEG-based BCIs in wheelchair movement and control: A systematic literature review. Sensors 2021, 21, 6285. [Google Scholar] [CrossRef]
  8. Kaur, M.; Ahmed, P.; Rafiq, M.Q. Technology development for unblessed people using bci: A survey. Int. J. Comput. Appl. 2012, 40, 18–24. [Google Scholar] [CrossRef]
  9. Lin, C.-T.; Chiu, C.-Y.; Singh, A.K.; King, J.-T.; Ko, L.-W.; Lu, Y.-C.; Wang, Y.-K. A wireless multifunctional SSVEP-based brain–computer interface assistive system. IEEE Trans. Cogn. Dev. Syst. 2018, 11, 375–383. [Google Scholar] [CrossRef]
  10. Anindya, S.F.; Rachmat, H.H.; Sutjiredjeki, E. A prototype of SSVEP-based BCI for home appliances control. In Proceedings of the 2016 1st International Conference on Biomedical Engineering (IBIOMED), Yogyakarta, Indonesia, 5–6 October 2016. [Google Scholar]
  11. Han, C.-H.; Kim, Y.-W.; Kim, D.Y.; Kim, S.H.; Nenadic, Z.; Im, C.-H. Electroencephalography-based endogenous brain–computer interface for online communication with a completely locked-in patient. J. Neuroeng. Rehabil. 2019, 16, 18. [Google Scholar] [CrossRef]
  12. Gao, W.; Yu, T.; Yu, J.-G.; Gu, Z.; Li, K.; Huang, Y.; Yu, Z.L.; Li, Y. Learning Invariant Patterns Based on a Convolutional Neural Network and Big Electroencephalography Data for Subject-Independent P300 Brain-Computer Interfaces. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1047–1057. [Google Scholar] [CrossRef]
  13. Katyal, A.; Singla, R. A novel hybrid paradigm based on steady state visually evoked potential & P300 to enhance information transfer rate. Biomed. Signal Process. Control 2020, 59, 101884. [Google Scholar]
  14. Herrmann, C.S. Human EEG responses to 1–100 Hz flicker: Resonance phenomena in visual cortex and their potential correlation to cognitive phenomena. Exp. Brain Res. 2001, 137, 346–353. [Google Scholar] [CrossRef]
  15. Iturrate, I.; Antelis, J.M.; Kubler, A.; Minguez, J. A noninvasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation. IEEE Trans. Robot. 2009, 25, 614–627. [Google Scholar] [CrossRef]
  16. Zhang, R.; Li, Y.; Yan, Y.; Zhang, H.; Wu, S.; Yu, T.; Gu, Z. Control of a wheelchair in an indoor environment based on a brain–computer interface and automated navigation. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 24, 128–139. [Google Scholar] [CrossRef]
  17. Zhang, Y.; Xu, P.; Cheng, K.; Yao, D. Multivariate synchronization index for frequency recognition of SSVEP-based brain–computer interface. J. Neurosci. Methods 2014, 221, 32–40. [Google Scholar] [CrossRef]
  18. Chen, S.-K.; Chen, C.-S.; Wang, Y.-K.; Lin, C.-T. An SSVEP stimuli design using real-time camera view with object recognition. In Proceedings of the 2020 IEEE Symposium Series on Computational Intelligence (SSCI), Canberra, ACT, Australia, 1–4 December 2020. [Google Scholar]
  19. Singla, R.; Haseena, B. Comparison of ssvep signal classification techniques using svm and ann models for bci applications. Int. J. Inf. Electron. Eng. 2014, 4, 6–10. [Google Scholar] [CrossRef]
  20. Kobayashi, N.; Ishizuka, K. LSTM-based Classification of Multiflicker-SSVEP in Single Channel Dry-EEG for Low-power/High-accuracy Quadcopter-BMI System. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 2160–2165. [Google Scholar]
  21. Lin, Z.; Zhang, C.; Wu, W.; Gao, X. Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs. IEEE Trans. Biomed. Eng. 2006, 53, 2610–2614. [Google Scholar] [CrossRef]
  22. Zhang, Y.; Zhou, G.; Jin, J.; Wang, X.; Cichocki, A. Frequency recognition in SSVEP-based BCI using multiset canonical correlation analysis. Int. J. Neural Syst. 2014, 24, 1450013. [Google Scholar] [CrossRef]
  23. Bin, G.; Gao, X.; Wang, Y.; Li, Y.; Hong, B.; Gao, S. A high-speed BCI based on code modulation VEP. J. Neural Eng. 2011, 8, 025015. [Google Scholar] [CrossRef]
  24. Stamps, K.; Hamam, Y. Towards inexpensive BCI control for wheelchair navigation in the enabled environment—A hardware survey. In Proceedings of the International Conference on Brain Informatics, Toronto, ON, Canada, 28–30 August 2010. [Google Scholar]
  25. Norzam, W.; Hawari, H.; Kamarudin, K. Analysis of mobile robot indoor mapping using gmapping based slam with different parameter. In Proceedings of the 5th International Conference on Man Machine Systems, Pulau Pinang, Malaysia, 26–27 August 2019. [Google Scholar]
  26. Yang, X.; Weifeng, L.; Liu, W.; Tao, D. A survey on canonical correlation analysis. IEEE Trans. Knowl. Data Eng. 2019, 33, 2349–2368. [Google Scholar] [CrossRef]
  27. Qin, K.; Wang, R.; Zhang, Y. Filter bank-driven multivariate synchronization index for training-free SSVEP BCI. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 934–943. [Google Scholar] [CrossRef]
  28. Vaughan, T.M.; Heetderks, W.J.; Trejo, L.J.; Rymer, W.Z.; Weinrich, M.; Moore, M.M.; Kübler, A.; Dobkin, B.H.; Birbaumer, N.; Donchin, E. Brain-computer interface technology: A review of the Second International Meeting. IEEE Trans. Neural Syst. Rehabil. Eng. Publ. IEEE Eng. Med. Biol. Soc. 2003, 11, 94–109. [Google Scholar] [CrossRef] [Green Version]
  29. The Picture Which Is the Configuration of Channels on EEG Cap Comes from the Website. Available online: https://www.bci2000.org/mediawiki/index.php/User_Tutorial:EEG_Measurement_Setup (accessed on 6 August 2021).
Figure 1. The architecture for this study. Subjects select one of the three scenarios for automatic interactive control. EEG signals are acquired by participants using one of the scenarios. When the data are collected, a CCA classifier detects the target. Analyzed EEG signals are then translated to commands to control the electric wheelchair.
Figure 1. The architecture for this study. Subjects select one of the three scenarios for automatic interactive control. EEG signals are acquired by participants using one of the scenarios. When the data are collected, a CCA classifier detects the target. Analyzed EEG signals are then translated to commands to control the electric wheelchair.
Biosensors 12 00772 g001
Figure 2. The three scenario configurations for this study: (a) Scenario 1 with four directions on the corners and an end choice in the middle of the black background, (b) Scenario 2 with five pieces of information of each room and (c) Scenario 3, where the map is constructed by the electric wheelchair.
Figure 2. The three scenario configurations for this study: (a) Scenario 1 with four directions on the corners and an end choice in the middle of the black background, (b) Scenario 2 with five pieces of information of each room and (c) Scenario 3, where the map is constructed by the electric wheelchair.
Biosensors 12 00772 g002
Figure 3. The information that is transferred in bits/trial for different numbers of targets.
Figure 3. The information that is transferred in bits/trial for different numbers of targets.
Biosensors 12 00772 g003
Figure 4. The information that is transferred rate in bits/trial for different average times for one selection.
Figure 4. The information that is transferred rate in bits/trial for different average times for one selection.
Biosensors 12 00772 g004
Figure 5. The electric wheelchair for this study.
Figure 5. The electric wheelchair for this study.
Biosensors 12 00772 g005
Figure 6. The sensors that are installed on the electric wheelchair: (a) SICK TIM551 2D-LiDAR and (b) Intel® RealSense D435i RGB-D camera.
Figure 6. The sensors that are installed on the electric wheelchair: (a) SICK TIM551 2D-LiDAR and (b) Intel® RealSense D435i RGB-D camera.
Biosensors 12 00772 g006
Figure 7. The configuration for the experiment using a screen.
Figure 7. The configuration for the experiment using a screen.
Biosensors 12 00772 g007
Figure 8. The configuration for the experiment using MR goggles.
Figure 8. The configuration for the experiment using MR goggles.
Biosensors 12 00772 g008
Figure 9. The configuration of channels for the EEG cap [29].
Figure 9. The configuration of channels for the EEG cap [29].
Biosensors 12 00772 g009
Figure 10. The ITR value for each stimulus using a screen.
Figure 10. The ITR value for each stimulus using a screen.
Biosensors 12 00772 g010
Figure 11. The ITR value for each stimulus using MR goggles.
Figure 11. The ITR value for each stimulus using MR goggles.
Biosensors 12 00772 g011
Table 1. The confusion matrix for the SSVEP recognition results using CCA classifier for the direction test in the first experiment (Scenario 1).
Table 1. The confusion matrix for the SSVEP recognition results using CCA classifier for the direction test in the first experiment (Scenario 1).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz228431495
8 Hz1221834390.8
9 Hz1052177190.4
11 Hz346226194.2
13 Hz13991019983
Precision (%)85.790.891.291.195.7
Table 2. The confusion matrix for the SSVEP recognition results using CCA classifier for the room information test in the first experiment (Scenario 2).
Table 2. The confusion matrix for the SSVEP recognition results using CCA classifier for the room information test in the first experiment (Scenario 2).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz227372194.6
8 Hz822163392.1
9 Hz372216392.1
11 Hz1186212388.3
13 Hz2012111218577.1
Precision (%)89.8989798100
Table 3. The confusion matrix for the SSVEP recognition results using CCA classifier for the map test in the first experiment (Scenario 3).
Table 3. The confusion matrix for the SSVEP recognition results using CCA classifier for the map test in the first experiment (Scenario 3).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz20912611287.1
8 Hz22198410682.5
9 Hz1392096387.1
11 Hz15196197382.1
13 Hz1716111218476.7
Precision (%)75.77888.683.592.9
Table 4. The confusion matrix for the SSVEP recognition results using MSI classifier for the direction test in the first experiment (Scenario 1).
Table 4. The confusion matrix for the SSVEP recognition results using MSI classifier for the direction test in the first experiment (Scenario 1).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz227631394.5
8 Hz2220494485
9 Hz1192155189.6
11 Hz1277213288.8
13 Hz271717817572.9
Precision (%)75.98485.792.294.6
Table 5. The confusion matrix for the SSVEP recognition results using MSI classifier for the room information test in the first experiment (Scenario 2).
Table 5. The confusion matrix for the SSVEP recognition results using MSI classifier for the room information test in the first experiment (Scenario 2).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz226255294.2
8 Hz821973391.3
9 Hz792184290.8
11 Hz22206191179.6
13 Hz3117141016870
Precision (%)76.984.987.289.795.5
Table 6. The confusion matrix for the SSVEP recognition results using MSI classifier for the map test in the first experiment (Scenario 3).
Table 6. The confusion matrix for the SSVEP recognition results using MSI classifier for the map test in the first experiment (Scenario 3).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz221655392.1
8 Hz2120366484.6
9 Hz23111977282.1
11 Hz28175186477.5
13 Hz3020131316468.3
Precision (%)68.47987.285.792.7
Table 7. The confusion matrix for the SSVEP recognition results using CCA classifier for the direction test in the second experiment (Scenario 1).
Table 7. The confusion matrix for the SSVEP recognition results using CCA classifier for the direction test in the second experiment (Scenario 1).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz230124395.8
8 Hz423510097.9
9 Hz0024000100
11 Hz111237098.8
13 Hz103223497.5
Precision (%)97.599.297.291.1395.67
Table 8. The confusion matrix for the SSVEP recognition results using CCA classifier for the room information test in the second experiment (Scenario 2).
Table 8. The confusion matrix for the SSVEP recognition results using CCA classifier for the room information test in the second experiment (Scenario 2).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz227225494.6
8 Hz323123196.3
9 Hz302370098.8
11 Hz111236198.3
13 Hz222423095.8
Precision (%)96.297.997.195.297.5
Table 9. The confusion matrix for the SSVEP recognition results using CCA classifier for the map test in the second experiment (Scenario 3).
Table 9. The confusion matrix for the SSVEP recognition results using CCA classifier for the map test in the second experiment (Scenario 3).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz239001099.6
8 Hz223701098.8
9 Hz0024000100
11 Hz121236098.3
13 Hz104123497.5
Precision (%)98.499.29898.7100
Table 10. The confusion matrix for the SSVEP recognition results using MSI classifier for the direction test in the second experiment (Scenario 1).
Table 10. The confusion matrix for the SSVEP recognition results using MSI classifier for the direction test in the second experiment (Scenario 1).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz234212197.5
8 Hz323700098.8
9 Hz0024000100
11 Hz223233097.1
13 Hz432222995.4
Precision (%)96.397.197.698.399.6
Table 11. The confusion matrix for the SSVEP recognition results using MSI classifier for the room information test in the second experiment (Scenario 2).
Table 11. The confusion matrix for the SSVEP recognition results using MSI classifier for the room information test in the second experiment (Scenario 2).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz231322296.3
8 Hz722542293.8
9 Hz302370098.8
11 Hz623228195
13 Hz475322192.1
Precision (%)9294.994.49797.8
Table 12. The confusion matrix for the SSVEP recognition results using MSI classifier for the map test in the second experiment (Scenario 3).
Table 12. The confusion matrix for the SSVEP recognition results using MSI classifier for the map test in the second experiment (Scenario 3).
True ClassPredicted ClassACC (%)
7 Hz8 Hz9 Hz11 Hz13 Hz
7 Hz238001199.2
8 Hz123702098.8
9 Hz212370098.8
11 Hz521232096.7
13 Hz988321288.3
Precision (%)93.395.696.397.599.5
Table 13. The thresholds that we designed for the BCI system.
Table 13. The thresholds that we designed for the BCI system.
TDTRTM
Confidence Score0.2150.220.22
Table 14. The average accuracy for the direction test (Scenario 1), the room information test (Scenario 2) and the map test (Scenario 3) using a screen with CCA classifier.
Table 14. The average accuracy for the direction test (Scenario 1), the room information test (Scenario 2) and the map test (Scenario 3) using a screen with CCA classifier.
ScenarioAverage ACC (%)ITR (bits/min)
190.733.79
288.831.84
383.126.57
Table 15. The average accuracy for the three scenarios using MR goggles with CCA classifier.
Table 15. The average accuracy for the three scenarios using MR goggles with CCA classifier.
ScenarioAverage ACC (%)ITR (bits/min)
19842.81
296.841.07
398.844.08
Table 16. Comparison of all experiments.
Table 16. Comparison of all experiments.
ScenarioClassifier/Monitor
CCA/ScreenCCA/MR GogglesMSI/ScreenMSI/MR Goggles
190.7%98%86.2%97.8%
288.8%96.8%86.3%95.2%
383.1%98.8%80.9%95%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chen, W.; Chen, S.-K.; Liu, Y.-H.; Chen, Y.-J.; Chen, C.-S. An Electric Wheelchair Manipulating System Using SSVEP-Based BCI System. Biosensors 2022, 12, 772. https://doi.org/10.3390/bios12100772

AMA Style

Chen W, Chen S-K, Liu Y-H, Chen Y-J, Chen C-S. An Electric Wheelchair Manipulating System Using SSVEP-Based BCI System. Biosensors. 2022; 12(10):772. https://doi.org/10.3390/bios12100772

Chicago/Turabian Style

Chen, Wen, Shih-Kang Chen, Yi-Hung Liu, Yu-Jen Chen, and Chin-Sheng Chen. 2022. "An Electric Wheelchair Manipulating System Using SSVEP-Based BCI System" Biosensors 12, no. 10: 772. https://doi.org/10.3390/bios12100772

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop