Next Article in Journal
The Benefits and Costs of Netlist Randomization Based Side-Channel Countermeasures: An In-Depth Evaluation
Previous Article in Journal
Dynamic SIMD Parallel Execution on GPU from High-Level Dataflow Synthesis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Electrical Impedance Tomography for Hand Gesture Recognition for HMI Interaction Applications

by
Noelia Vaquero-Gallardo
and
Herminio Martínez-García
*
Department of Electronics Engineering, Eastern Barcelona School of Engineering (EEBE), Technical University of Catalonia–BarcelonaTech (UPC), 08019 Barcelona, Spain
*
Author to whom correspondence should be addressed.
J. Low Power Electron. Appl. 2022, 12(3), 41; https://doi.org/10.3390/jlpea12030041
Submission received: 29 May 2022 / Revised: 28 June 2022 / Accepted: 9 July 2022 / Published: 18 July 2022

Abstract

:
Electrical impedance tomography (EIT) is based on the physical principle of bioimpedance defined as the opposition that biological tissues exhibit to the flow of a rotating alternating electrical current. Consequently, here, we propose studying the characterization and classification of bioimpedance patterns based on EIT by measuring, on the forearm with eight electrodes in a non-invasive way, the potential drops resulting from the execution of six hand gestures. The starting point was the acquisition of bioimpedance patterns studied by means of principal component analysis (PCA), validated through the cross-validation technique, and classified using the k-nearest neighbor (kNN) classification algorithm. As a result, it is concluded that reduction and classification is feasible, with a sensitivity of 0.89 in the worst case, for each of the reduced bioimpedance patterns, leading to the following direct advantage: a reduction in the numbers of electrodes and electronics required. In this work, bioimpedance patterns were investigated for monitoring subjects’ mobility, where, generally, these solutions are based on a sensor system with moving parts that suffer from significant problems of wear, lack of adaptability to the patient, and lack of resolution. Whereas, the proposal implemented in this prototype, based on the so-called electrical impedance tomography, does not have these problems.

1. Introduction

Electrical impedance tomography (EIT) is an imaging technique utilized to study changes in the internal conductivity of the human body [1,2,3,4,5,6,7,8,9]. As a consequence, EIT can be used to infer the physiological state of the human body based on changes in the internal conductivity of the human body [3]. The technique involves injecting excitation current or voltage in sequence from a group of electrodes around an object and measuring the electrical impedance responses between surface electrodes [4,10]. These responses are known as the physical principle of bioimpedance, which is defined as the opposition presented by biological tissues to the passage of an alternating electric current through them [11,12]. EIT has the advantages of being a non-invasive, non-hazardous, simple structure, and low cost technique [3]. For this reason, it is proposed to analyze the characterization and classification of bioimpedance patterns using measurements based on this technique.
After reviewing the existing literature [1,2,3,4,5,6], the AD5933 evaluation board was chosen for the experimental platform. Potential drops in bioimpedance were measured from eight electrodes on the forearm for a set of six different hand gestures; 28 independent measurements of bioimpedance were obtained for each gesture, performed with five repetitions. The results were analyzed to establish a reduced bioimpedance pattern that characterized each of the gestures performed. The purpose was to reach a robust and reduced bioimpedance pattern without lacking information as compared with the original data pattern with a smaller number of electrodes and, hence, the simplest hardware system. Considering these results and reviewing the literature for resistive sensors, it is feasible to control a motor through hand gesture recognition [7,8,9]. The proposed system offers an alternative solution for people with reduced mobility that, currently, is a mechanical solution with a lack of adaptability to the patient and lack of resolution. Therefore, the contribution of this work is a feasible human–machine interface (HMI) interaction that provides the ability to control machines and computers explicitly by using bioelectrical signals produced by biochemical reactions occurring in the living cells of the human body [13].
Based on the results of this study, we propose applying bioimpedance patterns obtained from a set of gestures and associating them with the control of a motor. The link between humans and machines can significantly enhance the quality of life, with applications ranging from better control of robotics, for example, in surgery operations, restoring a degree of normality to amputees, or safely handling hazardous materials [1].

2. State of the Art

The basic structural and morphological unit of living cells which form tissues provides resistive and capacitive properties. In fact, on the skin, the electrical response is mainly determined by the stratum corneum at frequencies below 10 kHz [11]. Consequently, living tissue can be described as a complex microscopic network of electrical circuits made up of resistors and capacitors. Since the equivalent model that best parameterizes this behaviour is unknown, this model is approached mathematically from Maxwell’s equations that fully define electromagnetic phenomena [11,14]. However, it is a complex solution and, therefore, in the literature different perspectives for solving the electrical configuration of tissue are adapted to various equations. The most relevant and simple models in literature are the Fricke model, the Debye model, and the Cole model.
Therefore, bioimpedance values vary depending on the frequency of the injection current and, depending on the frequency, as reported in studies [15], the parameters of permittivity (ε) and conductivity (ɗ) versus frequency, and then ionic concentration gradients are produced, developing electromotive voltages that influence the current flow [15,16,17,18].

3. Materials and Method: Experimental Platform

The experimental platform consisted of five distinguishable parts: (1) the measurement system, (2) hardware and software requirements, (3) configuration parameters for the acquisition of patterns, (4) pattern acquisition and processing, and (5) pattern classification.

3.1. Measuring System

The type of measurement, the number of electrodes to be used, and the number of resulting measurements and types of electrodes to be used were determined as described below.

3.1.1. Type of Measurement

The common configurations for obtaining bioimpedance measurements are two-wire and four-wire configurations [19,20]. The two-wire measurement configuration requires two electrodes, where the emitter injects the excitation, and the receiver closes the circuit [21]. A disadvantage as compared with the four-wire configuration is that it introduces some errors because the potential difference sensed between the two electrodes includes the nonlinear voltages generated by the current flowing through the polarization impedance at the electrode–tissue interface [22]. Therefore, applications that rely on more precise measurements implement a four-wire configuration for exchanges with two additional electrodes.
In this work, the aim is a new drive pattern that generates less impedance data for classification with sufficient accuracy. Since a two-wire configuration requires fewer electrodes to complete impedance measurements, it can reduce the complexity of hardware. The effect of contact impedance can be removed by data preprocess [2,23], because the output signal corresponds to the series of two electrodes and the load, which is necessary to extract the load from the measurements.

3.1.2. Number of Electrodes and Measurements

The electrodes are used as transducers of ionic current, which exists in the intra- and extracellular spaces, to electron current, and they are non-polarizable Ag/AgCl electrodes because of their low potential and low noise during measurements [24].
The aim is to generate less impedance data for classification with sufficient accuracy [2] and the number of electrodes is considered to provide adequate sensitivity and spatial resolution across the imaged space [25]. Using more electrodes enables the user to identify smaller entities having specific conductivities in the examined space, but the overall advantage is counterbalanced by the increased complexity of the measurement chain and higher computational intensity of the reconstruction algorithm [26]. As the number of electrodes is increased, both the measurement time and the calculating time abruptly increase [27]. Therefore, in this case, only eight electrodes are used in order, firstly, to minimize possible hardware complications and, secondly, to obtain a small dataset in the measurement system.
With eight electrodes (N), N(N − 1)/2 allows 28 independent measurements to be done. Therefore, the final measurement system is shown in Figure 1, where the eight electrodes are presented with the letters A, B, C, D, E, F, G, and H. In the first loop of measurements, A electrode is an emitter, and from B to H electrodes are the receivers. Hence, seven independent measurements are made (in Figure 1b, measurement pairs are represented by red arrows). In the second loop of measurements, B electrode is an emitter, and from C to H electrodes are the receivers. Hence, six independent measurements are made (in Figure 1b, measurement pairs are represented by blue arrows). Repeating this sequence, 28 independent measurements are achieved.

3.2. Required Hardware and Software

The AD5933 evaluation board was used to acquire measurements. The system is made up of an integrated frequency generator with an analog-digital converter (ADC) of 12 bits at 1 MSPS (mili samples per second), giving rise to the following measurement process: transmission stage, reception stage, calculation of the bioimpedance DFT. The DFT algorithm returns a real (R) and an imaginary (I) data word at each output frequency. Once calibrated, the magnitude of the impedance and the relative phase of the impedance at each frequency point throughout the sweep is easily calculated.
Software offered by the manufacturer Analog Devices was used for the data acquisition. To that end, it was necessary to adjust parameters to adapt the system to the needs of the application. Next, we describe each of the required steps. The first step was the frequency sweep adjustment which involved three parameters, i.e., the initial frequency (start frequency), delta frequency (delta frequency) and the number of increments (number of elements). The second step was calibration, which was as simple as placing a known resistance value between the input and output of the AD5933; this value was chosen to be in the middle of the unknown impedance range. The parameters selected wer clock, output drive value, and PGA gain. The third step involved o adjusting the gain factor by pressing the program device registers button so that it was possible to save the data in it through I2C [28].

3.3. Configuration Parameters for Pattern Acquisition

Next, the parameters used in the configuration are described, and summarized in Table 1 and Table 2.
As indicated in Table 1, a frequency sweep was established from 50 kHz to 100 kHz to measure the β dispersion section in terms of relaxation and the character that predominates in the tissue is resistive, that is, it exhibits properties similar to an electric current and the current passes through the extracellular space.
As indicated in Table 2, a 2 kΩ RFB was selected because it allowed working in an adequate dynamic range for biopotential measurements. This value was given or adjusted on a trial-error basis.

3.4. Pattern Acquisition and Processing

First, measurements were acquired from a different set of gestures made with the right hand. It was not known a priori what pattern of measurements were to be defined; therefore, gestures that implied distinguishable muscle movements and contractions were selected in order to obtain gestures that were sufficiently different from each other and facilitated the next stage of data processing. Finally, the gestures performed are shown in Figure 2.
The gestures indicated in Figure 2 present each of the gestures that were performed. For each gesture, five repetitions were performed on the same arm and consecutively in order to avoid possible errors introduced by the human factor and to be as similar as possible to each other. Consequently, 28 independent measuresments were obtained for each gesture and each gesture was repeated five times. There were 140 measurements registered for each gesture and 840 measurements in total.
Once the measurements were obtained for each gesture, the mean value was calculated for each independent measurement in the established frequency sweep. The average values were used as characteristics to obtain a pattern and, later, for the classification.

3.5. Pattern Classifier

According to the selected measurement method, as previously mentioned, a set of 28 measurements was obtained that defined the characteristics of each set of gestures. Due to the fact that it was a large volume of data, it was decided to reduce, as far as possible, the dimensionality of the data, selecting only a subset of the characteristics of the measurements. Therefore, it was necessary to apply different classification methods to find a subset of features that could represent the original dataset. The steps carried out are indicated in the scheme shown in Figure 3.
First, statistical and classification techniques were used to infer about the samples obtained in order to draw a consequence and a conclusion. Regardless of the method used, the sequence that was applied for the treatment of the data obtained was based on: (1) study of the data, (2) extraction and selection of characteristics, (3) predictive model, and (4) optimization.
Consequently, it was essential to check if it was possible to reduce the dimensionality of the dataset without significant loss of features of the original set, and thus, obtain a new reduced data matrix. To do this, the following analyses were carried out to determine if a reduction in data dimensionality was feasible: (1) descriptive statistical parameters, (2) regression and correlation analysis and (3) unsupervised learning techniques.
Once a reduced data matrix was obtained, it was necessary to verify that the new set was sufficiently comparable to the initial one through validation techniques.
First, the new reduced dataset must be validated for accuracy during the development of the optimal classification model from machine learning-based classification methods. The k-nearest neighbor (kNN) method was selected because it adapted to the needs of the work. The method is to rank an unknown validation sample in the dataset based on the number of k closest neighbors to the variable.
The last stage was based on the assumption that all classification methods intrinsically include errors that can lead to false positives; therefore, optimization of the model must be calculated.

4. Results

The first results are derived from the extraction of features that allow the dimensionality reduction of the data.
For each of the steps proposed, Figure 3, the method was adjusted based on the nature of the data, the requested requirements, and the hypotheses to be proven.

4.1. Experimental Results of Data Reading

Figure 4 shows the bioimpedance data measured by each gesture.

Analysis of the Reading of Experimental Results

It can be observed in Figure 4 that each gesture presents a characteristic pattern, although within a range of similar impedance values (400 Ω–650 Ω). In addition, visually it seems that the gestures are grouped into two distinguishable groups based on the similarity of their nature. From higher to lower impedance, the first group consists of the fist, left, forefinger-thumb, and claw gestures, while the second group consists of the relaxed gesture and thumb. Note that the gestures of the first group are more abrupt, that is, they require more force than those of the second group, which are more relaxed because they do not require as much effort. In addition, it can be observed that the fist gesture is totally opposite to the relaxed one in terms of impedance.
Another observation that can be seen is that there is a characteristic envelope between Measurements 1 and 14 and another that can be differentiated from Measurement 15 to 28 in all gestures. This observation is considered to check if any simplification is feasible to classify the gestures that are made.
From now on, we will work with three possible simplification datasets, as follows:
  • The pattern from 28 measurements to 14 measurements will be called “Reduction A”, abbreviated as RedA;
  • The pattern from the first 14 measurements to 7 measurements will be called “Reduction B”, abbreviated as RedB;
  • The pattern from the last 14 measurements to 5 measurements will be called “Reduction C”, abbreviated as RedC.
Therefore, after this analysis, the possibility that the readings are linearly dependent on each other is assessed and a PCA is applied to reduce the dimensionality of the dataset. It should be noted that to test this hypothesis and extract the characteristics, the results obtained from the descriptive parameters and the correlation matrix will be used. In [29], a multiple measurement vector model was established to capture the spatiotemporal correlations and to describe the underlying multidimensional reconstruction problem.

4.2. Experimental Results of Feature Extraction and Selection

4.2.1. Correlation Matrix Results

In this section, we discuss the results obtained from the statistical parameters, the correlation matrix, and the dimensionality reduction by means of principal component analysis (PCA).
Because the gestures present a similar range of bioimpedances, as has been observed previously, consequently, the mean values and the deviations are similar in the case of gestures that are similar to each other, as has been previously pointed out. Soft gestures such as the relaxed gesture and the thumb exhibit the lowest bioimpedance values, while the gestures of the first group show average values similar to each other. This observation can be seen both in the set of 28 measurements and in the other two cases.
These observed differences imply that each gesture is differentiable, but sufficiently dependent, therefore, the mean values and the deviations are similar.
Experimental results of the correlation matrix:
Prior to the application of the component analysis for the dimensionality reduction, it was verified which parameters were correlated with each other, and therefore, according to the statistical parameters obtained, the following hypotheses were proposed:
  • H0 states that the set of measurements obtained by each gesture is linearly independent of each other
  • H1 states that the set of measurements obtained by each gesture is linearly dependent on each other
Correlated data is considered in those cases with Pearson coefficients greater than 0.65. Figure 5 shows the graphic matrix of correlations obtained, indicating both the Pearson coefficient and the linear fit.
After the results are obtained, the application of component analysis for dimensionality reduction is feasible. A priori, six main components are extracted, since six gestures are going to be worked on.

4.2.2. Experimental Results of Dimensionality Reduction by PCA

Figure 6 shows the scree plot graph that relates the eigenvalues (eigenvalue) with respect to the main components.
It can be seen in Figure 7 that the change in trend of the graph is shown in Principal Component 2. Next, in Table 3, Table 4 and Table 5, the variance and its accumulated [30] are calculated for each component.
Figure 7 and Table 3, Table 4 and Table 5 show that working with two principal components is feasible because they describe more than 90% of the data in the three reductions. With these results, it is decided to work the dataset with only two main components.
The graphical representation corresponds to the biplot graph, Figure 7, where the lines are shown in blue and the points that correspond to the new coordinates that the original variables take in the two dimensions determined by PC1 and PC2 are shownin red.
In Figure 7a, the gestures are located in the first and fourth quadrants, with the left fist gesture and the rest of the gestures, respectively. The measurements that are in those two quadrants are: 4-5-6-10-11-12-13-21-23-24-25-26-27-28.
In Figure 7b, the gestures are located in the first and fourth quadrants, with the relaxed gesture and the rest of the gestures, respectively. The measurements that are in those two quadrants are: 4-5-6-10-11-12-13.
Lastly, in Figure 7c, the gestures are located in the first and fourth quadrants again, with the gesture fist-left and the rest of the gestures, respectively. The measurements that are in those two quadrants are: 24-25-26-27-28.
In all three cases, the measurements in the first and fourth quadrants have been selected because they are the variables that present the smallest distance with respect to any vector defined by any of the gestures. In Table 6, the measurements obtained in each reduction set are presented.
Therefore, in the case of 28 measurements, it can be reduced to 14 measurements, that is, it is reduced by half. In the case of measurements from 1 to 14, it is possible to work with half, therefore, up to seven measurements and, finally, from measurements 15 to 28, it can be reduced to five measurements. This observation is shown in Figure 8, respectively.
From the results shown in Table 6, in Reduction A and in Reduction B, it is possible to simplify their dimensionality by up to half, while in the case of Reduction C it is possible to reduce up to five measurements. Furthermore, in the case of Reduction A and Reduction C, the vectors that define the gestures are located in the same quadrant, fist and left gesture in the first quadrant, and the rest of the gestures in the fourth quadrant.
At the level of use of electrodes, for Reduction A seven electrodes are required, for Reduction B six electrodes are required, and for Reduction C four electrodes are required. Therefore, this results in a time reduction of 50% for both Reductions A and B and 64% for Reduction C.
With these data, a possible reduced bioimpedance pattern is obtained, and the next step is to create a validation data matrix in order to see if, with this reduction, it is possible to make a correct classification.

4.3. Experimental Results of the Predictive Model

In this section, the results obtained in the cross-validation and in the kNN classification algorithm [31] are shown, evaluating their potential, in each case, with the error and sensitivity parameters, respectively.

Cross-Validation

For cross-validation, the dimensionality reduction indicated in Table 7 is used in order to observe which of the three reduced patterns is comparable to the original. Since it is not known which value of K to select, the error for various values is calculated.
It can be seen in Table 7 that all the K iterations present similar errors among themselves. However, it is clear that the dimensionality Reduction B shows very low errors and, therefore, it is a pattern that will not be used in the following steps because it will not allow the classification to be robust. As observed in Figure 9b, the gestures were not classified in the same way as compared with the other cases, and the cross-validation results corroborate that the selection of this set is not a good option. In contrast, in the case of dimensionality Reductions A and C, similar cross-validation errors are obtained, as observed in the classification shown in Figure 9a,c.
Therefore, the EIT system can be simplified by using fewer measurements, also reducing measurement and computation time. An iteration value K = 5 is selected because Reductions A and C present the error values closest to the unit.

4.4. Results of the kNN Classification Algorithm and Sensitivity

To apply this algorithm, it is necessary to take into account what type of distance is going to be used (Euclidean or Hamming), the labels, and the number of neighbors, this parameter being determined by the variable k. In the first instance, the viable value of k is unknown and, therefore, the kNN classification algorithm is iterated several times for different values of k.
Given the results obtained in Table 7, we work with Reduction A pattern and with Reduction C pattern. The classification potential of this algorithm is evaluated by means of the sensitivity values obtained by different values of k for both sets, Reductions A and C, and for each gesture.
After calculating the sensitivity values for different values of k, it is observed that, in all the gestures, except the index-thumb gesture, the highest sensitivity value is given for k = 6, above 0.89 for any gesture and both Reductions A and C. In addition, it is observed that the highest sensitivity values are obtained, both for Reduction A and for Reduction C.
Finally, after carrying out the analysis of the results, the dimensionality Reduction C is selected as the optimal classifier pattern because, although the Reduction A pattern could have been chosen, the other option is chosen because it presents two measurements less and, therefore, it would reduce the classification time to 64%, instead of 50% for Reduction A, and it would only be necessary to use four electrodes instead of seven electrodes for Reduction A. Table 8 shows the statistical parameters considered in terms of mean and standard deviation obtained by each reduction and for each gesture. It is observed that the standard deviation is less in some gestures obtained from Reduction C and the mean values are similar in all the reductions.
Figure 9 shows the envelopes for all the measurements and Reductions A and C, respectively.
Appreciating Figure 9, there are gestures that are very similar to each other and show similar impedance values, allowing the set of gestures performed to be divided into three groups, from greater to less impedance as follows:
  • Group 1, left gesture and fist;
  • Group 2, index-thumb and claw gesture;
  • Group 3, thumb gesture and relaxed.
Comparing the results in Figure 9a with those of the reductions in Figure 9b,c, the gestures are more distant from each other, being possible to distinguish three groups of defined gestures instead of two. Therefore, in order to select gestures that are largely differentiable for the continuation of this work in the development prototype of a motorized chair, a gesture should be selected from each of the three determined groups and, in this way, make it easier for possible classification errors.

5. Discussion

To improve this prototype, it is proposed to make modifications to the hardware used, which include: (1) an external reference oscillator, (2) a calibration system, and (3) adding a multiplexing stage [32]. Given the results obtained, a prototype is proposed that would be perfectly feasible for human–machine interface (HMI) applications. Therefore, as a continuation of this work, we are currently working on the development of a motorized wheelchair that is directed by the subject through the bioimpedance patterns studied in this work and associated with specific movements of the motor due to the positive results reflected in the feasibility analysis. In this way, an alternative could be offered that would improve the mobility of the subject at a reduced cost as compared with the options currently on the market. These solutions, generally based on a sensor system with moving parts, suffer from a significant problem of wear, lack of adaptability to the patient, and lack of resolution, problems that the proposal implemented in this work based on the so-called electrical impedance tomography does not have.

6. Conclusions

The measurements obtained were carried out on the epidermis of the forearm, specifically, on the stratum corneum. These measurements are determined by the electrical properties defined by the intra-extracellular space and the cell membrane of epithelial cells, which can be considered to be an electrolyte or as a capacitor, respectively. According to the models presented in the literature, the Fricke model was selected as electrical equivalence because it is simple and sufficient for the objective of this work. Therefore, the electrical equivalent can be considered to be a parallel RC circuit.
According to the requirements to base the measurements on the EIT technique, the following conditions were taken into consideration: An injection of sinusoidal current so that it is distributed uniformly and the accumulation of charges in the cell membrane can be considered to be a capacitor. On the other hand, the measurements have been carried out in the frequency range from 50 kHz to 100 kHz, β dispersion in terms of relaxation, so that the character that predominates in the tissue is resistive, that is, that they present similar electrical properties in continuous and the current passes through the extracellular space. The measurements were made with a two-wire configuration in order to simplify the required electronics and because the electrodes were close to each other and there were no problems of decreased current density. In addition, eight electrodes were used to reduce the number of measurements, the data acquisition time, and the corresponding electronics. Given these premises, 28 measurements were obtained for each of the six different gestures: index-thumb, left, claw, fist, thumb, and relaxed.
Once the results of the measurements were obtained, it was visually verified if there were differences between the envelopes of the 28 bioimpedance measurements of each gesture and, although they were found in similar impedance values 400 Ω–650 Ω, it was possible to identify two groups of gestures: the first group included the index-thumb, left, fist and claw gestures and the second group included the thumb and relaxed gestures. In addition, there was a characteristic envelope between measurements 1 and 14 and another could be differentiated from measurements 15 to 28 in all gestures. This observation was taken into account to analyze three possible patterns.
In order to extract and select characteristics, first, an analysis at the level of descriptive parameters was carried out, and differences between the gestures were observed, meaning that each gesture was differentiable, but sufficiently dependent. Therefore, the mean values and the deviations are similar.
Second, the correlation matrix shows that, in each case and in all the combinations between gestures, the Pearson coefficients are above 0.65 and the significance values are below 0.05. These results allow us to reject hypothesis H0 and accept the alternative hypothesis H1. Therefore, the condition is fulfilled so that it is possible to apply principal component analysis to reduce the dimensionality of the dataset. After its application, it was observed that it was possible to work with two main components because they describe, in the worst case, 96% of the data. Bearing in mind that the measurements with the least distance from the lines of each gesture are those located in the first and fourth quadrants, three possible reduced patterns were established; Reduction A from 28 to 14 measurements, Reduction B from 14 to 7 measurements, and Reduction C from 14 to 5 measurements.
Consequently, to validate this pattern reduction, cross-validation was calculated, obtaining the best values, at the level of error terms, for an iteration of K = 5. In this step, due to the error values obtained in Reduction B, this was omitted as a pattern for the following steps and the classification algorithm was only applied to Reductions A and C. When applying the kNN classification algorithm, selected for triviality, it was observed that the highest sensitivity values were given for k = 6, in terms of sensitivity, both for Reduction A and for Reduction C.
Given these results, the dimensionality Reduction C was selected as valid because, although Reduction A could have been chosen, this option was selected because it presented two fewer measurements and, consequently, reduced the sorting time to 64% instead of 50% for Reduction A, and only four electrodes were needed instead of seven electrodes for Reduction A. In the new dimensions, it is appreciated that the gestures can be divided into three groups, instead of the initial two groups: the first group includes left and fist, the second group includes index-thumb and claw, and the third group includes thumb and relaxed.
This work has demonstrated that the study of EIT applications is very versatile; improvements in hardware and software have been analyzed and proposed in this work in order to make a system that is easier to use and adatable to applications in real time that are proposed as a continuation of this work. Given the results obtained, three gestures from each of the differentiable groups obtained would be feasible, and thus, avoid possible classification errors.
With the results of the feasibility study for the development of a wheelchair with a motor guided by the analyzed gestures, it is obtained that, comparing the cost of our option with respect to those existing in the market, a saving of 23% is achieved and it is compared with the more expensive option, this saving is increased to 85%.
In conclusion, this work proposes a feasible HMI application that applies bioimpedance patterns obtained from a set of gestures and associates them with the control of a motor, which it is a viable alternative option, since it is a product demanded by society and it has a more attractive cost than existing solutions on the market. Current solutions are based on a sensor system with moving parts, which suffers from significant problems of wear, lack of adaptability to the patient, and lack of resolution, problems that the proposal implemented in this work does not present.

Author Contributions

Conceptualization, N.V.-G. and H.M.-G.; methodology, N.V.-G. and H.M.-G.; software, N.V.-G.; validation, N.V.-G. and H.M.-G.; formal analysis, N.V.-G. and H.M.-G.; investigation, N.V.-G.; resources, N.V.-G. and H.M.-G.; data curation, N.V.-G. and H.M.-G.; writing—original draft preparation, N.V.-G. and H.M.-G.; writing—review and editing, N.V.-G. and H.M.-G.; visualization, H.M.-G.; supervision, H.M.-G.; project administration, H.M.-G.; funding acquisition, H.M.-G. All authors have read and agreed to the published version of the manuscript.

Funding

Grant PGC2018-098946-B-I00 funded by: MCIN/AEI/10.13039/501100011033/ and by ERDF ERDF A way of making Europe.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the Spanish Ministerio de Ciencia, Innovación y Universidades (MICINN)-Agencia Estatal de Investigación (AEI) and the European Regional Development Funds (ERDF), by grant PGC2018-098946-B-I00 funded by MCIN/AEI/10.13039/501100011033/ and by ERDF A way of making Europe.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wu, Y.; Jiang, D.; Liu, X.; Bayford, R.; Demosthenous, A. A Human–Machine Interface Using Electrical Impedance Tomography for Hand Prosthesis Control. IEEE Trans. Biomed. Circuits Syst. 2018, 12, 1322–1333. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Ma, G.; Hao, Z.; Wu, X.; Wang, X. An optimal Electrical Impedance Tomography drive pattern for human-computer interaction applications. IEEE Trans. Biomed. Circuits Syst. 2020, 14, 402–411. [Google Scholar] [CrossRef] [PubMed]
  3. Yao, J.; Chen, H.; Xu, Z.; Huang, J.; Li, J.; Jia, J.; Wu, H. Development of a Wearable Electrical Impedance Tomographic Sensor for Gesture Recognition With Machine Learning. IEEE J. Biomed. Health Inform. 2019, 24, 1550–1556. [Google Scholar] [CrossRef]
  4. Jiang, D.; Wu, Y.; Demosthenous, A. Hand Gesture Recognition Using Three-Dimensional Electrical Impedance Tomography. IEEE Trans. Circuits Syst. II Express Briefs 2020, 67, 1554–1558. [Google Scholar] [CrossRef]
  5. Rezvanigilkilaei, S.; Vefaghnematollahi, S. Using Electrical Impedance Tomography to Control a Robot. Available online: https://publications.waset.org/10003835/using-electrical-impedance-tomography-to-control-a-robot (accessed on 18 June 2022).
  6. Zhang, Y.; Harrison, C. Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, Charlotte, NC, USA, 8–11 November 2015; pp. 167–173. [Google Scholar]
  7. Dipietro, L.; Sabatini, A.M.; Dario, P. A Survey of Glove-Based Systems and Their Applications. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2008, 38, 461–482. [Google Scholar] [CrossRef]
  8. Kumuda, S.; Mane, P.K. Smart Assistant for Deaf and Dumb Using Flexible Resistive Sensor: Implemented on LabVIEW Platform. In Proceedings of the 2020 International Conference on Inventive Computation Technologies (ICICT), Coimbatore, India, 26–28 February 2020; pp. 994–1000. [Google Scholar] [CrossRef]
  9. Chen, Y.; Liang, X.; Assaad, M.; Heidari, H. Wearable Resistive-based Gesture-Sensing Interface Bracelet. In Proceedings of the 2019 UK/China Emerging Technologies (UCET), Glasgow, UK, 21–22 August 2019; pp. 1–4. [Google Scholar] [CrossRef] [Green Version]
  10. Stowe, S.; Adler, A. The Effect of Internal Electrodes on Electrical Impedance Tomography Sensitivity. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; Volume 2020, pp. 1457–1460. [Google Scholar] [CrossRef]
  11. Grimnes, S.; Martinsen, O.G. Bioimpedance and Bioelectricity Basics, 3rd ed.; Elsevier: London, UK, 2014; Chapter 3. [Google Scholar]
  12. Bronzino, J.D.; Peterson, D.R. Medical Devices and Human Engineering, 1st ed.; Bronz, J.D., Ed.; CRC Press: Boca Raton, FL, USA, 2014; Chapter 10. [Google Scholar]
  13. Singh, H.P.; Kumar, P. Developments in the human machine interface technologies and their applications: A review. J. Med. Eng. Technol. 2021, 45, 552–573. [Google Scholar] [CrossRef] [PubMed]
  14. Rosell Ferrer, F.X. Tomografía de Impedancia Eléctrica Para Aplicaciones Médicas. Ph.D. Thesis, Universitat Politècnica de Catalunya, Barcelona, Spain, 1989. [Google Scholar]
  15. Gabriel, C.; Gabriel, S.; Corthout, E. The dielectric properties of biological tissues: I. Literature survey. Phys. Med. Biol. 1996, 41, 2231–2249. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Barnes, F.S.; Greenebaum, B.; Greenebaum, B. Handbook of Biological Effects of Electromagnetic Fields—Two Volume Set; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  17. Nordenström, B. Biologically Closed Electric Circuits: Clinical, Experimental and Theoretical Evidence for an Additional Circulatory System; Princeton University Press: Princeton, NJ, USA, 1983. [Google Scholar]
  18. Rigaud, B.; Morucci, J.P. Bioelectrical impedance techniques in medicine. Part III: Impedance imaging. First section: General concepts and hardware. Crit. Rev. Biomed. Eng. 1996, 24, 467–597. [Google Scholar] [CrossRef]
  19. Ackmann, J.J.; Seitz, M.A. Methods of complex impedance measurements in biologic tissue. Crit. Rev. Biomed. Eng. 1984, 11, 281–311. [Google Scholar] [PubMed]
  20. Cömert, A. The Assessment and Reduction of Motion Artifact in Dry Contact Biopotential Electrodes. Ph.D. Thesis, Tampere University of Technology, Tampere, Finland, 2015. [Google Scholar]
  21. Holder, D.S. Electrical Impedance Tomography, 1st ed.; Institute of Physics Publishing: Bristol, PA, USA, 2005; Article number 27. [Google Scholar] [CrossRef] [Green Version]
  22. Gupta, A.K. Application Report Respiration Rate Measurement Based on Impedance Pneumography. Texas Instruments. Application Report, 2011. Available online: https://www.semanticscholar.org/paper/Respiration-Rate-Measurement-Based-on-Impedance-Gupta/c761bc3d0abf54c4d042d9670808344248b7edc3 (accessed on 28 May 2022).
  23. Yufera, A.; Rueda, A. A method for bioimpedance measure with four- and two-electrode sensor systems. In Proceedings of the 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; Volume 2008, pp. 2318–2321. [Google Scholar] [CrossRef]
  24. Webster, J.G. Medical Instrumentation: Application and Design, 5th ed.; John Wiley & Sons: New York, NY, USA, 2020. [Google Scholar]
  25. Chong, Y.L.; Chin, R.K.Y. An Investigation of the Effect of Different Number of Electrodes on EIT Reconstructed Images. In Proceedings of the 2020 IEEE 2nd International Conference on Artificial Intelligence in Engineering and Technology (IICAIET), Kota Kinabalu, Malaysia, 26–27 September 2020; pp. 1–6. [Google Scholar] [CrossRef]
  26. Mikulka, J.; Dusek, J.; Dedkova, J.; Pařílková, J.; Munsterova, Z. A Fast and Low-cost Measuring System for Electrical Impedance Tomography. In Proceedings of the 2019 PhotonIcs & Electromagnetics Research Symposium—Spring (PIERS-Spring), Rome, Italy, 17–20 June 2019; pp. 3751–3755. [Google Scholar] [CrossRef]
  27. Ito, T.; Kaneda, N.; Higuchi, Y. Simulation based prior evaluation of 3-D EIT system with a small number of electrodes. In Proceedings of the 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Kanazawa, Japan, 19–22 September 2017; pp. 912–915. [Google Scholar] [CrossRef]
  28. Bagdalkar, P.; Ali, L. Interfacing of light sensor with FPGA using I2C bus. In Proceedings of the 2020 6th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 6–7 March 2020; pp. 843–846. [Google Scholar] [CrossRef]
  29. Liu, S.; Cao, R.; Huang, Y.; Ouypornkochagorn, T.; Jia, J. Time Sequence Learning for Electrical Impedance Tomography Using Bayesian Spatiotemporal Priors. IEEE Trans. Instrum. Meas. 2020, 69, 6045–6057. [Google Scholar] [CrossRef] [Green Version]
  30. Nansheng, P.; Yingling, S.; Changming, J. Research on comprehensive bid evaluation of construction project based on the principal component analysis. In Proceedings of the 2008 4th International Conference on Wireless Communications, Networking and Mobile Computing, Dalian, China, 12–14 October 2008. [Google Scholar]
  31. Sun, S.; Huang, R. An adaptive k-nearest neighbor algorithm. In Proceedings of the 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery, Yantai, China, 10–12 August 2010; Volume 1, pp. 91–94. [Google Scholar] [CrossRef]
  32. Barreiro, M.; Sánchez, P.; Vera, J.; Viera, M.; Morales, I.; Dell´osa, A.H.; Bertemes-Filho, P.; Simini, F. Multiplexing Error and Noise Reduction in Electrical Impedance Tomography Imaging. Front. Electron. 2022, 3, 848618. [Google Scholar] [CrossRef]
Figure 1. Two-wire measurement scheme with eight electrodes: (a) Presents the two-wire measurement system to be used, with A electrode as an emitter and B electrode as a receiver; (b) shows a first loop of measurements with A electrode as an emitter (red arrows represent the measurement pairs) and a second loop of measurements with B electrode as an emitter (blue arrows represent the measurement pairs).
Figure 1. Two-wire measurement scheme with eight electrodes: (a) Presents the two-wire measurement system to be used, with A electrode as an emitter and B electrode as a receiver; (b) shows a first loop of measurements with A electrode as an emitter (red arrows represent the measurement pairs) and a second loop of measurements with B electrode as an emitter (blue arrows represent the measurement pairs).
Jlpea 12 00041 g001
Figure 2. Gestures made with the right hand to acquire the impedance patterns. Each gesture is named as: (a) Claw; (b) fist; (c) index-thumb; (d) relaxed; (e) left; and (f) thumb.
Figure 2. Gestures made with the right hand to acquire the impedance patterns. Each gesture is named as: (a) Claw; (b) fist; (c) index-thumb; (d) relaxed; (e) left; and (f) thumb.
Jlpea 12 00041 g002
Figure 3. Procedure for data processing.
Figure 3. Procedure for data processing.
Jlpea 12 00041 g003
Figure 4. Impedance patterns from gesture measurements for each combination: (a) Index-thumb gesture; (b) left gesture; (c) fist gesture; (d) claw gesture; (e) thumb gesture; (f) relaxed gesture. Each bioimpedance pattern corresponds to: black, first iteration; red, second iteration; blue, third iteration; pink, fourth iteration; green, fifth iteration.
Figure 4. Impedance patterns from gesture measurements for each combination: (a) Index-thumb gesture; (b) left gesture; (c) fist gesture; (d) claw gesture; (e) thumb gesture; (f) relaxed gesture. Each bioimpedance pattern corresponds to: black, first iteration; red, second iteration; blue, third iteration; pink, fourth iteration; green, fifth iteration.
Jlpea 12 00041 g004
Figure 5. Graph of correlations between each gesture. The linear fit and Pearson’s coefficient are also represented for: (a) Reduction A; (b) Reduction B; (c) Reduction C.
Figure 5. Graph of correlations between each gesture. The linear fit and Pearson’s coefficient are also represented for: (a) Reduction A; (b) Reduction B; (c) Reduction C.
Jlpea 12 00041 g005aJlpea 12 00041 g005b
Figure 6. Scree graph that relates the eigenvalues with respect to the main components for: (a) Reduction A; (b) Reduction B; (c) Reduction C.
Figure 6. Scree graph that relates the eigenvalues with respect to the main components for: (a) Reduction A; (b) Reduction B; (c) Reduction C.
Jlpea 12 00041 g006aJlpea 12 00041 g006b
Figure 7. Biplot graph that relates Principal Component 2 with respect to Principal Component 1 for: (a) Reduction A; (b) Reduction B; (c) Reduction C.
Figure 7. Biplot graph that relates Principal Component 2 with respect to Principal Component 1 for: (a) Reduction A; (b) Reduction B; (c) Reduction C.
Jlpea 12 00041 g007
Figure 8. Dimensionality reduction for: (a) Reduction A. Measurements A are indicated in green, measurements B in blue, measurements D in yellow, measurements E in orange, measurements F in black, and measurements G in gray; (b) Reduction B. Measurements A are indicated in green and measurements B in blue; and (c) Reduction C. Measurements E are indicated in orange, measurements F in black, and measurements G in gray.
Figure 8. Dimensionality reduction for: (a) Reduction A. Measurements A are indicated in green, measurements B in blue, measurements D in yellow, measurements E in orange, measurements F in black, and measurements G in gray; (b) Reduction B. Measurements A are indicated in green and measurements B in blue; and (c) Reduction C. Measurements E are indicated in orange, measurements F in black, and measurements G in gray.
Jlpea 12 00041 g008
Figure 9. Impedance pattern of each gesture measured for: (a) The 28 measurements; (b) Reduction A; (c) Reduction C. The legend is at follows: black, mean value for index-thumb gesture; red, mean value for left gesture; blue, mean value for fist gesture; pink, mean value for claw gesture; green, mean value for relaxed gesture; dark blue, mean value for thumb gesture.
Figure 9. Impedance pattern of each gesture measured for: (a) The 28 measurements; (b) Reduction A; (c) Reduction C. The legend is at follows: black, mean value for index-thumb gesture; red, mean value for left gesture; blue, mean value for fist gesture; pink, mean value for claw gesture; green, mean value for relaxed gesture; dark blue, mean value for thumb gesture.
Jlpea 12 00041 g009
Table 1. Frequency sweep parameters configured in the first step.
Table 1. Frequency sweep parameters configured in the first step.
ParameterValue
Start frequency50 kHz
Delta frequency500 Hz
Increment number100
Final frequency100 kHz
Table 2. Calibration parameters configured in the second step.
Table 2. Calibration parameters configured in the second step.
ParameterValue
System clockExternal clock
Output excitation1 VPP
PGA controlGain = 1
Calibration impedanceR1 = 2 kΩ
Table 3. Eigenvalues, variance, and accumulated for each principal component defined for Reduction A. In orange shading, the accumulation greater than 90%.
Table 3. Eigenvalues, variance, and accumulated for each principal component defined for Reduction A. In orange shading, the accumulation greater than 90%.
Component (PCi)EigenvalueVariance (%)Accumulated
15.17886.29%86.29%
20.4136.89%93.18%
30.3015.03%98.20%
40.06811.14%99.34%
50.03330.55%99.89%
60.006540.11%100.00%
Table 4. Eigenvalues, variance, and accumulated for each principal component defined for Reduction B. In orange shading, the accumulation greater than 90%.
Table 4. Eigenvalues, variance, and accumulated for each principal component defined for Reduction B. In orange shading, the accumulation greater than 90%.
Component (PCi)EigenvalueVariance (%)Accumulated
15.38789.79%89.79%
20.4367.27%97.06%
30.1452.43%99.49%
40.02150.36%99.85%
50.008170.14%99.98%
69.22 × 10−40.02%100.00%
Table 5. Eigenvalues, variance, and accumulated for each principal component defined for Reduction C. In orange shading, the accumulation greater than 90%.
Table 5. Eigenvalues, variance, and accumulated for each principal component defined for Reduction C. In orange shading, the accumulation greater than 90%.
Component (PCi)EigenvalueVariance (%)Accumulated
15.335988.93%88.93%
20.4477.46%96.39%
30.1302.17%98.56%
40.07731.29%99.85%
50.004790.08%99.93%
64.02 × 10−30.07%100.00%
Table 6. Measurements obtained after applying component analysis for Reduction A, Reduction B, and Reduction C.
Table 6. Measurements obtained after applying component analysis for Reduction A, Reduction B, and Reduction C.
Measurements
RedA4
AE
5
AF
6
AG
10
BE
11
BF
12
BG
13
BH
21
DG
23
EF
24
EG
25
EH
26
FG
27
FH
28
GH
RedB4
AE
5
AF
6
AG
10
BE
11
BF
12
BG
13
BH
-------
RedC---------24
EG
25
EH
26
FG
27
FH
28
GH
Table 7. Errors obtained after cross-validation for Reduction A (RedA), Reduction B (RedB), and Reduction C (RedC) for different values of K.
Table 7. Errors obtained after cross-validation for Reduction A (RedA), Reduction B (RedB), and Reduction C (RedC) for different values of K.
Iteration KGestureError, RedAError, RedBError, RedC
K = 2Z Index-Thumb (Ω)0.9300.4320.874
Z Left (Ω)0.9550.4430.897
Z Fist (Ω)0.9530.4420.895
Z Claw (Ω)0.9340.4330.877
Z Relaxed (Ω)0.7790.3610.732
Z Thumb (Ω)0.9580.4440.900
K = 3Z Index-Thumb (Ω)0.8610.4030.823
Z Left (Ω)0.8630.4030.822
Z Fist (Ω)0.8610.4020.819
Z Claw (Ω)0.8580.3990.813
Z Relaxed (Ω)0.8360.3890.790
Z Thumb (Ω)0.8580.3990.809
K = 4Z Index-Thumb (Ω)0.8460.3920.795
Z Left (Ω)0.8680.4030.815
Z Fist (Ω)0.8660.4020.814
Z Claw (Ω)0.8490.3940.798
Z Relaxed (Ω)0.7080.3280.665
Z Thumb (Ω)0.8700.4040.818
K = 5Z Index-Thumb (Ω)0.8720.4050.819
Z Left (Ω)0.8950.4150.841
Z Fist (Ω)0.8930.4140.839
Z Claw (Ω)0.8750.4060.822
Z Relaxed (Ω)0.7300.3390.686
Z Thumb (Ω)0.8970.4160.843
K = 6Z Index-Thumb (Ω)0.8460.3920.795
Z Left (Ω)0.8680.4030.815
Z Fist (Ω)0.8660.4020.814
Z Claw (Ω)0.8490.3940.798
Z Relaxed (Ω)0.7080.3280.665
Z Thumb (Ω)0.8700.4040.818
Table 8. Descriptive parameters mean (x) and standard deviation (δ) for each mean value of the gesture for Reduction A (Red A), Reduction B (Red B), and Reduction C (Red C).
Table 8. Descriptive parameters mean (x) and standard deviation (δ) for each mean value of the gesture for Reduction A (Red A), Reduction B (Red B), and Reduction C (Red C).
Z Index-
Thumb (Ω)
Z
Left (Ω)
Z
Fist (Ω)
Z
Claw (Ω)
Z
Relaxed (Ω)
Z
Thumb (Ω)
Red. Ax416.82420.37421.56418.29364.31364.38
δ13.0920.3824.4814.0812.3910.79
Red. Bx414.98415.59412.44416.54363.20362.90
δ15.2418.9318.3116.7915.0111.39
Red. Cx418.67425.15430.68420.05365.41365.86
δ10.8021.3227.0211.109.5310.37
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Vaquero-Gallardo, N.; Martínez-García, H. Electrical Impedance Tomography for Hand Gesture Recognition for HMI Interaction Applications. J. Low Power Electron. Appl. 2022, 12, 41. https://doi.org/10.3390/jlpea12030041

AMA Style

Vaquero-Gallardo N, Martínez-García H. Electrical Impedance Tomography for Hand Gesture Recognition for HMI Interaction Applications. Journal of Low Power Electronics and Applications. 2022; 12(3):41. https://doi.org/10.3390/jlpea12030041

Chicago/Turabian Style

Vaquero-Gallardo, Noelia, and Herminio Martínez-García. 2022. "Electrical Impedance Tomography for Hand Gesture Recognition for HMI Interaction Applications" Journal of Low Power Electronics and Applications 12, no. 3: 41. https://doi.org/10.3390/jlpea12030041

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop