Next Article in Journal
Towards More Accurate and Complete Heterogeneous Iris Segmentation Using a Hybrid Deep Learning Approach
Previous Article in Journal
Fuzzy Color Aura Matrices for Texture Image Segmentation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Using Computer Vision to Track Facial Color Changes and Predict Heart Rate

1
Research Center in Sports Sciences, Health Sciences and Human Development, CIDESD, Universidade de Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
2
Institute for Systems and Computer Engineering, Technology and Science, INESC TEC, 4200-465 Porto, Portugal
3
Department of Sport Science, Biomechanics, Kinesiology and Computer Science, University of Vienna, 1150 Vienna, Austria
*
Author to whom correspondence should be addressed.
J. Imaging 2022, 8(9), 245; https://doi.org/10.3390/jimaging8090245
Submission received: 28 June 2022 / Revised: 23 August 2022 / Accepted: 6 September 2022 / Published: 9 September 2022

Abstract

:
The current technological advances have pushed the quantification of exercise intensity to new era of physical exercise sciences. Monitoring physical exercise is essential in the process of planning, applying, and controlling loads for performance optimization and health. A lot of research studies applied various statistical approaches to estimate various physiological indices, to our knowledge, no studies found to investigate the relationship of facial color changes and increased exercise intensity. The aim of this study was to develop a non-contact method based on computer vision to determine the heart rate and, ultimately, the exercise intensity. The method was based on analyzing facial color changes during exercise by using RGB, HSV, YCbCr, Lab, and YUV color models. Nine university students participated in the study (mean age = 26.88 ± 6.01 years, mean weight = 72.56 ± 14.27 kg, mean height = 172.88 ± 12.04 cm, six males and three females, and all white Caucasian). The data analyses were carried out separately for each participant (personalized model) as well as all the participants at a time (universal model). The multiple auto regressions, and a multiple polynomial regression model were designed to predict maximum heart rate percentage (maxHR%) from each color models. The results were analyzed and evaluated using Root Mean Square Error (RMSE), F-values, and R-square. The multiple polynomial regression using all participants exhibits the best accuracy with RMSE of 6.75 (R-square = 0.78). Exercise prescription and monitoring can benefit from the use of these methods, for example, to optimize the process of online monitoring, without having the need to use any other instrumentation.

1. Introduction

The current technological advances have pushed the process of quantifying exercise intensity to higher levels in sports sciences. Monitoring physical exercise is essential in the process of planning, applying, and controlling loads for performance optimization and health [1,2,3]. The physiological responses vary depending upon the individual, the task, and the level of exertion during the physical exercise [4]. In steady-state continuous exercise, the level of physiological exertion is very low at the beginning and increases linearly with exercise intensity. In cases where intensity rises above the ventilatory and/or anaerobic threshold, the increase in the exertion is exponential. The changes of cardiovascular status during exercise and, broadly, the entirety of physiologic function, is reflected in heart rate (HR) responses [5,6], although body temperature, blood pressure, and blood lactate concentration can also quantify internal loads [7].
Exercise intensity can be estimated using objective and subjective measurements. The subjective measurements represent the level of exertion imposed by exercise through psychophysiological scales which require considerable individual familiarization and consume additional training time. Borg scale of perceived exertion [4] is one of the well-known examples of it. An important part of the research related to the measurement of the physical exercise intensity relies on the extraction of the physiological parameters such as heart rate [8,9], as the gold standard measure internal responses to the exercise. The previous studies demonstrated that the heart rate response to exercise is quite individual, whether it is moderate or vigorous exercise [10,11]. The HR response presents direct relation to the cardiac output during effort situation and can be obtained using a contact sensor technique or contact-less technique [12,13].
The use of wearables for monitoring biological signals during exercise is a trend that releases a novel perspective for an ecologic approach based on investigations in the sports sciences and medicine areas [14,15]. However, the contact sensors may present some drawbacks to users such as some discomfort during the exercise. Moreover, during movement, the devices can lack the contact with the skin, thus impairing data recordings. There is also a chance of failure in the data transmission from the sensor to the computing device, which might impair data sampling. The contactless sensors technology [16,17,18] and special methodologies for processing the contactless signals [19,20,21,22] are good alternatives to overcome the previous mentioned difficulties in estimating exercise intensity and energy expenditure preserving the ecological demands in applied science [12,23]. In this sense, facial features are the trending possibilities of the contactless estimation of physiological parameters [7,24,25]. Although facial expression of emotions is a cue that has been related to changes in physical and mental state [26,27,28,29,30], the use of facial color is an emerging technique in pattern recognition [31,32,33] that may provide a simple but innovative means of information that can be linked with exertion during exercise.
In the recent years, the exercise-related data are analyzed using several statistical and machine learning approaches which are proposed to classify and estimate the intensity of exercise and accumulated fatigue [34,35,36,37,38,39]. The outcome of research studies [34,35] suggested the strong correlation of facial color to heart rate. This evidence point out towards the potential of using the facial color analysis to estimate the physical exertion levels, however, there are no studies advancing towards the dynamics of gold standard measures as heart rate and perceived exertion, across the exercise. Despite a lot of research studies proposing various statistical approaches to estimate various physiological indices, to our knowledge, no studies have been found to investigate the relationship of facial color changes and increased exercise intensity.
Thus, in this paper, we aimed to describe the efficacy of various color models to facial color tracking in relating to heart rate dynamics during fatiguing exercise. Our investigation intended to find the potential statistical relationship between facial color changes and exercise intensity induced by heart rate. We believe that the deeper understanding of the relationship between the facial color changes and the exercise intensity is important not only the research perspective, but also practical application [40]. The autoregressive analysis is used as a statistical tool to investigate the relationship. The possibility to designed global regression equation is also investigated using a global regression model considering all subjects’ data at a time. The polynomial regression equation is also designed for global model. Standard error, t-statistics, p-values, multi-collinearity between each color component of all the color models are also taken into consideration. The outcome of this research contributes to design the better exercise model for the individual. Besides this, the automatic control of the exercise equipment based on the intensity of the exercise could be implemented. Moreover, the outcome of this research study can be useful for the accident control of the elderly people during physical exercise.
The rest of this paper is organized as follows: Section 2 will review the works related to this study. Next, in Section 3, the proposed methods will be explained in detail. Then, in Section 4, the evaluation of the proposed approach through experimental results and discussion will be shown in Section 5. Finally, in Section 6, this paper will be concluded, and future work directions will be provided.

2. Related Works

One of the noticeable application fields of computer vision using facial expression and color analysis is drivers’ tiredness and fatigue detection. Many of them use color information from digital images to improve the algorithms [22,41,42]. This can be an important cue to link the fatigue and tiredness detection during the physical exercise. In face recognition related problems, the facial color information represented in the different color spaces is a valuable tool to improve the pattern detection [22,32,43,44,45,46,47]. Thus, it seems coherent that a facial expression detection might also be enhanced by considering the various color models [48,49]. The computer vision techniques showed to be feasible to detect perceived exertion [50,51]. However, there is a lack of information on the effectiveness of such non-contact techniques and its models, as the ones related to image processing, in describing the physiological process of fatigue during exercise when compared to gold standard contact sensors.
Digital images can be represented by the intensity of the color components present in each pixel through a range of various color channels. Recent research efforts have described the various color models rather than RGB (Red; Green; Blue) which may provide effective information for facial image processing such as face detection and recognition, etc. Various studies have been proposed to measure the physiological parameters including HR using alternative color spaces (HSV, YCbCr, Lab, etc.) instead of raw RGB. Leangwattana [52] proposed an approach to measure HR using each color com-ponent of RGB and HSV (Hue; Saturation; Value) and found that the H component and a third principal component of R provides better accuracy. Wang et. al. proposed a model to recognize the micro-expression and results showed that the performance of Tensor Independent Color Space (TICS), CIELab, and CIELub are better than those of RGB or gray [53,54]. Such efficacy of color models different than the RGB and gray for estimating levels of exertion along physical exercise is still unknown. The literature has shown that the facial color variation is related to the increase of exercise intensity, as redder skin due to tiredness from the physical exertion [45]. Dynamic appearance model of skin color built from in vivo and real time measurements of melanin and hemoglobin concentration has also been described to change according high-intensity exercise [43].
Several statistical and machine learning approaches were proposed recently to classify and estimate the intensity of exercise and accumulated fatigue. Timme [40] proposed a statistical approach using multilevel regression to analyze the facial action movement to predict exertion during incremental physical exercise. The authors concluded that there are significant changes in facial active with respect to exercise intensity. Not only the facial expression, facial color changes during exercise is also a potential cue to estimate the exercise intensity. Perrett [55] investigated the relationship between skin color, aerobic fitness (measured VO2 max), and body fat. The results suggested that there are significant changes in facial color with respect to the exercise intensity. Resting heart rate measurements using R, G, and B color channels in the facial color analysis have also been related [56,57].

3. Materials and Methods

3.1. Data Collection

Nine university students participated in the study (mean age = 26.88 ± 6.01 years, mean weight = 72.56 ± 14.27 kg, mean height = 172.88 ± 12.04 cm, six males and three females, all white Caucasian). In Table 1 detailed information about the participants is presented. All participants signed an informed consent form prior to data collection and went through protocol familiarization before recordings. The participants were not allowed to talk during the test but could express their feelings freely through facial expression. During video recording, the participants were instructed to look straight at the camera lens. The test consisted of a submaximal ramp exercise protocol in a Wattbike Cycloergometer (Wattbike Ltd., Nottingham, UK), after a 5-min warm up. The initial power output was 75 W, which was incremented in, 15 W min−1 until the participants reach 85% of their maximal heart rate (calculated as 208 − (0.7 × age) [58] or be unable to maintain cadence to generate the required power output throughout the stage. Heart rate data were collected at 1 Hz using the Polar T31 cardiofrequencimeter, (Polar Electro, Kempele, Finland) synchronized to the Wattbike load cell for power output measures, sampled at 100 Hz. For the facial tracking, images were recorded during the test using a video camera placed on a tripod approximately one meter from the Cycloergometer in the frontal plane view to capture the participants’ face while performing the exercise, as shown in Figure 1. The camera was adjusted to maintain the angle of 90° between face and camera. The surrounding light source was the fluorescent light bulbs inside the laboratory. The video was recorded at 25 Hz with spatial resolution of 1080 × 1920 pixels.

3.2. Data Procesing

The recorded video was processed off-line to track the average color intensity of a small patch in the participant’s forehead during the exercise. The block diagram of the proposed system is shown in Figure 2. The frame rate was 25 Hz re-sampled to 1 Hz the heart rate was recorded at 1 Hz.

3.2.1. Pre-Processing

Due to the head movement during cycling, some frames extracted from the video recording presented blurring. To overcome this problem Wiener–Hunt deconvolution algorithm was applied in each frame [59], based on a threshold of images variance. After de-blurring, a Gaussian filter was applied for noise reduction.

3.2.2. Face Detection

Participants’ face was detected by Viola and Jones [60] face detection algorithm. The detected face was cropped and resized to 400 × 400 pixels (Figure 3a). Based on the results reported on the literature, forehead patch provides the highest accuracy of heart rate measurement among other patch locations, even better than whole face [41,61]. Therefore, an image of 16 × 16 pixels cropped from whole face in the lower part of forehead patch. (Figure 3b).

3.2.3. Normalization

The images were recorded in a room with lighting background from the AC florescent bulb, where the frequency of supplying electricity was 50 Hz, therefore, affecting the image brightness. The solution to this issue was found in the use of color normalization to eliminate brightness. Consider an RGB image of size M × N pixels where an image is represented by I [M, N, c] where M = width, N = height and c is the color component. The normalized value of each color component for each pixel is calculated by expression 1.
I n o r m   i , j , c = I i , j , c c = 1 3 I i , j , c ,     i = 1   M ,   a n d   j = 1   N
where c = {1, 2, 3} corresponding to red (R), green (G), and blue (B) component of image I.
From the above expression, it is common that:
c = 1 3 I n o r m [ i ,   j ,   c ] = 1

3.2.4. Color Space Conversion

The RGB color model is an additive color system. The combination of the three colors results in visible colors to human eye. Although the RGB color model is the most common for image processing, we added other four different models for tracking color components to compare its efficiency: Hue-Saturation-Value (HSV), Luminance-Chroma: Blue-Chroma: Red (YCbCr), Lightness-Blue: Red, Blue: Yellow (Lab), and YUV. After the normalization of the original RGB image, it was converted to each of the color models.

3.2.5. Patch Averaging

The pixels within the 16 × 16 forehead patch are spatially averaged to yield an individual component of each color space for a frame per second throughout the video frames captured from the beginning to the end of a fatiguing exercise protocol with incremental intensity performed in a cycloergometer and form the raw signals of each channel. After the color space conversion of the normalized RGB, HSV, YCbCr, Lab, and YUV, the patch averaging y ¯ c i was calculated for each color component and space as:
y ¯ c i = 1 M × N m = 1 M n = 1 N y c m , n
where i is color channel such that R or G or B for RGB, H or S or V for HSV and so on. M and N are the patch dimensions (16 × 16) and yc(m, n) represents the pixel value of the color channel i.
Applying expression 3 for each frame, results in a vector of each component with the length equal to the number of seconds in the video; therefore, for each color model, there are three vectors representing each color component.

3.2.6. Median Filter

The patch averaging result is usually noisy, thus, to smooth the data, a moving average filter and median filter were applied. The window size was set to 201 points and to solve the endpoint problem, signal padding with the size of 40 points as well as signal reflection were applied.

3.2.7. Average Filter

To smooth the small peaks resultant from the median filter a moving average filter with a small WindowSize was applied. The average value within a sliding window was calculated as expression 4.
y n = 1 W S x n + x n 1 + + x n W S 1
where y(n) is an average value, x(n) is a given value and WS is the window size considered for smoothing operation.
The result of the median and the moving average filter is shown in Figure 4.

3.3. Statistical Analysis

For all data analysis, both the heart rate and video recordings were re-sampled to 1 Hz. The individual color component of each color space was analyzed separately, such as RGB = [R, G, B], HSV = [H, S, V], YCbCr = [Y, Cb, Cr], Lab = [L, a, b1] and YUV = [Y1, U, V1] as well as a combined form. The normalized heart rate and normalized color intensity of RGB images of one random subject is plotted to illustrate the variation of color intensity with the heart rate changes. Various statistical approaches were used to analyze the color changes with respect to the exercise intensity where exercise intensity was represented by the HR. Most of the statistical approaches are related to the association of heart rate with the color intensities.
The association between HR with the various color models were calculated using continuous analysis of the color intensity. The multivariate time series analysis using multiple variable autoregression with lag of 1 was used to estimate the maximum HR percentage (maxHR%) from the individual color intensities [R, G, and B for RGB model] and its previous values. Before analyzing the multivariate autoregressive model, all the individual independent (three independent variables for each of the participants and each color model) were tested and all the independent variables were stationary. For the time series stationary test, Augmented Dickey-Fuller (ADF) Test were used. The autoregressive model is calculated using the Equation (5). The maxHR% was calculated as HR × 100/(220 − Age). Before fitting regression models, the independent variables were normalized between 0 and 1. The regression models were developed for both the individual participant (personalized model) and the whole sample at a time (universal model).
H R t = a 1 + w 1 × H R t 1 + w n × C R n t 1 + e t 1
where a 1 is the constant terms, w 1 , w 2 , w 3 are the coefficients and C R n represents the color values, e is the error terms.
A universal multiple polynomial regression model using a Support Vector Regression (SVR) was also used to predict the maxHR% from three components of each color model for all the participants at a time. The root mean square error (RMSE) was calculated for whole sample at a time to find the best predictor of the heart rate through the color components of each of the models. All statistics were performed using dedicated codes written in Python programming V3.5.

4. Results

The main objective of this study was to analyze the relationship of heart rate and facial color changes during fatiguing exercise. The results of images pre- and post-processing are shown in respective sections.

4.1. Color Intensity vs. HR Plot

The normalized heart rate and normalized color intensity of RGB images of one random participant is plotted in Figure 5. The result indicates that facial color intensity changes according to the effort performed during fatiguing exercise, and this variation correspond to changes in heart rate. Increase in heart rate is related to the increase in blood pump to face vessels, thus, turning face skin redder and combined color intensity get affected. Therefore, increase in heart rate indicates heart pumps to blood vessels more often. However, we observe a decrease in the red component as the heart rate increases. This result can be attributed to the color normalization performed during images pre-processing.

4.2. Multivariate Autoregression Analysis

The multivariate autoregression model was designed and analyzed using combination of the color components at a time. The model was developed to estimate the maxHR% from facial color intensity. The normalized values of all the three components were considered as independent variables for the multiple autoregression. The RMSE of each model for personalized is presented in Table 2. The best average RMSE using individual color component of HSV color model in multivariate autoregression model is 0.255.

4.3. Polynomial Support Vector Regression

Using the same independent and dependent variables used in the autoregression models, a polynomial SVR (degree three) is used to predict maxHR%. The data analysis was limited to the global model instead of the individual participant. The results were interpreted using RMSE, F-values, and R-square values as illustrated in the individual regression model (see Table 3). The significant variation between the individual model and the global model was detected specially in terms of the RMSE. Compared to the autoregression model, the universal polynomial regression model provides inferior with least value of the RMSE. As all the color models were calculated from the RGB color model, the color channel might have multicollinearity between each other suggesting that the independent variable exhibited considerable multicollinearity. The multicollinearity in between each color component of each color model were tested using multicollinearity test and Variance Inflation Factor (VIF) were calculated. The results of multicollinearity test were illustrated in Table 4. All the values of VIF are less than 10 and most of them are less than 5 which indicate that the independent variables are not colinear to each other which suggests to worth enough to develop the global regression model of each color model.

5. Discussion

It is very common that during physical exercise, human body needs more oxygen, that is why, the heart beats faster so that more blood gets out to the body. The heart rate increases in proportion to the intensity of your exercise [62,63,64]. Based on the outcome of the related research studies, it is well known that the exercise intensity and the HR establish the linear relationship. The HR is generally considered as a better parameter for assessing and monitoring relative exercise intensity [5,63]. This study aimed to quantify the relationship between heart rate measured using a contact sensor and heart rate estimation through facial skin color modelling in healthy young people during the sub-maximal ramp exercise protocol. The main findings suggested that the accuracy of estimated the heart rate was higher when facial skin color was modeled (multiple linear regression using fifteen independent variables) using combined color space channels (RGB, HSV, YCbCr, Lab, and YUV), compared to the model (single variable linear regression) that accounted for the individual color channel (R, G, B, H, S, V, etc.).
It has been a while since the change in the use of alternative color spaces in image processing, including in skin color and facial expression analysis [27,28,44,53]. Thus, it was expected that the models would show better performance for face skin segmentation using HSV and YCbCr rather than RGB, as already reported previously [65,66,67]. The green channel of the addictive color space showed the highest accuracy and consistency of the correlation model for heart rate estimation. This is justified by the fact that green light is better absorbed by hemoglobin than red and blue light [67].
As mentioned, there is a significant change in facial color with respect to increased exercise intensity. There is a strong association between the facial color and induced heart rate, as the skin becomes redder due to increased heart rate of physical exertion [45]. However, to our knowledge, this is the first study to demonstrate the use regression analysis to investigate the relationship between facial color and increased heart rate during an incremental intensity cycling. From the results, we found that the strong association between the facial color changes with respect to the increased heart rate. The initial heart rate, maximum heart rate, duration of exercise is quite dependent on individual participant. As our study focuses on both personalized regression model and the global model, the individual regression equation exhibits quite strong association. It is also worth noting that the facial color changes were highly heterogeneous within the participant.
The possibility of obtaining the global regression model is also investigated and obtained statistically significant results. The data analysis extended to the polynomial regression considering multi-collinearity between the individual independent variables. As Perrett [55] suggested, the facial color changes have strong correlation between exercise intensity. The changes in facial color with respect to heart rate seems nonlinear fashion as the results in global regression equation, suggesting focusing on non-linear regression approaches for future extension of research.
Previous studies have already reported a strong relationship between heart rate and color space components, but mostly for people in resting state, not continuously during exercise [61]. However, because the measurement of intensity is essential to be assessed during physical performance, the determination of the behavior of the color components to predict heart rate in the timeframe enhances the importance of this study. Another aspect worth of notice is on the individuality observed between heart rate and color intensity values. We believe that the current study reaches an important next step by estimating physical effort using image recognition effectively during exercise, nevertheless, it seems that individual variability is still an issue that might be addressed to achieve higher accuracy in further studies.
There are some limitations to this study: (1) Only the healthy participants between the age of 18-36 years were recruited for participation; therefore, the findings of this study may not generalize for other populations such as the elderly. (2) The recruited participants were all Caucasian; therefore, the findings may not generalize to other non-Caucasian peoples. (3) This study was conducted in a laboratory setting in the stationary cycles; therefore, the findings for other types of exercise protocol may not fit into this result. (4) The video was recorded in fluorescent light; therefore, the results may not generalize for other light sources such as ambient and other types of light sources. (5) The number of participants was limited (only nine: six males and three females), and not equally gender-distributed, therefore, it is suggested for further investigation with a greater number of participants with diverse ethnicity and gender. (6) The heart rate is correlated only with the color intensity change but combining other parameters such as temperature might increase the accuracy of HR prediction.

6. Conclusions and Future Work

The experimental results showed that the facial color changes with the increase in exercise intensity. The main findings suggested that the accuracy of estimated heart rate was higher with HSV color model using autoregressive model with the average RMSE of 0.255. Comparing to global polynomial regression model, the individual model exhibits better results. From the overall results, it can be concluded that the proposed models seem best suited for personalized exercise intensity monitoring instead of developing the universal regression model. A high degree of heterogeneity of facial color changes was observed between participants, reflecting individual differences in facial color changes. Exercise prescription and monitoring can benefit from the use of these methods, for example, to optimize the process of online monitoring, without having the need to use any other instrumentation.
This work can be extended to relate heart rate and the color intensity with more than five-color models, as well as to the investigation of different skin-colors. From the results, it was reported that this method can be appropriate for individual analysis, therefore, this work can be extended by analyzing the data from the same participant in multiple trials and in various situations.

Author Contributions

Conceptualization, S.R.K. and V.F.; methodology, S.R.K. and V.F.; software, S.R.K.; validation.; formal analysis, S.R.K., J.S. and V.F.; investigation, S.R.K., V.F. and J.B.; resources, S.R.K., J.S. and J.E.; writing—original draft preparation, S.R.K.; writing review and editing, S.R.K., J.S., J.E., J.B. and V.F.; project administration, J.B. and V.F.; funding acquisition, J.B., V.F. and J.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Project “NanoSTIMA: Macro-to-Nano Human Sensing: Towards Integrated Multimodal Health Monitoring and Analytics/NORTE-01-0145-FEDER-000016” financed by the North Portugal Regional Operational Programme (NORTE 2020), under the PORTUGAL 2020 Partnership Agreement, and through the European Regional Development Fund (ERDF).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank all the participants who involved in the experiments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Nelson, M.E.; Rejeski, W.J.; Blair, S.N.; Duncan, P.W.; Judge, J.O.; King, A.C.; Macera, C.A.; Castaneda-Sceppa, C. Physical activity and public health in older Adults: Recommendation from the American College of Sports Medicine and the American Heart Association. Med. Sci. Sports Exerc. 2007, 39, 1435–1445. [Google Scholar] [CrossRef] [PubMed]
  2. Warbarton, D.H.; Paterson, D.E. Physical activity and functional limitations in older adults: A systematic review related to Canada’s Physical Activity Guidelines. Int. J. Behav. Nutr. Phys. Act. 2010, 7, 38. [Google Scholar]
  3. Duking, P.; Achtzehn, S.; Holmberg, H.-C.; Sperlich, B. Integrated Framework of Load Monitoring by a Combination of Smartphone Applications, Wearables and Point-of-Care Testing Provides Feedback that Allows Individual Responsive Adjustments to Activities of Daily Living. Sensors 2018, 18, 1632. [Google Scholar] [CrossRef]
  4. Borg, G. Borg’s Perceived Exertion and Pain Scales; Human Kinetics: Rimbi, Sweden, 1998. [Google Scholar]
  5. Dooley, E.E.; Golaszewski, N.M.; Bartholomew, J.B. Estimating Accuracy at Exercise Intensities: A Comparative Study of Self-Monitoring Heart Rate and Physical Activity Wearable Devices. JMIR Mhealth Uhealth 2017, 5, e34. [Google Scholar] [CrossRef]
  6. Hensen, S.J. Measuring physical activity with heart rate monitors. Am. J. Public Health 2017, 107, e24. [Google Scholar] [CrossRef] [PubMed]
  7. Miles, K.H.; Clark, B.; Périard, J.D.; Goecke, R.; Thompson, K.G. Facial feature tracking: A psychophysiological measure to assess exercise intensity? J. Sports Sci. 2017, 36, 934–941. [Google Scholar] [CrossRef]
  8. Kiviniemi, A.M.; Hautala, A.J.; Kinnunen, H.; Nissila, J.; Virtanen, P.; Karjalainen, J.; Tulppo, M.P. Daily Exercise Prescription on the Basis of HR Variability among Men and Women. Med. Sci. Sports Exerc. 2010, 42, 1355–1363. [Google Scholar] [CrossRef]
  9. Nakamura, F.Y.; Flatt, A.A.; Pereira, L.A.; Campillo, R.R.; Loturco, I.; Esco, M.R. Ultra-Short-Term Heart Rate Variability is Sensitive to Training Effects in Team Sports Players. J. Sports Sci. Med. 2015, 14, 602–605. [Google Scholar]
  10. Orini, M.; Tinker, A.; Munroe, P.B.; Lambiase, P.D. Long-term intra-individual reproducibility of heart rate dynamics during exercise and recovery in the UK Biobank cohort. PLoS ONE 2017, 12, 183732. [Google Scholar] [CrossRef]
  11. Hunt, K.J.; Grunder, R.; Zahnd, A. Identification and comparison of heart-rate dynamics during cycle ergometer and treadmill exercise. PLoS ONE 2019, 14, 220826. [Google Scholar] [CrossRef]
  12. Mackingnon, S.N. Relating heart rate and rate of perceived exertion in two simulated occupational tasks. Ergonomics 1999, 42, 761–766. [Google Scholar] [CrossRef] [PubMed]
  13. Chen, Y.L.; Chin, C.C.; Hsia, P.Y.; Lin, S.K. Relationships of Borg’s RPE 6-20 scale and heart rate in dynamic and static exercises among a sample of young Taiwanese men. Percept. Mot. Ski. Phys. Dev. Meas. 2013, 117, 971–982. [Google Scholar] [CrossRef] [PubMed]
  14. Dias, D.; Cunha, J. Wearable Health Devices—Vital Sign Monitoring, Systems and Technologies. Sensors 2018, 18, 2414. [Google Scholar] [CrossRef] [PubMed]
  15. Sun, G.; Matsui, T.; Watai, Y.; Kim, S.; Kirimoto, T.; Suzuki, S.; Hakozaki, Y. Vital-SCOPE: Design and Evaluation of a Smart Vital Sign Monitor for Simultaneous Measurement of Pulse Rate, Respiratory Rate, and Body Temperature for Patient Monitoring. J. Sens. 2018, 2018, 1–7. [Google Scholar] [CrossRef]
  16. Butte, N.F.; Ekelund, U.; Westerterp, K.R. Assessing physical activity using wearable monitors: Measures of physical activity. Med. Sci. Sports Exerc. 2012, 44, 5–12. [Google Scholar] [CrossRef]
  17. Chen, K.; Janz, K.F.; Zhu, W.; Brychta, R.J. Re-Defining the roles of sensors in objective physical activity monitoring. Med. Sci. Sports Exerc. 2012, 44, 13–23. [Google Scholar] [CrossRef]
  18. Arif, M.; Kattan, A. Physical activities monitoring using wearable acceleration sensors attached to the body. PLoS ONE 2015, 10, 1–16. [Google Scholar] [CrossRef] [PubMed]
  19. Garbey, M.; Sun, N.; Merla, A.; Pavlidis, I. Contact-free measurement of cardiac pulse based on the analysis of thermal energy. IEEE Trans. Biomed. Eng. 2007, 54, 1418–1427. [Google Scholar] [CrossRef] [PubMed]
  20. Docampo, G.N. Heart Rate Estimation Using Facial Video Information (Master Thesis). 2012. Available online: http://hdl.handle.net/2099.1/16616 (accessed on 28 June 2022).
  21. Balakrishnan, G.; Durand, F.; Guttag, J. Detecting pulse from head motions in video. In Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition, Piscataway, NJ, USA, 23–28 June 2013; pp. 3430–3438. [Google Scholar]
  22. Lin, Y.C.; Chou, N.K.; Lin, G.Y.; Li, M.H.; Lin, Y.H. A Real-Time contactless pulse rate and motion status monitoring system based on complexian tracking. Sensors 2017, 17, 1490. [Google Scholar] [CrossRef]
  23. Ndahimana, D.; Kim, E. Measurement methods for physical activity and energy expenditure: A review. Clin. Nutr. Res. 2017, 6, 68–80. [Google Scholar] [CrossRef]
  24. Haque, M.A.; Irani, R.; Nasrollahi, K.; Thomas, M.B. Facial Video Based Detection of Physical Fatigue for Maximal Muscle Activity. IET Comput. Vis. 2016, 10, 323–330. [Google Scholar] [CrossRef]
  25. Chen, J.; Tao, Y.; Zhang, D.; Liu, X.; Fang, Z.; Zhou, Q.; Zhang, B. Fatigue detection based on faical images prossed by difference algorithm. In Proceedings of the lASTED International Conference Biomedical Engineering, Innsbruck, Austria, 20–21 February 2017; BioMed: Innsbruck, Austria; pp. 1–4. [Google Scholar]
  26. Ekman, P.; Friesen, W.V.; Ancoli, S. Ficial Signs of Emotion Experiences. J. Personal. Soc. Psychol. 1980, 39, 1125–1134. [Google Scholar] [CrossRef] [Green Version]
  27. Khanal, S.R.; Sampaio, J.; Barroso, J.; Filipe, V. Classification of Physical Exercise Intensity Based on Facial Expression Using Deep Neural Network; Springer: Cham, Switzerland, 2019; Volume 11573, ISBN 9783030235628. [Google Scholar]
  28. Khanal, S.R.; Fonseca, A.; Marques, A.; Barroso, J.; Filipe, V. Physical exercise intensity monitoring through eye-blink and mouth’s shape analysis. In Proceedings of the TISHW 2018-2nd International Conference on Technology and Innovation in Sports, Health and Wellbeing, Thessaloniki, Greece, 20–22 June 2018. [Google Scholar]
  29. Khanal, S.R.; Barroso, J.; Sampaio, J.; Filipe, V. Classification of physical exercise intensity by using facial expression analysis. In Proceedings of the 2nd International Conference on Computing Methodologies and Communication, ICCMC 2018, Erode, India, 15–16 February 2018. [Google Scholar]
  30. Khanal, S.R.; Sampaio, J.; Barroso, J.; Filipe, V. Individual’s Neutral Emotional Expression Tracking for Physical Exercise Monitoring; Springer: Cham, Switzerland, 2020; Volume 12424, ISBN 9783030601164. [Google Scholar]
  31. Neagoe, V.E. An Optimum 2D Color Space for Pattern Recognition. In Proceedings of the 2006 International Conference on Image Processing, Computer Vision, & Pattern Recognition, Las Vegas, NV, USA, 26–29 June 2006; Volume 2. [Google Scholar]
  32. Tayal, Y.; Lamba, R.; Padhee, S. Automatic face detection using color based segmentation. Int. J. Sci. Publ. 2012, 2, 1–7. [Google Scholar]
  33. Rewar, E.; Lenka, S.K. Comparative analysis of skin color based models for face detection. Int. J. Signal Image Process. 2013, 4, 69–75. [Google Scholar] [CrossRef]
  34. Silva, S.M.; Jayawardana, M.W.; Meyer, D. Statistical methods to model and evaluate physical activity programs, using step counts: A systematic review. PLoS ONE 2018, 13, 206763. [Google Scholar] [CrossRef]
  35. Gang, K.Q.; Wu, Z.X.; Zhou, D.Y. Effects of hot air-drying process on lipid quality of whelks Neptunea arthritica cumingi Crosse and Neverita didyma. J. Food Sci. Technol. 2019, 56, 4166–4176. [Google Scholar] [CrossRef]
  36. Ivanov, Y. Adaptive moving object segmentation algorithms in cluttered environments. In Proceedings of the The Experience of Designing and Application of CAD Systems in Microelectronics, Lviv, Ukraine, 24–27 February 2015; pp. 97–99. [Google Scholar] [CrossRef]
  37. Tkachenko, R.; Tkachenko, P.; Izonin, I.; Tsymbal, Y. Learning-Based Image Scaling Using Neural-Like Structure of Geometric Transformation Paradigm. In Advances in Soft Computing and Machine Learning in Image Processing. Studies in Computational Intelligence; Hassanien, A., Oliva, D., Eds.; Springer: Cham, Switzerlamd, 2018; Volume 730. [Google Scholar]
  38. Staffini, A.; Svensson, T.; Chung, U.-i.; Svensson, A.K. Heart Rate Modeling and Prediction Using Autoregressive Models and Deep Learning. Sensors 2021, 22, 34. [Google Scholar] [CrossRef] [PubMed]
  39. Ni, A.; Azarang, A.; Kehtarnavaz, N. A Review of Deep Learning-Based Contactless Heart Rate Measurement Methods. Sensors 2021, 21, 3719. [Google Scholar] [CrossRef]
  40. Timme, S.; Brand, R. Affect and exertion during incremental physical exercise: Examining changes using automated facial action analysis and experiential self-report. PLoS ONE 2020, 15, 228739. [Google Scholar] [CrossRef]
  41. Ramirez, G.A.; Fuentes, O.; Crites, S.L.; Jimenez, M.; Ordonez, J. Color Analysis of Facial Skin: Detection of Emotional State. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, Columbus, OH, USA, 23–28 June 2014; pp. 468–473. [Google Scholar]
  42. Temko, A. Accurate heart rate monitoring duding physical exercise using PPG. IEEE Trans. Biomed. Eng. 2017, 64, 2018–2024. [Google Scholar] [CrossRef]
  43. Jimenez, J.; Scully, T.; Barbosa, N.; Donner, C.; Alvarez, X.; Vierira, T.; Weyrich, T. A practical appearance model for dynamic facial color. ACM Trans. Graph. 2010, 29, 141. [Google Scholar] [CrossRef]
  44. Chandrappa, D.N.; Ravishankar, M.; Rameshbabu, D.R. Face detection in color images using skin color model algorithm based on skin color information. In Proceedings of the 2011 3rd International Conference on Electronics Computer Technology, Kanyakumari, India, 8–10 April 2011; pp. 254–258. [Google Scholar]
  45. Hasan, M.M.; Hossain, M.F.; Thakur, J.M. Podder Driver fatigue recognition using skin color modeling. Intern. J. Comput. Appl. 2014, 97, 34–41. [Google Scholar]
  46. Nanni, L.; Lumini, A.; Dominio, F.; Zanuttigh, P. Effective and precise face detection based on depth and color data. Appl. Comput. Inform. 2014, 10, 1–13. [Google Scholar] [CrossRef]
  47. Qerem, A. Face detection and recognition using fusion of color space. Int. J. Comput. Sci. Electron. Eng. 2016, 4, 12–16. [Google Scholar]
  48. Lajevardi, S.M.; Wu, H.R. Facial expression recognition in perceptual color space. IEEE Trans. Image Process. 2012, 21, 3721–3734. [Google Scholar] [CrossRef]
  49. Nakajima, K.; Minami, T.; Nakauchi, S. Interaction between facial expression and color. Sci. Rep. 2017, 7, 41019. [Google Scholar]
  50. Irani, R.; Nasrollahi, K.; Moeslund, T.B. Contactless measurement of muscle fatigue by tracking facial feature points in video. In Proceedings of the IEEE International Conference on Image Processing ICIP, Paris, France, 27–30 October 2014; pp. 4181–5186. [Google Scholar]
  51. Wu, B.F.; Lin, C.H.; Huang, P.W.; Lin, T.M.; Chung, M.L. A contactless sport training monitor based on facial expression and remote PPG. In Proceedings of the 2017 IEEE International Conference on Systems, Man and Cybernetics, Banff, AB, Canada, 5–8 October 2017; pp. 846–852. [Google Scholar]
  52. Lueangwattana, C.; Kondo, T.; Haneishi, H. A Comparative Study of video Signals for Non-contact heart rate measurement. In Proceedings of the 12th International Conference on Electrical Engineering/Electronics; Computer, Telecommunications and Information Technology (ECTI-CON), Hua Hin, Thailand, 24–27 June 2015. [Google Scholar]
  53. Wang, S.-J.; Yan, W.-J.; Li, X.; Zhao, G.; Fu, X. Micro-expression Recognition Using Dynamic Textures on Tensor Independent Color Space. In Proceedings of the 22nd International Conference on Pattern Recognition, Stockholm, Sweden, 24–28 August 2014; pp. 4678–4684. [Google Scholar]
  54. Wang, S.-J.; Yan, W.-J.; Li, X.; Zhao, G.; Zhou, C.-G.; Fu, X.; Tao, J. Micro-Expression Recognition Using Color Spaces. IEEE Trans. Image Process. 2015, 24, 6035–6049. [Google Scholar] [CrossRef]
  55. Perrett, D.I.; Talamas, S.N.; Cairns, P.; Henderson, A.J. Skin Color Cues to Human Health: Carotenoids, Aerobic Fitness, and Body Fat. Front. Psychol. 2020, 11, 392. [Google Scholar] [CrossRef]
  56. Poh, M.-Z.; McDuff, D.J.; Picard, R.W. Advancements in noncontact, multiparameter physiological measurements using webcam. IEEE Trans. Biomed. Eng. 2011, 58, 7–11. [Google Scholar] [CrossRef]
  57. More, A.V.; Wakankar, A.; Gawande, J.P. Automated heart rate measurement using wavelet analysis of face video sequences. In Innovations in Electronics and Communication Engineering; Saini, H.S., Sing, R.K., Patel, V.M., Santhi, K., Ranganayakulu, S.V., Eds.; Springer: Singapore, 2018; pp. 113–120. [Google Scholar]
  58. Tanaka, H.; Monahan, K.D.; Seals, D.R. Age-predicted maximal heart rate revisited. J. Am. Coll. Cardiol. 2000, 37, 153–156. [Google Scholar] [CrossRef]
  59. Orieux, F.; Giovannelli, J.; Rodet, T. Bayesian estimation of regularization and point spread function parameters for Wiener–Hunt deconvolution. J. Opt. Soc. Am. A 2010, 27, 1593–1607. [Google Scholar] [CrossRef] [PubMed]
  60. Viola, P.; Jones, M. Robust Real-time Object Detection. Int. J. Comput. Vis. 2004, 57, 137–154. [Google Scholar] [CrossRef]
  61. Hassan, M.A.; Malik, G.S.; Saad, N.; Karasfi, B.; Ali, Y.S.; Fofi, D. Optimal source selection for image photoplethysmography. In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference Proceedings, Taipei, Taiwan, 23–26 May 2016; pp. 1–5. [Google Scholar]
  62. Rosado, C.M.; Jansz Rieken, C.; Spear, J. The effects of heart rate feedback on physical activity during treadmill exercise. Behav. Anal. Res. Pract. 2021, 21, 209–218. [Google Scholar] [CrossRef]
  63. Mikus, C.R.; Earnest, C.P.; Blair, S.N.; Church, T.S. Heart rate and exercise intensity during training: Observations from the DREW Study. Br. J. Sports Med. 2009, 43, 750–755. [Google Scholar] [CrossRef]
  64. Tran, D.L.; Kamaladasa, Y.; Munoz, P.A.; Kotchetkova, I.; D’Souza, M.; Celermajer, D.S.; Maiorana, A.; Cordina, R. Estimating exercise intensity using heart rate in adolescents and adults with congenital heart disease: Are established methods valid? Int. J. Cardiol. Congenit. Heart Dis. 2022, 8, 100362. [Google Scholar] [CrossRef]
  65. Paschos, G. Perceptually Uniform Color Spaces for Color Texture Analysis: An Emperical Evaluation. IEEE Trans. Image Process. 2001, 10, 932–937. [Google Scholar] [CrossRef]
  66. Sanchez-Cuevas, M.C.; Aguilar-Ponce, R.M.; Tecpanecatl-Xihuitl, J.L. A Comparison of Color Models for Color Face Segmentation. Procedia Technol. 2013, 7, 134–141. [Google Scholar] [CrossRef]
  67. Seidman, D.S.; Moise, J.; Ergaz, Z.; Laor, A.; Vreman, H.J.; Stevenson, D.K.; Gale, R. A prospective randomized controlled study of phototherapy using blue, and blue-green light-emiting devices, and conventional halogen-quartz phototherapy. J. Perinatol. 2003, 23, 123–127. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Facial video and HR data collection at the time of physical exercise in stationary cycle-ergometer.
Figure 1. Facial video and HR data collection at the time of physical exercise in stationary cycle-ergometer.
Jimaging 08 00245 g001
Figure 2. System block diagram of proposed model (ROI-Region of Interest).
Figure 2. System block diagram of proposed model (ROI-Region of Interest).
Jimaging 08 00245 g002
Figure 3. (a) Face detection; (b) Patch location.
Figure 3. (a) Face detection; (b) Patch location.
Jimaging 08 00245 g003
Figure 4. Filtered signals after smoothing operation.
Figure 4. Filtered signals after smoothing operation.
Jimaging 08 00245 g004
Figure 5. Plot of color intensity value of combined color component of RGB (I = (0.21 × R) + (0.72 × G) + (0.07 × B)/255 × 100 vs. Normalized HR.
Figure 5. Plot of color intensity value of combined color component of RGB (I = (0.21 × R) + (0.72 × G) + (0.07 × B)/255 × 100 vs. Normalized HR.
Jimaging 08 00245 g005
Table 1. Physiological information about the participants of the study. The participant’s ID is represented by P1, P2 etc. The initial heart rate was recorded when a participant starts the exercise (after 5 min of warm up exercise) and Final HR (maximum heart of each participant) was recorded at the end of the exercise. HR-Heart Rate, bmp-Beat Per Minute.
Table 1. Physiological information about the participants of the study. The participant’s ID is represented by P1, P2 etc. The initial heart rate was recorded when a participant starts the exercise (after 5 min of warm up exercise) and Final HR (maximum heart of each participant) was recorded at the end of the exercise. HR-Heart Rate, bmp-Beat Per Minute.
Participants IDGenderAge (years)Weight (KG)Height (cm)Initial HR (bpm)Final HR (bpm)Duration (mm:ss)
Participant 1Male2264.2172931919:30
Participant 2Male3366.91777018016:00
Participant 3Female1964.2177871919:05
Participant 4Male3683.21829119115:00
Participant 5Female2466.6170971849:20
Participant 6Female29471571141938:00
Participant 7Male338318610118016:00
Participant 8Male2290.719510120112:00
Participant 9Male2487.319411518811:00
Table 2. The Root Mean Square Error (RMSE) values using multivariate autoregression with lag 1 between different color models and maxHR%.
Table 2. The Root Mean Square Error (RMSE) values using multivariate autoregression with lag 1 between different color models and maxHR%.
ColorSub1Sub2Sub3Sub4Sub5Sub6Sub7Sub8Sub9AVG
RGB0.310.350.420.50.460.350.540.250.240.275
HSV0.310.330.380.350.380.290.510.220.20.255
YCBCR0.330.370.460.420.430.360.450.290.310.32
LAB0.320.340.420.450.390.420.590.30.210.265
YUV0.320.380.410.480.480.430.610.230.30.31
Table 3. Root mean square error, F-values, R-square values for five color models and combination of all the models using a global polynomial regression model with degree three.
Table 3. Root mean square error, F-values, R-square values for five color models and combination of all the models using a global polynomial regression model with degree three.
Color ModelRMESF-ValueR-Square Value
RGB7.85(F(3,6060) = 4633, p = 0.006)0.70
HSV6.75(F(3,6060) = 7360, p < 0.001)0.78
YCBCR7.84(F(3,6060) = 3839, p < 0.001)0.92
LAB7.78(F(3,6060) = 6905, p < 0.001)0.70
YUV7.73(F(3,6060) = 3651, p < 0.001)0.94
Table 4. Multi-collinearity test for all the independent variables of each color model, p-values and Variance Inflation Factor (VIF).
Table 4. Multi-collinearity test for all the independent variables of each color model, p-values and Variance Inflation Factor (VIF).
Color Model p-ValueVIF
RGBR0.00002.54
G0.00009.25
B0.005188.56
HSVH0.27961.27
S0.00001.04
V0.00001.19
YCBCRY0.00003.56
Cb0.00003.48
Cr0.00005.14
LabA0.05525.24
a0.00006.14
B0.00872.85
YUVY0.00002.45
U0.00414.15
V0.00005.32
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Khanal, S.R.; Sampaio, J.; Exel, J.; Barroso, J.; Filipe, V. Using Computer Vision to Track Facial Color Changes and Predict Heart Rate. J. Imaging 2022, 8, 245. https://doi.org/10.3390/jimaging8090245

AMA Style

Khanal SR, Sampaio J, Exel J, Barroso J, Filipe V. Using Computer Vision to Track Facial Color Changes and Predict Heart Rate. Journal of Imaging. 2022; 8(9):245. https://doi.org/10.3390/jimaging8090245

Chicago/Turabian Style

Khanal, Salik Ram, Jaime Sampaio, Juliana Exel, Joao Barroso, and Vitor Filipe. 2022. "Using Computer Vision to Track Facial Color Changes and Predict Heart Rate" Journal of Imaging 8, no. 9: 245. https://doi.org/10.3390/jimaging8090245

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop