Next Article in Journal
A Review of Islanding Detection Techniques for Inverter-Based Distributed Generation
Previous Article in Journal
Application of a Risk Management System of Road Networks Exposed to Volcanic Hazards
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Enhancing Driver Safety: Real-Time Eye Detection for Drowsiness Prevention Driver Assistance Systems †

by
Zainah Md. Zain
*,
Mohd Shahril Roseli
and
Nurul Athirah Abdullah
Faculty of Electrical and Electronics Engineering, Universiti Malaysia Pahang Al-Sultan Abdullah, Pekan 26600, Pahang, Malaysia
*
Author to whom correspondence should be addressed.
Presented at the 8th International Electrical Engineering Conference, Karachi, Pakistan, 25–26 August 2023.
Eng. Proc. 2023, 46(1), 39; https://doi.org/10.3390/engproc2023046039
Published: 13 October 2023
(This article belongs to the Proceedings of The 8th International Electrical Engineering Conference)

Abstract

:
Drowsiness has become a significant contributing factor to traffic accidents in modern times, posing a major concern to society. Driver fatigue or sleepiness leads to decreased reaction time, diminished attention, and compromised decision-making abilities, thereby affecting the overall driving experience. This paper addresses this issue by proposing a drowsiness detection system based on image processing, utilizing a cascade of classifiers built on Haar-like features. The system effectively detects the eyes, allowing for determination of eye closure or opening, which serves as an indicator of driver drowsiness.

1. Introduction

Over the past few years, a significant number of electronic driving assistance systems have been developed and implemented with the primary goal of ensuring road safety, in direct response to the alarming prevalence of traffic accidents [1]. Traffic accidents are predominantly caused by driver errors and mistakes. In response to the alarming rise in fatal accidents, there has been an increasing global aspiration for a safer world, free from vehicle-related incidents. According to the Road Safety Report published by the World Health Organization, Malaysia has recorded one of the highest fatality rates, with 25 deaths per 100,000 people, surpassing the regional average of 17.9 deaths per 100,000 people [2]. With road security being an absolute necessity, the implementation of Advanced Driver Assistance Systems (ADAS) has become paramount.
Various types of equipment are utilized in ADAS, including physical sensors, locators, ultrasonic devices, phonic mixer devices, cameras, and night foresight tools. Algorithms are employed to ensure safety by considering factors such as traffic conditions, hazardous situations, and weather conditions. The primary objective is to avoid potentially dangerous situations. Additionally, an on-board driver assistant system plays a crucial role in monitoring driver attentiveness, alertness, and fatigue. These systems implement diverse sensing modalities, with computer vision emerging as a vital solution [3].

2. Drowsiness Detection System

Driver drowsiness detection stands out as a significant category among Advanced Driver Assistance Systems. This prominence is primarily attributed to the strong correlation between driver drowsiness and car accidents. The two primary causes of road accidents and the resulting financial losses are driving while drowsy or in a sleepy condition [4]. The detrimental impact of driver fatigue, resulting in a loss of concentration, significantly impairs the car driver’s ability to make timely and effective decisions [5]. As per the Road Safety Web Publication, around 20% of car accidents can be attributed to driver fatigue [6]. Consequently, the monitoring of driver fatigue contributes to enhancing both driver and vehicle safety.
In the last decade, researchers have dedicated efforts to develop driver drowsiness supervision systems. Various methods have been introduced, such as the feature-based approach, which involves analyzing the driver’s facial images to detect drowsiness indicators like eye blinks, yawning, or head movements. These techniques can be implemented through diverse approaches. Türkan et al. [7] successfully implemented edge projection with wavelet domain image classification for face detection. Fletcher et al. [8] introduced another technique based on the measurement of the percentage of eyelid closure, which calculates the ratio of frames with closed eyes to the total eye frames, providing an indication of drowsiness. Additionally, skin color can be employed as a feature in face detection. Alshaqaqi et al. [9] presented a system that utilized the color of human skin to identify regions containing faces. Additionally, modern systems have incorporated wearable smart glasses for the purpose of drowsiness detection. Chen et al. [10] and Chang et al. [11] proposed a fatigue–drowsiness detection system utilizing wearable smart glasses integrated with an in-vehicle smart system. This computer-based glasses system has the capability to detect the drowsiness or fatigue state of the driver. Viola and Jones [12] introduced Haar feature-based classifiers, which have become a prevalent technique for detecting facial and eye features in drowsiness detection systems due to their effectiveness and efficiency. Significant research efforts have been dedicated to the field of driver drowsiness detection, with the objective of uncovering the most effective and efficient system solutions.

3. Drowsiness System Development

The block diagram depicting the proposed system is presented in Figure 1. Raspberry Pi 3 serves as the main microcontroller for this system. The system is powered by a 5 V Micro USB connected to the car adapter. The Raspberry Pi NoIR Camera V2 is linked to the main controller via a camera serial interface. The main controller directly connects to the Touch Screen Display and active buzzer, which serve as system outputs. To store data, a 32 GB SD Card is utilized in this system.

3.1. Flowchart of the System

The process of the Eye Detection System for Driver Assistance begins by setting up and turning on the camera, as depicted in Figure 2 and Figure 3. Subsequently, the Haar algorithm method is loaded to detect the presence of the driver’s eyes. If the driver’s eyes are detected, Dlib’s facial landmark predictor is utilized to identify 68 salient points and draw a rectangular shape around the eyes to capture the eye area.
Next, the Eye Aspect Ratio algorithm is applied to extract the eye regions, which are then displayed on the LCD touchscreen. At this point, the user has the option to exit the system by utilizing the power-off button on the touchscreen, thereby ending the process. If the user chooses to continue using the device, the process restarts from point A, searching for the driver’s eyes using the Haar algorithm method.
During the sub-process labeled as B, the camera continuously monitors the driver’s eyes’ aspect ratio (EAR) and counts the frames if the EAR falls below the set threshold. If the count of frames is below the required threshold, the process returns to point A. However, if the count of frames reaches the specified consecutive frames threshold, the alarm is triggered. The process then loops back to point A again. This entire process continues until the device is turned off or shut down.

3.2. Electrical Wiring Connection

The connections between each component are illustrated in Figure 4.

4. Experimental and Data Analysis

A system prototype, depicted in Figure 5, has been successfully developed. It was designed as a portable device that can be easily installed in various vehicle models. The core concept of this system revolves around real-time video monitoring of the driver’s eyes using a camera, enabling reliable measurement of driver drowsiness levels.

4.1. Eye Aspect Ratio (EAR) Threshold

Figure 6 shows eye detection in the normal eye state and below the threshold conditions. Multiple threshold values ranging from 0.1 to 0.4 are collected. A blink is recorded when the eye aspect ratio drops below a specific threshold and subsequently rises above the threshold. No alarm detection occurs for thresholds 0.1 and 0.2. However, an alarm is detected at the preferred threshold of 0.3, as indicated in Table 1.

4.2. Detection Analysis

The detection analysis involved testing variable subjects and variable distances, as presented in Table 2 and Table 3. Subject variables were based on gender, age, race, and whether the subjects were wearing glasses or not. According to Table 2, the optimum distance for the camera to capture the drivers was determined. However, for drivers wearing glasses, the detection time exhibited some errors due to reflections from the glasses. On the other hand, for subjects without glasses, the detection time was relatively consistent across different ages, genders, and races. A variation in eye aspect ratio was observed, primarily attributable to differences in eye sizes among the subjects.
In summary, the novelty of this paper lies in its extensive evaluation of the drowsiness detection system, taking into account variables such as gender, age, race, and glasses-wearing status. This contributes to a more comprehensive understanding of the system’s performance under diverse subject characteristics and enhances its applicability in real-world scenarios.

5. Conclusions

Our eye detection system developed for driver assistance has demonstrated its capability to promptly detect drowsiness, aligning with the objectives of this project. By leveraging the Eye Aspect Ratio (EAR), the system effectively distinguishes between normal eye blinking and drowsiness, thereby preventing the driver from entering a state of sleepiness. Through analysis, it was determined that the optimal distance for accurate detection falls within the range of 40 cm to 60 cm from the camera. However, it is worth noting that the system experienced challenges in accurately detection for subjects wearing glasses, likely due to reflections from the glasses. Interestingly, the analysis revealed that the races and ages of the drivers did not significantly impact the EAR settings.

Author Contributions

Conceptualization, M.S.R. and Z.M.Z.; methodology, M.S.R.; software, M.S.R. and N.A.A.; validation, M.S.R. and Z.M.Z.; formal analysis, M.S.R.; investigation, M.S.R. and Z.M.Z.; resources, Z.M.Z.; data curation, M.S.R. and Z.M.Z.; writing—original draft preparation, Z.M.Z. and M.S.R.; writing—review and editing, Z.M.Z.; visualization, N.A.A.; supervision, Z.M.Z.; funding acquisition, Z.M.Z. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to thank for the support given to this research by Centre of Automotive, Universiti Malaysia Pahang (UMP) under grant RDU1803189 and UMP Fundamental Grant Scheme RDU220364.

Institutional Review Board Statement

This study did not require ethical approval.

Informed Consent Statement

This study did not require ethical approval.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank Rozina Abdul Rani and Shamsiah Suhaili for their help.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tigadi, A.; Gujanatti, R.; Gonchi, A.; Klemsscet, B. Advanced driver assistance systems. Int. J. Eng. Res. Gen. Sci. 2016, 4, 151–158. [Google Scholar]
  2. World Health Organization. Global Status Report on Road Safety 2017; WHO Press: Paris, France, 2017. [Google Scholar]
  3. Nowosielski, A. Vision-based solutions for driver assistance. J. Theor. Appl. Comput. Sci. 2014, 8, 35–44. [Google Scholar]
  4. Charniya, N.N.; Nair, V.R. Drunk driving and drowsiness detection. In Proceedings of the 2017 International Conference on Intelligent Computing and Control (I2C2), Coimbatore, India, 23–24 June 2017; pp. 1–6. [Google Scholar]
  5. Krishnasree, V.; Balaji, N.; Sudhakar, R.P. Areal time improved driver fatigue monitoring system. WSEAS Trans. Signal Process. 2014, 10, 146–155. [Google Scholar]
  6. Jackson, P.; Hilditch, C.; Holmes, A.; Reed, N.; Merat, N.; Smith, L. Fatigue and Road Safety: A Critical Analysis of Recent Evidence; Road Safety Web Publication 21; Department for Transport: London, UK, 2011.
  7. Türkan, M.; Onaran, I.; Çetin, A.E. Human face detection in video using edge projections. In Proceedings of the 14th European Signal Processing Conference (EUSIPCO 2006), Florence, Italy, 4–8 September 2006; pp. 1–5. [Google Scholar]
  8. Fletcher, L.; Petersson, L.; Zelinsky, A. Driver assistance systems based on vision in and out of vehicles. In Proceedings of the IEEE IV 2003 Intelligent Vehicles Symposium, Columbus, OH, USA, 9–11 June 2003; pp. 322–327. [Google Scholar]
  9. Alshaqaqi, B.; Baquhaizel, A.S.; Ouis, M.E.A.; Boumehed, M.; Ouamri, A.; Keche, M. Vision based system for driver drowsiness detection. In Proceedings of the 2013 11th International Symposium on Programming and Systems (ISPS), Algiers, Algeria, 22–24 April 2013; pp. 103–108. [Google Scholar]
  10. Chen, L.B.; Chang, W.J.; Su, J.P.; Ciou, J.Y.; Ciou, Y.J.; Kuo, C.C.; Li, K.S.M. A wearable-glassesbased drowsiness-fatigue-detection system for improving road safety. In Proceedings of the 2016 IEEE 5th Global Conference on Consumer Electronics, Kyoto, Japan, 11–14 October 2016; pp. 1–2. [Google Scholar]
  11. Chen, L.B.; Chang, W.J.; Hu, W.W.; Wang, C.K.; Lee, D.H.; Chiou, Y.Z. A band-pass IR light photodetector for wearable intelligent glasses in a drowsiness-fatigue-detection system. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 12–18 January 2018; pp. 1–2. [Google Scholar]
  12. Viola, P.; Jones, M.J. Robust real-time face detection. Int. J. Comput. Vis. 2004, 57, 137–154. [Google Scholar] [CrossRef]
Figure 1. System block diagram.
Figure 1. System block diagram.
Engproc 46 00039 g001
Figure 2. Flowchart of the system.
Figure 2. Flowchart of the system.
Engproc 46 00039 g002
Figure 3. Flowchart of the system when alarm activated.
Figure 3. Flowchart of the system when alarm activated.
Engproc 46 00039 g003
Figure 4. System wiring diagram.
Figure 4. System wiring diagram.
Engproc 46 00039 g004
Figure 5. The developed prototype for drowsiness detection.
Figure 5. The developed prototype for drowsiness detection.
Engproc 46 00039 g005
Figure 6. Eye detection. (a) Normal eye state; (b) aspect ratio below threshold.
Figure 6. Eye detection. (a) Normal eye state; (b) aspect ratio below threshold.
Engproc 46 00039 g006
Table 1. Threshold test results.
Table 1. Threshold test results.
ThresholdOpenCloseAlarm
0.10.3410.181No
0.20.3120.187No
0.30.320.2Yes
0.40.326immediateYes
Table 2. Variable subject versus variable distance results.
Table 2. Variable subject versus variable distance results.
DistanceSubjectOpenCloseAlarm Detection (Second)
40 cmA0.330.1292.8
B0.2320.10
C0.30.1592
D0.2460.0840
E0.3120.1622.6
50 cmA0.3650.22.69
B0.20.0710
C0.320.1292.21
D0.2210.110
E0.320.1812.61
60 cmA0.3510.1892.6
B0.2430.0930
C0.3180.172.5
D0.2370.080
E0.3270.162.8
Table 3. Subject details.
Table 3. Subject details.
SubjectGenderAgeRaceWearing Glasses
AMale39MalayNo
BMale28IndianYes
CMale27ChineseNo
DFemale35MalayYes
EFemale33MalayNo
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zain, Z.M.; Roseli, M.S.; Abdullah, N.A. Enhancing Driver Safety: Real-Time Eye Detection for Drowsiness Prevention Driver Assistance Systems. Eng. Proc. 2023, 46, 39. https://doi.org/10.3390/engproc2023046039

AMA Style

Zain ZM, Roseli MS, Abdullah NA. Enhancing Driver Safety: Real-Time Eye Detection for Drowsiness Prevention Driver Assistance Systems. Engineering Proceedings. 2023; 46(1):39. https://doi.org/10.3390/engproc2023046039

Chicago/Turabian Style

Zain, Zainah Md., Mohd Shahril Roseli, and Nurul Athirah Abdullah. 2023. "Enhancing Driver Safety: Real-Time Eye Detection for Drowsiness Prevention Driver Assistance Systems" Engineering Proceedings 46, no. 1: 39. https://doi.org/10.3390/engproc2023046039

Article Metrics

Back to TopTop