Eye-Tracking Technologies: Theory, Methods and Applications

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (31 December 2023) | Viewed by 20862

Special Issue Editors


E-Mail Website
Guest Editor
College of Natural Sciences, University of Rzeszow, Pigonia St. 1, 35-959 Rzeszow, Poland
Interests: eye tracking; image processing; neural networks with fractional derivative; pilot attention analysis; control; spacecraft formation; state estimation; scheduling of discrete production processes; control algorithms

E-Mail Website
Guest Editor
The Faculty of Mechanical Engineering and Aeronautics, Rzeszow University of Technology, 35-959 Rzeszów, Poland
Interests: aircraft systems; vision system; flight simulator, eye tracking; HMI systems; image processing; neural networks; control

E-Mail Website
Guest Editor
Faculty of Electrical Engineering, Automatics, Computer Science, and Biomedical Engineering, AGH University of Science and Technology in Krakow, 30-059 Krakow, Poland
Interests: scheduling of discrete production processes; control algorithms; neural networks; control; knowledge base; multistage decision process; 3-D scenery analysis

E-Mail Website
Guest Editor
College of Natural Sciences, University of Rzeszow, Pigonia St. 1, 35-959 Rzeszow, Poland
Interests: eye tracking; image processing; human computer interaction; state estimation; ontology based solvers; control algorithms

Special Issue Information

Dear Colleagues,

Familiarisation with the thought processes accompanying a person while observing and recognizing the surrounding scenery is an extremely interesting area of many scientific studies. In particular, the rapid development of this area includes methods and tools used for registration and attention analysis, which are gaining new and innovative applications in modern industry.

For this purpose, various mathematical models are developed that are used to solve the geometry of the scenery around the observer, as well as intelligent methods of continuous attention tracking and recognition of human interaction with the environment, which form the basis of the development of neuroscience and the theory of perception and cognition.

This Special Issue will be devoted to new solutions and prospects for the further development of methods and tools used in modern eye tracking applications.

Due to the interdisciplinary nature of eye tracking research, we invite articles from all areas related to modern eye tracking applications, including new technologies and the mathematical models used in them and their applications.

Dr. Zbigniew Gomolka
Dr. Damian Kordos
Prof. Dr. Ewa Dudek-Dyduch
Dr. Bogusław Twaróg
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • eye tracking
  • image processing
  • neural networks
  • object tracking
  • pilot attention
  • gaze tracking systems
  • eye tracking applications
  • gaze-based interaction
  • eye movement data analysis

Published Papers (12 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

23 pages, 3571 KiB  
Article
Studying the Role of Visuospatial Attention in the Multi-Attribute Task Battery II
by Daniel Gugerell, Benedikt Gollan, Moritz Stolte and Ulrich Ansorge
Appl. Sci. 2024, 14(8), 3158; https://doi.org/10.3390/app14083158 - 09 Apr 2024
Viewed by 307
Abstract
Task batteries mimicking user tasks are of high heuristic value. Supposedly, they measure individual human aptitude regarding the task in question. However, less is often known about the underlying mechanisms or functions that account for task performance in such complex batteries. This is [...] Read more.
Task batteries mimicking user tasks are of high heuristic value. Supposedly, they measure individual human aptitude regarding the task in question. However, less is often known about the underlying mechanisms or functions that account for task performance in such complex batteries. This is also true of the Multi-Attribute Task Battery (MATB-II). The MATB-II is a computer display task. It aims to measure human control operations on a flight console. Using the MATB-II and a visual-search task measure of spatial attention, we tested if capture of spatial attention in a bottom-up or top-down way predicted performance in the MATB-II. This is important to understand for questions such as how to implement warning signals on visual displays in human–computer interaction and for what to practice during training of operating with such displays. To measure visuospatial attention, we used both classical task-performance measures (i.e., reaction times and accuracy) as well as novel unobtrusive real-time pupillometry. The latter was done as pupil size covaries with task demands. A large number of analyses showed that: (1) Top-down attention measured before and after the MATB-II was positively correlated. (2) Test-retest reliability was also given for bottom-up attention, but to a smaller degree. As expected, the two spatial attention measures were also negatively correlated with one another. However, (3) neither of the visuospatial attention measures was significantly correlated with overall MATB-II performance, nor with (4) any of the MATB-II subtask performance measures. The latter was true even if the subtask required visuospatial attention (as in the system monitoring task of the MATB-II). (5) Neither did pupillometry predict MATB-II performance, nor performance in any of the MATB-II’s subtasks. Yet, (6) pupil size discriminated between different stages of subtask performance in system monitoring. This finding indicated that temporal segregation of pupil size measures is necessary for their correct interpretation, and that caution is advised regarding average pupil-size measures of task demands across tasks and time points within tasks. Finally, we observed surprising effects of workload (or cognitive load) manipulation on MATB-II performance itself, namely, better performance under high- rather than low-workload conditions. The latter findings imply that the MATB-II itself poses a number of questions about its underlying rationale, besides allowing occasional usage in more applied research. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

13 pages, 20825 KiB  
Article
Video-Based Gaze Detection for Oculomotor Abnormality Measurements
by Eran Harpaz, Rotem Z. Bar-Or, Israel Rosset and Edmund Ben-Ami
Appl. Sci. 2024, 14(4), 1519; https://doi.org/10.3390/app14041519 - 13 Feb 2024
Viewed by 1211
Abstract
Measuring oculomotor abnormalities in human subjects is challenging due to the delicate spatio-temporal nature of the oculometric measures (OMs) used to assess eye movement abilities. Some OMs require a gaze estimation accuracy of less than 2 degrees and a sample rate that enables [...] Read more.
Measuring oculomotor abnormalities in human subjects is challenging due to the delicate spatio-temporal nature of the oculometric measures (OMs) used to assess eye movement abilities. Some OMs require a gaze estimation accuracy of less than 2 degrees and a sample rate that enables the detection of movements lasting less than 100 ms. While past studies and applications have used dedicated and limiting eye tracking devices to extract OMs, recent advances in imaging sensors and computer vision have enabled video-based gaze detection. Here, we present a self-calibrating neural network model for gaze detection that is suitable for oculomotor abnormality measurement applications. The model considers stimuli target locations while the examined subjects perform visual tasks and calibrate its gaze estimation output in real time. The model was validated in a clinical trial and achieved an axial accuracy of 0.93 degrees and 1.31 degrees for horizontal and vertical gaze estimation locations, respectively, as well as an absolute accuracy of 1.80 degrees. The performance of the proposed model enables the extraction of OMs using affordable and accessible setups—such as desktop computers and laptops—without the need to restrain the patient’s head or to use dedicated equipment. This newly introduced approach may significantly ease patient burden and improve clinical results in any medical field that requires eye movement measurements. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

18 pages, 7800 KiB  
Article
Differences between Experts and Novices in the Use of Aircraft Maintenance Documentation: Evidence from Eye Tracking
by Florence Paris, Remy Casanova, Marie-Line Bergeonneau and Daniel Mestre
Appl. Sci. 2024, 14(3), 1251; https://doi.org/10.3390/app14031251 - 02 Feb 2024
Viewed by 716
Abstract
Maintenance is a highly procedural activity requiring motor and cognitive engagement. The aim of this experimental study was to examine how expertise affects maintenance tasks, in particular, the use of procedural documents. A total of 22 aircraft maintenance technicians were divided into two [...] Read more.
Maintenance is a highly procedural activity requiring motor and cognitive engagement. The aim of this experimental study was to examine how expertise affects maintenance tasks, in particular, the use of procedural documents. A total of 22 aircraft maintenance technicians were divided into two groups according to their level of expertise. Helicopter maintenance was evaluated in a real work environment, using an eye tracker, a fixed camera, and NASA-TLX to measure workload. Both groups reported a high mental load. Novices showed elevated levels of effort and mental demand. Experts were faster at all levels of the task and spent less time consulting maintenance documentation. The acquisition of procedural information was greater at the start of the task, where the gap between groups was more pronounced. This may be related to the overall planning of the task, in addition, the task was atomized, with frequent back-and-forth between execution and information intake, for all participants. Novices had a longer document consultation duration, spread over a greater number of consultations, but did not have a higher average consultation time. The results indicate a higher mental load for novices, potentially linked to an increased atomization of the task, as shown by the frequency of consultations. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

18 pages, 1281 KiB  
Article
The Impact of Reading Modalities and Text Types on Reading in School-Age Children: An Eye-Tracking Study
by Wi-Jiwoon Kim, Seo Rin Yoon, Seohyun Nam, Yunjin Lee and Dongsun Yim
Appl. Sci. 2023, 13(19), 10802; https://doi.org/10.3390/app131910802 - 28 Sep 2023
Viewed by 991
Abstract
This study examined the eye movement patterns of 317 elementary students across reading conditions (audio-assisted reading (AR) and reading-only (R)) and text types (fiction and non-fiction) and identified eye movement parameters that predict their literal comprehension (LC) and inferential comprehension (IC). Participants, randomly [...] Read more.
This study examined the eye movement patterns of 317 elementary students across reading conditions (audio-assisted reading (AR) and reading-only (R)) and text types (fiction and non-fiction) and identified eye movement parameters that predict their literal comprehension (LC) and inferential comprehension (IC). Participants, randomly assigned to either reading condition and either text type, answered questions assessing their LC and IC. Average fixation duration (AFD), total fixation duration (TFD), and scanpath length were used as eye movement parameters. The main effects of age were observed on all parameters, along with interaction effects between age and reading condition on TFD and scanpath length. These results indicate that children employ different reading strategies, depending on reading modalities and text types. When controlling for age, TFD had a positive impact on the LC of both text types in the AR, while in the R, it had a negative effect on the IC of both text types. Longer scanpaths predicted the IC of fiction in the AR; the LC and IC of non-fiction under the AR; and the LC of non-fiction within the R. AFD had a negative influence on the IC of fiction in the AR, as well as on the LC and IC of non-fiction in the AR, and the LC of non-fiction under the R. These findings highlight the importance of selecting appropriate reading strategies, based on reading modality and text type, to enhance reading comprehension. This study offers guidance for educators when providing reading instruction to school-age children. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

14 pages, 4302 KiB  
Article
Exploring the Potential of Event Camera Imaging for Advancing Remote Pupil-Tracking Techniques
by Dongwoo Kang, Youn Kyu Lee and Jongwook Jeong
Appl. Sci. 2023, 13(18), 10357; https://doi.org/10.3390/app131810357 - 15 Sep 2023
Viewed by 1641
Abstract
Pupil tracking plays a crucial role in various applications, including human–computer interactions, biometric identification, and Autostereoscopic three-dimensional (3D) displays, such as augmented reality (AR) 3D head-up displays (HUDs). This study aims to explore and compare advancements in pupil-tracking techniques using event camera imaging. [...] Read more.
Pupil tracking plays a crucial role in various applications, including human–computer interactions, biometric identification, and Autostereoscopic three-dimensional (3D) displays, such as augmented reality (AR) 3D head-up displays (HUDs). This study aims to explore and compare advancements in pupil-tracking techniques using event camera imaging. Event cameras, also known as neuromorphic cameras, offer unique benefits, such as high temporal resolution and low latency, making them well-suited for capturing fast eye movements. For our research, we selected fast classical machine-learning-based computer vision techniques to develop our remote pupil tracking using event camera images. Our proposed pupil tracker combines local binary-pattern-features-based eye–nose detection with the supervised-descent-method-based eye-nose alignment. We evaluate the performance of event-camera-based techniques in comparison to traditional frame-based approaches to assess their accuracy, robustness, and potential for real-time applications. Consequently, our event-camera-based pupil-tracking method achieved a detection accuracy of 98.1% and a tracking accuracy (pupil precision < 10 mm) of 80.9%. The findings of this study contribute to the field of pupil tracking by providing insights into the strengths and limitations of event camera imaging for accurate and efficient eye tracking. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

25 pages, 3327 KiB  
Article
Investigating the Effect of Outdoor Advertising on Consumer Decisions: An Eye-Tracking and A/B Testing Study of Car Drivers’ Perception
by Radovan Madlenak, Roman Chinoracky, Natalia Stalmasekova and Lucia Madlenakova
Appl. Sci. 2023, 13(11), 6808; https://doi.org/10.3390/app13116808 - 03 Jun 2023
Cited by 2 | Viewed by 5303
Abstract
This study aims to investigate the impact of outdoor advertising on consumer behaviour by using eye-tracking analysis while drivers travel specific routes in Žilina, Slovakia. This research combines questionnaire inquiry and A/B testing to assess the conscious and subconscious effects of outdoor advertising [...] Read more.
This study aims to investigate the impact of outdoor advertising on consumer behaviour by using eye-tracking analysis while drivers travel specific routes in Žilina, Slovakia. This research combines questionnaire inquiry and A/B testing to assess the conscious and subconscious effects of outdoor advertising on consumer decisions. The findings of this study have important implications for businesses providing outdoor advertising spaces, as well as those using outdoor advertising as a form of advertisement. Additionally, the study provides insights into the role of transportation background and how it influences consumer behaviour in relation to outdoor advertising. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

12 pages, 2183 KiB  
Article
Evaluation of an Eye-Tracking-Based Method for Assessing the Visual Performance with Progressive Lens Designs
by Pablo Concepcion-Grande, Eva Chamorro, José Miguel Cleva, José Alonso and Jose A. Gómez-Pedrero
Appl. Sci. 2023, 13(8), 5059; https://doi.org/10.3390/app13085059 - 18 Apr 2023
Cited by 2 | Viewed by 1862
Abstract
Due to the lack of sensitivity of visual acuity (VA) measurement to quantify differences in visual performance between progressive power lenses (PPLs), in this study, we propose and evaluate an eye-tracking-based method to assess visual performance when wearing PPLs. A wearable eye-tracker system [...] Read more.
Due to the lack of sensitivity of visual acuity (VA) measurement to quantify differences in visual performance between progressive power lenses (PPLs), in this study, we propose and evaluate an eye-tracking-based method to assess visual performance when wearing PPLs. A wearable eye-tracker system (Tobii-Pro Glasses 3) recorded the pupil position of 27 PPL users at near and distance vision during a VA test while wearing three PPL designs: a PPL for general use (PPL-Balance), a PPL optimized for near vision (PPL-Near), and a PPL optimized for distance vision (PPL-Distance). The participants were asked to recognize eye charts at both near and distance vision using centered and oblique gaze directions with each PPL design. The results showed no statistically significant differences between PPLs for VA. However, significant differences in eye-tracking parameters were observed between PPLs. Furthermore, PPL-Distance had a lower test duration, complete fixation time, and number of fixations at distance evaluation. PPL-Near has a lower test duration, complete fixation time, and number of fixations for near vision. In conclusion, the quality of vision with PPLs can be better characterized by incorporating eye movement parameters than the traditional evaluation method. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

19 pages, 2137 KiB  
Article
The Effect of 3D TVs on Eye Movement and Motor Performance
by Chiuhsiang Joe Lin, Retno Widyaningrum and Yogi Tri Prasetyo
Appl. Sci. 2023, 13(4), 2656; https://doi.org/10.3390/app13042656 - 18 Feb 2023
Cited by 1 | Viewed by 1602
Abstract
Three-dimensional TVs have been commercialized in recent few years; however, poor visual and motor performances may have an impact on consumer acceptance of 3D TVs. The purpose of this study was to investigate the effects of 3D TVs on eye movement and motor [...] Read more.
Three-dimensional TVs have been commercialized in recent few years; however, poor visual and motor performances may have an impact on consumer acceptance of 3D TVs. The purpose of this study was to investigate the effects of 3D TVs on eye movement and motor performance. Specifically, the effect of stereoscopic display parallax of 3D TVs and movement task index of difficulty (ID) on eye movement was investigated. In addition, the effect of stereoscopic display parallax of 3D TVs and movement task ID on motor performance was also investigated. Twelve participants voluntarily participated in a multi-directional tapping task under two different viewing environments (2D TV and 3D TV), three different levels of stereoscopic depth (140, 190, 210 cm), and six different Index of Difficulty levels (2.8, 3.3, 3.7, 4.2, 5.1, 6.1 bit). The study revealed that environment had significant effects on eye movement time, index of eye performance, eye fixation accuracy, number of fixations, time to first fixation, saccadic duration, revisited fixation duration, hand movement time, index of hand performance, and error rate. Interestingly, there were no significant effects of stereoscopic depth on eye movement and motor performance; however, the best performance was found when the 3D object was placed at 210 cm. The main novelty and contributions of this study is the in-depth investigations of the effect of 3D TVs on eye movement and motor performance. The findings of this study could lead to a better understanding of the visual and motor performance for 3D TVs. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

23 pages, 4970 KiB  
Article
Eye-Tracking Investigation of the Train Driver’s: A Case Study
by Radovan Madlenak, Jaroslav Masek, Lucia Madlenakova and Roman Chinoracky
Appl. Sci. 2023, 13(4), 2437; https://doi.org/10.3390/app13042437 - 14 Feb 2023
Cited by 2 | Viewed by 1683
Abstract
This article investigates the utilization of eye-tracking methodology to monitor the driver’s activities and attention during the arrival and departure procedures of train operations on Slovak Railway (ŽSR) line no. 120, Bratislava–Žilina. Previous studies conducted in 2020 formed the basis of the current [...] Read more.
This article investigates the utilization of eye-tracking methodology to monitor the driver’s activities and attention during the arrival and departure procedures of train operations on Slovak Railway (ŽSR) line no. 120, Bratislava–Žilina. Previous studies conducted in 2020 formed the basis of the current research, which focused on two train stations and two railway stops located on the Žilina–Púchov track section. The results of the experiment allowed for a greater understanding of the driver’s cognitive processes, thereby leading to increased safety and sustainability in the railway transport system. It is noteworthy that the employed measurement methodology and technology had no detrimental effect on train operation, or operational and thus passenger safety. Thus, the results of this experiment provide a sound foundation for further exploration into human–machine (driver–train) interaction in actual traffic conditions. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

13 pages, 959 KiB  
Article
Parameters of Optokinetic Nystagmus Are Influenced by the Nature of a Visual Stimulus
by Peter Essig, Jonas Müller and Siegfried Wahl
Appl. Sci. 2022, 12(23), 11991; https://doi.org/10.3390/app122311991 - 23 Nov 2022
Cited by 1 | Viewed by 1401
Abstract
Studies on contrast sensitivity (CS) testing using optokinetic nystagmus (OKN) proposed adjusting the stimulus presentation duration based on its contrast, to increase the time efficiency of such measurement. Furthermore, stimulus-specific limits of the least OKN gain might reduce false negatives in OKN detection [...] Read more.
Studies on contrast sensitivity (CS) testing using optokinetic nystagmus (OKN) proposed adjusting the stimulus presentation duration based on its contrast, to increase the time efficiency of such measurement. Furthermore, stimulus-specific limits of the least OKN gain might reduce false negatives in OKN detection procedures. Therefore, we aimed to test the effects of various stimulus characteristics on OKN and to propose the stimulus-specific limits for the OKN gain and stimulus presentation duration. We tested the effect of contrast (C), spatial frequency (SF), and color on selected parameters of robust OKN response, namely its onset and offset time, amplitude, and gain. The right eyes of fifteen emmetropes were tracked with an infrared eye tracker during monocular observations of sinusoidal gratings moving over the horizontal plane with a velocity of (21/s). The available contrast levels were C: 0.5%, 2.0%, 8.2%, 16.5%, 33.0%, and 55.5% presented in a random order for ten times in all measurements of SF: 0.12, 0.25, 0.5, and 1.00 cycles per degree and grating type: luminance, red-green, and blue-yellow. This study showed a significant effect of the stimulus characteristics on the OKN onset, offset and gain. The effect of SF was insignificant in OKN amplitude; however, it indicated significance for the C and grating type. Furthermore, the OKN gain and offset limits were proposed as functions of contrast for the luminance and chromatic gratings. This study concludes the characteristics of a visual stimulus have an effect on the OKN gain and onset and offset time, yet do not affect the eye-movement amplitude considerably. Moreover, the proposed limits are expected to improve the time efficiency and eye-movement detection in OKN-based contrast sensitivity measurements. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

13 pages, 5469 KiB  
Article
Use of a DNN in Recording and Analysis of Operator Attention in Advanced HMI Systems
by Zbigniew Gomolka, Ewa Zeslawska, Boguslaw Twarog, Damian Kordos and Pawel Rzucidlo
Appl. Sci. 2022, 12(22), 11431; https://doi.org/10.3390/app122211431 - 11 Nov 2022
Cited by 1 | Viewed by 1567
Abstract
The main objective of this research was to propose a smart technology to record and analyse the attention of operators of transportation devices where human–machine interaction occurs. Four simulators were used in this study: General Aviation (GA), Remotely Piloted Aircraft System (RPAS), AS [...] Read more.
The main objective of this research was to propose a smart technology to record and analyse the attention of operators of transportation devices where human–machine interaction occurs. Four simulators were used in this study: General Aviation (GA), Remotely Piloted Aircraft System (RPAS), AS 1600, and Czajka, in which a spatio-temporal trajectory of system operator attention describing the histogram distribution of cockpit instrument observations was sought. Detection of the position of individual instruments in the video stream recorded by the eyetracker was accomplished using a pre-trained Fast R-CNN deep neural network. The training set for the network was constructed using a modified Kanade–Lucas–Tomasi (KLT) algorithm, which was applied to optimise the labelling of the cockpit instruments of each simulator. A deep neural network allows for sustained instrument tracking in situations where classical algorithms stop their work due to introduced noise. A mechanism for the flexible selection of Area Of Interest (AOI) objects that can be tracked in the recorded video stream was used to analyse the recorded attention using a mobile eyetracker. The obtained data allow for further analysis of key skills in the education of operators of such systems. The use of deep neural networks as a detector for selected instrument types has made it possible to universalise the use of this technology for observer attention analysis when applied to a different objects-sets of monitoring and control instruments. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

Review

Jump to: Research

22 pages, 2708 KiB  
Review
Situational Awareness Assessment of Drivers Boosted by Eye-Tracking Metrics: A Literature Review
by Claudia Yohana Arias-Portela, Jaime Mora-Vargas and Martha Caro
Appl. Sci. 2024, 14(4), 1611; https://doi.org/10.3390/app14041611 - 17 Feb 2024
Viewed by 578
Abstract
The conceptual framework for assessing the situational awareness (SA) of drivers consists of three hierarchical levels: perception of the elements of the environment, comprehension of the elements, and decision-making in the near future. A common challenge in evaluating SA is the determination of [...] Read more.
The conceptual framework for assessing the situational awareness (SA) of drivers consists of three hierarchical levels: perception of the elements of the environment, comprehension of the elements, and decision-making in the near future. A common challenge in evaluating SA is the determination of the available subjective and objective techniques and their selection and integration into methodologies. Among the objective techniques, eye tracking is commonly used, considering the influence of gaze behavior on driving. This review is presented as an innovative approach to the subject matter, introducing physiological metrics based on eye tracking and investigating their application in assessing the SA of drivers. In addition, experiments and methodologies that revealed patterns at the three levels of SA were identified. For this purpose, databases were searched, and 38 papers were considered. Articles were clustered according to prevalent themes such as eye-tracking metrics, eye-tracking devices, experiment design, and the relationship between SA and eye-tracking. This review summarizes the main metrics and key findings for each article and reveals a wide relationship between the eye-tracking metrics and SA. The influence of appropriately calibrated equipment, refined data collection protocols, and adequate selection of the eye-tracking metrics was examined. Further reviews are needed to systematically collect more evidence. Full article
(This article belongs to the Special Issue Eye-Tracking Technologies: Theory, Methods and Applications)
Show Figures

Figure 1

Back to TopTop