Next Article in Journal
Electrical Interference Simulation and Prediction Model for Acoustoelectric Logging Detector
Next Article in Special Issue
Forward and Backward Walking: Multifactorial Characterization of Gait Parameters
Previous Article in Journal
Mobile Payment Protocol with Deniably Authenticated Property
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Automatic Calibration Method for Kappa Angle Based on a Binocular Gaze Constraint

1
School of Automation and Electrical Engineering, University of Science and Technology Beijing, Beijing 100083, China
2
Beijing Engineering Research Center of Industrial Spectrum Imaging, University of Science and Technology Beijing, Beijing 100083, China
3
Shunde Innovation School, University of Science and Technology Beijing, Foshan 528399, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(8), 3929; https://doi.org/10.3390/s23083929
Submission received: 6 March 2023 / Revised: 6 April 2023 / Accepted: 11 April 2023 / Published: 12 April 2023
(This article belongs to the Special Issue Sensing and Vision Technologies for Human Activity Recognition)

Abstract

:
Kappa-angle calibration shows its importance in gaze tracking due to the special structure of the eyeball. In a 3D gaze-tracking system, after the optical axis of the eyeball is reconstructed, the kappa angle is needed to convert the optical axis of the eyeball to the real gaze direction. At present, most of the kappa-angle-calibration methods use explicit user calibration. Before eye-gaze tracking, the user needs to look at some pre-defined calibration points on the screen, thereby providing some corresponding optical and visual axes of the eyeball with which to calculate the kappa angle. Especially when multi-point user calibration is required, the calibration process is relatively complicated. In this paper, a method that can automatically calibrate the kappa angle during screen browsing is proposed. Based on the 3D corneal centers and optical axes of both eyes, the optimal objective function of the kappa angle is established according to the coplanar constraint of the visual axes of the left and right eyes, and the differential evolution algorithm is used to iterate through kappa angles according to the theoretical angular constraint of the kappa angle. The experiments show that the proposed method can make the gaze accuracy reach 1.3 ° in the horizontal plane and 1.34 ° in the vertical plane, both of which are within the acceptable margins of gaze-estimation error. The demonstration of explicit kappa-angle calibration is of great significance to the realization of the instant use of gaze-tracking systems.

1. Introduction

Gaze tracking aims to determine the user’s current gaze direction or gaze point from facial or eye features, and realize automatic interaction and attention analysis by analyzing the user’s eye movement information. At present, gaze tracking has been applied to an eye-control mouse [1], a virtual reality helmet [2], disease screening and diagnosis [3], and attention analysis [4], which relates to human–computer interaction, virtual reality, wise medicine, etc. It has a broad application prospect.
The core of gaze tracking is gaze estimation. According to the eyeball structure, the optical axis (OA) of the eyeball passes through the eyeball center, the corneal center, the iris center and the pupil center, and is perpendicular to the iris or pupil plane. The visual axis (VA) of the eyeball passes through the fovea and the corneal center. There is a fixed deflection angle between the OA and the VA called the kappa angle [5]. Three-dimensional gaze estimation refers to the estimation of the VA of the eyeball. However, the VA cannot be determined only by the corneal center due to the invisibility of the fovea. The general method is to reconstruct the OA of the eyeball first, and then use the kappa angle to generate the VA from the OA. The kappa angle has individual differences, and it is not universal practice to set it with a fixed value. Therefore, the calibration of kappa angle is especially significant, which will directly affect the gaze accuracy.
The existing kappa-angle-calibration methods are mainly explicit, requiring the user to look at some pre-defined calibration points on the screen before eye-gaze tracking. During this explicit calibration process, the user’s eye-feature parameters are extracted from the collected facial images, and the OA and the VA of the eyeball when the user looks at each calibration point can be calculated to represent the kappa angle. The kappa angle has two possible representations. One is represented by the transformation matrix between the OA and the VA. The researchers calibrated the transformation matrix using multiple groups of OA directions and their corresponding VA directions [6,7]. This method usually uses multiple calibration points to ensure the robustness of the calibrated transformation matrix, but multi-point user calibration determines the complexity of the calibration process. To simplify the calibration, some researchers put forward the method of representing the kappa angle using angles. In the eyeball coordinate system, both the OA direction and the VA direction can be represented by a trigonometric function of a horizontal angle and a vertical angle. The horizontal and vertical angles of VA direction can be regarded as the superposition of the horizontal and vertical angles of OA direction and the kappa angle. Thus, the horizontal and vertical components of kappa angle can be calculated using a single calibration point [8,9,10,11,12,13]. In addition, a few researchers used one or two angles to represent the transformation angle between the OAs of the eyeball at two different positions, thereby calculating the VA direction of the eyeball at one position using the VA direction at another position and the transformation angle, since the positional relationship between the OA and the VA at different positions remains unchanged [14,15,16]. Xiong et al. [17] used the horizontal and vertical rotation angles from the OA to the VA in a camera coordinate system to represent the transformation matrices for the horizontal and vertical directions. The product of these two matrices was then used to represent the kappa angle. Wan et al. [18] calculated the horizontal- and vertical-rotation matrices between the OA and the VA, and determined the horizontal and vertical components of the kappa angle by averaging the horizontal and vertical rotation angles obtained by looking at each calibration point. However, the explicit calibration method increases the complexity of the system’s use, which is contrary to the "instant use" pursuit of natural human–computer interaction. Although the number of calibration points can be reduced to one, multi-point calibration seems to be more robust than single-point calibration.
To improve the actualities, Nagamatsu et al. [19] approximated the midpoint of the intersections of the OAs of both eyes and the screen as the point of regard (POR), so they estimated the horizontal angle of the kappa angle without requiring user calibration. However, the vertical component is not considered, and the difference in kappa angle between left and right eyes is ignored. Wang and Ji [20] used four staring constraints to formulate an implicit kappa-angle calibration as a constrained unsupervised regression problem and solved it through an iterative hard EM algorithm. This method is friendly to users, but the distance between the eyeball center and the corneal center is set as a constant when calculating the eyeball’s center, which will lead to different deviations in the 2D mapping model for different users, resulting in an equal offset between the 2D POR and the 3D POR.
In this paper, an automatic calibration method of kappa angle based on the binocular gaze constraint is proposed. An optimal objective function of the kappa angle is established by using the coplanar constraint of the VAs of the left and right eyes. Additionally, the differential evolution algorithm is used to iterate toward the optimal solution of the kappa angle within its theoretical range.
The main contributions of this paper are:
(1)
The automatic calibration of the kappa angle is realized by using the binocular gaze constraint that the VAs of the left and right eyes should be coplanar. It avoids the explicit calibration of the kappa angle while both individual differences and the differences between left and right eyes are considered.
(2)
The differential evolution algorithm is used to solve the established optimal objective function of the kappa angle within its theoretical range, which makes the kappa-angle calibration accurate. It avoids the setting of the initial value in the conventional solution method, as the kappa-angle calibration is susceptible to the initial value.
The rest of this paper is organized as follows. Section 2 discusses the existing calibration methods for the kappa angle. Section 3 describes the proposed automatic calibration method for the kappa angle in detail. The experiments involving computer simulations and practical experiments are analyzed in Section 4. Section 5 summarizes and concludes the work.

2. Representation of Kappa-Angle Calibration

The kappa angle is the angle between the OA and the VA of the eyeball. Since the VA is the real gaze, the kappa angle is an essential parameter for estimating the 3D gaze. It has two representations: a matrix representation and an angular representation. They are discussed separately below.

2.1. Matrix Representation

As Figure 1a shows, when the user looks at the ith calibration point, the unit-direction vector of the OA is expressed as  V i oa = x i o a y i o a z i o a T , and the unit-direction vector of the VA is expressed as  V i va = x i v a y i v a z i v a T . They can be obtained via a multi-point user calibration process. Assume that the number of calibration points is N, and the size of both the OA matrix  V oa and the VA matrix  V va is  3 × N . Thus, the relationship between the OA and the VA can be expressed by a transformation matrix  M , whose size is  3 × 3 . That is,
V va = M V oa .
where  M = a 1 a 2 a 3 a 4 a 5 a 6 a 7 a 8 a 9 V oa = x 1 o a x 2 o a x N o a y 1 o a y 2 o a y N o a z 1 o a z 2 o a z N o a , and  V va = x 1 v a x 2 v a x N v a y 1 v a y 2 v a y N v a z 1 v a z 2 v a z N v a . The coefficients  a 1 a 9 in  M can be calculated using at least three calibration points. When more calibration points are used, the robustness of the transformation matrix  M is stronger.

2.2. Angular Representation

As Figure 1b shows, an eyeball coordinate system is defined as follows: (1) the corneal center is the origin; (2) the x- and y-axes are the same as the horizontal and vertical directions of the screen; (3) the z-axis is perpendicular to the screen and points to the eyeball. In this coordinate system, the unit-direction vector of the OA can be expressed by a horizontal angle,  ω , and a vertical angle,  ϕ :
V ˜ oa = cos ϕ sin ω sin ϕ cos ϕ cos ω .
The kappa angle is decomposed into a horizontal angle,  α , and a vertical angle,  β . The average value of  α is about 5 ° , and the average value of  β is about 1.5 ° [8]. With the OA direction and the kappa angle, the unit-direction vector of the VA is:
V ˜ va = cos ( ϕ + β ) sin ( ω + α ) sin ( ϕ + β ) cos ( ϕ + β ) cos ( ω + α ) .
As the gaze point is known by explicit user calibration, the unit-direction vector of the VA in the camera coordinate system, represented by  V va , is known. Thus,  V ˜ va can be obtained via the transformation between the camera coordinate system and the eyeball coordinate system. When  V ˜ va is substituted into Equation (3),  α and  β can be calculated. The kappa angle can be calibrated by using a single calibration point.

3. Proposed Method

When calibrating the kappa angle, most existing methods require the user to look at some pre-defined calibration points. This explicit calibration is time-consuming and unfriendly. Although a few implicit calibration or calibration-free methods have been proposed [19,20], these methods do not fully consider the individual differences and the differences between the left and right eyes. Therefore, this paper proposes an automatic calibration method for the kappa angle. The basic procedure is shown in Figure 2. In the gaze-tracking system, based on the calculation of multiple groups of eyeball parameters that look at different positions, including the 3D corneal center and OA direction, the VA direction is represented by the kappa angle to be solved. An optimal objective function of the kappa angle is then constructed by using the binocular gaze constraint. Finally, the kappa angle is optimized and iterated using the differential evolution algorithm. The calibrated kappa angle can be used for subsequent real-time 3D gaze estimation.

3.1. Visual-Axis Representation

As mentioned in Section 2.2, angular representation is used to calibrate the kappa angle in this paper. Therefore, the representation of the VA mainly uses the transformation between the camera coordinate system and the eyeball coordinate system, as shown in Figure 3. The 3D corneal center ( C ) and the OA direction ( V oa ) in the camera coordinate system can be calculated according to the geometric imaging model of the eyeball. Using the coordinates of the screen points and the 3D corneal center, the transformation matrices ( R oc T oc ) from the camera coordinate system to the eyeball coordinate system can be obtained, and the transformation matrices ( R co T co ) from the eyeball coordinate system to the camera coordinate system can also be obtained. Thus, the OA direction ( V ˜ oa ) in the eyeball coordinate system can be converted from  V oa , and thereby  ω and  ϕ are calculated by substituting  V ˜ oa into Equation (2). The horizontal and vertical components of kappa angle are  α and  β , so the horizontal and vertical components of VA direction are  ( ω + α ) and  ( ϕ + β ) . The representation of VA direction in the eyeball coordinate system is shown in Equation (3). The VA direction ( V va ) in the camera coordinate system can be represented using  R co T co , and  C . The VA of the eyeball is related to the OA direction, the 3D corneal center, and the kappa angle—that is:
V va = h ( V oa , C ; α , β ) .

3.2. Objective Function Construction Using the Binocular Gaze Constraint

When a user gazes at the screen, ideally, the VAs of both eyes would intersect at a same point on the screen plane [21]. However, due to the presence of dominant and auxiliary eyes, the case that two VAs intersect at one point is not absolute. Therefore, a binocular gaze constraint is proposed in this paper, as shown in Figure 4. It does not require the explicit calibration points.
The horizontal and vertical components of the kappa angle of the left eye are expressed by  α L and  β L , and those of the right eye are expressed by  α R and  β R , so the VA directions of the left and right eyes are  V vaL = h ( V oaL , C L ; α L , β L ) and  V vaR = h ( V oaR , C R ; α R , β R ) . Compared to the constraint that the VAs of the left and right eyes intersect at a point, the coplanarity of the VAs of the left and right eyes is used. We defined a gaze plane formed by the VAs of the left and right eyes, and the normal vector of the plane can be expressed by the VA directions of the left and right eyes. That is:
n va = V vaL × V vaR V vaL × V vaR .
The structure of the eyeball shows that the corneal center is on the VA of the eyeball, so the 3D corneal centers of the left and right eyes are both in the gaze plane, so the gaze plane can be determined uniquely. It satisfies:
n va ( C L C R ) = 0 .
Therefore, based on the invariance of the kappa angle and the coplanar constraint of the VAs of the left and right eyes, an objective function is constructed. That is,
[ α L , β L , α R , β R ] = arg min i = 1 N ( n i va ( C i L C i R ) ) 2 .
N is the amount of data for a user. The normal vector of the gaze plane of the ith group is  n i va = V i vaL × V i vaR V i vaL × V i vaR , in which the VA direction of the left eye is  V i vaL = h ( V i oaL , C i L ; α L , β L ) , and that of the right eye is  V i vaR = h ( V i oaR , C i R ; α R , β R ) . To solve four unknowns ( α L β L α R β R ), at least four groups of data are required to satisfy the coplanarity of the VAs of the left and right eyes; otherwise, Equation (7) is underdetermined, so N needs to be no less than 4. When more samples are used, the robustness of kappa-angle calibration will be stronger.

3.3. Kappa-Angle Calculation Using Differential Evolution Algorithm

The kappa angle can be accurately solved by the above objective function in ideal conditions. However, in the real scene, some errors in system or eyeball parameters may cause a large deviation between the optimal solution and the ground truth, or the solution is closely related to the initially set value. Therefore, the differential evolution algorithm is used in this paper to solve the kappa angle.
First, multiple groups of initial solution vectors are randomly generated in the form of  x = [ α L , β L , α R , β R ] within the theoretical range of kappa angles, and the element of each solution vector satisfies:
x k , j = x k , j E + ( x k , j F x k , j E ) r a n d ( ) .
where  k = 1 , 2 , , W , and  j = 1 , 2 , 3 , 4 . W is the total number of solution vectors.  x k , j refers to  α L , β L , α R , β R in the kth solution vector, and their upper and lower limits are expressed by  x k , j F and  x k , j E · r a n d ( ) represents a random number generated between [0, 1].
For the above solution vectors, we first calculate the gaze-constraint error as shown in Equation (9). A solution vector that minimizes the gaze-constraint error is taken as the initial optimal kappa angle, and its gaze-constraint error is taken as the initial error for subsequent comparison.
f ( x ) = i = 1 N ( h ( V i oaL , C i L ; x [ 0 ] , x [ 1 ] ) × h ( V i oaR , C i R ; x [ 2 ] , x [ 3 ] ) h ( V i oaL , C i L ; x [ 0 ] , x [ 1 ] ) × h ( V i oaR , C i R ; x [ 2 ] , x [ 3 ] ) ( C i L C i R ) ) 2 .
Three different solution vectors from the above solution vectors are then used to generate the mutated solution vector. The element of the mutated solution vector satisfies:
v k , j = x r 1 , j + K ( x r 2 , j x r 3 , j ) .
where  1 r 1 , r 2 , r 3 W , and  k r 1 r 2 r 3 . K is the mutation operator. If  v k , j , calculated by Equation (10), exceeds the upper and lower limits we set,  v k , j takes the boundary value.
Using the initial solution vector  x k and the mutated solution vector  v k , the crossover operation is performed to generate a new solution vector  u k , as Equation (11) shows. Whether  u k is  x k or  v k depends on the comparison between a crossover operator c and a random number.
u k = v k , r a n d ( ) c x k , r a n d ( ) > c .
The gaze-constraint errors for  x k and  u k are then, respectively, calculated using Equation (9). The final solution vector  X k for the comparison is represented by the one that makes the gaze-constraint error smaller. That is:
X k = u k , f ( u k ) < f ( x k ) x k , f ( u k ) f ( x k ) .
After obtaining  X k , the gaze-constraint errors corresponding to  X 1 X 2 , ..., X W are calculated. The solution vector that minimizes the gaze-constraint error is taken as the optimal solution of this iteration, as shown in Equation (13).
X = arg min f ( X k ) , k = 1 , 2 , , W .
The iteration stops when the gaze-constraint error is less than the iteration termination error, and the solution vector at this time is used as the calibration result of the kappa angle. The specific calculation process is shown in Algorithm 1.
Algorithm 1: Process of kappa-angle calculation.
Input: 
3D corneal centers of left and right eyes ( C L and  C R ), OA directions of left and right eyes ( V oaL and  V oaR ), number of solution vectors W, upper and lower limits of kappa angle ( x k , j F and  x k , j E ), upper limit of iterations T, mutation operator K, crossover operator c, iteration termination error  δ ;
Output: 
Optimal kappa angle  [ α L , β L , α R , β R ] ;
1:
for  k = 1 to W do
2:
     x k , j = x k , j E + ( x k , j F x k , j E ) r a n d ( ) , x k , 1 , x k , 2 , x k , 3 , x k , 4 = α L , β L , α R , β R ;
3:
end for
4:
x op = arg min f ( x k ) , k = 1 , 2 , . . . , W ;
5:
ϵ o p = f ( x op ) ;
6:
for  t = 1 to T do
7:
    for  k = 1 to W do
8:
         v k , j = x r 1 , j + K ( x r 2 , j x r 3 , j ) , 1 r 1 , r 2 , r 3 W , k r 1 r 2 r 3 ;
9:
        if  v k , j < x k , j E  then
10:
            v k , j = x k , j E ;
11:
        else if  v k , j > x k , j F  then
12:
            v k , j = x k , j F ;
13:
        end if
14:
        if  r a n d ( ) c  then
15:
            u k = v k ;
16:
        else
17:
            u k = x k ;
18:
        end if
19:
        if  f ( u k ) < f ( x k )  then
20:
            X k = u k ;
21:
        else
22:
            X k = x k ;
23:
        end if
24:
        if  f ( X k ) < ϵ o p  then
25:
            ϵ o p = f ( X k ) ;
26:
            x t op = X k ;
27:
        end if
28:
    end for
29:
    if  f ( x t op ) < δ  then
30:
         [ α L , β L , α R , β R ] = x t op ;
31:
        Break;
32:
    end if
33:
end for 
34:
return   [ α L , β L , α R , β R ]

4. Simulations and Experiments

To verify the performance of the proposed method, we simulated the proposed method using accurate data generated based on Python first, and then added different levels of noise to the considered error sources to mimic measurement errors in the real scene and analyze their impacts on the calibration of kappa angle. The practical experiments were also carried out in the actual gaze-tracking system. Since the ground truth of each subject’s kappa angle is unknown, we estimated the 3D gaze of each subject using the calibrated kappa angle and validated the accuracy of kappa-angle calibration through the gaze accuracy.

4.1. Computer Simulations

According to the scene of the user sitting in front of the screen and looking at the screen, we generated the required parameters using the Gullstrand–Le Grand eyeball model and the human average [22]. The coordinates of the left corneal centers were randomly generated within a certain range (x: from −45 mm to −15 mm, y: from −90 mm to −50 mm, z: from 350 mm to 650 mm), so the right corneal centers were determined with the constraint that the distance between the 3D corneal centers of the left and right eyes is 60 mm. According to the defined screen, whose corner points were (−200, 262.045317, 250.064495), (200, 262.045317, 250.064495), (200, 69.209034, 20.251162), and (−200, 69.209034, 20.251162), an equal number of on-screen gaze points to the number of corneal centers were randomly generated. Thus, the VA direction was determined using the 3D corneal center and the gaze point. Due to the theoretical range of kappa angle, we set the horizontal component of the kappa angle to 5 ° and the vertical component to 1.5 ° . The required OA direction was determined in reverse by using the 3D corneal center, the VA direction of the eyeball, and the kappa angle. The inputs of the model in this paper were 3D corneal centers and OA directions of the left and right eyes, and the output was the calibrated kappa angle. The accuracy and robustness of the model were verified by comparing the predicted and the set kappa angles.

4.1.1. Algorithm Verification

We calibrated the kappa angle by using nine groups of data including 3D corneal centers and OA directions of the left and right eyes. Some constant parameters in our model were set as follows:  W = 35 K = 0.5 , and  c = 1 . The iteration range of kappa angles in our simulations was set as follows:  4 α L 8 8 α R 4 0 β L , β R 3 . With respect to the iteration termination error of  10 9 , the kappa angles from 10 calibrations are listed in Table 1. “iter_num” represents the number of iterations for which the gaze-constraint error was less than the iteration termination error, and "fun_er" represents its corresponding gaze-constraint error.
Figure 5 shows the iterative process of kappa angles calculated in the third round. The abscissa represents the number of iterations, and the ordinate represents the calibrated kappa angle. As the number of iterations increased, the kappa angles drifted simultaneously toward their true locations. The gaze-constraint error of the 14th iteration was less than  10 6 , and that of the 32th iteration was less than  10 9 . At this time, the values of  α L , β L , α R , and  β R were 4.993 ° , 1.499 ° , −5.007 ° , and 1.499 ° , respectively, and their errors were less than 0.01 ° .
(1) The influence of the preset ground truth of the kappa angle: To exclude the influence of the preset ground truth of the kappa angle in the simulation, we analyzed six different sets of ground truth. The horizontal and vertical components of the kappa angle were: [5 ° ,1.5 ° ], [4 ° ,1.5 ° ], [6 ° ,1.5 ° ], [5 ° ,1 ° ], [5 ° ,2 ° ], and [4 ° ,0.5 ° ]. As mentioned in Section 4.1, we first randomly generated nine groups of 3D corneal center coordinates for the left and right eyes within a distance of 350–650 mm, and then randomly generated nine gaze points on the screen. Thus, the VA direction was determined using the 3D corneal center and the gaze point, and the OA direction was calculated in reverse using the different sets of ground truths of the kappa angle. The 3D corneal centers and OA directions were then used for analyzing the influence of the ground truth of the kappa angle. Using the same constant parameters as above, the kappa angle was calibrated. The calibration results using different sets of ground truth are listed in Table 2. It can be seen that no matter what the ground truth of kappa angle is, the error difference of the calibrated kappa angles was not significant, which was less than 0.01 ° . Therefore, in the subsequent study, we only analyzed the case where the horizontal and vertical components of the kappa angle were 5 ° and 1.5 ° , respectively.
(2) Influence of gaze angle: We divided the screen area into four regions and randomly generated 30 gaze points on each region, as shown in Figure 6. To ensure different gaze angles, we set the same left and right corneal centers corresponding to each gaze point. The 3D corneal centers of the left and right eyes were (−33.62420733, −62.55917489, 508.7146057) and (24.63352113, −76.53688391, 505.449529), respectively. Thereby, the corresponding OA directions of the left and right eyes were generated. Using the data located in the same region, we calibrated the kappa angle, as listed in Table 3. It shows the effectiveness of the proposed method when using different regions of gaze points.
(3) Influence of the amount of data: We also tested the kappa angle calibrated with different amounts of data as input. When 1, 2, 4, 6, 9, 16, 25, 50, and 100 groups of data were taken as the inputs, the respective kappa angles of 10 calibrations were recorded. By comparing them with the ground truth, we calculated the root mean square error (RMSE) of 10 calibrations with different amounts of data, as shown in Table 4. When the data used were less than four groups, the kappa angle could not be exactly iterated due to underdetermined equations. When no less than four groups were used, the kappa angle was accurately calibrated with an error of less than 0.01 ° . However, when more than nine groups of data were used, the accuracy improvement was little compared with that of nine groups of data.
In conclusion, the proposed kappa-angle-calibration method can obtain the accurate kappa angle using the accurate corneal centers and OA directions. To obtain the accurate kappa angle quickly, the the amount of data should be considered. When the data reach a certain amount, the accuracy improvement of kappa-angle calibration is little, and the increase in the the amount of data would reduce the operation speed of the algorithm. Moreover, with the increase in the amount of data, the error of the data itself would also be superimposed. Therefore, it is necessary to carry out data filtering according to the actual situation and select the appropriate the amount of data to calibrate the kappa angle.

4.1.2. Error Analysis

Due to system errors and eye-feature-detection errors, gaze-related parameters such as 3D corneal center, 3D pupil center, and iris center often have estimation errors in a real scene. These errors will be transmitted to the OA and VA of the eyeball subsequently. For example, Lu et al. [23] stated that a 0.5 mm error of 3D pupil estimation in x- or y-axis induced a 3 ° error in gaze angle. Therefore, we analyzed the influences of 3D-corneal-center error and OA error on the kappa-angle calibration, as they are the input of kappa-angle calibration.
Here, in addition to simulating the remote gaze-tracking system described above, we also simulated a head-mounted system for comparison. Four screen corner points were set at (−75, 70, −400), (225, 70, −400), (225, −140, −400), and (−75, −140, −400); and the coordinates of the left corneal centers were generated within certain ranges (x: from 5 mm to 10 mm, y: from 20 mm to 30 mm, z: from 25 mm to 35 mm). Thus, the right corneal centers and OA directions were generated in the same way as the remote system. In combination with the influence of the amount of data, we studied the kappa-angle calibration and the guarantee of a single variable. The same nine groups of data in each system were used for the following error analysis.
(1) Analysis of the 3D-corneal-center error: The independent, zero-mean white Gaussian noise was added to the coordinates of the 3D corneal center to simulate its error. When the standard deviation (SD) of the Gaussian noise increased from 0 to 4 mm, the same noise was simultaneously added to the coordinates of the 3D corneal centers of the left and right eyes. For each SD of noise of the 3D corneal center, 100 calibration results of kappa angle were recorded, and the RMSE was calculated. Figure 7 shows the influence of 3D-corneal-center error on kappa-angle calibration. When the SD of the noise changed from 0 to 4 mm, the error of kappa angle increased. For the remote gaze-tracking system, when the SD of the noise was 0.5 mm, the error of the horizontal angle was about 0.2 ° , and the error of the vertical angle was about 0.13 ° . When the SD of the noise was 4 mm, the errors of the horizontal and vertical angles were larger than 0.4 ° . For the head-mounted system, when the SD of the noise was 0.5 mm, the errors were similar to those of the remote system. However, when the SD of the noise was 4mm, the errors were less than those of the remote system. They were about 0.35 ° and 0.23 ° in the horizontal and vertical planes, respectively.
(2) Analysis of OA error: The OA direction in the camera coordinate system has a unique relation to its horizontal and vertical angles in the eyeball coordinate system. To simulate the OA error, we simultaneously added independent, zero-mean white Gaussian noise with a SD of 0–1 ° to the horizontal and vertical angles of the OA directions of the left and right eyes. The SD of the Gaussian noise varied from 0 to 1 ° in steps of 0.1 ° , and 100 calibration results of kappa angle under the noise were recorded for each value of SD, and the RMSE was calculated. Figure 8 shows the influence of OA error on kappa-angle calibration. The errors in the head-mounted system were generally larger than those in the remote system. For the remote gaze-tracking system, when the SD of the OA noise was 0.1 ° , the errors of the horizontal and vertical angles were about 0.2 ° . When the SD of the OA noise was 1 ° , the errors of the horizontal and vertical angles reached 0.7 ° . For the head-mounted system, when the SD of the OA noise was 0.1 ° , the error of the horizontal angle was about 0.41 ° , and the error of the vertical angle was about 0.3 ° . When the SD of the OA noise was 1 ° , the errors of the horizontal and vertical angles were about 0.85 ° and 0.7 ° in the horizontal and vertical directions, respectively.
(3) Discussion: It can be seen from the above analysis that whether in the remote system or the head-mounted system, the OA error was more influential on the kappa-angle calibration than the 3D-corneal-center error. The influence of 3D-corneal-center error on kappa-angle calibration was comparable when used in these two systems. However, when the calibration method was used in the head-mounted system, the influence of OA error was larger than that in the remote system. The fact that both the 3D-corneal-center error and the OA error have a significant impact on the kappa-angle calibration is a universal problem of all kappa-angle-calibration methods [24]. The reasons are as follows: On one hand, as the origin of the eyeball coordinate system, the 3D corneal center is the key to the transformation between the camera coordinate system and the eyeball coordinate system. In addition, the 3D corneal center is the only intersecting point for the OA and the VA, so it plays a vital role in the VA transformation from the OA. On the other hand, the OA direction is the reference direction of VA transformation. If there is an error in the OA direction, the transformed VA direction also has an error. It will appear as a large deviation in the POR on the screen with a large distance between the user and the screen. Therefore, the key to accurately iterating the kappa angle is to ensure the accuracy of the input data.

4.2. Practical Experiments

A single-camera, two-light-source system was used for our practical experiments. A CMOS camera was located in the middle, and two visible light sources were located on both sides of the camera, as shown in Figure 9. The model of the camera was OV7251. Its imaging resolution is 640 × 480, and the pixel size is 2.2  μ m. The camera parameters were obtained by the test-range calibration technique, and the positions of light sources were calibrated using the global calibration method [25]. It should be noted that the method proposed in this paper can automatically calibrate the kappa angle without pre-defined calibration points. However, the ground truth of the kappa angle of each subject is unknown. Therefore, the explicit gaze points were used in our experiments, for evaluating the performance of kappa-angle calibration by comparing the POR estimated using the calibrated kappa angle with the explicit gaze point.
The experimental process was as follows: Sixteen test points were preset on the screen, and each subject was asked to sit in front of the screen at a distance between 350 and 600 mm without glasses and look at each test point in turn. During this process, the subject was able to slightly move his head in translation, pitch, and yaw. The facial images were collected by the system’s camera synchronously for subsequent eyeball-feature extraction. The required feature parameters included the iris ellipse parameters and the glints of the left and right eyes. They were extracted by using the eyeball-feature extraction method described in our previous work [26] and then used to estimate the 3D corneal center and the OA direction, where the OA direction was expressed by the iris normal vector.

4.2.1. Performance Analysis

In our experiments, the kappa angles of four subjects were automatically calibrated using the proposed method. The iterative range of horizontal and vertical components of kappa angle was −5 ° –5 ° . In order to refine the search, the mutation operator K was set to 0.01, and the crossover operator c was set to 0.1 in consideration of the noise interference. The kappa angle results are listed in Table 5. To verify the accuracy of the kappa angle, the VA direction corresponding to each group of data was calculated using Equation (4). Since our gaze-tracking system has been calibrated [25], the intersection of the VA and the screen was calculated using a point  Q on the screen and the screen normal vector  n scr —that is, the POR, which is expressed as:
S = C + ( Q C ) n scr V va n scr V va .
The average of the PORs of the left and right eyes was taken as the final POR, and the Euclidean distance between the POR and the pre-defined gaze point was regarded as the POR error. The distance from the midpoints of the 3D corneal centers of the left and right eyes to the screen was approximately taken as the distance between the subject and the screen, represented by D. Thus, the gaze accuracy expressed by the RMSE is:
θ = arctan 1 N k = 1 N ( S esti , k S real , k ) 2 D .
Table 6 lists the gaze accuracy of each subject calculated using the calibrated kappa angle. The proposed method can make the average gaze accuracy reach 1.3 ° in the horizontal direction and 1.34 ° in the vertical direction. Since some researchers approximated the OA as the 3D gaze [27,28], we also calculated the intersection of the OA and the screen (POA), and took the average of the POAs of the left and right eyes as the POR. It can be seen that the gaze accuracy was greatly improved by using the proposed calibration method. Especially when the symbols of the horizontal angles (or vertical angles) of the left and right eyes are the same, the average of the POAs of the left and right eyes had a worse compensation effect on the POR. The POR would bias towards the POA with greater error, resulting in lower accuracy. Therefore, it is necessary to consider the kappa angle to improve the gaze accuracy. The automatic kappa-angle-calibration method proposed in this paper can make the gaze estimation reach the accuracy of less than 2 ° , which meets the general application scenario of gaze tracking.

4.2.2. Comparison with Existing Methods

As described in Section 2, there are two main methods for calibrating the kappa angle. One is to calibrate the transformation matrix between the OA and the VA using multiple groups of corresponding OA directions and VA directions obtained from multi-point user calibration, termed the matrix method; the other is to calibrate the horizontal and vertical components of the kappa angle using a single calibration point, termed the angular method. Both methods require explicit calibration points. In this section, we tested these two methods with the same experimental data, and the results are shown in Table 7. For comparison, we also tested the use of constants to represent the kappa angles of the left and right eyes ( [ α L , β L , α R , β R ] = [ 5 ° , 1 . 5 ° , 5 ° , 1 . 5 ° ] ), abbreviated as the constant method.
The gaze accuracy using the matrix method can reach 0.8 ° in the horizontal direction and 1.01 ° in the vertical direction. The matrix method has strong robustness due to the use of multiple known calibration points and has a good transformation effect for the case of large amount of data. However, the explicit calibration process required by this method is relatively complicated, and it is time-consuming for users to look at several on-screen calibration points in turn. The gaze accuracy using the angular method can reach 1.21 ° in the horizontal direction and 1.04 ° in the vertical direction. The angular method uses a single calibration point for the VA transformation, which simplifies the explicit calibration process. However, in order to ensure the gaze accuracy, this method has certain requirements for the selection of calibration points and the robustness of kappa-angle calculation. Although the constant method does not require explicit calibration, it ignores individual differences. In our experiments, the gaze accuracy in both the horizontal and vertical directions was larger than 2 ° . The proposed method is essentially to calibrate the kappa angle by obtaining multiple samples through implicit calibration, which has strong robustness. Although the gaze accuracy is not as good as the matrix method and the angular method, it is nevertheless within an acceptable margin of error, as the accuracy of gaze tracking in X- and Y- directions is usually 0.5 ° –2 ° . It is a development trend of gaze tracking to eliminate explicit user calibration and realize the instant use of gaze-tracking systems, which will promote the wider application of gaze tracking.

5. Conclusions

In this paper, an automatic calibration method of kappa angle based on binocular gaze constraint was proposed. The main idea is to use the coplanar constraint of the VAs of the left and right eyes to establish the optimal objective function of the kappa angles of the left and right eyes, and use the differential evolution algorithm to iterate the optimal kappa angle within its theoretical range. Computer simulations verified the feasibility of our method and pointed out that the accuracy of kappa-angle calibration is closely related to the accuracy of 3D-corneal-center estimation and OA reconstruction. The practical experiments showed that the calibrated kappa angle can make the gaze accuracy reach 1.3 ° in the horizontal direction and 1.34 ° in the vertical direction—making our method significantly better than the method without considering the kappa angle. Although the gaze accuracy using the kappa angle auto-calibrated by the proposed method is lower than that obtained by the explicit calibration method, it is still within the acceptable range. Compared with explicit calibration, the automatic calibration method of kappa angle is more in line with the demand for instant use and can provide a better user experience.
It should be noted that the proposed method has the following limitations: (1) We do not consider ethnic differences. The validation experiments were only conducted on a few young Asians. The applicability to other ethnic groups is currently beyond our scope. (2) For special populations such as strabismus or amblyopia, the VAs of the left and right eyes may not be coplanar, which also affects the applicability of the proposed method. (3) The current research status restricts the head rolling of the user. The superposition of head and eyeball movements constitutes the spatial movement of the eyeball. When the head moves in translation, pitch, or yaw, the eyeball parameters will change accordingly, and the head rolling will generate a rotation component of the OA around itself. This rotation component cannot be characterized by analyzing the visual features of the eyeball or calculating the spatial points on the OA, but it will result in changes in the relative position relationship between the OA and the VA, affecting the kappa-angle calibration.
Therefore, expanding upon our research, we are now focusing on the distribution law of kappa angle under natural head movement, especially the influence of head roll on kappa-angle calibration. We attempted to construct head and eyeball postures using some facial features to accurately characterize the OA of the eyeball, thereby ensuring the accuracy of the subsequent kappa-angle calibration, which is also the basis for achieving accurate gaze estimation under natural head movement. In addition, the feasibility analysis of the automatic calibration of the kappa angle considering multiple factors, such as race, gender, age, and health status, and developing it into a universal model, is also our future work.

Author Contributions

Conceptualization, J.L. and J.C.; methodology, J.L. and H.S.; software simulation, J.L.; experimental validation, J.L.; writing—original draft preparation, J.L. and H.S.; writing—review and editing, J.L. and J.C.; funding acquisition, J.L. and J.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded in part by the National Science Foundation for Young Scholars of China under grant 62206016, in part by the Beijing Municipal Natural Science Foundation under grant 4212023, in part by the Guangdong Basic and Applied Basic Research Foundation under grant 2022A1515140016, and in part by the Fundamental Research Funds for the Central Universities under grant FRF-GF-20-04A.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available because they have not been ordered and stored in a clear and manageable form.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Park, J.H.; Park, J.B. A novel approach to the low cost real time eye mouse. Comput. Stand. Interfaces 2016, 44, 169–176. [Google Scholar] [CrossRef]
  2. Qian, K.; Arichi, T.; Price, A.; Dall’Orso, S.; Eden, J.; Noh, Y.; Rhode, K.; Burdet, E.; Neil, M.; Edwards, A.D.; et al. An eye tracking based virtual reality system for use inside magnetic resonance imaging systems. Sci. Rep. 2021, 11, 427. [Google Scholar] [CrossRef]
  3. Wyder, S.; Hennings, F.; Pezold, S.; Hrbacek, J.; Cattin, P.C. With Gaze Tracking Toward Noninvasive Eye Cancer Treatment. IEEE Trans. Biomed. Eng. 2016, 63, 1914–1924. [Google Scholar] [CrossRef] [PubMed]
  4. Hu, Z.; Lv, C.; Hang, P.; Huang, C.; Xing, Y. Data-driven Estimation of Driver Attention using Calibration-free Eye Gaze and Scene Features. IEEE Trans. Ind. Electron. 2022, 69, 1800–1808. [Google Scholar] [CrossRef]
  5. Dierkes, K.; Kassner, M.; Bulling, A. A fast approach to refraction-aware eye-model fitting and gaze prediction. In Proceedings of the 2019 Symposium on Eye Tracking Research and Applications (ETRA’19), Denver, CO, USA, 25–28 June 2019; pp. 1–9. [Google Scholar]
  6. Zhu, Z.; Ji, Q. Novel Eye Gaze Tracking Techniques Under Natural Head Movement. IEEE Trans. Biomed. Eng. 2008, 54, 2246–2260. [Google Scholar]
  7. Wan, Z.; Xiong, C.; Chen, W.; Zhang, H.; Wu, S. Pupil-Contour-Based Gaze Estimation With Real Pupil Axes for Head-Mounted Eye Tracking. IEEE Trans. Ind. Inform. 2022, 18, 3640–3650. [Google Scholar] [CrossRef]
  8. Guestrin, E.D.; Eizenman, M. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 2006, 53, 1124–1133. [Google Scholar] [CrossRef] [PubMed]
  9. Hennessey, C.; Lawrence, P. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions. IEEE Trans. Biomed. Eng. 2009, 56, 790–799. [Google Scholar] [CrossRef] [PubMed]
  10. Li, J.; Li, S. Gaze Estimation From Color Image Based on the Eye Model With Known Head Pose. IEEE Trans. Hum.-Mach. Syst. 2016, 46, 414–423. [Google Scholar] [CrossRef]
  11. Wang, K.; Ji, Q. Real time eye gaze tracking with Kinect. In Proceedings of the 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, 4–8 December 2016; pp. 2752–2757. [Google Scholar]
  12. Sun, L.; Liu, Z.; Sun, M.T. Real time gaze estimation with a consumer depth camera. Inf. Sci. 2015, 320, 346–360. [Google Scholar] [CrossRef]
  13. Zhou, X.; Cai, H.; Shao, Z.; Yu, H.; Liu, H. 3D eye model-based gaze estimation from a depth sensor. In Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, China, 3–7 December 2016; pp. 369–374. [Google Scholar]
  14. Nagamatsu, T.; Kamahara, J.; Tanaka, N. 3D Gaze Tracking with Easy Calibration Using Stereo Cameras for Robot and Human Communication. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008. [Google Scholar]
  15. Nagamatsu, T.; Hiroe, M.; Arai, H. Extending the Measurement Angle of a Gaze Estimation Method Using an Eye Model Expressed by a Revolution about the Optical Axis of the Eye. IEICE Trans. Inf. Syst. 2021, 104, 729–740. [Google Scholar] [CrossRef]
  16. Liu, J.; Chi, J.; Lu, N.; Yang, Z.; Wang, Z. Iris Feature-Based 3-D Gaze Estimation Method Using a One-Camera-One-Light-Source System. IEEE Trans. Instrum. Meas. 2020, 69, 4940–4954. [Google Scholar] [CrossRef]
  17. Xiong, X.; Cai, Q.; Liu, Z.; Zhang, Z. Eye gaze tracking using an RGBD camera: A comparison with an RGB solution. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA, 13–17 September 2014; pp. 1113–1121. [Google Scholar]
  18. Wan, Z.; Wang, X.; Yin, L.; Zhou, K. A Method of Free-Space Point-of-Regard Estimation Based on 3D Eye Model and Stereo Vision. Appl. Sci. 2018, 8, 1769. [Google Scholar] [CrossRef] [Green Version]
  19. Nagamatsu, T.; Sugano, R.; Iwamoto, Y.; Kamahara, J.; Tanaka, N. User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes. In Proceedings of the 2010 Symposium on Eye-Tracking Research and Applications, Austin, TX, USA, 22–24 March 2010; pp. 251–254. [Google Scholar]
  20. Wang, K.; Ji, Q. 3D gaze estimation without explicit personal calibration. Pattern Recognit. 2018, 79, 216–227. [Google Scholar] [CrossRef]
  21. Zhou, X.; Cai, H.; Li, Y.; Liu, H. Two-eye model-based gaze estimation from a Kinect sensor. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 1646–1653. [Google Scholar]
  22. Villanueva, A.; Cabeza, R. Evaluation of Corneal Refraction in a Model of a Gaze Tracking System. IEEE Trans. Biomed. Eng. 2008, 55, 2812–2822. [Google Scholar] [CrossRef] [PubMed]
  23. Lu, C.; Chakravarthula, P.; Liu, K.; Liu, X.; Li, S.; Fuchs, H. Neural 3D Gaze: 3D Pupil Localization and Gaze Tracking based on Anatomical Eye Model and Neural Refraction Correction. In Proceedings of the 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Singapore, 17–21 October 2022; pp. 375–383. [Google Scholar]
  24. Model, D.; Eizenman, M. An automatic personal calibration procedure for advanced gaze estimation systems. IEEE Trans. Biomed. Eng. 2010, 57, 1031–1039. [Google Scholar] [CrossRef] [PubMed]
  25. Chi, J.; Yang, Z.; Zhang, G.; Liu, T.; Wang, Z. A Novel Multi-Camera Global Calibration Method for Gaze Tracking System. IEEE Trans. Instrum. Meas. 2020, 69, 2093–2104. [Google Scholar] [CrossRef]
  26. Liu, J.; Chi, J.; Hu, W.; Wang, Z. 3D Model-Based Gaze Tracking Via Iris Features With a Single Camera and a Single Light Source. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 75–86. [Google Scholar] [CrossRef]
  27. Ohno, T.; Mukawa, N.; Yoshikawa, A. FreeGaze: A Gaze Tracking System for Everyday Gaze Interaction. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, New Orleans, LA, USA, 25–27 March 2002; pp. 125–132. [Google Scholar]
  28. Morimoto, C.H.; Amir, A.; Flickner, M. Detecting eye position and gaze from a single camera and 2 light sources. In Proceedings of the 2002 International Conference on Pattern Recognition, Quebec City, QC, Canada, 11–15 August 2002; pp. 314–317. [Google Scholar]
Figure 1. Kappa-angle representations. (a) Using a transformation matrix. (b) Using a horizontal angle and a vertical angle.
Figure 1. Kappa-angle representations. (a) Using a transformation matrix. (b) Using a horizontal angle and a vertical angle.
Sensors 23 03929 g001
Figure 2. Procedure of the proposed method.
Figure 2. Procedure of the proposed method.
Sensors 23 03929 g002
Figure 3. Visual-axis representation using kappa angle.
Figure 3. Visual-axis representation using kappa angle.
Sensors 23 03929 g003
Figure 4. Binocular gaze constraint.
Figure 4. Binocular gaze constraint.
Sensors 23 03929 g004
Figure 5. Iteration of kappa-angle calibration in the third round.
Figure 5. Iteration of kappa-angle calibration in the third round.
Sensors 23 03929 g005
Figure 6. Distribution of gaze points.
Figure 6. Distribution of gaze points.
Sensors 23 03929 g006
Figure 7. Influence of 3D-corneal-center error on kappa-angle calibration.
Figure 7. Influence of 3D-corneal-center error on kappa-angle calibration.
Sensors 23 03929 g007
Figure 8. Influence of OA error on kappa-angle calibration.
Figure 8. Influence of OA error on kappa-angle calibration.
Sensors 23 03929 g008
Figure 9. Experimental system.
Figure 9. Experimental system.
Sensors 23 03929 g009
Table 1. Kappa angles obtained from 10 calibrations.
Table 1. Kappa angles obtained from 10 calibrations.
Number of Timesiter_num   α L / °   β L / °   α R / °   β R / ° fun_err
1334.9961.498−5.0031.499   4.9 × 10 10
2305.0031.506−4.9991.506   7.3 × 10 10
3324.9931.499−5.0071.499   8.9 × 10 10
4335.0061.496−4.9931.495   3.2 × 10 10
5404.9991.498−5.0001.498   4.1 × 10 10
6355.0061.499−4.9951.499   6.0 × 10 10
7355.0081.496−4.9911.495   5.0 × 10 10
8355.0011.502−5.0001.502   9.4 × 10 10
9395.0001.503−5.0011.503   8.2 × 10 10
10275.0141.496−4.9851.495   8.8 × 10 10
Table 2. Kappa angle calibrated from different sets of ground truth.
Table 2. Kappa angle calibrated from different sets of ground truth.
Ground Truth of Kappa Angle/ ° iter_num   α L / °   β L / °   α R / °   β R / ° fun_err
[5,1.5]465.0011.499−4.9991.499   1.8 × 10 11
[4,1.5]404.0021.501−3.9981.501   8.0 × 10 11
[6,1.5]416.0021.501−5.9991.501   4.3 × 10 11
[5,1]415.0001.001−5.0011.001   7.7 × 10 11
[5,2]455.0002.000−5.0002.000   3.0 × 10 11
[4,0.5]413.9990.500−4.0010.500   1.5 × 10 11
Table 3. Kappa-angle calibration results with different gaze angles.
Table 3. Kappa-angle calibration results with different gaze angles.
Regioniter_num   α L / °   β L / °   α R / °   β R / ° fun_err
1384.9991.501−5.0001.501   2.4 × 10 11
2344.9991.500−5.0001.500   7.3 × 10 12
3404.9991.503−5.0011.503   1.5 × 10 11
4394.9981.498−5.0021.498   4.2 × 10 11
Table 4. Influence of amount of data on kappa-angle calibration.
Table 4. Influence of amount of data on kappa-angle calibration.
Data AmountRMSE/ °
  α L   β L   α R   β R
11.1860.7001.0590.619
21.0140.7050.7260.722
40.0040.0030.0050.003
60.0030.0030.0030.003
90.0020.0010.0020.002
160.0020.0010.0020.001
250.0010.0010.0010.001
500.0010.0010.0010.001
1000.0010.0010.0010.001
Table 5. Auto-calibrated kappa angles of different subjects.
Table 5. Auto-calibrated kappa angles of different subjects.
Subjects   α L / °   β L / °   α R / °   β R / °
12.032.794.410.76
22.050.66−4.970.87
30.05−4.88−1.22−2.84
40.63−2.71−2.910.60
Table 6. Gaze accuracy of different subjects.
Table 6. Gaze accuracy of different subjects.
SubjectsGaze Accuracy/ °
Use Auto-Calibrated Kappa AngleUse the Average of POAs
1X:0.82, Y:1.29X:4.27, Y:2.91
2X:2.05, Y:1.53X:2.97, Y:2.19
3X:1.61, Y:1.82X:2.23, Y:6.86
4X:0.71, Y:0.71X:1.26, Y:1.28
AverageX:1.30, Y:1.34X:2.68, Y:3.31
Table 7. Comparison of different methods.
Table 7. Comparison of different methods.
MethodsMatrix MethodAngular MethodConstant MethodProposed Method
RequireMulti-point explicit calibrationSingle-point explicit calibrationWithout explicit calibrationWithout explicit calibration
Gaze accuracy/ ° X:0.8, Y:1.01X:1.21, Y:1.04X:2.08, Y:2.11X:1.30, Y:1.34
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, J.; Chi, J.; Sun, H. An Automatic Calibration Method for Kappa Angle Based on a Binocular Gaze Constraint. Sensors 2023, 23, 3929. https://doi.org/10.3390/s23083929

AMA Style

Liu J, Chi J, Sun H. An Automatic Calibration Method for Kappa Angle Based on a Binocular Gaze Constraint. Sensors. 2023; 23(8):3929. https://doi.org/10.3390/s23083929

Chicago/Turabian Style

Liu, Jiahui, Jiannan Chi, and Hang Sun. 2023. "An Automatic Calibration Method for Kappa Angle Based on a Binocular Gaze Constraint" Sensors 23, no. 8: 3929. https://doi.org/10.3390/s23083929

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop