Next Article in Journal
From Human to Autonomous Driving: A Method to Identify and Draw Up the Driving Behaviour of Connected Autonomous Vehicles
Next Article in Special Issue
Adaptive Individual-Level Cognitive Driving Anomaly Detection Model Exclusively Using BSMs
Previous Article in Journal
Advantage Actor-Critic for Autonomous Intersection Management
Previous Article in Special Issue
Adaptive Driving Style Classification through Transfer Learning with Synthetic Oversampling
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Driving with a Haptic Guidance System in Degraded Visibility Conditions: Behavioral Analysis and Identification of a Two-Point Steering Control Model

1
Département Automatique Productique et Informatique, IMT Atlantique, F-44307 Nantes, France
2
LS2N, UMR 6004, CNRS, École Centrale Nantes, IMT Atlantique, Nantes Université, F-44000 Nantes, France
*
Author to whom correspondence should be addressed.
Vehicles 2022, 4(4), 1413-1429; https://doi.org/10.3390/vehicles4040074
Submission received: 6 November 2022 / Revised: 21 November 2022 / Accepted: 12 December 2022 / Published: 15 December 2022
(This article belongs to the Special Issue Driver-Vehicle Automation Collaboration)

Abstract

:
The objective of this study is to determine the ability of a two-point steering control model to account for the influence of a haptic guidance system in different visibility conditions. For this purpose, the lateral control of the vehicle was characterized in terms of driving performance as well as through the identification of anticipation and compensation parameters of the driver model. The hypothesis is that if the structure of the model is valid in the considered conditions, the value of the parameters will change in coherence with the observed behavior. The results of an experiment conducted on a driving simulator demonstrate that the identified model can account for the cumulative influence of the haptic guidance system and degraded visibility. The anticipatory gain is sensitive to changes in driving conditions that have a direct influence on the produced trajectory, and the compensatory gain is sensitive to a decrease in the variability of the lateral position. However, a model with only the steering wheel angle as output is not able to determine whether the change in lateral position variability is due to the driver’s lack of anticipation or to the assistance provided by the haptic guidance system.

1. Introduction

Advanced driver assistance systems for road vehicles have been intensively studied in recent years due to their potential to improve driving comfort and safety. Part of this research focuses on avoiding or reducing persistent issues in human–automation interaction. These issues are usually caused by a lack of effective communication between the driver and the assistance system. A specific mode of assistance, called haptic shared control, has been proposed and meets commonly formulated design guidelines for human–automation interaction, especially in automotive applications [1,2,3,4]. Applied to the steering control (i.e., haptic guidance system), an automation system continuously provides human drivers with additional assisting torque through the steering wheel. Because the steering wheel is controlled simultaneously by the human driver and automation system, it acts as a communication interface between them. In this case, the driver can feel the additional force generated by the automation system and can decide whether to follow the “optimal” control operated by the system or to override it if necessary. As such, shared control is distinct from systems that binarily switch authority between humans and machines. The benefits of haptic guidance systems have been observed in lane-keeping performance [5,6,7,8,9,10].
The effectiveness of haptic guidance systems depends on various aspects, including the design method and its parameters [11], the level of authority of the system [12], the driver’s age and driving experience [7,8], fatigue [10], and the driver’s reliance on the system [13]. In addition to system- or driver-related aspects, the influence of the driving environment, especially visibility conditions, is a key issue because of the predominant role of visual information in driving [14]. Indeed, several studies have already investigated the benefits of haptic guidance systems in degraded visibility conditions [15,16]. Driving performance was evaluated through behavioral analysis, which consists of statistical comparison of different metrics such as the steering wheel reversal rate, steering effort, and mean lateral position of the vehicle. It has been shown that a haptic guidance system is more effective in compensating for loss of nearby visual information than loss of far-off vision [15]. Moreover, a study varying the level of haptic authority of a guidance system has shown that the optimum distribution of control between the driver and the system depends on the visibility conditions, with drivers relying more on the system in the presence of fog [16].
To better understand how haptic guidance systems modify the lateral control of the vehicle as a function of visibility conditions, this study proposes to adopt a model-based analysis method in addition to behavioral analysis. Our objective is to link the results of the behavioral analysis to the parameter values of a driver steering model in order to understand the extent to which the model can account for the combined effect of haptic guidance and visibility on driving performance. Driver models are useful tools for understanding information processing in humans as well as for estimating the driver’s state. Recent work has shown, for example, that it is possible to discriminate different driver distraction modalities through parametric analysis of a steering control model [17,18]. In addition, these models can be used for the realization of controllers in haptic guidance systems [19]. Controllers based on driver models have many advantages, such as reduction of undesired steering conflicts with the guidance system, smooth authority transition, and consideration of the driver’s state [20,21,22,23,24,25,26,27,28,29,30].
The model chosen to meet the objective of this paper is similar to the two-point model proposed by [31], and is in line with [32]. It has been subsequently been established that steering on a sinuous road can be summarized by the complementary action of two processes: the visual anticipation of the road’s curvature and the compensation of lateral positioning errors [16,31,32,33,34,35]. The anticipation process refers to the visual exploration of the road ahead, while the compensation process represents the task of maintaining an appropriate distance from the edges of the lane. The two-point model combines anticipation and compensation by considering a simple additive process, and predicts the steering wheel angle. This model has been validated by [31] in three different contexts. First, the two-point model can account for the effect of partial occlusion of the visual scene (driving with near and/or far vision) [33]. Furthermore, the model can predict human-like steering profiles for perturbation correction and lane change maneuvers.
Our assumption for the model-based analysis is that while the model structure remains valid in the presence of haptic guidance and reduced visibility, the parameter values will change accordingly. Specifically, it is assumed that as visibility is degraded, the steering wheel angle depends more on error compensation than on anticipation. Therefore, the hypothesis of a decrease in the anticipation gain and an increase in the compensation gain can be put forward. When driving with the haptic guidance system, the driver can rely on the system, which may facilitate lane keeping and smooth steering control. In this case, an increase in the anticipation gain and a decrease in the compensation gain may be observed. If there is an interaction between the two factors, which could have an antagonistic action on certain behavioral indicators, it will be interesting to examine how the model parameters capture this phenomenon.
In summary, the objective of this study is to verify the extent to which a simple and robust model such as the two-point steering model is likely, through its parameters, to “explain” the behavior of the driver-assistance system as a function of haptic guidance and visibility. Achieving this goal could help in the design of haptic guidance systems for better human–ADAS interaction. The remainder of this paper is organized as follows: Section 2 presents the way the experiment was conducted and a brief description of the haptic guidance system. The methods of data analysis, including identification of the model parameters, are described in Section 3. Section 4 presents the results obtained from the behavioral analysis and model parameter identification. Section 5 summarizes the effects of the experimental conditions on driving performance and the model parameters. Finally, Section 6 concludes with the predictive capability of the selected model and the future prospects for this work.

2. Experiment

2.1. Participants

A group of fifteen, comprising eleven male and four female participants, took part in the experiment (mean age = 30.3 years, SD = 8.4 years). The participants were primarily recruited from students and staff of the Laboratory of Digital Sciences in Nantes and Institute Mines-Télécom Atlantique Nantes. All participants possessed a valid driver’s license with at least three years of driving experience. The participants had no known medical issues that could affect their driving skills and had normal or corrected vision. None of them had ever experienced a haptic guidance system. The research was conducted in accordance with the standards of the CNRS ethical committee and with the 1964 Declaration of Helsinki. Informed consent was obtained from all individual participants involved in the study.

2.2. Independent Variables

Two independent variables were manipulated in this study according to our objectives: the visibility conditions of the driving environment (F) and the haptic guidance system (A). Visibility was manipulated by applying a thick fog in the driving scene (Figure 1, upper left and lower left), making it difficult for the driver to anticipate changes in road curvature. The visibility variable therefore has two degrees: with fog (+F) or without fog (−F). The same applies to the haptic guidance system which has two conditions: with guidance (+A) and without guidance (−A). The combination of these two variables provides the following four different experimental scenarios:
  • without fog, without haptic guidance (−F, −A);
  • without fog, with haptic guidance (−F, +A);
  • with fog, without haptic guidance (+F, −A);
  • with fog, with haptic guidance (+F, +A).
Figure 1. (a) Upper Left: Screenshot of scenario without fog; (b) Lower Left: Screenshot of scenario with fog; (c) Right: Fixed base driving simulator SCANeR.
Figure 1. (a) Upper Left: Screenshot of scenario without fog; (b) Lower Left: Screenshot of scenario with fog; (c) Right: Fixed base driving simulator SCANeR.
Vehicles 04 00074 g001

2.3. Haptic Guidance System

The haptic guidance system employed in this study was developed by [26]. The total guidance torque applied on the steering wheel Γ a comes from anticipatory and compensatory assistance (Figure 2):
Γ a = Γ a r e f + Γ a f b
Figure 2. Design strategy of Haptic Guidance System.
Figure 2. Design strategy of Haptic Guidance System.
Vehicles 04 00074 g002
The anticipatory assistance, that is, the feed-forward (FF) controller, is a trajectory generator that generates the reference vehicle–road model states and the torque control on the basis of the previewed road curvature ρ p r e v i e w e d :
Γ r e f x r e f = K F F ( p ) ρ p r e v i e w e d
where K F F ( p ) represents the transfer function of the trajectory generator. The applied FF torque is then determined by the sharing level of the anticipatory part: Γ a r e f = α a n t Γ r e f .
The compensatory assistance, namely, the feedback controller, is static output feedback obtained by an H 2 / H multi-objective control synthesis and determined by the sharing level of the compensatory part:
Γ a f b = α c o m p K f b ( x v r x r e f )
The sharing levels are used for regulating the total haptic guidance torque. When the sharing level is highly in favor of automation (close to 100%, see [26]), the haptic guidance system can drive the vehicle by itself. In this study, both sharing levels were fixed at 50%, which provides a system that delivers clear haptic guidance, although it will eventually leave the lane during navigation on a curve without driver intervention.

2.4. Apparatus

The experiment in this study was conducted on a fixed-base driving simulator powered by SCANeR Studio (Figure 1, right). It is equipped with a complete dashboard; a five-speed gear stick; gas, brake and clutch pedals; and a steering wheel connected to a TRW steering system. A torque sensor is mounted on the steering column to measure driver steering torque. Specifically, the sensor measures the torsion of the column on a specific section between the steering wheel and a motor, which is responsible for providing the torque feedback and the assistance torque. The estimation of the torque applied to the steering wheel by the driver was obtained after subtracting the torque feedback (auto-alignment torque and steering column emulation) and assistance torque (if any) from the measured net torque, with a small inaccuracy due to residual friction. The visual scene is displayed on three LCD screens: a central one in front of the driver and two others oriented at 45° relative to the center. The screens cover a field of view of 25° in height and 115° in width. A small family car, the Citroën C5, was chosen as the vehicle model in this experiment.

2.5. Scenarios

The track used for the four experimental scenarios is a two-lane road, with lane 3.5 m wide, and is shown in Figure 3. It consists of eleven bends of various length (from 150 m to 650 m) and radius of curvature (from 50 m to 200 m). All bends are Euler spirals with continuous changes in the curvature. The longitudinal speed of the vehicle was set at 64 km/h (18 m/s). This speed was chosen according to the curvature of the bends in the track and the fog density to ensure that all bends could be negotiated. A brief introduction to each scenario, including the operation of the haptic guidance system, was provided to the drivers at the beginning of the experiment. The participants were instructed to drive in the right-hand lane without making any lane changes. The order of the four scenarios was randomized between participants. Moreover, each scenario lasted 10 min, which is equivalent to nearly two laps on the track at the chosen speed. Finally, participants were allowed to take a short break between scenarios.

3. Data Analysis Methods

3.1. Driving Metrics

3.1.1. Steering Performance

Steering performance was assessed using the steering wheel reversal rate (SWRR). This measure represents the frequency of changes in the direction of steering wheel rotation by a magnitude greater than a given angle. This is called the gap, and is one of the most commonly used measures of steering performance [36,37]. The algorithm proposed in [37] with a gap of 2° has been adopted here. SWRR is expected to decrease with haptic guidance and to increase with fog.

3.1.2. Lane-Keeping Performance

Lane-keeping performance was evaluated using the deviation y a of the lateral position of the vehicle’s center of gravity from the center of the lane (in meters). The mean and the standard deviation of y a were chosen as the metrics. In addition, y a was considered positive if the vehicle deviated from the lane center towards the inner edge of the lane, ensuring that the mean provided information on the driver’s behavior in terms of cornering independently of the direction of the curve. In contrast, y a was negative if the deviation was towards the outside of the bend. Because the deviation of the lateral position on the simulator used the center of the lane as a reference, the adjustment is as shown in (4) and (5). The calculation of the standard deviation of the lateral position (SDLP) is based on the non-adjusted deviation of the lateral position y a [ k ] . The y ¯ a is expected to decrease with haptic guidance. Alternatively, this might happen if the driver adopts a more conservative path planning behavior with less corner-cutting. The SDLP is expected to decrease with haptic guidance and fog.
For a deviation of the lateral position signal vector y a [ k ] , k = 1 , 2 , , N , the adjusted mean is
y ¯ a = 1 N k = 1 N y a [ k ]
where y a [ k ] is the adjusted deviation of the lateral position
y a [ k ] = y a [ k ] , i f ρ [ k ] 0 y a [ k ] , i f ρ [ k ] < 0
with ρ [ k ] denoting the sampled road curvature vector.

3.1.3. Driver Control Effort

The driver control effort is evaluated through the torque Γ d applied by the driver on the steering wheel. The L 2 -norm, which is equivalent to the signal energy, is chosen as the metric ( Γ d 2 ). The calculation is as follows:
Γ d 2 = k = 0 N Γ d 2 [ k ]
where N is the number of samples; Γ d 2 is expected to decrease with haptic guidance.

3.2. Model Identification

The model chosen in this study is shown in Figure 4a. It is a two-point visual model inspired by [18,31,32,38,39]. In this model, the steering wheel angle δ S W is a combination of anticipatory and compensatory behavior in a lateral control task. The anticipatory part represents the behavior of looking far ahead to determine the road direction, and the compensatory part corresponds to the use of nearby visual information to maintain the vehicle in the lane. Two points on the road, a far and a near point, are chosen to represent the areas where the visual information is acquired. The angles between the direction of these points and the heading of the vehicle, which are defined as the far-point angle θ f a r and near-point angle θ n e a r , respectively, are used as inputs in the model. The calculation of these inputs is shown in Figure 4b and in (7). For the far-point angle, it is assumed that the vehicle heading is aligned with the road. For the near point angle, it is assumed that the line segment y L is perpendicular to the vehicle heading. Here, y L is the lateral position error at distance l s in front of the vehicle, which is directly available from the vehicle–road (VR) model (see [26]). The model parameters are described in Table 1.
θ f a r D f a r × ρ , θ n e a r y L / l s
By approximating the delay block using a first-order Padé model, the minimal realization of the two-point visual model shown in Figure 4a can be written as follows:
x ˙ ( t ) = A x ( t ) + B u ( t ) y ( t ) = C x ( t ) + D u ( t )
with
y ( t ) = δ S W ( t ) , u ( t ) = θ f a r ( t ) θ n e a r ( t ) , x ( t ) = x 1 ( t ) x 2 ( t ) A = 1 ( T I 0 2 τ p K c v ( T L T I 1 ) 2 τ p B = 0 1 ( T I 2 τ p K p 2 τ p K c v T L T I C = K c v ( T L T I 1 ) 2 D = K p K c v T L T I
The model identification aims to estimate the parameter values by the prediction error minimization (PEM) [40] method using measured input and output data from the experiment. The system identification toolbox in MATLAB is used to compute the identification results. The criterion is as follows:
J = 1 N k = 1 N e 2 [ k ]
where e [ k ] represents the difference between the measured output and predicted output of the model (Figure 5):
e [ k ] = δ S W [ k ] δ ^ S W [ k ] .
In this study, the visual anticipatory and compensatory gain ( K p and K c ) are chosen as the parameters to be identified and the other parameters are fixed at their nominal value (the longitudinal speed is fixed at 18 m/s; see Section 2.5). Although the concepts of anticipation and compensation have certain commonalities with the strategy of the haptic guidance system, it is beyond the scope of this paper to evaluate the relevance of this control architecture, as the sharing level of the anticipatory and compensatory controller have not been manipulated. In addition, the two-point visual model represents a combination of the dynamics of the lateral control and steering wheel (LC-SW) system in Figure 5. This LC-SW system covers both human driver behavior and haptic guidance behavior, including extreme cases of human or haptic guidance driving alone. By identifying the parameters, we aim to project the input–output relationship of the LC-SW system onto the visual model in order to check the contribution of each input signal to the output signal. In other words, if we obtain a decrease in the anticipatory gain or an increase in the compensatory gain, the steering wheel angle ( δ S W ) is determined more by the anticipatory than the compensatory part. Such results can provide an idea of how the steering control strategy changes according to the conditions.
By defining the parameter vector ξ = K p K c , we have
(11) ξ = arg min ξ J ( ξ ) (12) = arg min ξ 1 N k = 1 N ( δ S W [ k ] δ ^ S W [ k ; ξ ] ) 2
Here, the notation δ ^ S W [ k ; ξ ] represents the fact that the predicted output is a function of the parameter vector ξ . According to Figure 4a, it can be written as
δ ^ S W [ k ; ξ ] = ξ T U [ k ]
where U [ k ] is
θ f a r ( t k ) T I v T I θ n e a r ( t k ) + T I T L v T I 2 τ p + θ n e a r ( τ ) e 1 T I ( t k τ ) d τ
and t k = k T s τ p , with T s denoting the sampling period. By substituting (13) in (12), the criterion becomes the loss function in the linear regression problem. Therefore, the optimization is convex and there is only one global solution for ξ .

3.3. Validation of Identified Model

Before comparing the values of the identified parameters, each model obtained for every participant must be validated by verifying the model fit and parameter uncertainties. The fit percentage is calculated as follows:
F I T = 1 δ S W δ ^ S W 2 δ S W m δ S W 2 × 100 %
where m δ S W = 1 N k = 1 N δ S W [ k ] is the arithmetical mean of the measured steering wheel angle and δ ^ S W is the predicted steering wheel angle. The value of F I T varies between (worst) and 100% (best).
The magnitude of the parameter uncertainties provides a measure of the model’s reliability. When model parameters are estimated from the data, these estimates are precise within a confidence region that can be assessed. The size of this region is determined by the value of the parameter uncertainties computed during the estimation. It is important to always verify this information before exploiting the value of the identified parameters. Supposing an estimated parameter vector θ ^ ( M ) = K ^ p K ^ c T and its limit θ * (the “true" parameter vector), under certain conditions, the following is known [40]:
M ( θ ^ ( M ) θ * ) A s N ( 0 , Q θ )
where M is the number of data samples and Q θ is the asymptotic covariance matrix of θ . This demonstrates that the distribution of the random variable M ( θ ^ ( M ) θ * ) converges asymptotically to the normal distribution N ( 0 , Q θ ) if M tends to infinity. In addition, we obtain the following:
( θ ^ ( M ) θ * ) T Q θ 1 ( θ ^ ( M ) θ * ) χ 2 ( d )
where d = dim θ ^ ( M ) = 2 (indicating that two parameters are to be identified) and χ 2 ( d ) is the chi-square distribution with d degrees of freedom. In this case, a confidence level α (e.g., 90%) can be chosen to obtain a confidence region:
P ( θ ^ ( N ) θ * ) T Q θ 1 ( θ ^ ( N ) θ * ) χ α 2 ( d ) = α
with χ α 2 ( d ) as the α -level of the χ 2 ( d ) -distribution. This equation indicates that, with α probability, the ”true” parameter vector lies inside an ellipsoid (an ellipse in our case) defined in the R d space, of which the center is θ ^ ( N ) and the axes are determined by the eigenvalues and eigenvectors of χ α 2 ( d ) Q θ . For each participant, we verified whether any intersection exists between the model ellipses from scenario i and j by defining the value d i j , which is the difference between the Euclidean distance of the center of two ellipses and the sum of two semi-major axes, as follows:
d i j = θ ^ i θ ^ j 2 ( λ i + λ j )
where λ i and λ j are the largest eigenvalues of matrices χ α 2 ( d ) Q θ i and χ α 2 ( d ) Q θ j , respectively. No intersection exists if d i j > 0 , which indicates that these two models are statistically different; thus, it is likely that any difference between identified parameters should be caused by the independent experimental variables rather than the parameter uncertainties.

3.4. Summary Diagram

Figure 5 illustrates how the data analysis was conducted in this study. The lateral control process is represented by white boxes. The human driver performs the task with or without the haptic guidance system through the steering column and generates the steering wheel angle as vehicle input. The vehicle interacts with the road and provides ad hoc information to the driver and haptic guidance system. The “Signal Generation” block represents the computational interface, allowing for the state of the vehicle–road model in order to format or even estimate the signals input to the driver model and the haptic guidance system (see Figure 2 and [26]). The performance of the lateral control task is evaluated using the metrics in the green boxes and the identified parameters of the LC-SW system in the red box.

4. Results

All data from each scenario were used to calculate the metrics and to identify the two model parameters. To represent the distribution of participants, the results are presented in box plots around the mean value for all participants. The red bar is the median. The blue triangles are mean values. The blue box represents the first quartile and the third quartile data points. The whiskers represent the minimum and maximum data points which are not beyond the interquartile range. The red crosses are outliers.
A repeated-measures analysis of variance (ANOVA) with two within-subject factors was used to evaluate the effects of the visibility (F) and haptic guidance (A). The differences between the degrees of the independent variables were considered statistically significant for p-values less than 0.05.

4.1. Steering Performance

The SWRR per minute with a gap-size of 2° indicates a significant main effect for both F ( F ( 1 , 14 ) = 69.92 ,   p < 0.001 ,   η p 2 = 0.833 ) and A ( F ( 1 , 14 ) = 14.42 ,   p < 0.005 ,   η p 2 = 0.507 ), as shown in Figure 6. The SWRR is significantly higher with fog (+F) than without fog (−F). With haptic guidance (+A), the SWRR is significantly lower than it is without haptic guidance (−A). No significant interaction effect exists between F and A ( F ( 1 , 14 ) = 2.7087 ,   p = 0.12 ,   η p 2 = 0.162 ).

4.2. Lane-Keeping Performance

The mean of adjusted deviation of the lateral position ( y ¯ a ) shows a significant main effect for A ( F ( 1 , 14 ) = 10.73 ,   p < 0.01 ,   η p 2 = 0.434 ) and not for F ( F ( 1 , 14 ) = 3.30 , p = 0.091, η p 2 = 0.191 ), Figure 7a). With haptic guidance (+A), y ¯ a is significantly lower than without haptic guidance (−A), which indicates that curve-cutting behavior was reduced. No significant interaction effect exists between F and A ( F ( 1 , 14 ) = 0.49 ,   p = 0.49 ,   η p 2 = 0.034 ), which reflects the lower y ¯ a with haptic guidance in the two visibility conditions.
The SDLP shows a significant main effect for both F ( F ( 1 , 14 ) = 6.06 ,   p < 0.05 , η p 2 = 0.301 ) and A ( F ( 1 , 14 ) = 10.23 ,   p < 0.01 ,   η p 2 = 0.422 ) (see Figure 7b). In the scenarios with fog (+F) or with haptic guidance (+A), the SDLP is significantly lower than in the scenarios without fog (−F) or without haptic guidance (−A). No significant interaction effect exists between F and A ( F ( 1 , 14 ) = 0.0005 ,   p = 0.98 ,   η p 2 0 ). Thus, the SDLP is significantly lower in the presence of either fog or haptic guidance.

4.3. Driver Control Effort

The L 2 -norm of the driver steering wheel torque Γ d 2 indicates a significant main effect for A ( F ( 1 , 14 ) = 562.89 ,   p < 0.001 ,   η p 2 = 0.976 ) and not for F ( F ( 1 , 14 ) = 1.34 , p = 0.27, η p 2 = 0.087 ), as shown in Figure 8. With haptic guidance (+A), Γ d 2 is significantly lower than it is without haptic guidance (−A). With haptic guidance, the mean value of Γ d 2 is about half of that without haptic guidance. This corresponds to the sharing levels defined in Section 2.3. No significant interaction effect exists between F and A ( F ( 1 , 14 ) = 3.37 ,   p = 0.088 ,   η p 2 = 0.194 ). Hence, the driver’s steering wheel torque energy is significantly lower with haptic guidance in all visibility conditions.

4.4. Identified Model Validation

The F I T value of each identified model indicates a significant main effect for F ( F ( 1 , 14 ) = 30.44 ,   p < 0.001 ,   η p 2 = 0.685 ) and not for A ( F ( 1 , 14 ) = 0.15 ,   p = 0.70 , η p 2 = 0.011 ), as shown in Figure 9a. No significant interaction effect exists between F and A ( F ( 1 , 14 ) = 3.84 ,   p = 0.07 ,   η p 2 = 0.215 ). Although the F I T is significantly lower with fog (+F) than without fog (−F), almost all F I T values are above 70 % , which is acceptable in terms of model validation. Figure 9b is an example of a comparison between the measured steering-wheel angle and the predicted output from an identified model.
The d i j values indicating the intersection between the model parameter confidence ellipses are shown in Table 2; see Section 2.2 for enumeration of the scenarios. Because four different scenarios exist, six pairs of comparisons exist in total for each participant. The table shows that most identified models differed from each other, with several exceptions (negative values). The negative values are of relatively small magnitudes. In addition, having a positive d i j is a sufficient, though not necessary, condition for there not being any intersections between ellipses. As an example, Figure 9c shows the worst case, that of Participant No. 3. Although d 13 (red vs. blue ellipse) and d 24 (green vs. black ellipse) are negative, the confidence regions do not overlap. Consequently, the observed parameter variations are attributed to the experimental manipulations rather than the parameter uncertainties.

4.5. Anticipatory and Compensatory Gain

The identified anticipatory gain ( K p ) shows a significant main effect for A (F(1,14) = 29.03, p < 0.001 ,   η p 2 = 0.675 ) and not for F ( F ( 1 , 14 ) = 2.23 ,   p = 0.16 ,   η p 2 = 0.138 ), as shown in Figure 10a. With haptic guidance (+A), the anticipatory gain is significantly higher than it is without haptic guidance (−A). No significant interaction effect exists between F and A ( F ( 1 , 14 ) = 3.19 ,   p = 0.096 ,   η p 2 = 0.186 ). Thus, the anticipatory gain is significantly higher with haptic guidance in all visibility conditions.
The identified compensatory gain ( K c ) shows a significant main effect for both F ( F ( 1 , 14 ) = 12.88 ,   p < 0.01 ,   η p 2 = 0.479 ) and A ( F ( 1 , 14 ) = 5.54 ,   p < 0.05 ,   η p 2 = 0.283 ), as shown in Figure 10b. With either fog (+F) or haptic guidance (+A), the compensatory gain is significantly higher than without fog (−F) or without haptic guidance (−A). No significant interaction effect exists between F and A ( F ( 1 , 14 ) = 1.15 ,   p = 0.30 , η p 2 = 0.076 ). Thus, the compensatory gain is significantly higher in the presence of either fog or haptic guidance.

5. Discussion

The results demonstrate that the effects of assistance and fog on the different considered indicators were cumulative, without any interaction. This result differs from that reported by [16], who showed that drivers could benefit more from haptic shared control in conditions of reduced visibility. The difference between the studies can be explained by the diverse control strategies used in the two cases to achieve shared control. This could be due to a difference in the difficulty of the lane-keeping task. The track in our study was more demanding, with more variation in road curvature and certain sections of higher curvature. Regardless of this, it appears that the drivers benefited as much from haptic shared control in good visibility conditions as in the fog. The following discussion focuses first on the main effect of fog, then haptic shared control is considered separately; finally, we propose a synthesis of the results.

5.1. Effect of Fog

Fog reduces a driver’s ability to anticipate. This can result in more short-term corrections at the steering wheel and increased safety margins. In addition, fog can sometimes make it more difficult to keep the vehicle on the desired trajectory [31,33,35]. Frissen and Mars [35] manipulated near and far vision artificially by applying a mask to the visual scene and controlling its degree of opacity. They showed that steering control was robust up to 60% degradation of far vision, and became less stable starting at 80%. It is difficult to assess the degree of opacity of the fog used in our study. In any case, its effect would additionally depend on vehicle speed and road profile. In practice, the density of the fog was determined empirically in such a way that it was difficult, though possible, to anticipate changes in road curvature for the selected speed. The results show that no major difficulties were observed. The value of y ¯ a was almost the same in both cases (i.e., with and without fog), indicating that the driving trajectory remained similar. In contrast, an increase in SWRR and a decrease in SDLP were observed. Therefore effect of the fog was limited to an increase in the frequency of small correction movements at the steering wheel, resulting in a slight decrease in the variability of the lateral position. Consistently, the parametric identification of the model captured this increase in compensatory behavior through the increase in the compensatory gain ( K c ). The anticipatory gain ( K p ) remained stable, as there was no significant change in the path taken by the participants.

5.2. Effect of Haptic Guidance

The purpose of haptic shared control is to facilitate steering control by delegating effort to the system while providing gesture guidance to the driver. As in previous work [2], in this study haptic guidance made it possible to reduce the driver’s control effort, as observed through Γ d 2 . When haptic guidance was present, the control effort by the drivers was half the effort required without haptic guidance, which corresponds to the chosen sharing level. Although the validation of the control strategy used in this study was not our main objective, the results demonstrated that it is relevant. The reduction in effort is accompanied by smoother steering wheel control, which is evidenced by a slight decrease in SWRR. Haptic guidance helped to keep the lateral position of the vehicle closer to the center of the lane even if the participants continued to cut corners, especially if participants had sufficient visual information for anticipation. In other words, it appeared that the participants essentially followed the haptic guidance provided by the system, meaning that the produced trajectories were more consistent with the curvature of the road and were less variable. The results of parametric identification were consistent with these observations. The new profile trajectory, which conforms more to the curvature, resulted in a net increase of anticipatory gain ( K p ), and smoother control increased the compensatory gain ( K c ).

5.3. Synthesis

Having considered the effects of both fog and haptic guidance, we can now synthesize the relationship between driving performance and the identified parameters of the LC-SW system. As mentioned above, the anticipatory gain parameter represents the visual anticipation of changes in road curvature. One might logically expect this parameter to be particularly sensitive to any experimental manipulation that prevents the road from being correctly anticipated (i.e., the introduction of fog into the visual scene). However, this was not the case in this study; in the end, the fog did not change the trajectory path followed by the vehicle profile. Nevertheless, haptic guidance had a significant influence. By reducing the drivers’ propensity to cut corners, even very slightly, trajectories became closer to the road profile, and the anticipatory gain proved to be very sensitive to this. Therefore, it is reasonable to conclude that the anticipatory gain represents visual anticipation only to the extent that this anticipation defines the followed trajectory.
The compensatory gain parameter represents the online compensation of deviations in lateral position during driving. We expected this to increase as the SDLP decreased. This was the case in our study, either due to fog, haptic guidance, or both, in which case the effect was cumulative. However, the reduction in the variability of the lateral position is not of the same nature under the two conditions. In the case of fog, the variability is the consequence of an increase in SWRR, whereas in the case of haptic guidance it is a consequence of a decrease in SWRR. In other words, compensatory gain proved to be very sensitive to a decrease in the variability of the lateral position, which could be the result of either more corrections due to a decrease in visual anticipation capability or of the smoother steering wheel control induced by haptic guidance.

6. Conclusions

The objective of this work was to verify extent to which a simple and robust model such as the two-point model is likely to “explain”, through its parameters, the behavior of the driver-assistance system as a function of the characteristics of the haptic guidance or the type of visibility. To achieve this goal, this study proposed to evaluate the performance of the human–machine system by combining two types of indicators: the usual metrics used to evaluate driving performance, and the values resulting from the identification of a steering-control model. As a first approach, we chose a simple two-parameter model that accounts for visual steering control. We concluded that a two-point visual model can capture the effect of certain driving conditions very well, particularly those that influence the produced trajectory. However, such a model considers only the steering wheel angle as an output for the performance evaluation. Therefore, it does not independently distinguish between the actions of the human driver and those of the haptic guidance systems on the steering wheel. In further work, we aim to use other more comprehensive models, including models incorporating a neuromuscular system and using torque as the output [18,38,39]. It would be interesting to repeat this experiment with different fixed vehicle speeds, or even to let drivers control the vehicle speed, in order to assess the response of the model parameters under these conditions. Moreover, other control settings, such as variant sharing levels, could be tested to help generalize the validity of our conclusions.

Author Contributions

Conceptualization, Y.Z., P.C. and F.M.; methodology, Y.Z.; software, Y.Z.; validation, P.C., F.C. and F.M.; formal analysis, Y.Z.; investigation, Y.Z.; resources, F.M.; data curation, Y.Z.; writing—original draft preparation, Y.Z.; writing—review and editing, Y.Z., P.C., F.C. and F.M.; visualization, Y.Z. and F.M.; supervision, P.C. and F.M.; project administration, P.C. and F.M.; funding acquisition, F.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by RFI Atlanstic 2020, funded by Région Pays de la Loire.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Steele, M.; Gillespie, R.B. Shared Control between Human and Machine: Using a Haptic Steering Wheel to Aid in Land Vehicle Guidance. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2001, 45, 1671–1675. [Google Scholar] [CrossRef] [Green Version]
  2. Abbink, D.A.; Mulder, M. Exploring the dimensions of haptic feedback support in manual control. J. Comput. Inf. Sci. Eng. 2009, 9, 1–9. [Google Scholar] [CrossRef]
  3. Abbink, D.A.; Mulder, M.; Boer, E.R. Haptic shared control: Smoothly shifting control authority? Cogn. Technol. Work. 2012, 14, 19–28. [Google Scholar] [CrossRef] [Green Version]
  4. Abbink, D.A.; Carlson, T.; Mulder, M.; De Winter, J.C.; Aminravan, F.; Gibo, T.L.; Boer, E.R. A topology of shared control systems-finding common ground in diversity. IEEE Trans.-Hum.-Mach. Syst. 2018, 48, 509–525. [Google Scholar] [CrossRef] [Green Version]
  5. Griffiths, P.; Gillespie, R.B. Shared control between human and machine: Haptic display of automation during manual control of vehicle heading. In Proceedings of the 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, IL, USA, 27–28 March 2004; pp. 358–366. [Google Scholar] [CrossRef]
  6. Griffiths, P.G.; Gillespie, R.B. Sharing Control Between Humans and Automation Using Haptic Interface: Primary and Secondary Task Performance Benefits. Hum. Factors 2005, 47, 574–590. [Google Scholar] [CrossRef] [Green Version]
  7. Mulder, M.; Abbink, D.A.; Boer, E.R. The effect of haptic guidance on curve negotiation behavior of young, experienced drivers. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Singapore, 12–15 October 2008; pp. 804–809. [Google Scholar] [CrossRef]
  8. Marchal-Crespo, L.; McHughen, S.; Cramer, S.C.; Reinkensmeyer, D.J. The effect of haptic guidance, aging, and initial skill level on motor learning of a steering task. Exp. Brain Res. 2010, 201, 209–220. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Mulder, M.; Abbink, D.A.; Boer, E.R. Sharing control with haptics: Seamless driver support from manual to automatic control. Hum. Factors 2012, 54, 786–798. [Google Scholar] [CrossRef]
  10. Wang, Z.; Zheng, R.; Kaizuka, T.; Shimono, K.; Nakano, K. The effect of a haptic guidance steering system on fatigue-related driver behavior. IEEE Trans.-Hum.-Mach. Syst. 2017, 47, 741–748. [Google Scholar] [CrossRef]
  11. Petermeijer, S.M.; Abbink, D.A.; Mulder, M.; De Winter, J.C. The Effect of Haptic Support Systems on Driver Performance: A Literature Survey. IEEE Trans. Haptics 2015, 8, 467–479. [Google Scholar] [CrossRef]
  12. Petermeijer, S.M.; Abbink, D.A.; De Winter, J.C. Should drivers be operating within an automation-free bandwidth? Evaluating haptic steering support systems with different levels of authority. Hum. Factors 2015, 57, 5–20. [Google Scholar] [CrossRef]
  13. Wang, Z.; Kaizuka, T.; Nakano, K. Effect of Haptic Guidance Steering on Lane Following Performance by Taking Account of Driver Reliance on the Assistance System. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics, Miyazaki, Japan, 7–10 October 2018. [Google Scholar] [CrossRef]
  14. Sivak, M. The Information That Drivers Use: Is it Indeed 90% Visual? Perception 1996, 25, 1081–1089. [Google Scholar] [CrossRef] [PubMed]
  15. De Nijs, S.Y.; Mulder, M.; Abbink, D.A. The value of haptic feedback in lane keeping. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, San Diego, CA, USA, 5–8 October 2014; pp. 3599–3604. [Google Scholar] [CrossRef]
  16. Mars, F.; Deroo, M.; Hoc, J.M. Analysis of human-machine cooperation when driving with different degrees of haptic shared control. IEEE Trans. Haptics 2014, 7, 324–333. [Google Scholar] [CrossRef]
  17. Ameyoe, A.; Mars, F.; Chevrel, P.; Le Carpentier, E.; Illy, H. Estimation of driver distraction using the prediction error of a cybernetic driver model. In Proceedings of the Driving Simulation Conference Europe, Tubingen, Germany, 16–18 September 2015; pp. 13–18. [Google Scholar]
  18. Mars, F.; Chevrel, P. Modelling human control of steering for the design of advanced driver assistance systems. Annu. Rev. Control. 2017, 44, 292–302. [Google Scholar] [CrossRef]
  19. Marcano, M.; Diaz, S.; Perez, J.; Irigoyen, E. A Review of Shared Control for Automated Vehicles: Theory and Applications. IEEE Trans.-Hum.-Mach. Syst. 2020, 50, 475–491. [Google Scholar] [CrossRef]
  20. Saleh, L.; Chevrel, P.; Claveau, F.; Lafay, J.F.; Mars, F. Shared steering control between a driver and an automation: Stability in the presence of driver behavior uncertainty. IEEE Trans. Intell. Transp. Syst. 2013, 14, 974–983. [Google Scholar] [CrossRef]
  21. Flad, M.; Frohlich, L.; Hohmann, S. Cooperative shared control driver assistance systems based on motion primitives and differential games. IEEE Trans.-Hum.-Mach. Syst. 2017, 47, 711–722. [Google Scholar] [CrossRef]
  22. Nguyen, A.T.; Sentouh, C.; Popieul, J.C. Sensor Reduction for Driver-Automation Shared Steering Control via an Adaptive Authority Allocation Strategy. IEEE/ASME Trans. Mechatronics 2018, 23, 5–16. [Google Scholar] [CrossRef]
  23. Benloucif, M.A.; Sentouh, C.; Floris, J.; Simon, P.; Popieul, J.C. Online adaptation of the Level of Haptic Authority in a lane keeping system considering the driver’s state. Transp. Res. Part Traffic Psychol. Behav. 2019, 61, 107–119. [Google Scholar] [CrossRef]
  24. Sentouh, C.; Nguyen, A.; Benloucif, M.A.; Popieul, J. Driver-Automation Cooperation Oriented Approach for Shared Control of Lane Keeping Assist Systems. IEEE Trans. Control. Syst. Technol. 2019, 27, 1962–1978. [Google Scholar] [CrossRef]
  25. Ji, X.; Yang, K.; Na, X.; Lv, C.; Liu, Y. Shared Steering Torque Control for Lane Change Assistance: A Stochastic Game-Theoretic Approach. IEEE Trans. Ind. Electron. 2019, 66, 3093–3105. [Google Scholar] [CrossRef]
  26. Pano, B.; Chevrel, P.; Claveau, F. Anticipatory and compensatory e-assistance for haptic shared control of the steering wheel. In Proceedings of the 2019 18th European Control Conference (ECC), Naples, Italy, 25–28 June 2019; pp. 724–731. [Google Scholar] [CrossRef]
  27. Pano, B.; Claveau, F.; Chevrel, P.; Sentouh, C.; Mars, F. Systematic H2/H haptic shared control synthesis for cars, parameterized by sharing level. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020; pp. 4416–4423. [Google Scholar] [CrossRef]
  28. Zhao, Y.; Chevrel, P.; Claveau, F.; Mars, F. Towards a Driver Model to Clarify Cooperation between Drivers and Haptic Guidance Systems. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020; pp. 1731–1737. [Google Scholar] [CrossRef]
  29. Zhao, Y.; Pano, B.; Chevrel, P.; Claveau, F.; Mars, F. Driver Model Validation through Interaction with Varying Levels of Haptic Guidance. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020; pp. 2284–2290. [Google Scholar] [CrossRef]
  30. Pano, B.; Chevrel, P.; Claveau, F.; Sentouh, C.; Mars, F. Obstacle avoidance in highly automated cars: Can progressive haptic shared control make it smoother? IEEE Trans.-Hum.-Mach. Syst. 2022, 52, 547–556. [Google Scholar] [CrossRef]
  31. Salvucci, D.D.; Gray, R. A two-point visual control model of steering. Perception 2004, 33, 1233–1248. [Google Scholar] [CrossRef] [PubMed]
  32. Donges, E. A Two-Level Model of Driver Steering Behavior. Hum. Factors J. Hum. Factors Ergon. Soc. 1978, 20, 691–707. [Google Scholar] [CrossRef]
  33. Land, M.; Horwood, J. Which parts of the road guide steering? Nature 1995, 377, 339–340. [Google Scholar] [CrossRef] [PubMed]
  34. Steen, J.; Damveld, H.J.; Happee, R.; Van Paassen, M.M.; Mulder, M. A review of visual driver models for system identification purposes. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Anchorage, AL, USA, 9–12 October 2011; pp. 2093–2100. [Google Scholar] [CrossRef]
  35. Frissen, I.; Mars, F. The Effect of Visual Degradation on Anticipatory and Compensatory Steering Control. Q. J. Exp. Psychol. 2014, 67, 499–507. [Google Scholar] [CrossRef] [PubMed]
  36. Markkula, G.; Engström, J. A Steering Wheel Reversal Rate Metric for Assessing Effects of Visual and Cognitive Secondary Task Load. In Proceedings of the 13th ITS World Congress, London, UK, 8–12 October 2006. [Google Scholar]
  37. Nilsson, L.; Merat, N.; Jamson, H.; Mouta, S.; Carvalhais, J.; Santos, J.; Anttila, V.; Sandberg, H.; Luoma, J.; de Waard, D.; et al. HASTE Deliverable 2: HMI and Safety-Related Driver Performance; European Commission: Brussels, Belgium, 2004. [Google Scholar]
  38. Mars, F.; Saleh, L.; Chevrel, P.; Claveau, F.; Lafay, J.F. Modeling the visual and motor control of steering with an eye to shared-control automation. Proc. Hum. Factors Ergon. Soc. 2011, 55, 1422–1426. [Google Scholar] [CrossRef]
  39. Saleh, L.; Chevrel, P.; Mars, F.; Lafay, J.F.; Claveau, F. Human-like cybernetic driver model for lane keeping. In Proceedings of the 18th IFAC World Congress, Milan, Italy, 29 August–3 September 2011; pp. 4368–4373. [Google Scholar]
  40. Ljung, L. System Identification: Theory for the User, 2nd ed.; Prentice Hall Information and System Sciences Series; Prentice Hall PTR: Hoboken, NJ, USA, 1999. [Google Scholar]
Figure 3. Route used for experiment.
Figure 3. Route used for experiment.
Vehicles 04 00074 g003
Figure 4. (a) Two-point visual model and (b) calculation of far-point and near-point angles.
Figure 4. (a) Two-point visual model and (b) calculation of far-point and near-point angles.
Vehicles 04 00074 g004
Figure 5. Summary of data analysis methods.
Figure 5. Summary of data analysis methods.
Vehicles 04 00074 g005
Figure 6. SWRR per minute with a gap size of 2°.
Figure 6. SWRR per minute with a gap size of 2°.
Vehicles 04 00074 g006
Figure 7. Lane-keeping performance: (a) mean of adjusted deviation of the lateral position and (b) standard deviation of the lateral position.
Figure 7. Lane-keeping performance: (a) mean of adjusted deviation of the lateral position and (b) standard deviation of the lateral position.
Vehicles 04 00074 g007
Figure 8. Driver steering torque energy.
Figure 8. Driver steering torque energy.
Vehicles 04 00074 g008
Figure 9. Identified model validation: (a) F I T for all participants; (b) measured vs. predicted steering-wheel angle for Participant No. 1 in the First Scenario; (c) confidence regions of identified parameters for Participant No. 3.
Figure 9. Identified model validation: (a) F I T for all participants; (b) measured vs. predicted steering-wheel angle for Participant No. 1 in the First Scenario; (c) confidence regions of identified parameters for Participant No. 3.
Vehicles 04 00074 g009
Figure 10. Parameters obtained from system identification: (a) identified anticipatory gain ( K p ) and (b) identified compensatory gain ( K c ).
Figure 10. Parameters obtained from system identification: (a) identified anticipatory gain ( K p ) and (b) identified compensatory gain ( K c ).
Vehicles 04 00074 g010
Table 1. Description of parameters in the model.
Table 1. Description of parameters in the model.
ParameterDescriptionNominal Values
K p Visual Anticipation Gain-
K c Visual Compensation Gain-
T I , T L Compensation Time Constants1, 3
τ p Processing Delay0.04
vVehicle Longitudinal Speed-
Table 2. Comparison of model parameter uncertainties.
Table 2. Comparison of model parameter uncertainties.
Participant d 12 d 13 d 14 d 23 d 24 d 34
10.672.134.141.053.061.27
20.242.720.563.311.161.52
31.49−0.110.911.17−0.30.61
4−0.060.742.660.552.461.62
52.864.624.011.460.850.14
60.721.021.6−0.120.410.17
70.52−0.140.750.76−0.240.99
80.752.210.931.15−0.110.46
90.371.513.440.722.641.6
100.282.181.632.92.35−0.1
110.520.080.920.881.710.36
121.562.20.490.280.671.31
13−0.21−0.270.73−0.010.990.59
141.77−0.020.71.412.730.94
150.430.22.95−0.022.292.37
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhao, Y.; Chevrel, P.; Claveau, F.; Mars, F. Driving with a Haptic Guidance System in Degraded Visibility Conditions: Behavioral Analysis and Identification of a Two-Point Steering Control Model. Vehicles 2022, 4, 1413-1429. https://doi.org/10.3390/vehicles4040074

AMA Style

Zhao Y, Chevrel P, Claveau F, Mars F. Driving with a Haptic Guidance System in Degraded Visibility Conditions: Behavioral Analysis and Identification of a Two-Point Steering Control Model. Vehicles. 2022; 4(4):1413-1429. https://doi.org/10.3390/vehicles4040074

Chicago/Turabian Style

Zhao, Yishen, Philippe Chevrel, Fabien Claveau, and Franck Mars. 2022. "Driving with a Haptic Guidance System in Degraded Visibility Conditions: Behavioral Analysis and Identification of a Two-Point Steering Control Model" Vehicles 4, no. 4: 1413-1429. https://doi.org/10.3390/vehicles4040074

Article Metrics

Back to TopTop