Next Article in Journal
A Methodology to Evaluate User Experience for People with Autism Spectrum Disorder
Next Article in Special Issue
Application and Assessment of Cooperative Localization in Three-Dimensional Vehicle Networks
Previous Article in Journal
Modelling the Interaction between a Laterally Deflected Car Tyre and a Road Surface
Previous Article in Special Issue
Jam Mitigation for Autonomous Convoys via Behavior-Based Robotics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Data Fusion of Positioning System with a Fault Detection Mechanism for Autonomous Vehicles

1
Ph.D. Program of Electrical and Communication Engineering, Feng-Chia University, Taichung 40724, Taiwan
2
Autmotive Research and Testing Center, Changhua 50544, Taiwan
3
Department of Electronic Engineering, Feng-Chia University, Taichung 40724, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(22), 11339; https://doi.org/10.3390/app122211339
Submission received: 3 October 2022 / Revised: 2 November 2022 / Accepted: 4 November 2022 / Published: 8 November 2022

Abstract

:
A positioning system provides useful positioning information for applications. The ability to maintain high-quality positioning information in real time is crucial. Autonomous vehicles require positioning information to plan control strategies. Inadequate positioning accuracy affects control and safety. This article explores the effect of an error detection and correction method on coordinate information fusion. This method can be used to efficiently correct positioning errors in the presence of faults. The fault detection mechanism is not limited to positioning systems with certain types of sensor applications. and it has certain flexibility in various applications. This positioning system detection method uses more than two positioning systems. Primarily, one positioning system is used to monitor the other. When the error of the positioning system differs excessively from the system setting value, the positioning fusion algorithm will make appropriate adjustments to output the corrected coordinate value. Therefore, this method integrates different positioning technologies, including a global position system (GPS), simultaneous localization and mapping (SLAM), inertial measurement units (IMUs), and mileage meters. Autonomous vehicles already have the required sensors deployed, and our method can be implemented at no additional cost. We integrated two positioning systems to support the positioning applications and designed a fault detection method to detect errors in the positioning system. This allowed the system to have some reference information by which to correct the coordinate information. According to the experimental findings, positioning information errors were corrected at a rate of 89.8%. The positioning system with the added detection function was able to effectively filter the data without too many errors. Additionally, the results showed that this method can effectively correct positioning errors. The results indicate that adding an error detection mechanism can effectively improve the accuracy of coordinate data fusion. Positioning systems with an error detection function can thus improve applications’ performance.

1. Introduction

Vehicles are important tools in everyday life. An increasing number of vehicle applications are being developed, such as autonomous vehicles. Many manufacturers have joined the research and development of autonomous vehicles. These vehicles are based on the existing vehicle architecture for innovation. The vehicle requires a control center to plan the driving strategy. The computer used for this strategy requires some position information to allow it to understand where the vehicle is. The positioning system can provide the vehicle with location information. There are various methods by which to provide reliable location information for autonomous vehicles. Every application system has the possibility of the occurrence of a system fault. The positioning system also has the problem of potential failures. Vehicle applications should consider this situation and design functions to detect faults. On-board diagnostics (OBD) is one such approach [1], but there are few discussions of fault detection methods for positioning systems in the study of the application of positioning systems.
In this paper, the influence of the fault detection mechanism on the data fusion of the positioning system is discussed. This method is developed based on the sensor configured by the WinBus [2]. The WinBus is an autonomous vehicle that integrates many application systems. This fault detection mechanism is discussed using positioning methods developed by on-board sensors. Different sensing systems have different coordinate values. The coordinate system must be unified for this mechanism to work. The coordinate system uses 3D coordinates (X, Y, and Z). This coordinate system is unified in the WinBus application. This coordinate system allows this method to have the same benchmark that can be used for cross-validation. In addition to the sensors provided by the vehicle, there are also many studies using the “Internet of vehicles” system to complete the positioning method. The experimental field did not have road equipment linked to the Internet of vehicles. Therefore, this paper does not address the issues related to the Internet of vehicles; instead, it focuses more on the positioning system application of the vehicle itself.
The fault detection mechanism proposed in this paper is an extension based on the abovementioned method to improve the reliability of the positioning system. This paper consists of five sections. Section 2 is the literature review. Section 3 describes the positioning system’s experimental platform, i.e., the WinBus. Section 4 presents the positioning fault detection methods. The method used includes a procedure and algorithm that are described in this section. Section 5 presents the verification results. The final section reports the conclusions and development plan.

2. Literature Review

Each positioning system uses different sensors to implement the navigation functions. The vehicles deploy many sensors that provide the autonomous vehicle strategy center with information required to plan control strategies. For example, real-time kinematic (RTK) positioning, inertial measurement units (IMU), 2D lidar, 3D lidar, radars, and cameras are the sensors of the applications. These sensors can also provide useful information for different positioning methods. RTK provides coordinate information to end applications. There are various studies regarding the development of technology based on this system [3,4,5,6,7,8]. Some methods integrate many sensors, while others use more RTK systems to reduce data loss. Researchers try to improve the usefulness and accuracy of RTK information, which is the goal. These methods require many application systems to support them to correct coordinate information. Each application system has a damage probability. In this paper, we propose a system fault-finding mechanism, and the coordinate accuracy is verified based on this mechanism. This mechanism references some related application systems’ information for implementation. The reference system is part of WinBus, so there is no additional cost of the sensor in this method.
Simultaneous localization and mapping (SLAM) is another currently applied technology in autonomous vehicle positioning. Some studies have discussed this technology. This technology is based on the 3D lidar for implementation [9,10,11,12,13,14,15]. There are many types of research that discuss the 3D lidar SLAM. The 3D lidar provides scan information of the surrounding environment to support the SLAM calculation. SLAM uses some deep learning methods for implementation, such as convolutional neural networks (CNN). This method provides some position information for the applications. There is autonomous vehicle research based on this method for implementation. This method supports autonomous vehicles in a scenario where there is no GPS signal support. This is one of the advantages of the autonomous vehicle application described in this study. Systems based on 3D lidar applications require many hardware computing resources. SLAM alone cannot achieve autonomous vehicles according to the previous application experience and literature. Similarly, when the system responsible for SLAM fails, how the decision-making control system of the self-driving car is informed is relatively less discussed. The fault detection mechanism proposed in this paper is a possible solution for autonomous vehicle applications.
Communication is an indispensable element in autonomous vehicle applications. The communication interface includes wired communication and wireless communication. Control area networks (CAN) and the eEhernet are the wired communication interfaces [16,17,18]. The CAN bus is used for message transmission between the power chain, chassis, and body. The International Standards Organization also defines the corresponding communication specifications, but the information on the positioning system applications is only partially defined. The CAN bus bandwidth cannot transmit a lot of information from the sensor. The autonomous vehicle requires the communication interface to be the Ethernet. In recent years, there has been much research on positioning systems in the Internet of vehicles (IoV). Communication interfaces used in the Internet of vehicles, such as dedicated short-range communication (DSRC), cellular-vehicle-to-everything (C-V2X), and fifth-generation mobile networks (5G) have positioning-related information [19,20,21]. Most of the positioning-related information follows the specifications of SAE J2735 [22]. A part of the communication interface applied by IoV belongs to the wireless communication interface. The IoV method is based on RSU and vehicle coordinates for implementation. There are also studies implementing positioning systems using the SLAM and IoV techniques discussed earlier. The deployment of hardware facilities is the key to the positioning system implemented using these technologies. This research used the wired network inside the vehicle to collect and transmit information. Limited by the problem of hardware deployment in the experimental field, discussion of the positioning system discussion of the Internet of vehicles is temporarily excluded from this research.
Some accurate and reliable location data are necessary for autonomous vehicles’ requirements. It can be seen from the previous discussion that positioning information alone cannot meet these requirements. Researchers have proposed many information fusion methods to solve the accuracy problem [23,24,25]. The information fusion method can solve some of the system’s disadvantages. This method integrates multiple sensor systems that can provide some positioning information in different scenarios. To address these shortcomings, researchers have used different approaches to solve them. Information fusion combines some of the benefits of other systems, making the positioning system more precise and reliable. GPS and IMU integrate the extended Kalman filter (EKF) as one of the methods. In addition, the coordinate data fusion method has the function of repairing inaccurate information coordinates. The additional error detection mechanism of the positioning system correcting the coordinate information is a method by which to solve the error of the positioning system. There are still some differences between these methods.
We propose a method that can be used to improve the stability and reliability of positioning systems; it supports the positioning system with an error detection mechanism. Considering the sensing systems and methods configured on the vehicle, this mechanism is based on the RTK, SLAM, IMU, and chassis systems for implementation of the function.

3. Positioning System Experimental Platform

3.1. WinBus

The WinBus is the experimental platform integrated by the Automotive Research and Testing Center (ARTC) and related manufacturers. The sensors deployed in the WinBus are shown in Figure 1.
Two 3D lidars are deployed on the front-end and back-end roof. Lidar is a kind of optical device. The device is based on a laser beam that is used to detect the obstacle. Lidar measures the time gap between the laser beam transmission and reception to better understand the obstacle’s distance. It predicts the shape of the target using the reflected signals of multiple beams. It can also use the reflectivity of the received beam to judge the material of the target surface. The front, back, and side effective ranges are each 60 m, and the side range is 20 m. The position, relative speed, and ID index constitute the obstacle data. The technique used here captures 3D lidar point cloud data. Two radars are mounted, one at the front and one at the back of the WinBus. Similar to the 3D lidar, the radar has the ability to identify obstacles. This information helps the vehicle locate the obstruction. This sensor can deliver information regarding obstacles, including their location and relative speed. Both the 3D lidar and the radar can simultaneously find obstacles in the path of the vehicle. These two sensors can aid in the development of a control rule that will boost safety. The cameras were mounted all over the body. The front camera shows information about lanes and traffic signals. The GPS and IMU deliver data on the dynamics and location of the vehicle. For radio signal transmission, the GPS uses 24 to 32 Earth-orbiting satellites. The GPS receiver can obtain the position, time, and speed with the assistance of the signal. An appropriate received signal quality is necessary for GPS accuracy. There are many factors affecting the positioning signal, including the environment, the ionosphere, and electromagnetic interference. The GNSS network is the foundation of real-time dynamic technology [26], which assesses the positioning inaccuracy of the region serviced by the reference station. The IMU determines the attitude of the item by monitoring the gyroscope and G-sensor. The information provides the positioning system with an angular velocity and acceleration reference.

3.1.1. LiDAR

Lidar is an optical sensing technology that illuminates a beam of light. Lidar measures the time interval between the emitted and received beam signals to count and calculate the distance to the target. It predicts the shape of the target using the reflected signals of multiple beams. It can also use the reflectivity of the received beam to determine the target surface material. The WinBus deploys two 3D lidars on the vehicle roof. 3D lidar sensors are also known as 3D laser scanners. This sensor captures their surroundings almost continuously. Therefore, 3D lidar sensors are ideal for collision protection for autonomous vehicles or for the detection of objects. In the previously discussed literature, 3D lidars were used as sensors for SLAM. The lidar mounted on the WinBus is used to record the scene point cloud. The system creates a recorded point cloud by overlaying the map. The completed map is the point cloud map data of the comparison ends. During the comparison, the lidar scans the scene environment in real-time. The SLAM algorithm compares the current point cloud with the point cloud image. This method finds feature locations that match the current point cloud, and then calculates the vehicle coordinates.

3.1.2. Global Positioning System (GPS)

GPS is the only fully operational global navigation satellite system in the world. This system has about 24 to 32 Earth-orbiting satellites. GPS satellites are usually located at an altitude of 20,000 km from the Earth and operate at a speed of 14,000 km per hour. GPS signals cannot penetrate walls. Signals in buildings are also blocked. It is less effective when the user uses it in an area with tall buildings or dense trees. Therefore, GPS signals are rarely used directly in practical applications.
The real-time dynamic technology uses a GNSS network composed of multiple satellite positioning reference stations to evaluate the positioning error of the area covered by the reference stations. Then, it matches the observation data of the nearest physical base station to produce a virtual base station as the RTK master station. The autonomous vehicle does not receive actual observations from a physical base station. The error-corrected virtual observation data are the information used by the applications. The WinBus integrates RTK to provide location information for applications. RTK is one of the positioning systems for the WinBus.

3.1.3. Simultaneous Localization and Mapping (SLAM)

SLAM is another positioning system used by the WinBus. Mapping and matching are the two parts of the SLAM method. The quality of the mapping determines whether this method is accurate because the positioning data source is from the feature of the map. The intricacy of the environment and the number of dynamic objects are two factors that influence the SLAM’s outcome. The clarity of the mapping will be affected if the characteristics of the environment are ambiguous or the base map environment changes too frequently. This kind of application has some risks for autonomous vehicle applications. In the drawing part, for scenes where one want to position oneself, first, create a map. The lidar mounted on the car will record the scene point cloud at a slow speed. The recorded point cloud is created by overlaying the map, and the completed map is the base map at the end of the comparison. Then, the scene environment is instantly scanned during the comparison. SLAM compares the current point cloud with the map point cloud. Additionally, it finds feature locations that match the current point cloud. The output coordinates are unified three-dimensional coordinates for the WinBus application. The WinBus uses SLAM to support the positioning system. Increasing the positioning system’s reliability is the goal.

3.1.4. Inertial Measurement Unit (IMU)

The IMU uses three-direction gyroscopes and three-direction accelerometers to measure the three-axis angular velocity and acceleration changes. It uses these data to calculate the pose of the object. The IMU is usually mounted on the object’s center of gravity to ensure correct calculations. Some IMUs integrate other sensors to extend its functions, such as magnetometers or barometers, and to calculate bearing and altitude. The WinBus integrates the IMU to increase the reliability of the positioning system.

3.2. Positioning System of the WinBus

The positioning system of the WinBus fuses different information from different sensor systems. Figure 2 shows the positioning system function block. GPS, SLAM, and reference chassis data are all combined into one positioning system. The information for the applications is the coordinates, both absolute and relative. The two elements that make up the absolute positioning system are GPS and SLAM. Autonomous cars may locate their position using these two technologies’ absolute latitude and longitude information. The relative coordinates are from the wheel speedometer, steering angle, and IMU. The positioning system predicts coordinates and guarantees stability through dynamic models.

4. Positioning Fault Detection Methods

How to increase positioning accuracy is a problem that researchers and developers wish to solve. This positioning system combines RTK and SLAM to aid the autonomous vehicle’s strategy-building process. The positioning fusion algorithm references the RTK and SLAM information to provide the location data. For precision, these two systems work together as a team. However, every application system has the possibility of failure. In this research, we propose a strategy by which to determine the fault detection mechanism and discuss the relationship between coordinate data fusion and faults. Here, we take advantage of the interdependence of the data to achieve this. Finding faults requires some mechanisms and a relatively stable device to support them. Taking speed information as an example, there are devices in the vehicle that provide speed information, such as the ABS (antilock braking system), IMU, and GPS. The ABS is a more stable provider of speed signals than the IMU and GPS. Based on this concept, the speed of the fault detection method references the ABS speed information. The detection method integrates the IMU and ABS to help the system to find the faults. A prediction algorithm with heading and velocity information can provide reference information to the positioning fusion algorithm. This information is a reference message that helps the fusion algorithm determine the correctness of the positioning data from the RTK. The SLM fault detection method references the RTK information to detect the fault.
Figure 3 shows the functional blocks of the positioning system and the fault detection method. The time tn is the current time, and tn+1 is the following time. The predicted coordinates reference the previous credible coordinates for the fault detection method of RTK and SLAM and are used as reference sources. The prediction data reference the vehicle speed and heading. Coordinate data, which must be reliable information derived from a prediction method, are referred to by the positioning system fault detection algorithm. The speed and heading information are from the ABS and IMU. The application of these two devices on a vehicle results in a relatively stable device. This method is suitable for use as a reference for estimating messages in positioning system fault detection.
The fault detection function block is shown in Figure 4. This method of obtaining reference data must be chosen following the positioning system’s features, for example, whether the fault detection algorithm used in the RTK uses the input coordinate data from the RTK or whether the method used in SLAM and the input coordinate data will be replaced with SLAM coordinate data. The reference coordinates for SLAM and RTK are both from the prediction coordinates. The positioning error condition is the error level condition. This condition defines the allowable error range of the positioning system. The data output are fault information and positioning data that the positioning fusion algorithm will reference.
The coordinate information allows the system to have a reference position. The positioning system of the WinBus fuses the RTK and SLM information to generate location information for the autonomous application. These data are used by the vehicle to determine its location. The original fusion algorithm refers to Equation (1), where P f is the position data that can be latitude or longitude information. This position information is calculated from the fusion algorithm. Prtk is the position information from RTK. Pslam is the position information from SLAM. Prtk and Pslam can be the latitude or longitude information. Wrtk is the weight for the RTK. Wslam is the weight for SLAM. The weight parameter can adjusted by the application. RTK has a large weight in this system. Each application has the primary trust information. The weight parameter can allow the strategy process to switch the primary position system when a system is damaged.
P f = P r t k W r t k + P s l a m W s l a m W r t k + W s l a m
We designed a prediction method to find reference information for autonomous vehicle applications. Reference information must be stable and reliable. This method integrates some information from the chassis and other sensors, such as the positioning system, the ABS sensor, and the IMU. We took into account that the RTK is subject to weather. From the perspective of system stability, the ABS and IMU on the vehicle are more stable than the GPS-based positioning system. Referring to these two pieces of information in the fault detection mechanism is more reliable. The positioning system provides previous valid position data. The ABS sensor provides the final speed information. The IMU provides the heading information to the algorithm. The latitude and longitude prediction algorithm equation is represented by Equation (2). The parameter P D x + 1 is the estimated latitude data, and P x is the current latitude data. The parameter P D y + 1 is the estimated longitude data, and P y is the current longitude data. The parameter V is the vehicle speed. The parameter θ is the heading of the vehicle. Equation (3) defines P D x + 1 and P D y + 1 as belonging to P f + 1   . The distance calculation algorithm is shown in Equation (4) The parameter D f   is the distance between the prediction position and the current position.
{ P D x + 1 = P x + 0 t V cos θ d t P D y + 1 = P y + 0 t V sin θ d t  
P D f + 1     {   P D x + 1 , P D y + 1 }
D f   ( P f , P D f + 1 ) = R cos 1 [ cos P y cos   P D y + 1 cos   (   P x P D x + 1 ) + sin P y sin   P D y + 1   ]
The fault result is shown in Equation (5), which is another indicator function. The parameter D r   is the fault detection result. D e can be a level that triggers the fault condition. If D e   is less than D f , the result is 1. The fault detection algorithm generates an error message to the self-driving application stating that an error has occurred in the positioning system. Equation (5) fits different applications that can be rewritten. If the fault detection mechanism is used in SLAM, the parameter D r     can be rewritten as D r ( S L A M )   . If the fault detection mechanism is used in RTK, the parameter D r   can be rewritten as D r ( R T K )   .
  D r   ( D e ,   , D f ) = {     1   ,   i f   D e > D f   0   ,   o t h e r w i s e
Equation (1) can be rewritten as Equation (6) based on Equation (5). This equation integrates the positioning fusion algorithm and fault detection algorithm for the autonomous application. It allows the strategy process to have a positional reference for planning the control rule to control the vehicle.
P f = ( P D f + 1 W p d ) + ( P r t k W r t k )   D r ( R T K )   ( D e ,   , D f ) + ( P s l a m W s l a m )   D r ( S L A M )   ( D e ,   , D f )   W p d + ( W r t k )   D r ( R T K )   ( D e ,   , D f ) + ( W s l a m )   D r ( S L A M )   ( D e ,   , D f )

5. Results

The experimental field was in the Chang Hua Coastal Industrial Park. The route is shown in Figure 5. The route is 7.5 km long. The positioning system was deployed in the WinBus. The fault detection algorithm was integrated with the system in the WinBus. This experiment obtained three results from three positioning methods: the RTK, SLAM, and prediction. The results were used to compare the different positioning methods for the coordinate results’ effect.
The result compares the SLAM and RTK information shown in Figure 6. The figure was generated from the latitude and longitude data. The red line presents the SLAM result, and the yellow line shows the RTK. This finding shows that these two positioning systems indicate the two trajectories in the figure. By comparing the actual route with the results, we can see that the SLAM system contained some errors. We discuss the failure detection mechanism below, as well as how to use this mechanism to increase the accuracy of the positioning system.
Figure 7 compares the RTK, SLAM, fusion, and prediction results. The fusion result is the blue line from Equation (1), shown in Figure 7. The prediction result is the green line, shown in Figure 7. This result is from Equations (2) and (3). There were some errors detected without a fault detection method that were compared the RTK result. The prediction result showed that the track was similar to the RTK. Another comparison between the RTK and prediction data showed that the information was similar to the actual trajectory. This also proved that the prediction results and RTK are reliable information for autonomous applications in this field.
Figure 8 includes all of the above results for comparison. The purple line is the fusion with the fault detection mechanism. Here, we discuss the impact of the fault detection mechanism of the positioning system. From the previous discussion, it can be seen that this prediction method is feasible. Finally, we compared the actual path, the predicted path, and the results of the fusion algorithm based on the fault detection mechanism. The results showed that the fault detection mechanism can effectively correct the coordinate results of the positioning system.
Figure 9 (longitude) and Figure 10 (latitude) compare each result. In the figures, lines of different colors represent RTK, SLAM, fusion, prediction, and fusion with the diagnostic mechanism results. The result showed that the SLAM and fusion without any fault detection mechanisms had some problems.
We also performed a more detailed comparison regarding the location of anomalous coordinates. Figure 11 and Figure 12 show the parts with abnormal coordinates in Figure 9 and Figure 10, respectively. Figure 11 and Figure 12 show the results of the longitude and latitude coordinates of each method’s coordinates, respectively. These diagrams show the coordinates of the anomalous parts for further discussion. The results showed that the coordinates results of SLAM cannot satisfy the requirements for autonomous vehicle applications. SLAM has the greatest number of mistakes after comparing all the methods. Due to the deviation of the coordinates, the system’s operation may be faulty.
The coordinate fusion method was based on Equation (1). The grey line shows the result obtained by the original fusion algorithm. This result is compared with the SLAM and RTK results, shown in Figure 11 and Figure 12. The original fusion result was without reference fault information. This result showed that the fusion algorithm was able to correct the biases in the coordinate information, but it was not sufficient. The original fusion algorithm only considered the weights occupied by the various positioning systems. The erroneous value was still included in the evolutionary algorithm’s calculations when one of the positioning systems failed. Due to the influence of SLAM, the trajectory of the coordinates was biased.
The correct path in this situation was the most similar to that of the RTK. Thus, the result of using the RTK here is the benchmark. The coordinate prediction results (yellow line) were similar to the RTK results, proving that the prediction mechanism was efficient. The fusion results and the diagnosis mechanism (light blue line) were combined with reference to the results of RTK, SLAM, and data fusion. Equation (7) describes the positioning system results. This result is close to the coordinates of the correct route.
The positioning error was calculated from Equation (4). The RTK position coordinates were compared with those obtained using SLAM, the original fusion algorithm, prediction, and fusion with a diagnostic mechanic. The results are shown in Figure 13. Table 1 shows a comparison between the values of each method. In order to compare trends and reduce errors caused by noise, 100 records were averaged from the peak in the comparison of the results. The average distance by which SLAM deviated in this test was 91.8 m.
The original algorithm was able to correct the error value correctly. However, the magnitude of the correction was limited. The maximum error was 18.3 m after correction. The original algorithm corrected the error at a rate of 80%. The prediction coordinate was from Equation (2). The prediction of the coordinates was based on the previously valid coordinates, with reference to the speed and heading of the vehicle. The error between the RTK coordinates and the predicted value was about 0.64 m. As can be seen from Figure 13, this result is very similar to the coordinate value of RTK. It can be used as reference data for the fault detection mechanism. Equation (6) presents the fusion algorithm with the diagnostic mechanism. The coordinate result of this algorithm differed from the RTK coordinate value by 0.25 m. The results showed that the error was further corrected.
Finally, the positioning errors of the algorithms were considered. A 2 m lateral error was set in the experiment by considering the lane width. Comparing the values, the difference between the RTK and the diagnostic mechanism was 2.25 m. This method corrected 89.8% of the errors. This result proves that our proposed method is useful.

6. Conclusions

Autonomous vehicles use relatively accurate positioning information to formulate control strategies and improve vehicle safety. Positioning systems have the problem of positioning errors. There are many methods implemented using only a single positioning method. There are certain risks in autonomous vehicles that rely on a single positioning system. Therefore, scholars have discussed a variety of options. Coordinate information fusion is one possible solutions. There are still some problems to be solved in positioning systems, such as fault problems.
In this paper, we have proposed a fault detection method to support coordinate fusion and compared the differences between the errors of various positioning methods and the proposed method. In the application, the concept of data fusion was used for error correction so that the autonomous vehicle had relatively accurate coordinate values as a reference. The coordinate information obtained by the original fusion algorithm still had an error problem. This was due to the fact that the original algorithm only focused on the weight allocation of the coordinate information. It did not consider the impact of system failures on information fusion. The fault detection mechanism referenced a prediction coordinate. SLAM was found to have more errors in all the positioning systems after comparing this coordinate. After observing these phenomena, we changed part of the algorithm. The adjusted coordinate fusion algorithm corrected 89.8% of errors. The experiments demonstrated that the method was able to detect errors when the positioning system had failed. The fault detection and correction mechanism was used to correct the coordinate. The results showed that this mechanism was able to support the positioning system in certain conditions and was able to correct the coordinate data.
We used the WinBus as the experimental platform. The scope of the WinBus experiment was mainly limited to Chang Hua Coastal Industrial Park. The use of sensors configured in the vehicle and the application equipment configured in the experiment field were limited. Not every vehicle had the same sensors. In this method, there will be gaps in the parameters of the vehicle dynamics, and there will be differences in the threshold values of the positioning system’s fault detection. Artificial intelligence (AI) may be used to support the proposed fault detection and correction methods in the future. There are AI tools available that can be used to determine the most suitable setting value for each car.

Author Contributions

Conceptualization, W.-H.C.; methodology, W.-H.C.; software, W.-H.C. and Y.-S.L.; validation, W.-H.C. and Y.-S.L.; formal analysis, W.-H.C. and R.-T.J.; investigation, Y.-S.L.; resources, W.-H.C. and Y.-S.L.; data curation, Y.-S.L.; writing—original draft preparation, W.-H.C.; writing—review and editing, W.-H.C. and R.-T.J.; visualization, W.-H.C.; supervision, R.-T.J.; project administration, W.-H.C.; funding acquisition, W.-H.C. and Y.-S.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work cooperated with the Automotive Research and Testing Center under grant number 111-EC-17-A-25-1588 in the Intelligent Vehicle Technology and Autonomous Driving System Development Project (1/4) of the Ministry of Economic Affairs, Taiwan ROC.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cao, D.; Zhao, B.; Han, H.; Gan, W. Design on Remote Diagnosis System of Refrigerated Containers. In Proceedings of the 2008 International Symposium on Knowledge Acquisition and Modeling, Wuhan, China, 20–21 December 2008; pp. 375–379. [Google Scholar]
  2. Available online: https://www.artc.org.tw/tw/research/detail/39 (accessed on 6 June 2022).
  3. Bakuła, M.; Pelc-Mieczkowska, R.; Walawski, M. Reliable and redundant RTK positioning for applications in hard observational conditions. Artif. Satell. 2021, 47, 23–33. [Google Scholar] [CrossRef]
  4. Speth, T.; Kamann, A.; Brandmeier, T.; Jumar, U. Precise relative ego-positioning by stand-alone RTK-GPS. In Proceedings of the 13th Workshop Positioning Navigation Communications (WPNC), Bremen, Germany, 19–20 October 2016; pp. 1–6. [Google Scholar]
  5. Langley, R.B. RTK GPS. GPS World 1998, 9, 70–76. [Google Scholar]
  6. Milanés, V.; Naranjo, J.E.; González, C.; Alonso, J.; de Pedro, T. Autonomous vehicle based in cooperative GPS and inertial systems. Robotica 2008, 26, 627–633. [Google Scholar] [CrossRef]
  7. Jo, K.; Chu, K.; Sunwoo, M. Interacting multiple model filter-based sensor fusion of GPS with in-vehicle sensors for real-time vehicle positioning. IEEE Trans. Intell. Transp. Syst. 2012, 13, 329–343. [Google Scholar]
  8. Song, Y.; Li, Q.; Li, B.; Xu, J.; Zheng, N. Real-Time Dynamic Display on Spatial Data for Car Navigation Service. In Proceedings of the International Conference on Intelligent Transportation System Telecommunications Proceedings, Toronto, ON, Canada, 17–20 September 2006; pp. 94–97. [Google Scholar]
  9. Bavle, H.; De La Puente, P.; How, J.P.; Campoy, P. VPS-SLAM: Visual planar semantic SLAM for aerial robotic systems. IEEE Access 2020, 8, 60704–60718. [Google Scholar] [CrossRef]
  10. Tateno, K.; Tombari, F.; Laina, I.; Navab, N. CNN-SLAM: Real-time dense monocular SLAM with learned depth prediction. IEEE CVPR 2017, 6243–6252. [Google Scholar]
  11. Available online: https://www.artc.org.tw/tw/research/detail/32 (accessed on 11 September 2022).
  12. Wang, L.; Zhang, Y.; Wang, J. Map-based localization method for autonomous vehicles using 3D-LIDAR. In Proceedings of the 20th IFAC World Congress, Toulouse, France, 14 July 2017; Volume 50, pp. 276–281. [Google Scholar]
  13. Kohlbrecher, S.; Stryk, O.; Meyer, J.; Klingauf, U. A Flexible and Scalable SLAM System with Full 3D Motion Estimation. In Proceedings of the International Conference on Safety Security and Rescue Robotics (SSRR), Kyoto, Japan, 1–5 November 2011; pp. 155–160. [Google Scholar]
  14. Pengpeng, S.S.; Zhao, X.; Xu, Z.; Wang, R.; Min, H. A 3D LiDAR data-based dedicated road boundary detection algorithm for autonomous vehicles. IEEE Access 2019, 7, 29623–29638. [Google Scholar]
  15. Brock, O.; Trinkle, F.; Ramos, J. Model-Based Vehicle Tracking for Autonomous Driving in Urban Environments; MIT Press: Cambridge, MA, USA, 2009; Volume 1, pp. 175–182. [Google Scholar]
  16. ISO 11898-1:2016; Road Vehicles—Controller Area Network (CAN)—Part 1: Data Link Layer and Physical Signaling. ISO: Palm Bay, FL, USA, 2015.
  17. ISO 11898-1:2016; Road Vehicles—Controller Area Network (CAN)—Part 2: High-Speed Medium Access Unit. ISO: Palm Bay, FL, USA, 2016.
  18. ISO 11898-1:2006; Road vehicles—Controller Area Network (CAN)—Part 3: Low-Speed, Fault-Tolerant, Medium-Dependent Interface. ISO: Palm Bay, FL, USA, 2016.
  19. Miletiev, R.; Iontchev, E.; Yordanov, R. Multisensor precision positioning based on inertial and GNSS systems via 5G network. In Proceedings of the 45th International Spring Seminar on Electronics Technology (ISSE), Vienna, Austria, 11–15 May 2022; pp. 11–15. [Google Scholar]
  20. Mostafavi, S.S.; Sorrentino, S.; Guldogan, M.B.; Fodor, G. Vehicular Positioning Using 5G Millimeter Wave and Sensor Fusion in Highway Scenarios. In Proceedings of the IEEE International Conference on Communications (ICC), Dublin, Ireland, 7–11 June 2020; pp. 7–11. [Google Scholar]
  21. Eckermann, F.; Gorczak, P.; Wietfeld, C. Cooperative Validation of CAM Position Information Using C-V2X. In Proceedings of the IEEE 92nd Vehicular Technology Conference (VTC2020-Fall), Antwerp, Belgium, 25–28 May 2020; p. 15. [Google Scholar]
  22. Hedges, C.; Perry, F. Overview and Use of SAE j2735 Message Sets for Commercial Vehicles; SAE Technical Paper: Troy, MI, USA, 2008; No. 2008-01-2650. [Google Scholar]
  23. El-Shafie, A.; Hussain, A.; Eldin, A.E.N. ANFIS-Based Model for Real-time INS/GPS Data Fusion for Vehicular Navigation System. In Proceedings of the International Conference on Computer Technology and Development, Kota Kinabalu, Malaysia, 13–15 November 2009. [Google Scholar]
  24. Raveena, C.; Sravya, R.; Kumar, R.; Chavan, A. Sensor Fusion Module Using IMU and GPS Sensors For Autonomous Car. In Proceedings of the IEEE INOCON, Virtual Conference, 6–9 July 2020. [Google Scholar]
  25. Min, H.; Wu, X.; Cheng, C.; Zhao, X. Kinematic and dynamic vehicle model-assisted global positioning method for autonomous vehicles with low-cost GPS/camera/in-vehicle sensors. Sensors 2019, 19, 5430. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Hashim, H.A.; Eltoukhy, A.E.E. Landmark and IMU data fusion: Systematic convergence geometric nonlinear observer for SLAM and velocity bias. IEEE Trans. Intell. Transp. Syst. 2020, 23, 3292–3301. [Google Scholar] [CrossRef]
Figure 1. Sensors deployed in the WinBus.
Figure 1. Sensors deployed in the WinBus.
Applsci 12 11339 g001
Figure 2. Positioning system.
Figure 2. Positioning system.
Applsci 12 11339 g002
Figure 3. Functional blocks of the positioning system.
Figure 3. Functional blocks of the positioning system.
Applsci 12 11339 g003
Figure 4. Fault detection method function block.
Figure 4. Fault detection method function block.
Applsci 12 11339 g004
Figure 5. The experimental route.
Figure 5. The experimental route.
Applsci 12 11339 g005
Figure 6. Comparison of SLAM and RTK results.
Figure 6. Comparison of SLAM and RTK results.
Applsci 12 11339 g006
Figure 7. Comparison of SLAM, RTK, fusion, and prediction results.
Figure 7. Comparison of SLAM, RTK, fusion, and prediction results.
Applsci 12 11339 g007
Figure 8. Comparison of the RTK, SLAM, fusion prediction, and fusion with the diagnostic mechanism results.
Figure 8. Comparison of the RTK, SLAM, fusion prediction, and fusion with the diagnostic mechanism results.
Applsci 12 11339 g008
Figure 9. The results from each positioning method (longitude).
Figure 9. The results from each positioning method (longitude).
Applsci 12 11339 g009
Figure 10. The results from each positioning method (latitude).
Figure 10. The results from each positioning method (latitude).
Applsci 12 11339 g010
Figure 11. The result from each positioning method (longitude).
Figure 11. The result from each positioning method (longitude).
Applsci 12 11339 g011
Figure 12. The result from each positioning method (latitude).
Figure 12. The result from each positioning method (latitude).
Applsci 12 11339 g012
Figure 13. Comparison of the results based on the RTK results.
Figure 13. Comparison of the results based on the RTK results.
Applsci 12 11339 g013
Table 1. The comparison between each method based on the max positioning error of 91.8m.
Table 1. The comparison between each method based on the max positioning error of 91.8m.
Error ConditionPositioning Error (m)Correction Rate (%)
Limited two-meter error89.8097.82
RTK—original fusion algorithm18.3080.06
RTK—prediction0.6499.30
RTK—fusion with diagnostic0.2599.73
RTK—fusion with diagnostic
(Considering the limited two-meter error)
2.2589.89
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chang, W.-H.; Juang, R.-T.; Lin, Y.-S. Research on Data Fusion of Positioning System with a Fault Detection Mechanism for Autonomous Vehicles. Appl. Sci. 2022, 12, 11339. https://doi.org/10.3390/app122211339

AMA Style

Chang W-H, Juang R-T, Lin Y-S. Research on Data Fusion of Positioning System with a Fault Detection Mechanism for Autonomous Vehicles. Applied Sciences. 2022; 12(22):11339. https://doi.org/10.3390/app122211339

Chicago/Turabian Style

Chang, Wei-Hsuan, Rong-Terng Juang, and You-Sian Lin. 2022. "Research on Data Fusion of Positioning System with a Fault Detection Mechanism for Autonomous Vehicles" Applied Sciences 12, no. 22: 11339. https://doi.org/10.3390/app122211339

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop