Next Article in Journal
Anisotropy Analysis of the Permeation Behavior in Carbon Dioxide-Assisted Polymer Compression Porous Products
Previous Article in Journal
Forecasting by Combining Chaotic PSO and Automated LSSVR
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

HAIS: Highways Automated-Inspection System

Faculty of Engineering and Applied Science, Ontario Tech University (UOIT), Oshawa, ON L1G 0C5, Canada
*
Author to whom correspondence should be addressed.
Technologies 2023, 11(2), 51; https://doi.org/10.3390/technologies11020051
Submission received: 2 March 2023 / Revised: 22 March 2023 / Accepted: 28 March 2023 / Published: 1 April 2023

Abstract

:
A smart city is a trending concept describing a new generation of cities operated intelligently with minimal human intervention. It promotes energy sustainability, minimal environmental impact, and better governance. In transportation, the remote highway infrastructure monitoring will enhance the driver’s safety, continuously report road conditions, and identify potential hazardous incidents such as accidents, floods, or snow storms. In addition, it facilitates the integration of future cuttingedge technologies such as self-driving vehicles. This paper presents a general introduction to a smart monitoring system for automated real-time road condition inspection. The proposed solution includes hardware devices/nodes and software applications for data processing, road condition inspection using hybrid algorithms based on digital signal processing, and artificial intelligence technologies. The proposed system has an interactive web interface for real-time data sharing, infrastructure monitoring, visualization, and management of inspection reports which can improve the maintenance process.

1. Introduction

Nowadays, the Fourth Industrial Revolution (sometimes called the 4IR or Industry 4.0) is pushing industries toward integrating state-of-the-art technologies, such as artificial intelligence (AI), robotics, and the Internet of Things (IoT) [1]. In public transportation, the highway’s status will become digitized, allowing real-time data collection and sharing, surveillance of the infrastructure condition of bridges and tunnels, and traffic monitoring and management. These factors will help prepare the needed digital capabilities to integrate and assist the transition to fully connected and automated vehicles (CAV) [2]. This transition is feasible given the recent innovation and research and application of autonomous driving technology, robotics and artificial intelligence [3,4,5], while maintaining legal regulations [6,7]. The application of these technologies in infrastructure inspection will support the deployments of autonomous vehicles with the required information on the highway conditions for a better understanding of the road environment dynamics while driving. In particular, in winter conditions where the safety risks increase. To facilitate the integration of the CAVs and ensure the safety of drivers and pedestrians, monitoring and analyzing road conditions play a critical role in fulfilling the following needs and concerns:
  • Safety concern: Road conditions vary in winter, representing a hazard for safe driving. Thus, all roads need to be inspected frequently to ensure better maintenance.
  • Camera limitation: The individual use of a camera for inspection limits the inspection performance due to ambient light variation and visibility reduction (e.g., dust or snow). Therefore, it would be interesting to add other sensors, such as ultrasound-based or laser-based sensors, to work collaboratively to increase inspection accuracy.
  • Vehicles assistance: The drivers need external aid when driving in harsh weather or when the sensors are no longer reliable, such as a camera in snow or a tunnel. The driver and autonomous vehicles (AVs) need to obtain real-time information about the critical zones that have been inspected recently. Because getting real-time road conditions or any potential road hazards will help enhance safety measures and save lives.
  • Connectivity: integrating secure and reliable connections, such as vehicle-to-vehicle, vehicle-to-infrastructure, and infrastructure-to-vehicle, will help enhance the safety measure within a connected collaborative-driving network.
Consequently, optimizing the infrastructure monitoring/inspection will play a key role in proving road/bridge conditions feed with minimal cost. This valuable information will be used to localize the damage, notify the drivers, and request reparation from the maintenance team. The transportation infrastructure, such as highways, bridges, and tunnels, can be inspected using four main inspection categories:
  • Manual inspection: regular missions/visits are performed to check the road visually or using some measurement tools. If needed, the reported damages will be delivered to the maintenance team for further investigation and reparation. The technician will provide a visual inspection.
  • Satellite-based inspection: satellite can be used to scan the road landscape and report the damage [8]. This approach is expensive, but it can provide good results depending on the satellite imagery resolution and the weather condition.
  • Inspection node (semi-automated): the inspection vehicle, equipped with a node composed of a sensors-based platform (cameras, LiDAR), is used to scan the road and send the data to an online platform to report the inspection results.
  • Autonomous inspection: a self-driven vehicle (robot or drone) is used for the inspection [9]. In this case, the inspection vehicle will define automatically the inspection mission trajectory based on the assigned target area. The collected data will be processed online to provide the inspection results.
This paper explores the proposed semi-automated hardware/software for road inspection, called Highway Autonomous Inspection System (HAIS). The proposed HAIS system takes advantage of the recent advances in deep learning and computer vision technology to assess and monitor highway conditions and enforce safety measures by giving real-time driving assistance. The system has a hardware module that prepares and sends the collected sensor data to the cloud (see the system illustration in Figure 1). These data will be processed using the designed inspection algorithm and visualized inspection report on an interactive user interface.
The following sections of the paper are organized as follows. Section 2 presents a general overview of the proposed framework flowchart and its different modules and explores the data collection and preparation module. Section 3 explores the data transfer module. Section 4 reports the inspection module with examples of the obtained inspection results. Section 5 exposes the interactive user interface module. Finally, concluding remarks and future works are summarized in Section 6.

2. System Design and Prototyping

This work investigates how to make highways safer and suitable for autonomous driving in the future. The proposed system will focus on automatic road condition inspection and safety assessment and send this information to all vehicles sharing the road. The proposed system will enhance real-time monitoring and improve the efficiency of the maintenance schedule. Moreover, it will facilitate the integration of autonomous driving by generating smart and digital highways where the highway can assist the vehicle with safety alarms. The proposed solution used inspection nodes to diagnose the road conditions and share these data with the data center (cloud), as illustrated in the following figures.
The proposed solution used inspection nodes to diagnose road conditions and share these data with the data center (cloud). The designed and prototyped hardware is a system (inspection node) composed of different sensors with 5G/4G connection to collect the data from roads and send them to the cloud database (see Figure 2). The system can be mounted on vehicles, buses, and snow removal trucks, allowing more scanning flexibility. The HAIS system processes the data of the different sensors in a hierarchical scheme, performing an automated road inspection in real-time. The inspection system uses the sensors for measurements and the inspection history to provide a final inspection report: road conditions and preventive maintenance scheduling. The first prototype of the inspection node is presented in Figure 3.
One sensor or more of the inspection node will be used to monitor a specific element of the road conditions. For instance, the road camera and LiDAR sensors are mainly used for road damage detection but the side camera is used for lane marker reflectivity estimation (see Figure 4). The data collection is a road screening process based on a predefined mission. The collected data will be then structured and transformed into the Nuscenes schema [10]. Examples of the collected sensor data are presented in Figure 4.

3. Data Collection and Transfer Module

The main aim of this Component is to save the collected data in the cloud database and retrieve it to the server. Considering the time response, size of the data, and the offline mode. In addition, it provides the receiver side with API-enabled requests, which enables the user to retrieve specific data by mission ID. Figure 5 shows the general flowchart of the data transfer module. The different components of the transfer module are composed of the following components.
  • Data compression: compress the received data from the inspection node and store the compressed file in a local folder based on a specific compression size. The tarfile library is used to create a .tar file containing the sensor’s data in the previously defined compression size [11].
  • Local storage: the compressed data are kept locally until they are sent successfully to the cloud Firebase [12], where all previously collected data are stored in the Firebase online database. In case of connection interruption or sending error, the transmission thread will keep trying to send the compressed data while preserving a local backup until the program ensures that the data are completely sent successfully to the database.
  • Data exchange using API: this component is the main component for the receiver side. It enables the web application to download, retrieve and request specific data from the database using Flask APIs. This component takes the vehicle number and the data token key to return a specific mission trajectory with the corresponding collected data.

4. Inspection Module

The HAIS system provides an integrated platform for road inspection in real-time. The project aims to investigate how to make highways safer and suitable for autonomous driving in the future. The proposed approach will take advantage of the recent advances in deep learning and computer vision technology to assess and monitor highway conditions and enforce safety measures by giving real-time driving assistance. It inspects highways/road status, such as roads full of snow or that are broken, and its conformity with safety measures. The inspection accuracy is enhanced by integrating and processing the data of different types of sensors using deep learning algorithms to diagnose the road conditions, as shown in Figure 6. The road inspection system is integrated with a complementary detection approach using both digital signal processing (DSP)-based and machine learning-based algorithms.
The main DSP-based algorithm consists of a damage detection algorithm based on image variation. The algorithm is based on detecting the damages as a variation in the road textures and then selecting the soundest defect based on their area. An example of the obtained damage detection is shown in Figure 7.
The inspection of the retro-reflectivity of road lane markings on all highways is an important diagnosis to assess road safety for drivers during low light/night-time conditions. The proposed algorithm is based on processing the RGB camera sensor. The proposed algorithm estimates the reflectivity based on pixel intensity in the segmented lane marker. An example of the obtained retro-reflectivity estimation is shown in Figure 8:
The ML-based algorithms are integrated into the HAIS system to better understand the driving environment. The proposed classification, object detection, and semantic segmentation models are trained using the collection dataset, manually annotated, and some public datasets, such as the RTK dataset [13]. The trained models will be used to enhance the inspection performance as follows:
  • Classification: to recognize the road safety index based on the weather condition and road quality. Examples of the CARLA-based simulated dataset [14] and public dataset used in this work are presented in Figure 9 and Figure 10. The trained classification model could achieve an accuracy of 97.1 % and 97.8 % in road snow coverage detection and safety index prediction, respectively.
  • Segmentation: to detect the road surface to avoid the false positive damage coming from the surrounding objects in the image. The trained segmentation model could achieve a DICE up to 0.99 , as shown in Figure 11.
  • Object detection: to detect road damage, mainly the cracks, and compute the road damage area of each crack. Some examples of the obtained results using an object detection model, trained on a manually annotated dataset collected using a Dashcam or inspection node or drone dataset, are presented in Figure 12. The trained YOLOv5 model could achieve an IOU of up to 0.7 .
It is worth mentioning that the size of the dataset (number of images) plays a key role in improving the performance of the previously reported models.

5. User Interface Module

The designed web–app interface is a HTML5/JavaScript-based user interface that allows the operator to visualize the collected data on the map from the inspection database on Firebase cloud and create the inspection reports in real-time. The web–app interface provides registered users with real-time and robust interaction and visualization, as shown in Figure 13, and is explained as follows:
  • Explore the collected data at every location by showing the different sensor data, such as road image, car speed, and car position.
  • Visualize the road conditions report generated by the inspection algorithms.
  • Manage the inspection and reparation missions by showing each mission separately.
The demonstration of the interactive user interface is shared in this video, Available online: https://youtu.be/5k4igwUg2ao (accessed on 20 March 2023).

6. Conclusions and Future Work

In this work, we presented an automated highway inspection system enabling real-time hybrid road condition inspection and monitoring. The proposed solution includes a hardware devices/nodes architecture and software application for data processing, such as road conditions using computer vision and artificial intelligence technologies. It enables data-sharing of the collected data using the Firebase cloud platform. In addition, it provides the users with an interactive web app user interface that allows the real-time monitoring, visualization, and management of inspection reports, which will help in planning maintenance operations. The obtained inspections show promising results in different weather conditions:
  • Road snow coverage detection: an average prediction accuracy of 97.1 % .
  • Road safety index prediction: an average prediction accuracy of 97.8 % .
  • Road damage detection: achieve an IOU up 0.7 .
In addition, two evaluation metrics are proposed to assess the road quality and lane marker quality that can be personalized to define the criteria where reparation or maintenance is required.
The proposed solution can be extended and adapted to other infrastructure inspections, such as bridges and tunnels, using autonomous drones. Therefore, we will implement an optimization-based algorithm for autonomous trajectory planning taking into consideration the number of drones, available power, and the minimization of the overall cost and resources.

Author Contributions

H.A.G. is the Principle Investigator. K.E. is the Co-Principle Investigator. M.U.I. designed the mechanical parts of the inspection node. S.G. participated in CARLA simulation. K.P.S. participated in the node design. A.M. designed and implemented the data transfer algorithm and the API communication. H.O. designed the graphical user interface. A.C. designed the system, DSP and ML inspection algorithms. A.C., M.U.I. and H.A.G. participated in the data collection using the inspection node and the drone. A.C. wrote the main draft paper, and H.A.G. reviewed the paper, and all authors participated in writing parts of the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Transportation Ontario (MTO).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

Research reported in this publication has been supported by the Ministry of Transportation (MTO). The authors would like to thank the students who participated in different phases of the project development and testing: Muhammad Idrees helped with the sensor assembly and the setup of the Jetson Nano, and he initiated the ROS programming. Elena Villalobos Herra helped with the data collection process.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
APIapplication programming interface
DSPdigital signal processing
IoUintersection over union
MLMachine learning
SQLfull structured query language
RTKroad traversing knowledge

References

  1. Salesforce Canada. Why Canadian Manufacturers Need To Understand–And Embrace–The Fourth Industrial Revolution; Salesforce Canada: Toronto, ON, Canada, 2022. [Google Scholar]
  2. Saeed, T.U.; Alabi, B.N.T.; Labi, S. Preparing Road Infrastructure to Accommodate Connected and Automated Vehicles: System-Level Perspective. J. Infrastruct. Syst. 2021, 27, 6020003. [Google Scholar] [CrossRef]
  3. Panagiotopoulos, I.; Dimitrakopoulos, G. An empirical investigation on consumers’ intentions towards autonomous driving. Transp. Res. Part C Emerg. Technol. 2018, 95, 773–784. [Google Scholar] [CrossRef]
  4. Sun, P.; Kretzschmar, H.; Dotiwalla, X.; Chouard, A.; Patnaik, V.; Tsui, P.; Guo, J.; Zhou, Y.; Chai, Y.; Caine, B.; et al. Scalability in perception for autonomous driving: Waymo open dataset. In Proceedings of the 2020 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 2443–2451. [Google Scholar] [CrossRef]
  5. Grigorescu, S.; Trasnea, B.; Cocias, T.; Macesanu, G. A survey of deep learning techniques for autonomous driving. J. Field Robot. 2020, 37, 362–386. [Google Scholar] [CrossRef] [Green Version]
  6. O’Sullivan, S.; Nevejans, N.; Allen, C.; Blyth, A.; Leonard, S.; Pagallo, U.; Holzinger, K.; Holzinger, A.; Sajid, M.I.; Ashrafian, H. Legal, regulatory, and ethical frameworks for development of standards in artificial intelligence (AI) and autonomous robotic surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2019, 15, e1968. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Imai, T. Legal regulation of autonomous driving technology: Current conditions and issues in Japan. IATSS Res. 2019, 43, 263–267. [Google Scholar] [CrossRef]
  8. Brewer, E.; Lin, J.; Kemper, P.; Hennin, J.; Runfola, D. Predicting road quality using high resolution satellite imagery: A transfer learning approach. PLoS ONE 2021, 16, e0253370. [Google Scholar] [CrossRef]
  9. Menendez, E.; Victores, J.G.; Montero, R.; Martínez, S.; Balaguer, C. Tunnel structural inspection and assessment using an autonomous robotic system. Autom. Constr. 2018, 87, 117–126. [Google Scholar] [CrossRef]
  10. Caesar, H.; Bankiti, V.; Lang, A.H.; Vora, S.; Liong, V.E.; Xu, Q.; Krishnan, A.; Pan, Y.; Baldan, G.; Beijbom, O. Nuscenes: A multimodal dataset for autonomous driving. In Proceedings of the 2020 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 11618–11628. [Google Scholar] [CrossRef]
  11. Tarfile—Read and Write Tar Archive Files—Python 3.11.2 Documentation. Available online: https://docs.python.org/3/library/tarfile.html (accessed on 20 March 2023).
  12. Moroney, L. The Firebase Realtime Database. In The Definitive Guide to Firebase; Moroney, L., Ed.; Apress: Berkeley, CA, USA, 2017; pp. 51–71. [Google Scholar] [CrossRef]
  13. Rateke, T.; Justen, K.A.; von Wangenheim, A. Road surface classification with images captured from low-cost camera-road traversing knowledge (RTK) dataset. Rev. De Inform. Teor. E Apl. 2019, 26, 50–64. [Google Scholar] [CrossRef] [Green Version]
  14. Dosovitskiy, A.; Ros, G.; Codevilla, F.; Lopez, A.; Koltun, V. CARLA: An Open Urban Driving Simulator. In Proceedings of the 1st Annual Conference on Robot Learning, Mountain View, CA, USA, 13–15 November 2017; pp. 1–16. [Google Scholar]
Figure 1. Illustration of the proposed HAIS inspection system.
Figure 1. Illustration of the proposed HAIS inspection system.
Technologies 11 00051 g001
Figure 2. The proposed framework of the HAIS system.
Figure 2. The proposed framework of the HAIS system.
Technologies 11 00051 g002
Figure 3. The first prototype of the inspection node.
Figure 3. The first prototype of the inspection node.
Technologies 11 00051 g003
Figure 4. Examples of the collected data using the inspection node.
Figure 4. Examples of the collected data using the inspection node.
Technologies 11 00051 g004
Figure 5. Illustration of the data transfer module flowchart.
Figure 5. Illustration of the data transfer module flowchart.
Technologies 11 00051 g005
Figure 6. The flowchart of the automated inspection approach.
Figure 6. The flowchart of the automated inspection approach.
Technologies 11 00051 g006
Figure 7. Example of road damage detection.
Figure 7. Example of road damage detection.
Technologies 11 00051 g007
Figure 8. Example of road lane markings retro-reflectivity inspection.
Figure 8. Example of road lane markings retro-reflectivity inspection.
Technologies 11 00051 g008
Figure 9. Examples of CARLA simulation dataset of the RGB camera sensor.
Figure 9. Examples of CARLA simulation dataset of the RGB camera sensor.
Technologies 11 00051 g009
Figure 10. Examples of winter RTK classification dataset.
Figure 10. Examples of winter RTK classification dataset.
Technologies 11 00051 g010
Figure 11. Examples of road segmentation using the RTK segmentation dataset.
Figure 11. Examples of road segmentation using the RTK segmentation dataset.
Technologies 11 00051 g011
Figure 12. Examples of damage detection using: (a) Dashcam, (b) node camera, and (c) drone camera.
Figure 12. Examples of damage detection using: (a) Dashcam, (b) node camera, and (c) drone camera.
Technologies 11 00051 g012
Figure 13. Road condition visualization data exploration using the inspection node data.
Figure 13. Road condition visualization data exploration using the inspection node data.
Technologies 11 00051 g013
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

A. Gabbar, H.; Chahid, A.; U. Isham, M.; Grover, S.; Singh, K.P.; Elgazzar, K.; Mousa, A.; Ouda, H. HAIS: Highways Automated-Inspection System. Technologies 2023, 11, 51. https://doi.org/10.3390/technologies11020051

AMA Style

A. Gabbar H, Chahid A, U. Isham M, Grover S, Singh KP, Elgazzar K, Mousa A, Ouda H. HAIS: Highways Automated-Inspection System. Technologies. 2023; 11(2):51. https://doi.org/10.3390/technologies11020051

Chicago/Turabian Style

A. Gabbar, Hossam, Abderrazak Chahid, Manir U. Isham, Shashwat Grover, Karan Pal Singh, Khalid Elgazzar, Ahmad Mousa, and Hossameldin Ouda. 2023. "HAIS: Highways Automated-Inspection System" Technologies 11, no. 2: 51. https://doi.org/10.3390/technologies11020051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop