Next Article in Journal
Magnetic Stress Sensing System for Nondestructive Stress Testing of Structural Steel and Steel Truss Components Based on Existing Magnetism
Previous Article in Journal
A Robust RANSAC-Based Planet Radius Estimation for Onboard Visual Based Navigation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sensors Fusion and Multidimensional Point Cloud Analysis for Electrical Power System Inspection

by
Vinicius F. Vidal
1,
Leonardo M. Honório
1,*,
Felipe M. Dias
1,
Milena F. Pinto
2,
Alexandre L. Carvalho
3 and
Andre L. M. Marcato
1
1
Electrical Engineering Department, Federal University of Juiz de Fora, Juiz de Fora 36036, Brazil
2
Electronics Department, Federal Center for Technological Education of Rio de Janeiro, Rio de Janeiro 20271, Brazil
3
MRS Logistica, Juiz de Fora 36060, Brazil
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(14), 4042; https://doi.org/10.3390/s20144042
Submission received: 9 June 2020 / Revised: 3 July 2020 / Accepted: 3 July 2020 / Published: 21 July 2020
(This article belongs to the Section Intelligent Sensors)

Abstract

:
Thermal inspection is a powerful tool that enables the diagnosis of several components at its early stages. One critical aspect that influences thermal inspection outputs is the infrared reflection from external sources. This situation may change the readings, demanding that an expert correctly define the camera position, which is a time consuming and expensive operation. To mitigate this problem, this work proposes an autonomous system capable of identifying infrared reflections by filtering and fusing data obtained from both stereo and thermal cameras. The process starts by acquiring readings from multiples Observation Points (OPs) where, at each OP, the system processes the 3D point cloud and thermal image by fusing them together. The result is a dense point cloud where each point has its spatial position and temperature. Considering that each point’s information is acquired from multiple poses, it is possible to generate a temperature profile of each spatial point and filter undesirable readings caused by interference and other phenomena. To deploy and test this approach, a Directional Robotic System (DRS) is mounted over a traditional human-operated service vehicle. In that way, the DRS autonomously tracks and inspects any desirable equipment as the service vehicle passes them by. To demonstrate the results, this work presents the algorithm workflow, a proof of concept, and a real application result, showing improved performance in real-life conditions.

1. Introduction

Electrical companies rely on a vast infrastructure to safely operate their business, requiring periodic inspection and maintenance. This scenario is especially critical for the electrical power transmission and distribution network, which can reach thousands of kilometers. There are several different types of inspections. However, some electrical equipment faults create thermal stress points due to a resistance increase, which can be detected through thermal data analysis before a major contingency occurs. Usually, these inspections are performed manually by trained personnel with several disadvantages,such as high cost, long time demand, risks to human life, and human failure. The application of automated inspection methods can reduce the problems as mentioned above.
Many different methods can be applied to perform automated condition inspection [1]. For instance, in Reference [2], a few different techniques are analyzed, such as the application of vibration sensors and torque monitoring. Despite good results, those techniques require the implementation of dedicated sensors, and the deployment of sensors network to allow data storage and analysis, increasing cost and complexity.
Moreover, computer vision techniques present good alternatives for many different scenarios. In the last years, various studies were proposed for defects detection in equipment through computer vision. In Reference [3], an Unmanned Aerial Vehicle (UAV) equipped with two cameras performs a three-dimensional reconstruction of an industrial facility for future inspection. Another application of image processing is found in Reference [4]. The authors proposed an image processing system for defects detection in paved streets based on color and texture information. Reference [5] places the camera on top of a running train, and an image processing system checks the distance between the rails enabling the faults detection.
A common characteristic in the previous works is the fact that they rely purely on the visible spectrum, which is a disadvantage once most failures on electrical parts generate heat from impedance increase or excessive load. Moreover, thermal imaging has been used in the last years to prevent possible faults in electrical units. According to Reference [6], bad connections, unbalanced loading, excessive use, or wear out are some of the factors that cause thermal stress in electrical components, which can be detected by hotspots in thermal images. Thermal cameras and 3D thermal models have already been used, separately, in a wide range of applications, such as in the sectors of building inspection [7,8], defect detection [9], and energy efficiency analysis [10,11]. However, just a few works have already applied the combination of 3D modeling with thermal data, for example, Reference [12], which still are limited and not used in an online fashion. Other works have already applied thermal inspection to electrical equipment, such as Reference [13]. However, there is no combination of 3D reconstruction, limiting its potential.
The main contribution of this research is the design and implementation of a Multidimensional Point Cloud Analysis (MPCA) methodology. The system is composed of an autonomous Directional Robotic System (DRS) robot embedding two RGB and one thermal camera. It has the capacity to move in the tilt and pan directions. The application of a Simultaneous Localization and Mapping (SLAM) methodology allows it to evaluate the robot’s exact position to point the cameras, generating n-dimensional models of previously selected objects. By calibrating the cameras, it is also possible to project the temperature readings from the thermal image into the 3D model. This proposal generates a Multidimensional Point Cloud (MPC) through the association of each spatial point to a thermal value. Moreover, each 3D point will have a temperature profile regarding its different position readings by repeating this process from different poses. The analysis of these profiles can indicate misreading due to infrared reflection, improving the quality of the measures. Finally, once in thermal inspection, the problem localization is a reliable indication of the diagnosis itself, this approach facilitates the correct analysis and helps the diagnosis.
To demonstrate the effectiveness of this approach, the MPC will be built over a vehicle for inspection of an electrical power distribution system. These research contributions can be summarized as follows:
  • A mechanism to provide reliable 3D/thermal (MPC) information for equipment inspection.
  • A real application of automated electrical power distribution line inspection using a DRS along with stereo and thermal cameras to result in real-time MPCs.
  • An optimized approach to process multiple MPC to filter thermal misreadings.
The remainder of this research is organized as follows—Section 2 presents a brief review of the related work highlighting the state-of-the-art in SLAM and thermal inspection. Section 3 details the architecture and its foundations for 3D reconstruction, Section 4 shows the system assembly and deployment, Section 5 shows the proposed experiments with a proper discussion of the results. The concluding remarks and future work are conducted in Section 6.

2. Background and Related Works

Inspections of electrical power transmission/distribution related equipment are performed based on four main methods, that is, helicopters, UAVs, road vehicles, and manually by operators. The use of helicopters and UAVs has the disadvantage of being relatively far from the subjected inspection area. The applications of helicopters can also be quite expensive, while some UAVs present low battery time that may limit its use. Manual inspection by operators is too costly and can be time-consuming, making it impractical in some situations. Thus, the use of road vehicles with mounted equipment becomes a suitable option for this type of inspection, especially at the distribution level.
Inspection based on vehicle-mounted systems is used in many areas. The most common use of this technique is concentrated in the rail inspection on works such as Reference [14]. However, some studies have applied these techniques to inspect tunnels [15], platforms, and ancillary equipment [16].
Many challenges arise in the application of computer vision to perform autonomous inspections. Those challenges are related to some practical aspects of the robot. For example, when determining the robot position accurately, performing proper 3D reconstruction using multiple data from the visual and thermal camera hardware, or controlling the primary function of the robotic system. The following subsections discuss related works using the techniques proposed in this research.

2.1. Precise Localization Using Visual Odometry (VO)

The current proposal of this work depends on the robot’s real-time precise position. This information is used to aim the system to a desirable and previously know object, and to fuse each point cloud into the final n-dimensional model.
According to Reference [17], environment mapping, such as the SLAM technique, is a fundamental approach to guarantee safe path planning. In the last decades, a few well-known SLAM methods were developed. The work of Reference [18] introduced a technique called LSD-SLAM. This method performs SLAM from direct image alignment using a monocular camera. The result is a pose graph without the scale drift problem inherent in monocular vision. The authors of Reference [19] proposed an algorithm that outperformed LSD-SLAM in location and mapping with 3D semi-dense reconstruction and VO. This process uses information from stereo vision and other sensors, such as IMU, fused, and filtered. They presented qualitative and quantitative results in real datasets, running in real-time on a CPU. Another SLAM algorithm is found in Reference [20] introduces the ORB-SLAM2 open-source algorithm, presenting a complete solution for SLAM, that is, using monocular, stereo, or RGB-D cameras. The results overcame LSD-SLAM methods in metrics such as time and rotation error in many KITTI benchmark datasets, with the advantage of accuracy and efficiency and also running on CPU. Note that some modern techniques for VO calculated with high frame rate cameras guarantee good quality mapping and location from relatively slow movements, requiring less processing capacity from the hardware [17]. Applications assessing its effectiveness can be found in the most variety of vehicle types and environments, as seen in References [21,22].
Stereo Vision is a consolidated technique for robotics applications, computing both 3D information maps and VO for the robot in the environment, even in real-time. Some works using and assessing stereo vision SLAM results can be found in References [23,24], where it is compared to other sensors and used for obstacle avoidance as well. The work of Reference [25] uses stereo vision to perform SLAM in multi-robot team control. The literature has also highlighted the effectiveness of this technique in outdoor scenarios.
Most state-of-the-art methodologies use SLAM for autonomous navigation or complex environments reconstruction, so this works needs a robust, still lightweight SLAM approach methodology for fast and accurate spacial results. For this reason, a modified ORB-SLAM2 VO  [20] running in a closed loop with a traditional Kalman filter is applied for the robot localization.

2.2. Thermal Inspection in Engineering

There are several works in the literature regarding the use of infrared cameras for industrial and maintenance applications. One noticeable field is the inspection in buildings and constructions pursuing heat leakage or electrical equipment issues. The research presented by Reference [26] brings a solution for generating thermal building information models by fusing information of an infrared camera with a 3D laser scanner. The equipment returns a model containing the temperature distribution in the interior of each room for further analysis.
Regarding infrared thermography for electrical equipment, some studies have presented solutions and results from the image data itself. The work developed by Reference [27] shows quantitative and qualitative methods for analyzing defects from thermal images and gathered temperature values, as much as considering their automatic recognition. In Reference [28], a fuzzy system is applied automatically to recognize and classify equipment failures from thermal image inputs. Induction motors are the focus in Reference [29], where the authors developed an algorithm to classify the faults observed in the thermal images.
The use of thermal and visible inspection is vastly applied in the railroad industry, where both the rail and vehicle conditions, as much as the surrounding distribution lines, are subjected to fault risks that can be prevented by analyzing infrared images. Reference [30] mentioned that the inspection labor is done many times by land in a non-effective manner, and brings a solution using a UAV for image data gathering in an automated fashion. Besides the applicable approach, it still relies on many conditions, for example, weather, vehicle line of sight, and channel links quality, not to mention pilot trained personnel.
Using RGB and thermal cameras, Reference [31] proposed a solution for correct thermal image registration with a novel image descriptor combining visual and thermal information to inspect the components. The results are used for thermal issues detection, while still in the 2D aspect world. Fusing the data acquired by thermal and RGB-D cameras, Reference [32] presented a device that scans real objects in 3D and returns the registered point cloud with thermal information as an ultimate result. All the process is described, from camera parameters calibration and motion estimation to data fusion into the point cloud. Still, the process is performed manually and not suitable for many external applications. In Reference [33], the authors presented a system to generate 3D thermal models with a combination of a stereo, an RGB, and a thermal camera. Besides three cameras, the stereo one is not used for the 3D models, but to generate the odometry data. Thus, this system has the same limitations as Reference [32].
Therefore, the motivation of this work is to use the benefits of thermal analysis in distribution line components in an automated fashion. This motivation, combined with a lightweight algorithm that calculates point clouds and VO to perform SLAM, provides an n-dimensional thermal and visual model of a given component, with data acquired from different poses.

3. The MPCA Approach

As stated before, the proposed approach fuses temperature data from a thermal camera with a 3D point cloud generated by a stereo camera for further analysis. Figure 1 presents a global overview of the proposed methodology divided into its seven processes. The system performs concurrent processing with delivered responses varying from hard real-time to offline. The most critical part is the data acquisition and, therefore, has priority over all the others. A real-time trigger controls the synchronization of both visual and thermal images, along with GPS and IMU data. The cameras are connected to the main computer through an Ethernet cable and have global shutter capability. All the processes are listed in Table 1, showing their respective priority, time requirements, and description.

3.1. Synchronization Process and Cameras Calibration

Literature present a vast amount of calibration techniques. Considering RGB cameras, the calibration process usually includes a checkerboard pattern with a known square size due to its simplicity. However, the same approach cannot be replicated for thermal cameras because the images of a standard checkerboard pattern do not have contrast, that is, temperature variation, for calibration. Therefore, the literature shows several methods for thermal camera calibration. In Reference [34], a halogen lamp heats a standard checkerboard to obtain thermal contrast. In Reference [35], a 9 × 9 small bulb matrix is used as the calibration pattern. This matrix generates a set of 100 reference points easily mapped from one image to another.
Different from what is presented in the literature, this work has chosen a different calibration system approach. A checkerboard pattern was printed in a plastic paper and attached to a squared piece of glass. An halogen bulb lamp heats the back of a personalized pattern, and then, the calibration process is performed.
Moreover, the calibration process defines each camera definition, its intrinsic K matrix, the radial distortion and its relative position.
For the rest of the work, it is considered that the images are corrected by the radial distortion. Finally, the cameras are synchronized through a master-slave system. A real-time clock sends a 10 Hz signal, triggering the cameras. A watchdog layer ensures that the three cameras are always synchronized by checking their time-stamps and choosing to publish or discard the images.

3.2. Visual Odometry Algorithm

The open-source ORB-SLAM2 algorithm was chosen in this research to calculate the VO. As seen in the work of Reference [20], this algorithm is composed of three main threads. The first one is responsible for calculating feature-based camera odometry in every frame. Besides, it minimizes the back-projection error using motion-only Bundle Adjustment (BA). The second thread computes and optimizes the local map with the use of local BA. Finally, the last one deals with loop closures employing a pose-graph optimization.
The robot is submitted to discrete trajectories of a few meters length for data acquisition throughout the current path since this research intends to monitor electrical equipment along the distribution line. Therefore, it is not expected for the path to repeat itself. Thus only the Localization Mode of ORB-SLAM2 algorithm is applied. In this mode, both second and third threads are deactivated for performance. Moreover, odometry relies on matches between the current frame’s ORB features and the 3D points calculated from stereo depth in the past frames to evaluate the motion. This algorithm separates the matched points in two categories to achieve better results in odometry, that is, close and far depth points. The points are separated by a threshold of X times the baseline distance. This method guarantees that close points are triangulated for more accurate translation estimation, while still using far points for rotation when seen in multiple views. For our application, a value of 100 was empirically defined in the inspection track.
As this process alone can integrate error along the path, it was proposed to use the VO algorithm in a closed loop with an Extended Kalman filter with colored electromagnetic interference, as shown in Reference [36]. For every pair of synchronized images, the VO is calculated in parallel to the stereo point cloud. It is vital for the later registration process algorithm and thermal 3D data acquisition.

3.3. Thermal Projection

After the calibration process, Equation (1) maps the thermal image I t h to the visual RGB-R one I v , by mapping every pixel p k t h = ( x t h , y t h ) k to its corresponding location p k v = ( x v , y v ) k at every instant k. The values of x v , y v , x t h , y t h must be inside their respective cameras resolutions.
Finally, the mapping process uses the homogeneous transformation matrix H v t h , which comprises the rotation R, translation t and distortion elements d between both image sources [37]. Finally the scalar s that deals with the final thermal image resolution, as a function f : p k t h p k v , described in Equation (1). Formal definition of all variables are shown in Appendix A.
x v y v w v = s H v t h x t h y t h 1 ,
where
H v t h = R t s d s 1 s .
The result is defined up to a scale related to w v , so the value of p k v is calculated in Equation (3).
p k v = x v / w v y v / w v .
The evaluation of H v t h is given by the optimization problem shown in Equation (4), where N is the number of reference points extracted from each of the n P pictures taken from the board at the individual camera calibration process, I v r is the image of the right picture, I a ( p b ) represents the point p b from image I a
M i n f ( s , R , t , d ) = i = 1 n P j = 1 N I i v r ( p j v r ) s H v t h I i t h ( p j t h ) ,
where Equation (4) is optimized by using the Levenberg-Marquardt algorithm.
It is important to note that the Field of View (FoV) and resolution of the thermal camera are both lower than the visual’s ones. The resolution is a cost/benefit choice, while the FoV was designed in this way to keep a good thermal resolution for distant objects. The result is a window of thermal projection inside the visual information, which will be assigned as W t h and has the same properties of I t h . A final observation is that the utilization of the entire RGB image facilitates VO and point cloud registration and fusing. Figure 2 shows a result of the thermal projection process.

3.4. Point Cloud Generation

The stereo algorithm used to compute the point cloud is based on Reference [38]. From a pair of images, the following steps are performed. First, the images are rectified by using the parameters obtained from the calibration process. Then, to reduce saturation problems between two different points of view, the images are converted to grayscale and then normalized to enhance texture and diminish possible differences in illumination. A sliding window W Z 7 x 7 calculates the new color value of color for its central pixel w c Z as in Equation (5).
w c = min max w c w ¯ , w c a p ,
where w c Z is the original color value for pixel i, w ¯ R is the average color values of pixels in the slide window W, and w l i m Z is a predetermined limit to avoid negative values.
The next step consists in comparing the similar points from left to right image using the Sum of Absolute Difference (SAD) operation from a fixed window W L in the RGB-L image to a sliding W R in the RGB-R one (both converted to grayscale), for a range of pixels previously defined as the disparity range d r in the x direction (Equation (6)). After performing the operation, the lowest value M m i n is considered as a match candidate between p L v and p R v pixels from RGB-L and RGB-R images, respectively. If M m i n satisfies the uniqueness ratio u n threshold in Equation (7) for all the other M i : i d r , the match is considered as valid, and the pixel disparity d between p L v and p R v (Equation (8)) is annotated in the disparity map τ as the difference between the pixels x coordinates.
M i = x = 1 m y = 1 n w x , y R w x , y L , i d r ,
u n > M i M m i n M m i n ,
d = x R x L .
Finally, the depth is calculated via triangulation operation for every p i τ pixel value in the disparity map, as described in Equation (9).
Z i = f b d i , p i τ ,
where Z i is the depth for the pixel’s corresponding point in 3D P i v R 3 x 1 , f is the RGB cameras focal length and b is the stereo rig baseline. The instantaneous point cloud C k v for instant k is composed of the group of P i v originated from τ , and is calculated for every p i τ τ corresponding x L v , y L v and d i in Equation (10).
P i v = x R v i c x y R v i c y f , b d i
where d n is the disparity for each match, and c x and c y are the principal point coordinates in the RGB-R image.
In possession of the intrinsic matrix K v R 3 x 3 for the RGB-R image containing the focus and principal point values f, c x and c y , respectively, the points P i v from C k v can be projected into the image plane to its respective pixel location p v in homogeneous coordinates as in Equation (11). Again, to get final coordinates, one must divide the result by w v and get p v , following Equation (3). Figure 3 shows the final instant thermal 3D reconstruction.
x v y v w v = K v X v Y v Z v ,
where
K v = f 0 c x 0 f c y 0 0 1 .

3.5. Accumulated N-Dimentional Point Cloud

The point clouds C v and C t h must be registered correctly regarding the world inertial frame. This is performed through a homogeneous transformation matrix given by the VO algorithm. Consider v T k i n as the odometry transformation from the origin of the inertial frame to the RGB-R camera frame. In a first moment, the registration of C k v (with N points P i v ) concept could be done by stacking the clouds after the homogeneous transformation for every instant k, building the visual accumulated point cloud A c v as in Equation (13) for a total of K instants.
A c v = k = 1 K i = 1 N v T k i n X i k Y i k Z i k 1 ,
where
v T k i n = r 1 r 2 r 3 t x r 4 r 5 r 6 t y r 7 r 8 r 9 t z 0 0 0 1 .
Analogously, there should exist an accumulated thermal point cloud A c t h formed by the addition of every C k t h cloud. Therefore, each 3D visual point with a thermal projection is associated with an n-dimensional temperature array, where n is the number of times that each 3D point is found in a pose. It is interesting to mention that, due to occlusions or other factors, the size of n changes from point to point. It means that points that are captured more times have a larger temperature vector. There are two possible approaches to deal with and analyze these accumulated thermal readings. One is to generate the n-dimensional vectors, use sophisticated analysis to find a diagnosis or operate a filter at each new entry, and store just one filtered value. As it is not the proposal of this work to analyze with filter is the best, the second approach is adopted. The final registration process uses a m i n filter to remove false temperature measurements illustrated by Figure 4. In an instant k, the new point cloud C k t h is submitted to a KD-Tree search process for corresponding points in A c t h . If neighbors are within a radius, the point temperatures from different instants are compared, and the lowest one is chosen. In case no neighbor is found, this new point is added to A c t h . The process is described in Algorithm 1.
Algorithm 1 Temperature correction algorithm.
C t h = new_point_cloud()
C t r a n s f = t r a n s f o r m ( C t h , v T i n )
for P t h C t r a n s f do
     n e i g h b o r s = Kd_tree_search( P t h , A c t h , t h r e s h _ r a d i u s )
    if ( n e i g h b o r s > 0 ) then
         t e m p e r a t u r e s = get_temperatures( n e i g h b o r s , P t h )
         t e m p = lowest_temperature( t e m p e r a t u r e s )
         P t h . t e m p e r a t u r e = t e m p
    else
         A c t h += P t h
    end if
end for
Note that accumulating duplicated 3D points wastes computer memory and processing capacity without any improvement. Thus, this work uses an overlap calculation to avoid this problem. This process uses the RGB-R camera pose to evaluate the homogeneous transformation v T r e f i n . The relative movement and its respective odometry v T k i n are computed for the new point cloud C k v , which is projected in the reference pose by using Equation (15). If this point cloud meets the thresholds of a minimum number of new points and a minimum distance from the accumulated pose, the algorithm considers C k v as a good point cloud. In such a case, it starts the registration process considering the thermal pair C t h . The odometry measurement v T k i n is taken as a new odometry reference v T r e f i n , and the process restarts. The final accumulated result is obtained by applying Equation (14) to this newly accepted point cloud. Figure 5 presents a flow chart of this process, where at the first time that a set of images is acquired, it is considered as reference frame until a threshold is met and a new reference frame is considered.
p v = K v v T r e f i n 1 v T k i n X i v Y i v Z i v 1 .
An example of accumulated N-Dimensional point cloud with focus on the reflected misreading temperature, before correction, can be seen in Figure 6.

4. Visual System and Robot Description

Figure 7 presents the robot developed for this application, namely Wally3. It is composed of two main structure parts, that is, body and head, which guarantee tilt and pan capabilities, as seen in Figure 8. The robot is mounted on top of a vehicle capable of driving along the railway to automatically monitor the distribution lines on the sideways. In the end, it is composed of vision, automatic orientation control, power distribution, and processing core systems.
The vision system is coupled on Wally3’s head, with a stereo pair of Allied Vision’s MAKO cameras on both sides and a FLIR A65 thermal camera in the center of the scheme. The calibration process calculates the cameras intrinsic and the extrinsic parameters relating each camera to the other two. The visual cameras capture images up to 20 Hz rates, with 1600 × 1200 resolution. Regarding the thermal camera, images capture up to 13 Hz rates, with 640 × 512 resolution. Figure 9 brings the relative position and spacing between the cameras in the robot. All cameras have global shutter or similar capture systems and are synchronized by a real time clock, meaning that images are only acquired when all cameras are ready. This approach mitigate problems such as shutter deformations and the parallax motion effect, which means that, objects closer to a moving camera tend to blur. Moreover, if the parallax field of view effect is considered, it is essential to remember that only the visual cameras are used to generate point clouds, and the thermal one is used to project the temperature readings over the right-placed camera. Thus, the distance value of ≅ 10 cm provides a good trade-off between accuracy and compactness and having the thermal camera closer to the visual camera mitigates occlusion and other undesirable effects.
The automatic orientation control relies on a Pixhawk controller board placed inside the body and a GPS module. Figure 10 and Figure 11a,b illustrate the robot’s behavior for pan and tilt movements. First, the controller gathers data from inertial sensors and GPS to provide the robot’s position in the world, and so calculate its orientation relative to the point of interest. The relative angles are transmitted to the servos for pan and tilt adjustments, so the tracked point is always inside the robot’s Field of View (FOV). As the dynamics of the servos and their encoders are well known and reliable, both are used in the Kalman filter process to mitigate angular misreadings.
Due to many sources of electromagnetic interference emanating mainly from the vehicle’s communication system, once a certain number of satellites is observed by the GPS, the orientation in the world uses information provided by this sensor to fuse it with the compass readings [39] . The fusion process uses an Extended Kalman Filter with electromagnetic disturbs [36]. This is a viable approach once the vehicle only moves forward during the inspection. Equation (16) describes the new orientation sensor θ r calculation in the world frame.
θ r = a t a n 2 ( v l o n , v l a t ) ,
where v l o n and v l a t stand for the velocity in the longitude and latitude directions, respectively.
Equation (17) is responsible for estimating the angle θ d i f from the vehicle to the point of interest in the world frame. Subsequently, Equation (18) gives the final smallest relative angle γ from vehicle’s forward-looking direction to the point of interest location.
θ d i f = a t a n 2 ( d l o n , d l a t )
γ = w a r p 180 ( θ r θ d i f ) ,
where d l o n and d l a t are the difference in longitude and latitude coordinates from the point of interest to the robot. This new reading is incorporated into the CKF to evaluate the final orientation and position.
Finally, Equation (19) calculates the tilt angle β . It considers the distance from the vehicle to the point of interest D r _ p o i and the difference in height from where the robot cameras are ( H r ) to the one estimated in the mission for the equipment to be inspected, namely H p o i :
β = a t a n 2 ( H p o i H r , D r _ p o i ) .
After these calculations, the angle values are converted to Pulse Width Modulation (PWM) signal and sent to the actuators. Wally3 has two dedicated Dynamixel servo motors, where the model MX-106 is for pan and MX-64 is for tilt movements. They are both controlled internally by a PID controller, which is tuned for smooth movement during inspection not to disturb the image acquisition. The commands are sent to them at a rate of 6 Hz.

5. Results and Discussion

The experimentation methodology consists of moving the robot from a determined starting position to different inspection points. During the missions, the robot is subjected to different conditions to verify the autonomous capability of inspecting various equipment in the surrounds. The entire processing is performed in a computer with an Intel i7 core processor, running Ubuntu 16.04. The whole process is managed by the Robot Operating System (ROS) framework, responsible for organizing the algorithms for vision and orientation.
This research uses the developed Wally3 robot for methodology validation. Besides, two practical experiments were carried out to evaluate the effectiveness of the proposed methodology—(1) A reflective surface thermal inspection, to test the concept of sun light effect mitigation; and (2) A piece of equipment is inspected in the rail distribution line.
An important observation is that, after an extensive bibliographical research as shown in the introduction, it was not found any similar approach, which makes impossible to compare our results with any other recent approach. Instead, the current methodology will be compared with the results performed by a field expert.

5.1. Reflective Surface Inspection

This setup allowed the heat diffusion through the plate and temperature monitoring. Note that the sunlight incidence on the board makes the temperature in determined spots increase depending on the point of view. To test the proposed approach, a random set of points was selected on the board surface for temperature analysis during a time interval. Figure 12 presents the points seen in four different sample instants, while a graph for their temperature variation is shown in Figure 13. Note a temperature difference of up to 30% from the lowest one gathered at some points, which could indicate a false hotspot and is avoided by the algorithm.

5.2. Real Application

Finally, the algorithm was tested in a real case scenario to inspect a 180 kVA autonomous diesel group generator was monitored during an active emergency operation for possible defects. It is a particularly crucial case once it is not typical for those types of equipment to enter in operation. Therefore when this situation happens, all related devices must be inspected as fast as possible. During the inspection, an infrared reflection was observed in a metallic piece attached to it. In a normal situation, this would demand a new service order for further analysis and correction.
Figure 14 shows the infrared interference as a red at the moment it is detected. Once it disappears, when the generator is seen from another angle in Figure 15, the plate returns to a uniform temperature color, which indicates that there was a false positive indication of defect in the past.
Figure 16 presents a graph of temperature evolution along the inspection. Two points are used to test the methodology in this real scenario. First, the blue line indicates the readings in a random 3D spot within a region of interest that presents infrared interference but is not the worst-case scenario. The second point represents the highest temperature variation found in the readings. This point is represented by the red line and it clearly shows the effect of external infrared interference over the readings. The filtered temperature is marked in green for the mentioned 3D point. Finally, Figure 17a,b presents 2D thermal pictures of the generator with and without sunlight reflection, respectively.
To compare the results, the same equipment was also analyzed by an expert. The qualitative result was exactly the same. However, Wally completed the inspection and diagnosis in less than 20 s while the expert took more than 10min. Considering, parking the car, deploying the equipment, process the readings, packing and leaving this process took 15 min.
Finally, an important observation, it that this approach only inspects parts of the object that is facing the road or equipments that any heat disturbance is propagated all over the object, such as insulators. Some cases in transmission lines, it is possible to circle around the object, but in distribution systems this is not always possible. But, even with this limitation, it is possible to reduce the overall inspection time in real routes in more than 30%.

5.3. Batch Inspection Mission

As a multi-inspection performance evaluation, the system has executed a 16 km long real mission consisting of 62 points spread the railroad dedicated electrical distribution network. Each inspection point may present more than one equipment, and the goal is to keep the predictive maintenance updated by searching for potential faulty components . Normally, each equipment present a different operational temperature, however, as any fault related to thermal irradiation will result in a measure much higher than any normal operational point, a unique threshold of 8 C was set. Figure 18 shows the entire mission where the blue markers are transformers, purple are switches, green are Insulating shroud to underground systems and, finally, the red one is a small power substation. Figure 19 shows the measurement profile and temperature corrections. It is possible to see that was detected infrared interference in 13 cases, 3 of them indicating overheat. This situation would demand a ground team to check those locations for further analysis.

6. Conclusions and Future Work

This research has proposed an autonomous system composed of two RGB stereo cameras and one infrared camera to capture 3D visual and thermal models for each instant. These instantaneous models are integrated over time to create accumulated visual and thermal 3D models, which are used for inspection analysis. The solution adopted in this research is generic and has presented an effective alternative approach for autonomous inspections of any type of equipment and machinery in transmission, distribution or in any other different area without changing the methodology.
It enhances security and efficiency when compared to the same service executed by aircraft and trained personnel. The final thermal models proved to be useful for both quantitative and qualitative analysis and further fault detection.
Since the thermal inspection is sensitive to infrared reflection from outer sources, it was developed and tested in real scenarios an algorithm to find and eliminate those situations. The results were corroborated by expert measurements with the advantage of the autonomous approach being much faster than traditional ones.
A few extensions are foreseen in this research work. First, the solution will be tested in a wide range of complex scenarios to explore detection of hidden hot spots through the thermal signature of the entire n-dimension temperature vector. Second, it is intended to miniaturize and apply the proposed methodology for aerial inspections in order to inspect areas of difficult access and imminent risk to humans.

Author Contributions

Conceptualization, L.M.H.; methodology, L.M.H. and V.F.V.; software, V.F.V. and F.M.D.; validation, A.L.C., M.F.P. and A.L.M.M.; investigation, A.L.C. and V.F.V.; data curation, A.L.C.; writing—original draft preparation, V.F.V., M.F.P., F.M.D. andA.L.M.M.; writing—review and editing, L.M.H.; supervision and project administration: L.M.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by CAPEs, CNPq, MRS Logística, TBE, EDP and ANEEL—The Brazilian Regullaroty Agency of Electricity by grant number PD-02651-0013/2017

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

I t h : I t h : Ω t h Z 2 x 2 [ 0 , 2 12 ] ; ( x , y ) I t h ( x , y )
I v : I v : Ω v Z 2 [ 0 , 2 8 ] 3 ; ( x , y ) I v ( x , y )
R: R O ( 2 )
t: t R 2 x 1
W n x n : W n x n : Ω w Z 2 x 2 [ 0 , 2 8 ] 3 ; ( x , y ) W v ( x , y )
w c : w c [ 0 , 2 8 ] 3
τ : τ Z 2 [ 0 , d r ] ; τ : ( x L v , y L v , x R v , y R v , ) d
v T k i n : v T k i n R 4 x 4
d r : d r Z

References

  1. Bogue, R. Robots in the offshore oil and gas industries: A review of recent developments. Ind. Robot. Int. J. 2019, 47, 1–6. [Google Scholar] [CrossRef]
  2. Qiao, W.; Lu, D. A survey on wind turbine condition monitoring and fault diagnosis—Part I: Components and subsystems. IEEE Trans. Ind. Electron. 2015, 62, 6536–6545. [Google Scholar] [CrossRef]
  3. Nikolic, J.; Burri, M.; Rehder, J.; Leutenegger, S.; Huerzeler, C.; Siegwart, R. A UAV system for inspection of industrial facilities. In Proceedings of the 2013 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2013; pp. 1–8. [Google Scholar]
  4. Varadharajan, S.; Jose, S.; Sharma, K.; Wander, L.; Mertz, C. Vision for road inspection. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision, Steamboat Springs, CO, USA, 24–26 March 2014; pp. 115–122. [Google Scholar]
  5. Karakose, M.; Yaman, O.; Baygin, M.; Murat, K.; Akin, E. A new computer vision based method for rail track detection and fault diagnosis in railways. Int. J. Mech. Eng. Robot. Res. 2017, 6, 22–27. [Google Scholar] [CrossRef]
  6. Dragomir, A.; Adam, M.; Andruçcâ, M.; Munteanu, A.; Boghiu, E. Considerations regarding infrared thermal stresses monitoring of electrical equipment. In Proceedings of the 2017 International Conference on Electromechanical and Power Systems (SIELMEN), Iasi, Romania, 11–13 October 2017; pp. 100–103. [Google Scholar]
  7. Wang, C.; Cho, Y.K. Automatic 3D thermal zones creation for building energy simulation of existing residential buildings. In Proceedings of the Construction Research Congress 2014: Construction in a Global Network, Atlanta, GA, USA, 19–21 May 2014; pp. 1014–1022. [Google Scholar]
  8. Cho, Y.K.; Ham, Y.; Golpavar-Fard, M. 3D as-is building energy modeling and diagnostics: A review of the state-of-the-art. Adv. Eng. Inform. 2015, 29, 184–195. [Google Scholar] [CrossRef]
  9. Borrmann, D.; Leutert, F.; Schilling, K.; Nüchter, A. Spatial projection of thermal data for visual inspection. In Proceedings of the 14th International Conference on Control, Automation, Robotics and Vision (ICARCV), Phuket, Thailand, 13–15 November 2016; pp. 1–6. [Google Scholar]
  10. Lagüela, S.; Martínez, J.; Armesto, J.; Arias, P. Energy efficiency studies through 3D laser scanning and thermographic technologies. Energy Build. 2011, 43, 1216–1221. [Google Scholar] [CrossRef]
  11. Bortoni, E.C.; Santos, L.; Bastos, G. A model to extract wind influence from outdoor IR thermal inspections. IEEE Trans. Power Deliv. 2013, 28, 1969–1970. [Google Scholar] [CrossRef]
  12. Akhloufi, M.A.; Verney, B. Multimodal registration and fusion for 3d thermal imaging. Math. Probl. Eng. 2015, 2015, 450101. [Google Scholar] [CrossRef] [Green Version]
  13. Silva, B.P.; Ferreira, R.A.; Gomes, S.C., Jr.; Calado, F.A.; Andrade, R.M.; Porto, M.P. On-rail solution for autonomous inspections in electrical substations. Infrared Phys. Technol. 2018, 90, 53–58. [Google Scholar] [CrossRef]
  14. Gibert, X.; Patel, V.M.; Chellappa, R. Deep multitask learning for railway track inspection. IEEE Trans. Intell. Transp. Syst. 2016, 18, 153–164. [Google Scholar] [CrossRef] [Green Version]
  15. Attard, L.; Debono, C.J.; Valentino, G.; Di Castro, M. Tunnel inspection using photogrammetric techniques and image processing: A review. ISPRS J. Photogramm. Remote. Sens. 2018, 144, 180–188. [Google Scholar] [CrossRef]
  16. Aydin, I. A new approach based on firefly algorithm for vision-based railway overhead inspection system. Measurement 2015, 74, 43–55. [Google Scholar] [CrossRef]
  17. Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef] [Green Version]
  18. Engel, J.; Schöps, T.; Cremers, D. LSD-SLAM: Large-scale direct monocular SLAM. In Proceedings of the 13th European Conference on Computer Vision, Zurich, Switzerland, 6–12 September 2014; pp. 834–849. [Google Scholar]
  19. Usenko, V.; Engel, J.; Stückler, J.; Cremers, D. Direct visual-inertial odometry with stereo cameras. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–20 May 2016; pp. 1885–1892. [Google Scholar]
  20. Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
  21. Maimone, M.; Cheng, Y.; Matthies, L. Two years of visual odometry on the mars exploration rovers. J. Field Robot. 2007, 24, 169–186. [Google Scholar] [CrossRef] [Green Version]
  22. Howard, A. Real-time stereo visual odometry for autonomous ground vehicles. In Proceeding of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 3946–3952. [Google Scholar]
  23. Fischer, T.; Pire, T.; Čížek, P.; De Cristóforis, P.; Faigl, J. Stereo vision-based localization for hexapod walking robots operating in rough terrains. In Proceeding of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 2492–2497. [Google Scholar]
  24. Kostavelis, I.; Charalampous, K.; Gasteratos, A.; Tsotsos, J.K. Robot navigation via spatial and temporal coherent semantic maps. Eng. Appl. Artif. Intell. 2016, 48, 173–187. [Google Scholar] [CrossRef]
  25. Schuster, M.J.; Schmid, K.; Brand, C.; Beetz, M. Distributed stereo vision-based 6D localization and mapping for multi-robot teams. J. Field Robot. 2019, 36, 305–332. [Google Scholar] [CrossRef] [Green Version]
  26. Adán, A.; Prado, T.; Prieto, S.; Quintana, B. Fusion of thermal imagery and LiDAR data for generating TBIM models. In Proceeding of the 2017 IEEE SENSORS, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
  27. Jadin, M.S.; Taib, S. Recent progress in diagnosing the reliability of electrical equipment by using infrared thermography. Infrared Phys. Technol. 2012, 55, 236–245. [Google Scholar] [CrossRef] [Green Version]
  28. Ahmed, M.M.; Huda, A.; Isa, N.A.M. Recursive construction of output-context fuzzy systems for the condition monitoring of electrical hotspots based on infrared thermography. Eng. Appl. Artif. Intell. 2015, 39, 120–131. [Google Scholar] [CrossRef]
  29. Glowacz, A.; Glowacz, Z. Diagnosis of the three-phase induction motor using thermal imaging. Infrared Phys. Technol. 2017, 81, 7–16. [Google Scholar] [CrossRef]
  30. Homma, R.Z.; Cosentino, A.; Szymanski, C. Autonomous inspection in transmission and distribution power lines–methodology for image acquisition by means of unmanned aircraft system and its treatment and storage. Cired-Open Access Proc. J. 2017, 2017, 965–967. [Google Scholar] [CrossRef] [Green Version]
  31. Tang, C.; Tian, G.Y.; Chen, X.; Wu, J.; Li, K.; Meng, H. Infrared and visible images registration with adaptable local-global feature integration for rail inspection. Infrared Phys. Technol. 2017, 87, 31–39. [Google Scholar] [CrossRef] [Green Version]
  32. Vidas, S.G. Handheld 3D Thermography Using Range Sensing and Computer Vision. Ph.D. Thesis, Queensland University of Technology, Brisbane City, Australia, 2014. [Google Scholar]
  33. Schramm, S.; Rangel, J.; Kroll, A. Data fusion for 3D thermal imaging using depth and stereo camera for robust self-localization. In Proceedings of the 2018 IEEE Sensors Applications Symposium (SAS), Seoul, Korea, 12–14 March 2018; pp. 1–6. [Google Scholar]
  34. Prakash, S.; Lee, P.Y.; Caelli, T.; Raupach, T. Robust thermal camera calibration and 3D mapping of object surface temperatures. SPIE Proc. Thermosense Xxviii 2006, 6205, 62050J. [Google Scholar]
  35. Ellmauthaler, A.; da Silva, E.A.; Pagliari, C.L.; Gois, J.N.; Neves, S.R. A novel iterative calibration approach for thermal infrared cameras. In Proceedings of the 2013 IEEE International Conference on Image Processing, Melbourne, Australia, 15–18 September 2013; pp. 2182–2186. [Google Scholar]
  36. da Silva, M.F.; Honório, L.M.; Marcato, A.L.M.; Vidal, V.F.; Santos, M.F. Unmanned aerial vehicle for transmission line inspection using an extended Kalman filter with colored electromagnetic interference. ISA Trans. 2019, 100, 322–333. [Google Scholar] [CrossRef] [PubMed]
  37. Solem, J.E. Programming Computer Vision with Python: Tools and Algorithms for Analyzing Images; O’Reilly Media: Newton, MA, USA, 2012. [Google Scholar]
  38. Bradski, G.; Kaehler, A. Learning OpenCV: Computer Vision with the OpenCV Library; O’Reilly Media: Newton, MA, USA, 2008. [Google Scholar]
  39. Christensen, H.I.; Hager, G.D. Sensing and estimation. In Springer Handbook of Robotics; Springer: Berlin, Germany, 2016; pp. 91–112. [Google Scholar]
Figure 1. Methodology Diagram.
Figure 1. Methodology Diagram.
Sensors 20 04042 g001
Figure 2. Thermal and Visual homography transformation result.
Figure 2. Thermal and Visual homography transformation result.
Sensors 20 04042 g002
Figure 3. Instant thermal reconstruction.
Figure 3. Instant thermal reconstruction.
Sensors 20 04042 g003
Figure 4. Algorithm to recognize false measurements removal.
Figure 4. Algorithm to recognize false measurements removal.
Sensors 20 04042 g004
Figure 5. Cloud overlap analysis process.
Figure 5. Cloud overlap analysis process.
Sensors 20 04042 g005
Figure 6. Instant N-Dimensional reconstruction.
Figure 6. Instant N-Dimensional reconstruction.
Sensors 20 04042 g006
Figure 7. Robot’s Schematic view, with Pan and Tilt capabilities illustration.
Figure 7. Robot’s Schematic view, with Pan and Tilt capabilities illustration.
Sensors 20 04042 g007
Figure 8. Robot assemble in two different configurations and inspection vehicles.
Figure 8. Robot assemble in two different configurations and inspection vehicles.
Sensors 20 04042 g008
Figure 9. Scheme for Stereo and Thermal Cameras Positioning.
Figure 9. Scheme for Stereo and Thermal Cameras Positioning.
Sensors 20 04042 g009
Figure 10. Camera pose control operation.
Figure 10. Camera pose control operation.
Sensors 20 04042 g010
Figure 11. Automatic Orientation Scheme for the robot’s behavior when inside a Point of Interest predefined region. (a) Pan. (b) Tilt.
Figure 11. Automatic Orientation Scheme for the robot’s behavior when inside a Point of Interest predefined region. (a) Pan. (b) Tilt.
Sensors 20 04042 g011
Figure 12. Image samples in different points of view, with 4 highlighted points.
Figure 12. Image samples in different points of view, with 4 highlighted points.
Sensors 20 04042 g012
Figure 13. Temperature variation for each point along the test, with minimum temperature taken as real ones.
Figure 13. Temperature variation for each point along the test, with minimum temperature taken as real ones.
Sensors 20 04042 g013
Figure 14. 3D model with false positive reading due to sunlight reflection.
Figure 14. 3D model with false positive reading due to sunlight reflection.
Sensors 20 04042 g014
Figure 15. Temperature correction in the 3D model due to a reading from another angle.
Figure 15. Temperature correction in the 3D model due to a reading from another angle.
Sensors 20 04042 g015
Figure 16. Temperature measurements for a random affected 3D point in blue and maximum reading for each image sample in red.
Figure 16. Temperature measurements for a random affected 3D point in blue and maximum reading for each image sample in red.
Sensors 20 04042 g016
Figure 17. 2D thermal pictures of the generator. (a) Hotspot in 2D thermal reading with temperature scale. (b) 2D thermal reading with no hotspot.
Figure 17. 2D thermal pictures of the generator. (a) Hotspot in 2D thermal reading with temperature scale. (b) 2D thermal reading with no hotspot.
Sensors 20 04042 g017
Figure 18. Batch Mission containing 50 POI with multiples elements with respective measured and filtered temperature.
Figure 18. Batch Mission containing 50 POI with multiples elements with respective measured and filtered temperature.
Sensors 20 04042 g018
Figure 19. Measurement temperature ( C) profile and corrections.
Figure 19. Measurement temperature ( C) profile and corrections.
Sensors 20 04042 g019
Table 1. Processes description, priorities and time requirements.
Table 1. Processes description, priorities and time requirements.
N ProcessPriorityTime Req. (s)Description
1Readings00.1Started by a real time trigger and provides synchronized
data acquisition among the sensors
2Kalman00.01fuses and updates the position
3Visual Odometry10.5uses the corrected position provided by the kalman filter
along with visual odometry to improve heading and
position
4Thermal 2D fusion21responsible to fuse the right visual and thermal images.
53D Point Cloud21generates a dense 3D point cloud through a SLAM
algorithm
63DT Point Cloud33reprojects the thermal data into the 3D point cloud.
This generates a 4D map
7MDPC420accumulates the current 3DT PC to the historical data.

Share and Cite

MDPI and ACS Style

Vidal, V.F.; Honório, L.M.; Dias, F.M.; Pinto, M.F.; Carvalho, A.L.; Marcato, A.L.M. Sensors Fusion and Multidimensional Point Cloud Analysis for Electrical Power System Inspection. Sensors 2020, 20, 4042. https://doi.org/10.3390/s20144042

AMA Style

Vidal VF, Honório LM, Dias FM, Pinto MF, Carvalho AL, Marcato ALM. Sensors Fusion and Multidimensional Point Cloud Analysis for Electrical Power System Inspection. Sensors. 2020; 20(14):4042. https://doi.org/10.3390/s20144042

Chicago/Turabian Style

Vidal, Vinicius F., Leonardo M. Honório, Felipe M. Dias, Milena F. Pinto, Alexandre L. Carvalho, and Andre L. M. Marcato. 2020. "Sensors Fusion and Multidimensional Point Cloud Analysis for Electrical Power System Inspection" Sensors 20, no. 14: 4042. https://doi.org/10.3390/s20144042

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop