Next Article in Journal
Ultrasound-Guided Percutaneous Neuromodulation in Patients with Unilateral Anterior Knee Pain: A Randomized Clinical Trial
Next Article in Special Issue
A Review of Modelling and Simulation Methods for Flashover Prediction in Confined Space Fires
Previous Article in Journal
The Use of Non-Destructive Testing (NDT) to Detect Bed Joint Reinforcement in AAC Masonry
Previous Article in Special Issue
Automatic Detection System of Deteriorated PV Modules Using Drone with Thermal Camera
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Image-Based Gimbal Control in a Drone for Centering Photovoltaic Modules in a Thermal Image †

1
Department of IT Convergence Engineering, Gachon University, Seongnam 13120, Korea
2
Department of Software, Gachon University, Seongnam 13120, Korea
3
Department of Fire Service Administration, Chodang University, Mu-An 58530, Korea
*
Author to whom correspondence should be addressed.
A preliminary version of this article was presented at the 3rd International Conference on Advanced Engineering and ICT-Convergence, Jeju Island, Korea, 11–13 July 2019.
Appl. Sci. 2020, 10(13), 4646; https://doi.org/10.3390/app10134646
Submission received: 23 April 2020 / Revised: 28 June 2020 / Accepted: 2 July 2020 / Published: 5 July 2020
(This article belongs to the Special Issue Computing and Artificial Intelligence for Visual Data Analysis)

Abstract

:
Recently, there have been many types of research applying drones with a thermal camera to detect deteriorations in photovoltaic (PV) modules. A thermal camera can measure temperatures on the surface of PV modules and find the deteriorated area. However, a thermal camera generally has a lower resolution than a visible camera because of the limitations of cost. Due to different resolutions between the visible and thermal cameras, there are often invalid frames from a thermal camera. In this paper, we describe a gimbal controller with a real-time image processing algorithm to control the angle of the camera to position the region of interest (ROI) in the center of target PV modules to solve this problem. We derived the horizontal angle and vertical position of ROI in visible images using image processing algorithms such as the Hough transform. These values are converted into a PID control signal for controlling the gimbal. This process makes the thermal camera capture the effective area of target PV modules. Finally, experimental results showed that the photovoltaic module’s control area was properly located at the center of the thermal image.

Graphical Abstract

1. Introduction

As the photovoltaic system is becoming popular as a future-oriented green energy, there is a strong tendency to install more PV systems widely. However, as operation time is increasing, the efficiency of the PV system is decreasing due to several reasons such as contamination of the PV module’s surface, back surface temperature, and shadows [1,2]. Thus, it is necessary to inspect the current status of the module surface regularly. Having a regular diagnosis of the module surface will ensure the best efficiency is kept by sending a real-time alarm for maintenance before the problem occurs.
Various previous studies have tested maintenance methods to keep the efficiency of the PV system. Conventional studies aimed to find a series of faulty modules by measuring the voltage and current of the PV module, monitoring the power generation status of the inverter [3,4], and detecting contamination cracks on the surface with a camera [5,6]. Other studies have manually inspected PV modules using thermal images or electroluminescence images [7,8,9,10]. Recently, studies have been conducted to find hotspots using a thermal camera [11,12]. Also, several studies used a drone to inspect a wider area and reduce inspection time. When using a drone-based thermal camera, there is an additional benefit: it can eliminate the hurdles of inspection caused by the problematic location of the site, e.g., on the roof of a high building or sea surface.
Since it is difficult to distinguish objects only with a thermal imaging camera, most studies use dual cameras composed of a thermal camera and a visible camera. The region of interest (ROI) is derived through image processing of the most visible image, and temperature information of the object is extracted by referring to temperature data of the thermal image. However, a thermal camera generally has a lower resolution than a visible camera because of the limitations of cost. Due to different resolutions between the visible and thermal cameras, there are often invalid frames from a thermal camera. Using a reasonable low-resolution camera, it is impossible to take an image from high altitude due to limited resolution. Therefore, the drone has to be placed at a closer distance from the PV module. In that case, since we cannot see the whole picture of the module, it is a prerequisite to take the picture in the center of our target object.
A gimbal system is commonly used for obtaining clean images from drone-mounted cameras, as drones in flight present a variety of vibration frequencies. The system consists of a structure to support a camera module and a stabilizer to block outside vibrations and keep the right angle [13]. A gimbal controls the target angle of a camera and provides advantages for acquiring better and proper images. This subject and the importance of using a gimbal on a drone was discussed by various authors [14,15]. Furthermore, a gimbal dampens vibrations, which is significantly beneficial for real-time image stabilization applications [16]. Apart from smoothing angular movements and dampening vibrations, a gimbal maintains a camera in a predefined position [17]. This is mostly the position in which the camera’s axis is horizontal or vertical, but all other positions according to the gimbal’s technical capabilities are possible.
In this paper, we propose an image processing and control system that can automatically adjust the gimbals for PV module inspection using autonomous drones. The proposed system consists of a dual camera, a gimbal frame, a gimbal control board (GCB), and a gimbal control system (GCS). The dual camera consists of a thermal imaging camera and a visible camera and uses a brushless DC (BLDC) motor for the gimbal frame. The GCB controls the motor through the pulse width modulation (PWM) input signal, and the main controller GCS implements image processing to control the angle of the thermal imaging camera in the visible camera. GCS calculates the center of the panel area and also the necessary value for angle adjustment from the image processing of captured images from a visible camera, and then sends the resulting data to the GCB. Consequently, we could reduce the time for pre-processing, and we could also increase the accuracy of inspection.

2. PV Module Imaging System Using Drone Equipped with Dual Camera

2.1. Considerations for Acquiring an Effective Thermal Image Using a Drone

Figure 1 illustrates the images captured from different altitudes. With a 100-m distance, we could obtain the full area of PV modules within one frame (Figure 1a,b), but due to low resolution in thermal camera, we could not obtain the explicit and proper images for thermal evaluation (Figure 1b). On the contrary, with a 10-m distance (Figure 1c,d), we could obtain the target image quality for thermal evaluation (Figure 1d). Therefore, it is necessary to place and keep the drone at a short distance, e.g., 10-m height.
If we refer to Figure 2a,c, which were captured with the visible camera, both look acceptable for further processing. However, if we compare the images from thermal cameras, due to using different resolutions between the visible and thermal cameras, Figure 2b could not obtain the proper ROI for further processing, while Figure 2d shows an aligned image for thermal analysis. Thus, it is important to place the target ROI of a thermal camera based on image processing from a visible camera.
Figure 3 shows the comparison of the acquired images with or without gimbal control during the drone’s flight. If the gimbal is not controlled, the drone’s driving changes irregularly due to the drone’s GPS error, vibration, wind, and influence of the surrounding environment. Unstable drone driving has a direct effect on the camera image. However, in an ideal case, a uniform image may be obtained through gimbal control in a straight line. Obtaining a uniform image through gimbal control can help substantially in the next analysis process.
Therefore, in this paper, we designed and implemented an image-based gimbal control system that is equipped with an independent dual camera mounted on the gimbal and performs image processing based on the acquired visible image to control the gimbal’s roll and tilt so that the ROI of the acquired image is horizontal and placed in the center.

2.2. Gimbal System

In this work, a three-axis brushless gimbal stabilizer alloy was applied to control the camera’s proper angle. The applied gimbal consists of three sets of BLDC motor for the function of pan, roll, and tilt, and a 32-bit microprocessor integrated with the GCB, a gimbal IMU (inertial measurement unit) with a three-axis accelerometer and a three-axis gyroscope, a wireless receiver for external PWM signal input, a slip ring for 360-degree continuous rotation, and an aluminum alloy frame for light weight (Figure 4). The full rotation axis symmetrical design is adapted for bearing and uses no virtual spaces in order to minimize the vibration. The gimbal functionality is controlled by a GCB, which calculates corrections for the camera position based on the IMU’s horizon deflection mounted on the camera holder.

2.3. GCS Equipped with Dual Camera

In this work, the GCS was developed as shown in Figure 5. The GCS is designed to control and implement the image processing for both thermal cameras with a resolution of 640 × 512 and a visible camera with a resolution of 1280 × 720. The output from the GCS is the control value for tilt motor and roll motor. With this controller, firstly, image processing is done from the image of the visible camera to find the target ROI for the thermal camera, and it controls the gimbal on a real-time basis. It is finally possible to obtain an excellent quality of the image from a thermal camera, which requires positioning the target area of the object in the center of the ROI.

2.4. Design of GCS

The block diagram of our drone system is shown in Figure 6. The proposed thermal imaging system captures the visible and thermal images of the PV modules and uses the visible image to derive the angle and center position of the PV module area. The angle of the derived module area is converted into the adjustment value for controlling the horizontal angle, and the position of the central area is converted into the adjustment value for controlling the vertical angle. The obtained adjustment values are inputted to the gimbal mechanism by calculating it with the signal of the wireless communication unit for the external wireless device.
The gimbal control method is divided into vertical control and horizontal control. Firstly, it is necessary to find the center position of the PV module through the gimbal’s vertical adjustment. If the PV module does not exist at the center of the thermal image, the thermal image with the upper and lower parts of the PV module will be out of the ROI. In this case, it is difficult to detect the hotspot area of the PV module accurately. Secondly, after having the right vertical respect adjustment, the gimbal needs to be aligned horizontally to adjust the gimbal tilt so that the PV modules in the image can be captured in the proper horizontal orientation. Although we explained the process in two steps, it works simultaneously. Therefore, the camera needs to be adjusted. Drone images are used to find the angle of the camera and the center of each module. With the horizontal and vertical angle control function, the gimbal is controlled in real-time so that the visible region and the thermal image are horizontal, and the ROI is captured at the center of the image.

2.5. Gimbal Control Process to Reach Desired Angle and Position

Figure 7 illustrates the process from image acquisition through the entire chain. The process involves (1) capturing the image from visible camera, (2) finding the PV module area using various image processing, such as threshold function, inRange function etc., (3) calculating horizontal angle and vertical position from the results of image processing, based on the Hough transform, etc., (4) calculating PID control value to obtain the proper PWM values, (5) transferring the calculated PWM values to GCB, and (6) GCB controlling BLDC motors as given values. This process will be repeated from phase 1 again as closed loop control.

3. Implementation of Gimbal Control through PV Image Analysis

3.1. Gimbal Control with PWM Signal

Manual and automatic signals for gimbal control are transmitted via GCS. For automatic signals, control values are determined by image processing, and for manual signals, control values are determined by external radio controls that are transmitted by a X8R receiver. The control value for the gimbal will be determined in the GCS with the combination of both manual and automatic input, with the priory of manual input signal. This result, with the format of the PWM signal, will be sent to the GCB from the GCS. According to the PWM signal, the GCB sends out control signals to each of its motors. The PWM is a cycle control method that controls the width of the pulse. Regularly, an output voltage value is maintained with high or low voltage values to produce a square wave output between 0 V and 3.3 V.
Figure 8 shows an example of the PWM. T refers to the cycle period, and ton is the duration of high voltage. The duty-cycle is calculated using Equation (1).
DutyCycle = t o n T × 100 %
The cycle period is 18 ms in our system, and the ton range is between 0.7 ms and 2.3 ms. When T is 1.5 ms, the motor will be stopped, when it is 0.7 to 1.3 ms, the motor will be run as clockwise, and when it is 1.7 to 2.3 ms, the motor will be run as counterclockwise. Therefore, the duty cycle value is operated between 3 and 12. For the safety of the gimbal, the duty cycle is limited from 6.3% to 10.5%. Therefore, the PWM value is converted to a value between 64 and 108 by setting the cycle period value T (18 ms) to 1024.

3.2. Implementation of GCS

Figure 9 shows the implemented GCS in our study. The components of the CGS consist of the object with the gimbal, the embedded GCS with dual cameras, and the gimbal control/driving module. The GCS is implemented based on the hardware of Raspberry Pi 3 b+.
The thermal camera and visual camera are attached to the drone’s GCS. The image taken from the visual camera is processed and used to find physical areas of the modules. Through image processing, the GCS calculates control values through module position and angle, and converts them into PWM signals which will be transmitted to the GCB. The motor control to adjust the gimbal tilt and roll is performed by sending out independent PWM signals (PWM1 to control roll, PWM0 to control tilt).

3.3. Implementation of Object Recognition Image Processing Program

The program is implemented by OpenCV and the C++ language, which is used in the Linux Codeblocks environment. The implementation process of the image processing algorithm is as follows. We convert RGB images to Hue-Saturation-Value (HSV) images. The HSV color space is more suitable for object detection than the RGB color space. The RGB color space is a combination of colors that reduces the intuition when a specific color must be expressed. The RGB color space is not intuitive. However, in the HSV color space, intuitive native colors can be analyzed [18]. The PV module can be detected with the color of the blue hue area in the HSV color space when the PV module is detected. We extract the PV module area using OpenCV inRange function. The inRange function extracts pixels corresponding to the blue region in the HSV color space.
In extracting the PV module area, blue pixels can be extracted in some background areas as noise. To eliminate this kind of noise, dilate/erode and medianBlur function were applied. Then, we binarize the given PV module area, and using the Hough transform, we calculate the horizontal angle. The Hough transform is used in automated analysis of digital images. All PV modules contain white lines in a specific grid-like pattern. These lines are the distinctive feature of a PV module. The Hough transform is a popular feature extraction technique widely used to detect lines in images. The Hough transform detects a line represented by Equation (2).
r = xcosθ + ysinθ
where r represents the distance from the origin to closest point on the straight line, and θ represents the angle between the x axis and the line connecting the origin with that closest point.
Each line in the image is associated with a pair (r, θ). For each pixel at (x, y) in an edge image, the Hough transform finds the set of all straight lines that passes through this point. The final output of the Hough transform is the accumulator matrix. This accumulator matrix contains the quantized values of r and θ. The element with the highest value indicates the most represented straight line in the image. Therefore, the value of r and θ is determined by referring to the existing information among the three values representing the highest value.
Moreover, we calculate the vertical position with the center of gravity of the block. We show the results for an implemented image processing example in Figure 10: original visible image (a), inRange function (b), threshold function (c), Hough line function (d), bold function (e), and derived result (f).

3.4. Implementation of BLDC Control Algorithm

The control method for the gimbal system’s stabilized driving device is a proportional integral derivative (PID) controller method. The PID has a small number of parameters and a simple structure. In particular, it is intuitive to change the response characteristics in response to changes in proportional gain, integration time, and derivative time [19]. Some studies examined intelligent PIDs to automatically determine the parameters [20,21]. However, in this work, the classical PID algorithm with simple structure is applied.
From the calculated horizontal angle and the vertical position, we can obtain the PWM values seen in Equations (3)–(5).
e ( t ) = y t a r g e t ( t ) y c u r ( t 1 )
y c u r ( t ) = K p e ( t ) + K d d e ( t ) d t + K i 0 t e ( t ) d t
o u t 2 P W M = w × y c u r ( t ) + b
where t is the unit time and y t a r g e t ( t ) is the target value obtained by image processing at t time, which can be a horizontal angle and a vertical position, y c u r ( t 1 ) and y c u r ( t ) are the tilt or roll motor control value at t − 1 time and t time, respectively, e(t) is the position error value, K p , K d , and K i are proportional, differential, and integral coefficients, respectively, o u t 2 P W M is the PWM value for tilt motor control of the main controller, w is the slope of the PWM value and position value, and b is the minimum PWM value.
The calculated horizontal and vertical positions are encoded along with the signal received from the remote control device. The relationship between the camera focus and the PV module is always fixed regardless of the tilt. Only the part cut by the image plane changes as the tilt changes. In this development, the camera tilt is adjusted around the horizontal axis of the module seen from above the PV module.

4. Gimbal Control Experiment

In order to verify the validity of the developed system, we conducted an experimental test as shown Figure 11. Our system consists of a drone system with a gimbal, GCS with dual camera, experiment panel with photo of PV module, and remote display.
The experiment process is as follows:
  • Produce the PWM signal correctly in the GCS and observe the transmission to the GCB.
  • Calculate the center value and horizontal angle of the PV module by image processing.
  • Analyze the gimbal control and tracking convergence according to changes in target vertical-position and target horizontal-angle.
Firstly, we confirmed whether the PWM signal was well generated by the main controller and transmitted to the GCB. As shown in Figure 12, the PWM waveform period was confirmed to be stably generated at 18 ms, and the width value was changed according to the position. Also, we confirmed that the gimbal’s BLDC motor rotated properly by generating the PWM signal.
Secondly, we extracted the central and horizontal angle values of the PV module from the image. Figure 13 shows one sample case for the image processing process. The image obtained through the image processing process, the PV module region extraction image, the binarization image, the hop transform image, and the overall result derivation image were sequentially displayed. We acquired a visible image and extracted the area of PV using the inRange function. The HSV range of the inRange function is 80 to 140 for hue, 65 to 255 for saturation, and 65 to 255 for brightness. To find the module’s center position, we binarized it with a threshold of 50 gray-value and find the boundary using the contours function. We use the Hough function with a threshold of 250 gray-value to find the PV module’s horizontal line.
Figure 14 shows various angle and center point extraction experiments with the PV module. Table 1 and Table 2 show the extracted values and error values for the extracted horizontal angle and center point. Compared with the ground truth, the horizontal angle showed an error of about ±8°, and the center position value showed ±28 pixels.
Thirdly, we checked that tilt and angle control was performed correctly to position the ROI in the center and horizontally capture it. In this experiment, PID control parameters (Kp, Kd, Ki) for the PWM output were set to 0.8, 0.4, and 0.01. The experimental results are shown in Figure 15 and Figure 16. The PWM is derived using the center and horizontal values of the extracted PV module. The gimbal’s tilt and roll are adjusted to look vertically at the PV module according to the PWM signal.
In the experiment, we tested whether the tilt and roll were corrected to the target position and angle when the center position and horizontal position of the target PV module were randomly changed. As shown in Figure 15, the tilt correction result shows that the vertical angle of the gimbal is adjusted to the target position when the center position of the target PV module changes to 200, 400, 100, and 350. At this time, the tilt correction error is about 6 pixels, and the PV module seen from the drone camera is positioned at the center. As shown in Figure 16, when the horizontal angle of the target PV module was changed to 43, −75, 10, and 70 degrees, the roll rotated to approximate each horizontal angle. At this time, the roll correction error was around 3 degrees, and the PV module seen by the drone camera was aligned horizontally. Through this, the ROI of the PV module is located at the center, and it is confirmed that the proper image is taken in real-time.

5. Conclusions

Herein, we proposed a gimbal control system, which can make drones capture effective frames from a thermal camera for detecting deteriorated areas of PV modules by locating a target object in the center of an image. This system involves real-time image processing and PID signal generation. In order to calculate proper angles (pitch, roll, yaw), blob analysis and the Hough transform are mainly used for center detection and angle calculation, respectively. These position and angles are converted into a PID signal. These processes can give feedback to each other and make a target PV module stable. Finally, experimental results showed that target modules can be kept in the center area under limited conditions. In the near future, we will develop a fully automatic drone system to detect deteriorated areas using this gimbal control system after verifying its performance in real environments such as hard wind, higher trees, cloudy and rainy weather, etc.

Author Contributions

Conceptualization, H.-C.P. and H.J.; methodology, H.-C.P.; software, H.-C.P.; validation, H.-C.P., H.J., and S.-W.L.; formal analysis, H.J.; investigation, H.-C.P.; resources, S.-W.L.; data curation, H.J.; writing—original draft preparation, H.J.; writing—review and editing, H.-C.P.; visualization, H.J.; supervision, S.-W.L.; project administration, H.J.; funding acquisition, H.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (NO. 2017-0-01712) and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (2019R1F1A1062075).

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

References

  1. Gao, X.; Munson, E.; Abousleman, G.P.; Si, J. Automatic solar panel recognition and defect detection using infrared imaging. In Automatic Target Recognition XXV; International Society for Optics and Photonics: Baltimore, MD, USA, 2015; Volume 9476. [Google Scholar]
  2. Jeong, H.; Kwon, G.R.; Lee, S.W. Deterioration diagnosis of solar module using thermal and visible image processing. Energies 2020, 13, 2856. [Google Scholar] [CrossRef]
  3. Uriarte, S.; Saenz, M.J.; Hernando, F.; Jimeno, J.C.; Martínez, V.E.; Egido, M.A.; Elorduizapatarietxe, S. Energy integrated management system for PV applications. In Proceedings of the 20th European Photovoltaic Solar Energy Conference, Barcelona, Spain, 6–10 June 2005; pp. 2292–2295. [Google Scholar]
  4. Kase, R.; Nishikawa, S. Fault detection of bypass circuit of PV module—Detection technology of open circuit fault location. In Proceedings of the 19th International Conference on Electrical Machines and Systems (ICEMS), Chiba, Japan, 13–16 November 2016; pp. 1–4. [Google Scholar]
  5. Anwar, S.A.; Abdullah, M.Z. Micro-crack detection of multicrystalline solar cells featuring an improved anisotropic diffusion filter and image segmentation technique. Eurasip J. Image Video Process. 2014, 2014, 15. [Google Scholar] [CrossRef] [Green Version]
  6. Tsai, D.M.; Wu, S.C.; Chiu, W.Y. Defect detection in solar modules using ICA basis images. IEEE Trans. Ind. Inf. 2012, 9, 122–131. [Google Scholar] [CrossRef]
  7. Koch, S.; Weber, T.; Sobottka, C.; Fladung, A.; Clemens, P.; Berghold, J. Outdoor electroluminescence imaging of crystalline photovoltaic modules: Comparative study between manual ground-level inspections and drone-based aerial surveys. In Proceedings of the 32nd European Photovoltaic Solar Energy Conference and Exhibition, Munich, Germany, 21–24 June 2016; pp. 1736–1740. [Google Scholar]
  8. Fuyuki, T.; Kitiyanan, A. Photographic diagnosis of crystalline silicon solar cells utilizing electroluminescence. Appl. Phys. A 2009, 96, 189–196. [Google Scholar] [CrossRef]
  9. Vergura, S.; Falcone, O. Filtering and processing IR images of PV modules. RE&PQJ (ISSN 2172-038X). Renew. Energy Power Qual. J. 2011, 9, 1209–1214. [Google Scholar]
  10. Kaplani, E. Detection of degradation effects in field-aged c-Si solar cells through IR thermography and digital image processing. Int. J. Photoenergy 2012, 2012, 396792. [Google Scholar] [CrossRef] [Green Version]
  11. Salazar, A.M.; Macabebe, E.Q.B. Hotspots detection in photovoltaic modules using infrared thermography. In MATEC Web of Conferences; EDP Sciences: Les Ulis, France, 2016; Volume 70, p. 10015. [Google Scholar]
  12. Henry, C.; Poudel, S.; Lee, S.W.; Jeong, H. Automatic Detection System of Deteriorated PV Modules Using Drone with Thermal Camera. Appl. Sci. 2020, 10, 3802. [Google Scholar] [CrossRef]
  13. Gašparović, M.; Jurjević, L. Gimbal influence on the stability of exterior orientation parameters of UAV acquired images. Sensors 2017, 17, 401. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Mingkhwan, E.; Khawsuk, W. Digital image stabilization technique for fixed camera on small size drone. In Proceedings of the Third Asian Conference on Defence Technology (ACDT), Phuket, Thailand, 18 January 2017; pp. 12–19. [Google Scholar]
  15. Vali, A.R.; Abdo, M.; Arvan, M.R. Modeling, control and simulation of cascade control servo system for one axis gimbal mechanism. Int. J. Eng. 2014, 27, 157–170. [Google Scholar]
  16. Kim, M.; Byun, G.S.; Kim, G.H.; Choi, M.H. The Stabilizer Design for a Drone-Mounted Camera Gimbal System Using Intelligent-PID Controller and Tuned Mass Damper. Int. J. Control Autom. 2016, 9, 387–394. [Google Scholar] [CrossRef]
  17. Fliess, M.; Join, C. Model-free control and intelligent PID controllers: Towards a possible trivialization of nonlinear control. In Proceedings of the 15th IFAC Symposium on System Identification Saint-Malo 2009, Saint-Malo, France, 6–8 July 2009; Volume 42, pp. 1531–1550. [Google Scholar]
  18. Erkut, U.; Bostancıŏglu, F.; Erten, M.; Özbayŏglu, A.M.; Solak, E. HSV color histogram based image retrieval with background elimination. In Proceedings of the 1st International Informatics and Software Engineering Conference (UBMYK), Ankara, Turkey, 6 November 2019; pp. 1–5. [Google Scholar]
  19. Tang, K.; Huang, S.; Tan, K.; Lee, T. Combined PID and adaptive nonlinear control for servo mechanical systems. Mechatronics 2004, 14, 701–714. [Google Scholar] [CrossRef]
  20. Fliess, M.; Join, C. Intelligent PID controllers. In Proceedings of the 16th Mediterranean Conference on Control and Automation, Ajaccio, France, 25–27 June 2008; pp. 326–331. [Google Scholar]
  21. d’Andréa Novel, B.; Fliess, M.; Join, C.; Mounier, H.; Steux, B. A mathematical explanation via “intelligent” PID controllers of the strange ubiquity of PIDs. In Proceedings of the 18th Mediterranean Conference on Control and Automation, MED’10, Marrakech, Morocco, 23–25 June 2010; pp. 395–400. [Google Scholar]
Figure 1. Images from different altitudes: (a) image from visible camera at 100-m height; (b) image from thermal camera at 100-m height (low resolution); (c) image from visible camera at 10 m; (d) image from thermal camera at 10 m (low resolution).
Figure 1. Images from different altitudes: (a) image from visible camera at 100-m height; (b) image from thermal camera at 100-m height (low resolution); (c) image from visible camera at 10 m; (d) image from thermal camera at 10 m (low resolution).
Applsci 10 04646 g001aApplsci 10 04646 g001b
Figure 2. Comparison between proper and improper image acquisition: (a) visible image case 1; (b) thermal image case 1, improper image with ROI not centered; (c) visible image case 2; (d) thermal image case 2, proper image with ROI centered.
Figure 2. Comparison between proper and improper image acquisition: (a) visible image case 1; (b) thermal image case 1, improper image with ROI not centered; (c) visible image case 2; (d) thermal image case 2, proper image with ROI centered.
Applsci 10 04646 g002
Figure 3. Comparison between using and not using a gimbal during a drone flight.
Figure 3. Comparison between using and not using a gimbal during a drone flight.
Applsci 10 04646 g003
Figure 4. Applied gimbal system.
Figure 4. Applied gimbal system.
Applsci 10 04646 g004
Figure 5. Gimbal control system (GCS) equipped with dual camera.
Figure 5. Gimbal control system (GCS) equipped with dual camera.
Applsci 10 04646 g005
Figure 6. Block diagram of photovoltaic module imaging system using drones equipped with thermal and visible cameras.
Figure 6. Block diagram of photovoltaic module imaging system using drones equipped with thermal and visible cameras.
Applsci 10 04646 g006
Figure 7. Gimbal control system (GCS).
Figure 7. Gimbal control system (GCS).
Applsci 10 04646 g007
Figure 8. Example of pulse width modulation.
Figure 8. Example of pulse width modulation.
Applsci 10 04646 g008
Figure 9. Implementation of GCS; PWM1 to control roll, PWM0 to control tilt.
Figure 9. Implementation of GCS; PWM1 to control roll, PWM0 to control tilt.
Applsci 10 04646 g009
Figure 10. Implementation of image processing.
Figure 10. Implementation of image processing.
Applsci 10 04646 g010
Figure 11. Our system for gimbal control experiment.
Figure 11. Our system for gimbal control experiment.
Applsci 10 04646 g011
Figure 12. Pulse width modulation (PWM) waveform generated by GCS.
Figure 12. Pulse width modulation (PWM) waveform generated by GCS.
Applsci 10 04646 g012
Figure 13. Experiment result image processing: (a) original image from visible camera; (b) extracting the area of PV (Inrange function); (c) binarized image from image (b) with threshold of 50 gray-value; (d) binarized image from image (b) with threshold of 250 gray-value; (e) converted value from the Hough function; (f) center position of module from image (c).
Figure 13. Experiment result image processing: (a) original image from visible camera; (b) extracting the area of PV (Inrange function); (c) binarized image from image (b) with threshold of 50 gray-value; (d) binarized image from image (b) with threshold of 250 gray-value; (e) converted value from the Hough function; (f) center position of module from image (c).
Applsci 10 04646 g013
Figure 14. Experiment result image processing for various cases with different angles and positions: (a) case a; (b) case b; (c) case c; (d) case d; (e) case e; (f) case f; (g) case g.
Figure 14. Experiment result image processing for various cases with different angles and positions: (a) case a; (b) case b; (c) case c; (d) case d; (e) case e; (f) case f; (g) case g.
Applsci 10 04646 g014aApplsci 10 04646 g014b
Figure 15. Gimbal control response according to the change of up and down position.
Figure 15. Gimbal control response according to the change of up and down position.
Applsci 10 04646 g015
Figure 16. Gimbal control response according to left and right angle change.
Figure 16. Gimbal control response according to left and right angle change.
Applsci 10 04646 g016
Table 1. Experiment result for horizontal angle cases.
Table 1. Experiment result for horizontal angle cases.
Case No.(a)(b)(c)(d)(e)(f)(g)
Horizontal angle [degree]20.06.0−14.0−26.6−43.219.0−1.0
Ground truth20.07.0−15.0−34.0−44.020−1.0
Error0−1−1−7.4−0.8−10
Table 2. Experiment result for vertical position cases.
Table 2. Experiment result for vertical position cases.
Case No.(a)(b)(c)(d)(e)(f)(g)
Vertical position [pixel]124158234190358430272
Ground truth96168240192350404273
Error28−10−6−28−26−1

Share and Cite

MDPI and ACS Style

Park, H.-C.; Lee, S.-W.; Jeong, H. Image-Based Gimbal Control in a Drone for Centering Photovoltaic Modules in a Thermal Image. Appl. Sci. 2020, 10, 4646. https://doi.org/10.3390/app10134646

AMA Style

Park H-C, Lee S-W, Jeong H. Image-Based Gimbal Control in a Drone for Centering Photovoltaic Modules in a Thermal Image. Applied Sciences. 2020; 10(13):4646. https://doi.org/10.3390/app10134646

Chicago/Turabian Style

Park, Hyun-Cheol, Sang-Woong Lee, and Heon Jeong. 2020. "Image-Based Gimbal Control in a Drone for Centering Photovoltaic Modules in a Thermal Image" Applied Sciences 10, no. 13: 4646. https://doi.org/10.3390/app10134646

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop