Next Article in Journal
De-Identification Mechanism of User Data in Video Systems According to Risk Level for Preventing Leakage of Personal Healthcare Information
Next Article in Special Issue
Radar Target Tracking for Unmanned Surface Vehicle Based on Square Root Sage–Husa Adaptive Robust Kalman Filter
Previous Article in Journal
Closed-Form Power Normalization Methods for a Satellite MIMO System
Previous Article in Special Issue
A Novel Reinforcement Learning Collision Avoidance Algorithm for USVs Based on Maneuvering Characteristics and COLREGs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Hierarchical Stabilization Control Method for a Three-Axis Gimbal Based on Sea–Sky-Line Detection

1
Department of Advanced Manufacturing and Robotics, College of Engineering, Peking University, Beijing 100871, China
2
Institute of Software, Chinese Academy of Sciences, Beijing 100190, China
3
School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(7), 2587; https://doi.org/10.3390/s22072587
Submission received: 30 January 2022 / Revised: 20 March 2022 / Accepted: 26 March 2022 / Published: 28 March 2022
(This article belongs to the Special Issue Perception, Planning and Control of Marine Robots)

Abstract

:
Obtaining a stable video sequence for cameras on surface vehicles is always a challenging problem due to the severe disturbances in heavy sea environments. Aiming at this problem, this paper proposes a novel hierarchical stabilization method based on real-time sea–sky-line detection. More specifically, a hierarchical image stabilization control method that combines mechanical image stabilization with electronic image stabilization is adopted. With respect to the mechanical image stabilization method, a gimbal with three degrees of freedom (DOFs) and with a robust controller is utilized for the primary motion compensation. In addition, the electronic image stabilization method based on sea–sky-line detection in video sequences accomplishes motion estimation and compensation. The Canny algorithm and Hough transform are utilized to detect the sea–sky line. Noticeably, an image-clipping strategy based on prior information is implemented to ensure real-time performance, which can effectively improve the processing speed and reduce the equipment performance requirements. The experimental results indicate that the proposed method for mechanical and electronic stabilization can reduce the vibration by 74.2% and 42.1%, respectively.

1. Introduction

As global marine activities continue to heat up, military and civilian marine activities dramatically increase, resulting in the great growth of the use of marine videos. In the civilian field, conventional offshore operations, exploration, and search and rescue often require the assistance of maritime videos. In the military field, visual information, as the main method of passive perception, also has significant value. However, in the process of video capturing, there will be severe disturbances due to wave fluctuations and mechanical vibrations of the camera platform, which will reduce the quality of marine videos and considerably affect the subsequent processing and application [1]. In addition, for storage and transmission, stable marine videos can substantially improve the video compression ratio, which is conducive to saving storage space, speeding up the transmission process, and obtaining higher image quality in the same code stream [2].
Video stabilization is an effective way to solve the problems from disturbances. However, there are still many difficulties. On the one hand, in the case of a heavy sea environment, various noise signals with different frequencies are superimposed because of the violent vibrations. The visual environment is complex and ever-changing, and it often lacks stable feature points, bringing great difficulties to the stabilization process. On the other hand, the motion of surface video includes not only the subjective motion of the surface vehicle platform and the camera platform, but also the motion caused by other detrimental external factors [3]. Fast visual-field movements may also cause motion blur [4]. In addition, more application scenarios, such as live broadcasting, further image processing, and so on, put forth higher requirements for the timeliness of the stabilization process. In general, the onboard video stabilization problem in heavy sea environments is a nonlinear problem with great complexity, timeliness, and strong coupling, and it has important theoretical significance and practical value [1].
Traditional video stabilization methods can be roughly divided into three kinds [5]. Mechanical image stabilization (MIS) keeps stable by compensating for the motion of the stabilization platform. Optical image stabilization (OIS) means keeping stable by compensating for optical paths [6]. Electronic image stabilization (EIS) keeps the visual field stable through image processing. Up to now, the technology of motion compensation with a gimbal is very mature and widely used by researchers [7,8,9,10]. In addition, many new gimbal control methods have been proposed to be adapted to different applications [11,12]. Cheng et al. designed a PID control method combined with a BP neural network, which has a short adjustment time, good tracking characteristics, a smooth transition process, and a small overshoot, and it can quickly meet control requirements [13]. Behzad et al. designed an inclusive dynamic model of a one-axis gimbal mechanism. The effectiveness of active disturbance rejection control (ADRC) in the presence of external disturbances and parameter uncertainties was illustrated [14]. Hao et al. proposed an optical stabilization system based on deformable mirrors (DMs) for retina-like sensors. This system achieved image stabilization by changing the reflective plate of the DM’s compensating tilt angle [15]. In particular, various corner detection algorithms were employed to accomplish corner detection and corner matching so as to obtain inter-frame motion estimations [1,3,16,17].
However, when tackling a more complex and ever-changing environment, a single stabilization method may not be adequate. A single mechanical stabilization method is readily affected by friction, wind force, sensor errors, etc. When upgrading technology, mechanical stabilization can mainly be improved by hardware optimization to ensure better stabilization effects, which requires a higher cost. Optical stabilization is mainly achieved through delicate and complicated optical anti-vibration devices. The accuracy of electronic stabilization often depends on the stability of the gray value or the accuracy of the feature point. As for heavy sea environments, feature points may only exist in possible landmarks or clouds in the sky, which may cause more misjudgment and loss [5]. The gray value of an image may also be unstable with changes in light conditions. Furthermore, an electronic stabilization method will lose a lot of information in the case of severe shaking of the visual field, making the method almost invalid. Hence, a hierarchical stabilization method is an effective way to solve these problems. Bayrak indicated that the combination of mechanical stabilization and electronic stabilization can achieve better results than those of a single stabilization method in the case of high amplitude jitter [18]. Windau and Itti presented a multilayer real-time video image stabilization system and suggested that different kinds of stabilization methods can interact and complement each other [19]. These results verify the rationality of the hierarchical stabilization control method.
In order to accomplish image stabilization with satisfactory timeliness and sensitivity in a heavy sea environment, a hierarchical stabilization method combining a mechanical stabilization method with an electronic stabilization method is proposed in this paper. A three-DOF gimbal with a robust controller is equipped with a camera, and the stabilization process is realized through real-time motion compensation for large-angle vibrations and electronic image stabilization. The main contributions can be summarized as follows:
  • A hierarchical stabilization control method is proposed to cope with the complex environment of a heavy sea. A mechanical stabilization process is combined with an electronic stabilization process, which greatly improves the effectiveness and robustness.
  • An electronic stabilization method based on sea–sky-line detection is proposed. In addition, an improved stabilization method of image clipping based on prior information is utilized. The experimental results reveal that the proposed method can effectively stabilize an image in a complex environment with noises of various frequencies superimposed. The algorithm can achieve more than 50 fps by clipping images according to prior information, which fully meets the real-time demand.
The remainder of this paper is structured as follows. The kinematic model of the three-DOF gimbal is described in Section 2, followed by the equation of the relationship between the kinematic parameters and the sea–sky line. The detailed experimental results are illustrated in Section 3. Conclusions and future work are in Section 4.

2. Hierarchical Stabilization Control Method

A visual flowchart of the proposed method is illustrated in Figure 1. The mechanical and electronic stabilization methods are combined in series, both of which are discussed below.

2.1. Kinematic Model

It is assumed that the surface vehicle is on the sea and the camera is carried on the surface vehicle. There is no object on the surface of the sea to refer to other than the sea–sky line. Considering the influence of atmospheric refraction on the surface, there is a conclusion of the relationship between θ and h [20]:
θ = 22 h 13 r 0.0295 h ,
where r is the radius of the earth, θ is the declivity angle, and h is the height from the sea level.
It is assumed that the effect of ocean waves on the size of h is very small relative to the radius of the Earth, in which case θ is approximately a fixed value. Thus, we can assume that h is zero. In other words, the camera is located near the sea level and fluctuates less with the waves. In this case, θ is always zero. The sea–sky line is in the center of the field of view. Therefore, the center is the original point of the sea–sky line. In addition, the other two translational motions of the camera along the surface of the sea obviously have no effect on the position of the sea–sky line in the field of view.
The relationship between the other three rotational degrees of freedom of the camera and the sea–sky line is described below. Let the reference frame be any fixed coordinate whose X O Y plane coincides with the sea surface. Firstly, it is assumed that the orientation of the fixed coordinate of the camera, as shown in Figure 2, is obtained through a continuous relative rotation transformation of the base coordinate around its own Z-X-Y axis.
p 0 = R 1 0 p 1 = R 1 0 R 2 1 p 2 = R 1 0 R 2 1 R 3 2 p 3 = R 3 0 p 3 ,
where p 0 is the value in the basis coordinate of any point at sea level, and R m n is the rotation matrix of the m-th coordinate with respect to the n-th coordinate. Specifically, R 1 0 , R 2 1 , R 3 2 are construed as
R 1 0 = [ cos α sin α 0 sin α cos α 0 0 0 1 ] ,
R 2 1 = [ 1 0 0 0 cos β sin β 0 sin β cos β ] ,
R 3 2 = [ cos γ 0 sin γ 0 1 0 sin γ 0 cos γ ] .
Thus, the rotation matrix of the camera can be described as
R 3 0 = [ cos α sin α 0 sin α cos α 0 0 0 1 ] [ 1 0 0 0 cos β sin β 0 sin β cos β ] [ cos γ 0 sin γ 0 1 0 sin γ 0 cos γ ] = [ c α c γ s α s β s γ c β s α c α s γ + s α s β c γ s α c γ c α s β s γ c β c α s α s γ c α s β c γ c β s γ s β c β c γ ]
where α , β , and γ represent the angle of continuous relative rotation of the camera around the Z-X-Y axis with respect to the base coordinate, respectively, and s and c, respectively, represent sin and cos.
Note that p 0 is the position of any point at sea level in the basis coordinate, which can be described as
p 0 = [ u v 0 ]
where u and v can be any real number. Hence, it follows that
p 3 = [ u ( c α c γ s α s β s γ ) + v ( s α c γ + c α s β s γ ) v c α c β u s α c β u ( c α s γ + s α s β c γ ) + v ( s α s γ c α s β c γ ) ]
which indicates the representation of the sea–sky line in the fixed coordinate of the camera.
As can be seen from Figure 2, the angle between X and the intersection line of the sea level and X O Z can be expressed as
tan θ = u ( c α s γ + s α s β c γ ) + v ( s α s γ c α s β c γ ) u ( c α c γ s α s β s γ ) + v ( s α c γ + c α s β s γ ) = tan γ
where θ is directly related to the intercept of the sea–sky line, as illustrated in Figure 3.
If the sea–sky line is represented in slope-intercept form, the view coordinate is established in the camera’s field of view, and the center of the field of view is taken as the origin, then the relationship between the intercept and the attitude angle can be described as
b = N sin γ 2 sin η 2
where b is the intercept of the sea–sky line, η is the field angle of the camera, and N is the pixel number of the height. Similarly, the slope of the sea–sky line in the field of view of the camera can be described as the intersection line of the sea surface and the Y O Z plane.
k = u ( c α s γ + s α s β c γ ) + v ( s α s γ c α s β c γ ) v c α c β u s α c β = ( sin γ + cos γ ) tan β
where k is the slope of the sea–sky line, as shown in Figure 4.
According to Equations (10) and (11), the attitude angles β and γ can be solved from k and b, i.e., the slope and intercept of the sea–sky line in the field of vision. Although the rotation of the yaw axis cannot be solved with sea–sky-line information, the sea waves usually cannot affect the yaw motion in real environments.

2.2. Mechanical Stabilization Method

Mechanical stabilization plays a vital role in image stabilization in heavy sea conditions. Firstly, mechanical stabilization can provide primary motion compensation for the entire amplitude spectrum, reducing the difficulty of the electronic stabilization algorithm [19]. In addition, it can also reduce the proportion of image compensation of the visual field in the electronic stabilization algorithm [21]. Secondly, the electronic stabilization method is limited by the exposure time and fps of the camera, which makes it difficult to deal with the high-speed motion of the camera itself or that of the objects. Mechanical stabilization can effectively reduce the possibility of motion blur in the visual field and improve the robustness of subsequent algorithms. Furthermore, the high-frequency noise brought by a wave or the motors on surface vehicles almost cannot be handled by simple electronic stabilization methods.
The simple BGC 32-bit three-axis gimbal control system developed by Basecam Electronics was used as the controller for the experimental data in this paper. The hardware platform used a three-axis stabilizer developed by Huizhou FOSICAM Technology. In this system, the traditional PID control method was used to realize the self-stability of the 3-DOF gimbal. It was equipped with a 6-DOF IMU placed in parallel next to the video camera. On this basis, a low-pass filter was utilized to stop the noise of high-frequency signals and reduce the possibility of high-frequency resonance. A superimposed trap filter could filter the noise with the specific frequency of the three-axis motors. By applying an adaptive-gain PID method, the PID parameters could be dynamically adjusted to make the moving process smoother. In order to improve the response speed in a heavy sea environment, the PID parameters of the three joint axes were adjusted first, and then the related parameters of adaptive gain were adjusted. The parameters of the adaptive-gain PID method included the RMS error threshold, attenuation rate, and recovery factor. RMS means that when the error exceeds this threshold, the adaptive PID algorithm starts to run. The larger the attenuation rate is, the more the proportional PID gain decreases. The recovery factor defines how fast the proportional PID gain recovers when the system becomes stable. The detailed steps of mechanical stabilization are shown in Figure 5.

2.3. Electronic Stabilization Method

Generally, the proposed electronic stabilization method includes preprocessing, sea–sky-line detection, motion compensation, and image compensation, as shown in Figure 6. The parts marked in yellow are the key contributions in this paper. Other parts have been extensively studied by researchers in past decades, and will not be covered here.

2.3.1. Preprocessing

The proposed algorithm adopts the conventional electronic image stabilization preprocessing process, including Gaussian filtering, binarization, contrast adjustment, Canny edge detection, and other steps. Gaussian filtering can make the result of edge detection less affected by wave fluctuation. In the contrast adjustment process, linear contrast adjustment is used to achieve better edge detection results and make it more robust. The Canny edge detection algorithm can calculate the gradient image, get the single-pixel edge line after non-maximum suppression, and then get the binary image with the selected sea–sky line through double-threshold processing and edge stitching [22].

2.3.2. Sea–Sky-Line Detection

For a binary image of several candidate sea–sky lines with continuous pixels, the Hough transform is used to get the real sea–sky line. The classical Hough transform can map points on the same line in the original image to a point in Hough space, forming a peak value. Then, one can set a threshold value to extract the line that is most likely to be the sea–sky line in the image [23]. If multiple sea–sky lines are extracted with the Hough transform, the outliers are removed from these lines according to the prior information. Then, the average value of the slope and intercept of the remaining sea–sky line will be taken as the final result.

2.3.3. Interframe Motion Estimation

The process of motion estimation involves obtaining the kinematic parameters of the camera according to the relationship of the slope and intercept of two consecutive sea–sky lines. According to Equations (10) and (11), it follows that
γ ˙ T = f ( arcsin ( 2 b T N sin η 2 ) arcsin ( 2 b T 1 N sin η 2 ) )
β ˙ T = f ( arctan k T sin γ + cos γ arctan k T 1 sin γ + cos γ )
where f is the reciprocal of the time between two frames, which indicates the fps, and T is a counting variable.

2.3.4. Image Clipping Based on Prior Information

The image preprocessing step also includes image clipping based on prior information with the aim of reducing the amount of data and improving timeliness according to prior information. The clipped image should contain the region of the sea–sky line in the previous frame and the possible interframe motion, plus an additional error redundancy, as shown in Figure 7. Therefore, the formula of the remaining pixel height is as follows:
h T = | k T | W + 2 | tan β T tan β T 1 | W + δ
where W is the width of the frame and δ is the additional error redundancy, which is set to 20 in this paper.

3. Results and Analysis

3.1. Results of the Mechanical Video Stabilization Experiment

For mechanical stabilization, the response of the three-DOF gimbal to large-angle sine signals is demonstrated by comparing the angle data, which were measured by the IMUs attached to the pedestal and camera. In this experiment, a Stewart platform was used as the source of external vibrations to simulate the influence of wave fluctuations on camera motion [24]. The Stewart platform was set to move with only three degrees of rotational freedom as a sine function. The rotation amplitude was 15 and the frequency was 0.5 Hz, which simulated the violent vibrations in a heavy sea environment. Two IMUs were attached to the Stewart platform to measure the angular displacement. Figure 8 shows the experimental platform of mechanical stabilization at different times. The camera could basically maintain a stable attitude when the Stewart platform moved.
Figure 9 plots the pitch angle measured on the Stewart platform and the stabilized platform at 0.5 Hz. The mechanically stabilized motion of the three degrees of freedom was approximately a sine motion. Figure 10 demonstrates the amplitude of the angle displacement of the three degrees of freedom. At 0.5 Hz, the large-angle mechanical vibration is reduced by 74.2% on average. Therefore, it is noticeable that mechanical stabilization can effectively reduce the amplitude and velocity of large-angle motion and reduce the motion compensation and image compensation of electronic stabilization. In addition, since mechanical stabilization has been proven to be effective in improving image quality [25], it can also improve electronic stabilization. Better image quality is beneficial in improving the success rate of the following sea–sky-line detection.

3.2. Results of the Electronic Video Stabilization Experiment

Figure 11 demonstrates the result of each step in the sea–sky-line detection algorithm. The proposed electronic video stabilization algorithm had a good effect on the visual environment of the sea–sky line. Remarkably, according to the time cost analysis of each step of the algorithm, the electronic video stabilization method based on image clipping could effectively reduce the processing time. Note that the tests were conducted on a 2.5 GHz i7 processor with 16 GB of RAM. For a 1080 p input video, the average processing frame rate could reach 50 fps, which was able to meet the real-time processing requirement of a camera with a frame rate of 30 fps.
In the meantime, the center of the visual field was considered as the origin of motion compensation of the sea–sky line. Some representative frames (30th, 60th, 90th, and 120th) of the video were selected for display, as shown in Figure 12. In particular, the mean squared error (MSE) was imported to describe the change in intensity between two images, as shown in Equation (15). The results of the MSE analysis are plotted in Figure 13. As can be readily identified, the mean value of the MSE declined from 108.91 to 60.94, corresponding to a decreasing amplitude of approximately 42.1%.
M S E = 1 M × N i = 0 M 1 j = 0 N 1 [ I ( i , j ) K ( i , j ) ] 2
Furthermore, a Windows system API in a C++ environment was called to measure the execution time of each step of the electronic stabilization algorithm with an accuracy of microseconds. The execution time of each step of the image clipping based on prior information in comparison with the original algorithm is shown in Figure 14. As can be observed, the image-clipping method based on prior information was able to effectively reduce the time taken for electronic stabilization by 14.9%. Except for the step of Gaussian filtering, the time consumption of the other steps was able to be effectively reduced.
In order to further verify the robustness of the algorithm, we tried to find more surface videos in different environments and conditions to test the algorithm. Video 1 has strong sunlight reflections on the surface. As an external interference factor, strong sunlight will cause the interruption of large gradients at the sea–sky line, reducing the success rate of sea–sky-line detection. Video 2 has a mountainous background, which may affect the sea–sky-line detection. Video 3 was shot in harsh conditions. Heavy waves made it impossible for the sea–sky line to stay in line. Fog made the sky hazy and blurred the sea–sky line. Due to the lack of mechanical stabilization, the field of vision shook violently. The original videos are posted on Google Drive (https://drive.google.com/file/d/11iHkR5ggZA1IuXHvCTURpz0XIGFR2BYt/view?usp=sharing (accessed on 28 January 2022)) for readers who need them.
Figure 15 shows the effects of electronic stabilization in different environments. Clouds and sunlight had little effect on the proposed electronic stabilization algorithm, as shown in Figure 15a,b. Our algorithm was also able to distinguish between the sea–sky line and the mountain line in the background, as shown in Figure 15c. Storms at sea can really affect the quality of videos at sea. Even so, the algorithm still had 92.4% success rate of sea–sky-line detection. Table 1 lists the success rates of sea–sky-line detection for the marine videos with different interference factors.

3.3. Discussion

In brief, the proposed method can meet the requirements of image stabilization in a heavy sea environment. The combination of mechanical stabilization with large-angle motion compensation and the maritime electronic stabilization method with high time efficiency and robustness can achieve a better effect than a single method can. At present, military and civilian activities on the sea are becoming more and more frequent. The proposed method can be widely applied to surface vehicles to enhance the quality of video at sea, and then further improve the accuracy of the acquired information, which has broad application prospects. Therefore, it is of great commercial value for managers of companies engaged in offshore operations. Stable offshore video is good for post-processing and applications such as target identification and tracking. Although the proposed method of the hierarchical stabilization method is more expensive, as it requires the hardware of a three-DOF gimbal and processor, it is still superior to pure electronic stabilization in heavy sea environments.
However, due to the limitations of the experimental site and time, the proposed method was not further verified in more application scenarios. The main disadvantage of the electronic stabilization method proposed in this paper is the lack of robustness in scene switching. According to the current research, this method is suitable for the maritime video environment, but probably not for other environments where the sea–sky line does not exist. In addition, detecting a sea–sky line incorrectly over a period of time or not detecting it may result in poor stabilization performance. These are all problems that might be worth investigating further.

4. Conclusions and Future Work

In this work, a novel hierarchical image stabilization method was proposed and implemented for a three-axis gimbal mounted on surface vehicles; it combines mechanical stabilization with electronic stabilization. Specifically, the mechanical stabilization method can effectively reduce the vibration amplitude and filter high-frequency noise, thus offering better conditions for electronic stabilization algorithms. The electronic stabilization method based on sea–sky-line detection can detect the sea–sky line in real time, calculate the attitude angle, obtain the interframe motion model, and then compensate for the motion. The obtained results demonstrate that the proposed stabilization control method can satisfy both the requirements of robustness and real-time operation in heavy maritime environments.
Ongoing and future work will involve two aspects. On the one hand, according to the characteristics of heavy sea environments, we will design another small three-DOF gimbal to obtain better controllability and anti-interference ability compared with those of common hand-held gimbals. On the other hand, for the electronic stabilization algorithm, the current method based on sea–sky-line detection often fails to utilize other stable features than the sea–sky line. An urgent task is to optimize the effects of electronic stabilization by effectively utilizing other stable features apart from the skyline in a sea–sky scenario.

Author Contributions

Conceptualization, Z.X., S.K., Y.W. and J.Y.; methodology, Z.X., S.K., Y.W., G.Z. and J.Y.; experiment and analysis, Z.X., S.K. and G.Z.; writing—original draft preparation, Z.X., S.K. and J.Y.; writing—review and editing, S.K. and J.Y.; supervision, J.Y.; project administration, J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the National Natural Science Foundation of China under Grant U1909206, Grant 61725305, and Grant T2121002, in part by the Postdoctoral Innovative Talent Support Program under Grant BX2021010, and in part by the S&T Program of Hebei under Grant F2020203037.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data can be made available upon request from the corresponding author.

Acknowledgments

We are grateful to the research group of Min Tan (State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chen, C.H.; Chen, T.Y.; Hu, W.C.; Peng, M.Y. Video stabilization for fast moving camera based on feature point classification. In Proceedings of the 2015 Third International Conference on Robot, Vision and Signal Processing (RVSP), Kaohsiung, Taiwan, 18–20 November 2015; pp. 10–13. [Google Scholar]
  2. Cao, H.; Zhang, J. Video stabilizing and tracking by horizontal line for maritime cruise ship. In Proceedings of the 2007 IEEE International Conference on Control and Automation, Christchurch, New Zealand, 9–11 December 2007; pp. 1202–1206. [Google Scholar]
  3. Liu, W.; Zhang, Y.; Yang, X. A feature-based method for shipboard video stabilization. In Proceedings of the 2019 IEEE 2nd International Conference on Electronic Information and Communication Technology (ICEICT), Harbin, China, 20–22 January 2019; pp. 315–322. [Google Scholar]
  4. Linden, P.; Smith, N.; Rohrbach, T. Automatic optical image stabilization system calibration, validation, and performance for the SkySat constellation. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 9 July 2021; pp. 1–6. [Google Scholar]
  5. Zhao, H.; Jin, H.; Xiong, J. Overview of the electronic image stabilization technology. Opt. Precis. Eng. 2001, 8, 149–160. [Google Scholar]
  6. Cardani, B. Optical image stabilization for digital cameras. IEEE Control Syst. Mag. 2006, 26, 21–22. [Google Scholar]
  7. Touil, D.E.; Terki, N.; Hamiane, M.; Aouina, A.; Sidi Brahim, K. Image-based visual servoing control of a quadcopter air vehicle. Int. J. Model. Simul. 2021, 42, 203–216. [Google Scholar] [CrossRef]
  8. Ansari, Z.A.; Sengupta, S.; Singh, A.K.; Kumar, A. Image stabilization for moving platform surveillance. In Proceedings of the Conference on Airborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications IX, Baltimore, MD, USA, 24–26 April 2012; Volume 8360, p. 836009. [Google Scholar]
  9. Dahary, O.; Jacoby, M.; Bronstein, A.M. Digital gimbal: End-to-end deep image stabilization with learnable exposure times. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 19–25 June 2021; pp. 11936–11945. [Google Scholar]
  10. Heya, A.; Hirata, K. Experimental verification of three-degree-of-freedom electromagnetic actuator for image stabilization. Sensors 2020, 20, 2485. [Google Scholar] [CrossRef] [PubMed]
  11. Rajesh, R.J.; Kavitha, P. Camera gimbal stabilization using conventional PID controller and evolutionary algorithms. In Proceedings of the IEEE International Conference on Computer, Communication and Control (IC4), Indore, India, 10–12 September 2015; pp. 1–6. [Google Scholar]
  12. Abdo, M.M.; Vali, A.R.; Toloei, A.R.; Arvan, M.R. Stabilization loop of a two axes gimbal system using self-tuning PID type fuzzy controller. ISA Trans. 2014, 53, 591–602. [Google Scholar] [CrossRef] [PubMed]
  13. Cheng, X.; Zhang, Y. Three-axis balanced camera stabilizer design based on neural network BP algorithms combine PID control. In Proceedings of the International Conference on Electrical, Control, Automation and Robotics (ECAR), Xiamen, China, 16–17 September 2018; pp. 364–368. [Google Scholar]
  14. Ahi, B.; Nobakhti, A. Hardware implementation of an ADRC controller on a gimbal mechanism. IEEE Trans. Control Syst. Technol. 2017, 26, 2268–2275. [Google Scholar] [CrossRef]
  15. Hao, Q.; Fan, F.; Cheng, X.; Wang, D.; Jiang, Y. Optical stabilization system based on deformable mirrors for retina-like sensors. Appl. Opt. 2016, 55, 5623–5629. [Google Scholar] [CrossRef] [PubMed]
  16. Yu, J.; Luo, C.; Jiang, C.; Li, R.; Li, L.Y.; Wang, Z.F. A digital video stabilization system based on reliable SIFT feature matching and adaptive low-pass filtering. In Proceedings of the 1st Chinese Conference on Computer Vision (CCCV), Xi’an, China, 18–20 September 2016; pp. 180–189. [Google Scholar]
  17. Shan, X.; Pang, M.; Zhao, D.; Wang, D.; Hwang, F.J.; Chen, C.H. Maritime target detection based on electronic image stabilization technology of shipborne camera. IEICE Trans. Inf. Syst. 2021, E104D, 948–960. [Google Scholar] [CrossRef]
  18. Bayrak, S. Video Stabilization: Digital and Mechanical Approaches. Master’s Thesis, Middle East Technical University, Ankara, Turkey, 2008. [Google Scholar]
  19. Windau, J.; Itti, L. Multilayer real-time video image stabilization. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), San Francisco, CA, USA, 25–30 September 2011; pp. 2397–2402. [Google Scholar]
  20. Lehn, W.H.; van der Werf, S. Atmospheric refraction: A history. Appl. Opt. 2005, 44, 5624–5636. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Gašparović, M.; Jurjevixcx, L. Gimbal influence on the stability of exterior orientation parameters of UAV acquired images. Sensors 2017, 17, 401. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Liu, X.Y.; Zhao, C.B.; Zhang, S.P.; Li, M.; Qi, Z. A 2-layered structure sea-sky-line detection algorithm based on regional optimal variance. In Proceedings of the 2017 13th IEEE International Conference on Electronic Measurement and Instruments (ICEMI), Yangzhou, China, 20–23 October 2017; pp. 486–490. [Google Scholar]
  23. Ma, T.; Ma, J. A sea-sky line detection method based on line segment detector and Hough transform. In Proceedings of the 2016 2nd IEEE International Conference on Computer and Communications (ICCC), Chengdu, China, 14–17 October 2016; pp. 700–703. [Google Scholar]
  24. Zhan, G.; Niu, S.; Zhang, W.; Zhou, X.; Pang, J.; Li, Y.; Zhan, J. A docking mechanism based on a Stewart platform and its tracking control based on information fusion algorithm. Sensors 2022, 22, 770. [Google Scholar] [CrossRef] [PubMed]
  25. Pan, J.; Zhang, P.; Liu, J.; Yu, J. A Novel Visual Sensor Stabilization Platform for Robotic Sharks Based on Improved LADRC and Digital Image Algorithm. Sensors 2020, 20, 4060. [Google Scholar] [CrossRef]
Figure 1. Flowchart of hierarchical video stabilization control method.
Figure 1. Flowchart of hierarchical video stabilization control method.
Sensors 22 02587 g001
Figure 2. Kinematic model of the camera. The fixed coordinate is obtained through a continuous relative rotation transformation of the base coordinate.
Figure 2. Kinematic model of the camera. The fixed coordinate is obtained through a continuous relative rotation transformation of the base coordinate.
Sensors 22 02587 g002
Figure 3. Relationship between θ and the intercept of the sea–sky line.
Figure 3. Relationship between θ and the intercept of the sea–sky line.
Sensors 22 02587 g003
Figure 4. Sea–sky line in slope-intercept form.
Figure 4. Sea–sky line in slope-intercept form.
Sensors 22 02587 g004
Figure 5. Detailed steps of mechanical stabilization.
Figure 5. Detailed steps of mechanical stabilization.
Sensors 22 02587 g005
Figure 6. Flowchart of the electronic stabilization.
Figure 6. Flowchart of the electronic stabilization.
Sensors 22 02587 g006
Figure 7. Clipped image based on prior information.
Figure 7. Clipped image based on prior information.
Sensors 22 02587 g007
Figure 8. Experimental platform for mechanical stabilization at different timestamps.
Figure 8. Experimental platform for mechanical stabilization at different timestamps.
Sensors 22 02587 g008
Figure 9. Motion compensation effect of mechanical stabilization.
Figure 9. Motion compensation effect of mechanical stabilization.
Sensors 22 02587 g009
Figure 10. Motion amplitudes before and after motion compensation through mechanical stabilization in three rotating directions.
Figure 10. Motion amplitudes before and after motion compensation through mechanical stabilization in three rotating directions.
Sensors 22 02587 g010
Figure 11. Process of electronic video stabilization. (a) The original picture; (b) Gaussian filtering; (c) gray-level image; (d) contrast adjustment; (e) Canny edge detection; (f) motion compensation.
Figure 11. Process of electronic video stabilization. (a) The original picture; (b) Gaussian filtering; (c) gray-level image; (d) contrast adjustment; (e) Canny edge detection; (f) motion compensation.
Sensors 22 02587 g011
Figure 12. Comparison of the original and stabilized images from the electronic stabilization algorithm. (a) The original image; (b) the stabilized image.
Figure 12. Comparison of the original and stabilized images from the electronic stabilization algorithm. (a) The original image; (b) the stabilized image.
Sensors 22 02587 g012
Figure 13. Mean squared error of the original video and the stabilized video.
Figure 13. Mean squared error of the original video and the stabilized video.
Sensors 22 02587 g013
Figure 14. Processing time of each step of electronic stabilization.
Figure 14. Processing time of each step of electronic stabilization.
Sensors 22 02587 g014
Figure 15. The effects of electronic stabilization in different environments: (a) in cloudy weather conditions; (b) the sunlight reflecting heavily on the sea surface; (c) a mountainous background; (d) in a severe storm.
Figure 15. The effects of electronic stabilization in different environments: (a) in cloudy weather conditions; (b) the sunlight reflecting heavily on the sea surface; (c) a mountainous background; (d) in a severe storm.
Sensors 22 02587 g015
Table 1. The success rate of the electronic stabilization algorithm in different environments.
Table 1. The success rate of the electronic stabilization algorithm in different environments.
Video NumberSuccess RateExperiment Frame Number
Original video98.6%2000
Video 199.1%400
Video 297.7%1000
Video 392.4%2000
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xin, Z.; Kong, S.; Wu, Y.; Zhan, G.; Yu, J. A Hierarchical Stabilization Control Method for a Three-Axis Gimbal Based on Sea–Sky-Line Detection. Sensors 2022, 22, 2587. https://doi.org/10.3390/s22072587

AMA Style

Xin Z, Kong S, Wu Y, Zhan G, Yu J. A Hierarchical Stabilization Control Method for a Three-Axis Gimbal Based on Sea–Sky-Line Detection. Sensors. 2022; 22(7):2587. https://doi.org/10.3390/s22072587

Chicago/Turabian Style

Xin, Zhanhua, Shihan Kong, Yuquan Wu, Gan Zhan, and Junzhi Yu. 2022. "A Hierarchical Stabilization Control Method for a Three-Axis Gimbal Based on Sea–Sky-Line Detection" Sensors 22, no. 7: 2587. https://doi.org/10.3390/s22072587

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop