Next Article in Journal
A Circular Sector-Shaped Dipole Antenna with Meandered Arms and Added Ferrite-Loaded Artificial Magnetic Conductor
Next Article in Special Issue
Quantum-Chromodynamics-Inspired 2D Multicolor LED Matrix to Camera Communication for User-Centric MIMO
Previous Article in Journal
Energy Generation and Attenuation of Blast-Induced Seismic Waves under In Situ Stress Conditions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A General Calibration Method for Dual-PTZ Cameras Based on Feedback Parameters

1
Fifth Team of Cadets, Army Military Transportation University, Tianjin 300161, China
2
Unit 32521, Zhenjiang 212000, China
3
Institute of Military Transportation Research, Army Military Transportation University, Tianjin 300161, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(18), 9148; https://doi.org/10.3390/app12189148
Submission received: 28 July 2022 / Revised: 7 September 2022 / Accepted: 7 September 2022 / Published: 12 September 2022
(This article belongs to the Special Issue Optical Camera Communications and Applications)

Abstract

:

Featured Application

Dual PTZ cameras can be used in high-precision 3D reconstruction of large scenes, long-distance reconnaissance, active tracking and target recognition and localization. This work can quickly and accurately obtain intrinsic and extrinsic parameters of Dual PTZ cameras, no matter how PTZ cameras rotate, zoom or the background continues to change, which can promote the application of dual PTZ cameras to intelligent unmanned systems.

Abstract

With the increasing application of dual-PTZ (Pan-Tilt-Zoom) cameras in intelligent unmanned systems, research regarding their calibration methods is becoming more and more important. The intrinsic and extrinsic parameters of dual-PTZ cameras continuously change during rotation and zoom, resulting in difficulties in obtaining precise calibration. Here, we propose a general calibration method for dual-PTZ cameras with variable focal length and posture under the following conditions: the optical center of the camera does not coincide with the horizontal and pitch rotation axes, and the horizontal and pitch rotation axes are not perpendicular to each other. We establish a relationship between the intrinsic and extrinsic parameters and the feedback parameters (pan, tilt, zoom value) of dual-PTZ cameras by fitting and calculating previous calibration results acquired at specific angles and zoom values using Zhang’s calibration method. Subsequently, we derive the intrinsic and extrinsic parameter calculation formula at arbitrary focal length and posture based on the camera’s feedback parameters. The experimental results show that intrinsic and extrinsic parameters computed using the proposed method can better meet precision requirements compared with the ground truth calibrated using Zhang’s method. The average focal length error is less than 4%, the cosine similarity of the rotation matrix between the left and right cameras is more than 99.8%, the translation vector error is less than 1%, and the recalculated Euler angle errors are less than 1 degree. Our work can quickly and accurately obtain intrinsic and extrinsic parameters during the use of the dual-PTZ camera.

1. Introduction

PTZ (Pan-Tilt-Zoom) cameras, also known as gimbal cameras, can achieve horizontal rotation, pitch rotation, and lens zoom control. They can also provide the current horizontal angle, pitch angle, and zoom value in real time via SDK. PTZ cameras can not only obtain image information from different regions via rotation but they can also obtain spatial images with rich target texture information from different depths, using zoom without sacrificing high resolution in a large range.
Dual-PTZ cameras simulate the chameleon’s vision system, whose left eye and right eye can rotate independently. The left and right cameras can rotate independently to actively track dynamic targets or reconnoiter the surrounding environment via rotation and zoom. They can also point in the same direction and activate stereovision to achieve the 3D-positioning of their targets when locating them. Based on these characteristics, more and more intelligent unmanned systems use PTZ cameras as sensors in their perception modules. They can be used in high-precision 3D reconstruction of large scenes, long-distance reconnaissance, active tracking and target recognition and localization [1,2,3,4,5].
Accurate intrinsic and extrinsic parameter calibration results at arbitrary angles and focal lengths are the basis of the aforementioned applications of dual-PTZ cameras. However, in the process of rotating and zooming, the intrinsic and extrinsic parameters change. Furthermore, due to the motion of intelligent unmanned systems, the background of the camera is variable as well. As a result, achieving accurate calibration results for the intrinsic and extrinsic parameters of these cameras in real time has always been a challenge.
Because the camera’s rotation angle and zoom value are based on real-time feedback, we can build a general model for a dual-PTZ camera and calculate its intrinsic and extrinsic parameters directly through its feedback parameters. Previous works have idealized an ideal PTZ camera model using two assumptions: that the horizontal and pitch rotation axes of the camera should coincide with the optical center of the camera and that the horizontal and pitch axes should be orthogonal to each other [6]. However, this idealized model is insufficient for all PTZ cameras because absolutely ideal manufacturing is impossible using real production processes, let alone the fact that some PTZ cameras are specifically designed with the optical center far away from the axis.
Our main contributions to the literature are as follows:
(1)
Establishing a general model for dual-PTZ cameras and differentiating the two different modes of PTZ cameras in a slave relationship between the horizontal rotation and the pitch rotation;
(2)
Proposing an actual rotation vector and rotation radius vector calibration method based on the proposed model;
(3)
Deriving the calculation formula of intrinsic and extrinsic parameters at an arbitrary pan degree, tilt degree, and zoom value based on feedback parameters.
This paper is organized as follows: Section 2 introduces the research progress and related calibration work of dual-PTZ cameras. Section 3 establishes a general model for dual-PTZ cameras. Section 4 presents the fitting method of zoom value and camera’s intrinsic parameters. Section 5 deduces the computational formula of the camera’s extrinsic parameters based on feedback parameters in variable focal length and posture. Section 6 provides the experimental results. The results calibrated by Zhang’s method [7] are taken as the ground truth to validate the accuracy of our proposed method.

2. Related Works

More and more research is devoted to the calibration of PTZ cameras. To calibrate intrinsic parameters, Kumar et al. collected a large number of pre-calibrated intrinsic parameters with different dispersed zoom values as datasets [3], and they obtained these parameters using a lookup table based on the camera’s current zoom values while in use. However, this method is time-consuming, and only a few discrete focal length values can be obtained. Sinha et al. discussed the calibration of intrinsic parameters and the radial distortion captured by the image automatically, and they validated their approaches by calibrating two different cameras [8]. Wan [6] and Wu established different fitting models between focal length and zoom scales for different cameras, and Wu also modeled the lens distortion coefficient as a function of the zoom scale [9]. For lens distortion, Alvarez conducted a thorough study of the mathematical model between lens distortion and focal length [10]. Chen used a distortion matrix to describe the distortions of different parts of the lens under non-axisymmetric conditions [11].
For the calibration of extrinsic parameters, Wan, based on the aforementioned two assumptions in Section 1, deduced the rotation matrix relative to the original position according to the rotation angle [6,12]. In this model, the rotation will not produce a translation vector. Some researchers have considered the offset between the rotation axis and the optical center for single cameras and dual cameras, and they have introduced more complete models for the motion of the camera to avoid the problem above [13,14,15,16]. Chang et al. [17] proposed a calibration method based on Zhang’s approach, using an augmented checkerboard composed of eight small checkerboards. This calibration is formulated as an optimization problem to be solved with an improved particle swarm optimization (PSO) method.
Some of the above calibration methods for intrinsic and extrinsic parameters are meant to extract a series of image feature points or use a special calibration board to obtain the parameters, for example, in Refs. [8,9,10,17]; these methods are suitable for situations where the camera position is fixed relative to the background, so they are not applicable when mounting the camera on a mobile intelligent unmanned system. Refs. [6,9,10,12] discussed the relationship between intrinsic and extrinsic parameters and feedback parameters based on different models. Moreover, Ref. [9] aimed at the problem that feedback parameters reported by PTZ cameras become inaccurate after many hours of operation, and propose a method with 10 images to achieve accurate calibration results, which solved the problem of feedback precision.
However, the works above rarely propose a general calibration method under a nonidealized model to systematically determine both the intrinsic and extrinsic parameters of dual-PTZ cameras; the majority of the works are only for single PTZ cameras. In this article, we study how to use real-time feedback parameters (pan angle, tilt angle, zoom value) to calculate the intrinsic and extrinsic parameters of dual-PTZ cameras based on our proposed general model.

3. The Dual-PTZ Camera Model

The schematic diagram of the general, but not idealized, model for dual-PTZ cameras in this paper is shown in Figure 1a. The positions of the left camera and right cameras are relatively fixed, and they can be rotated in the horizontal and pitch directions, respectively. v l , h l and v r , h r are the direction vectors of the left and right cameras’ horizontal rotation axis and pitch rotation axis, respectively. They can also be collectively referred to as the rotation vector. O l and O r are the optical centers of left and right cameras, and O l x y z and O r x y z are the coordinate systems of the left and right camera, respectively. The camera coordinate system is a right-handed coordinate system, and the origin of the coordinate system is the optical center. In our model, the optical center does not coincide with the horizontal and pitch rotation vectors, and the horizontal and pitch rotation vectors are not absolutely perpendicular. In the process of zooming, the optical center moves slightly back and forth along the optical axes ( z l and z r ) in order to obtain images at different depths.
We built different rotation models for PTZ cameras with different slave relationships between the horizontal and the pitch axes (as seen in Figure 1b,c) and derived different calculation formulas for different models.
Separately, we will take the intrinsic and extrinsic parameter variation models of dual-PTZ cameras into consideration.

3.1. Intrinsic Parameter Variation Model

We used the pinhole imaging model to describe the imaging principle of the camera under the assumption that the skew of the imaging plane is zero and the principal point is approximately replaced with the zoom center (Figure 2). The calibration of the intrinsic parameter is meant to obtain the camera’s focal length, principal point, distortion parameters, etc. The intrinsic parameters of the camera are inherent to the nature of the camera lens, and so they have nothing to do with the rotation angle of the camera.

3.2. Extrinsic Parameter Variation Model and Meaning

In Figure 3, p w is the coordinate of an object in the world coordinate system, p l and p r are the coordinates of p w in the left and right camera coordinate systems after imaging through optical center. Taking the variations in extrinsic parameters caused by horizontal rotation and zoom as an example, the optical center moves from the initial position o 1 to position o 2 .
In the initial position, o 1 , the extrinsic parameters between the left and right cameras can be described as follows: via rotation matrix R l , R r and translation vectors T l , T r , we can get p l and p r , the transformation process can be expressed as:
p l = R l p w + T l p r = R r p w + T r
From Expression (1), we can obtain to following:
p r = R r R l 1 p l + T r R r R l 1 T l
According to the definition, the rotation matrix between the cameras is R 0 l r = R r R l 1 , and the translation vector is T 0 l r = T r R r R l 1 T l , which can be easily obtained with Zhang’s calibration method.
From Expression (2), we know that the rotation matrix ( R l r ) also represents the rotation transformation from point p l in the left camera coordinate to point p r in the right camera coordinate. This can also be explained as a transformation from the right camera coordinate system to the left camera coordinate system. The translation vector, T l r , describes the translation vector from the right camera’s optical center to the left camera’s optical center in the right camera coordinate system, O r x y z . When the optical center reaches position o 2 , the derivation of the camera’s extrinsic parameters can be derived from the above relationships as well.

4. The Estimation of Intrinsic Parameters

Variations in a PTZ camera’s intrinsic parameters are mainly caused by changes in the zoom value. The intrinsic parameters of the camera include focal length, distortion parameters, the principal point, etc. We mainly researched the calibration of the focal length and radial distortion parameters, which can affect the actual use of dual-PTZ cameras. The principal point is idealized as the zoom center [6], regardless of changes in the variable focal length. Using the previous calibration technique under different zoom values, we can obtain several groups of discrete, camera-intrinsic parameters using this method; hence, the functional relationship between the zoom value and focal length and the radial distortion is established. With this function, we can estimate the intrinsic parameters according to the zoom value fed back by the camera in real time.

4.1. Fitting Method of Focal Length

The focal length of the camera increases as zoom increases, but the variation trend and fitting model for focal length is different for different PTZ cameras (Figure 4). For the PTZ camera in Ref. [12], the variation rate of the focal length also increases as zoom increases, the variation trend of which we used as the exponential function model. In the case of Ref. [18], where the focal length is approximately linear with the zoom value, we used the linear interpolation method or a quadratic function to establish the function expression.
We recorded z as the current zoom value. By observing the changing trend of the curve [12], the model of exponential function fitting of the focal length can be written as:
f ( z ) = p 1 e p 2 z + p 3 e p 4 z
For the linear model, we can easily devise a linear interpolation and quadratic function approach.
A linear interpolation approach to fit the focal length, f , into a piecewise function of z, according to the focal length previously calibrated under the different zoom value, can be written as:
f ( z ) = z z i z i z i 1 ( f i f i 1 ) + f i
In Expression (4), f i is the known focal length under the zoom value, z i . Obviously, the estimation accuracy depends on the quantity and distribution of the previous calibration data.
With respect to the focal length, the quadratic function of z can also be modeled as:
f ( z ) = a z 2 + b z + c
In practice, we can compare the two models to find out which one is better. Based on the results above, we can estimate the intrinsic parameter matrix, K:
K = f x u 0 f y v 0 1

4.2. Radial Distortion Fitting Method

The camera’s radial distortion is caused by the manufacturing process used to produce the camera lens, which reflects the influence of the camera lens on imaging. The tangential distortion of the camera is caused by an installation error in the lens assembly. In order to simplify the model, this study only considers the variations in the first radial distortion parameter caused by zooming. Inspired by Ref. [10], we directly establish the function between the zoom value and the first parameter, k, of the radial distortion; the function can be modeled with an n-order polynomial:
k ( z ) = c n z n + c n 1 z n 1 + + c 1 z + c 0
where c n is the coefficient of the polynomial. A polynomial of the n-order (Expression (7)) has n + 1 unknown coefficients; therefore, if we take more than m sets ( m n + 1 ) of calibration results, our model fit can obtain simultaneous expressions:
k 1 = c n z 1 n + c n 1 z 1 n 1 + + c 1 z 1 + c 0 k 2 = c n z 2 n + c n 1 z 2 n 1 + + c 1 z 2 + c 0 k n + 1 = c n z n + 1 n + c n 1 z n + 1 n 1 + + c 1 z n + 1 + c 0 k m = c n z m n + c n 1 z m n 1 + + c 1 z m + c 0
Expression set (8) can be written as a matrix (Expression (9)):
z 1 n z 1 n 1 z 1 1 z 2 n z 2 n 1 z 2 1 z m n z m n 1 z m 1 m × ( n + 1 ) · c n c n 1 c 1 c 0 ( n + 1 ) × 1 = k 1 k 2 k m m × 1
Expression (9) can also be written as follows:
Z C = K
The polynomial coefficient matrix obtained by the least square method is:
C = ( Z T Z ) 1 Z T K
Based on the results, we can obtain the specific function of Expression (7). Due to the different fitting effects of the polynomial with different orders, we need to select a polynomial with a better fit as the distortion parameter estimation expression in the process of application, which can be seen in Section 6.3.2.
After the specific Expressions (3)–(6) in Section 4.1 and Expression (7) in Section 4.2 have been obtained, they are all functional relations with respect to z; when we estimate intrinsic parameters, we merely need to substitute the feedback zoom value into the function.

5. The Calibration of Extrinsic Parameters

The extrinsic parameters between the left and right cameras are composed of the rotation matrix and the translation vector. Here, we discuss these two parameters separately.

5.1. Calibration of the Rotation Matrix

Considering the general model we built, we need to calibrate the actual direction vectors of the camera’s horizontal and pitch rotation axes and then calculate the rotation matrix between the left and right camera coordinates.

5.1.1. Rotation Vector Solution

The rotation vector is consistent with the direction vector of the rotation axes. Based on the homography between the camera’s optical center coordinate system and the calibration plate’s corner coordinate system, we can calibrate the posture relationship between the camera and the calibration plate to calculate their respective rotation vectors. Taking horizontal rotation as an example, as shown in Figure 5, in a minimal zoom state with the optical center at position O 1 , we can use Zhang’s method to find the rotation matrix, R 1 , and the translation vector, T 1 , between the camera coordinate system, O 1 x y z , and the planar pattern corner coordinate system, B x y z . Then, keeping the position of the checkerboard planar pattern fixed, we can rotate the camera α degrees counterclockwise around the rotation axis, v , in a horizontal direction to position O 2 . The rotation matrix and translation vector in this position are R 2 and T 2 , respectively.
We can calculate the rotation matrix of the optical center transformation from O 1 to O 2 :
R v α = R 2 × R 1 1
According to Rodrigues’ formula, the rotation matrix can be converted into a rotation vector and a rotation angle:
sin ( α ) v Λ = R v α R v α T 2
In Expression (13), v Λ is the antisymmetric matrix of the rotation vector, v . In experiments, we can take the average value of multiple measurements to eliminate the resulting error.
Similarly, we can determine the pitch rotation axis ( h ) shown in Figure 5b.

5.1.2. Rotation Matrix Solution

Based on the rotation vector calculated in Section 5.1.1, we can determine the rotation matrix of the camera when it rotates α degrees around the horizontal rotation vector, v :
R v α = cos α I + ( 1 cos α ) v v T + sin α v Λ
where I is the third-order unit matrix. Therefore, we can determine the horizontal and pitch rotation matrices ( R h β l and R h β l ) of the left camera and the horizontal and pitch rotation matrices ( R h β r and R h β r ) of the right camera.

5.2. Translation Vector Solution

In our model, the translation vector is generated by both rotation and zoom; therefore, we take both variations into consideration.

5.2.1. Rotation Radius Vector Solution

As shown in Figure 6, for the camera whose optical center does not coincide with the rotation axis, we need to introduce the concept of a rotation radius vector. For instance, by taking the horizontal rotation into account and referring to the rotation plane as γ, which is perpendicular to the rotation axis ( v ), the optical center can rotate around a circle with a rotation radius of m on plane γ. The intersection of γ and the vector ( v ) is point c; the vector from the circle center, c, to the optical center in the camera’s initial coordinate system, O 1 x y z , is the radius vector ( m ).
Using early calibration, we can obtain the translation vector between the camera’s coordinate system at O 1 and the corner coordinate system at T 1 ; the translation vector at O 2 is T 2 , which is explained in Section 5.1.1. It is worth noting that the translation vector is under the initial camera coordinate, O 1 x y z . According to the relationship between T 1 and T 2 , the translation vector of the optical center moving from O 1 to O 2 is:
T v α = T 1 R v α T 2
When rotation radius vector m rotates α degrees around the axis, v , the translation vector can also be obtained:
T v α = R v α m m
The calculation results of (15) and (16) shall be consistent with each other:
T 1 R v α T 2 = R v α m m
Therefore, we can obtain the rotation radius vector ( m ) of the camera’s horizontal rotation:
m v = ( R v α I ) 1 ( T 1 R v α T 2 )
where I is the third-order unit matrix.
Using the method above, we can obtain the horizontal and pitch rotation radius vectors, m l h   m l h   m r v and m r h , of left and right cameras, respectively.

5.2.2. Rotation Translation Vector Solution

Taking the horizontal rotation of the left camera as an example as well, for the variations generated by rotation, the translation vector between the camera’s optical center relative to the coordinate system of the initial position can be expressed as:
T v α l = R v α l m l v m l v
Using the same method, the translation vectors generated by the horizontal and pitch rotations of the left and right cameras are T v α l , T h β l , T v α r , and T h β r .

5.2.3. Zoom–Translation Vector Solution

Because the optical center moves back and forth along the optical axis in the process of zooming, the optical center changes relative to the original position during the zooming process. However, the actual distance of the camera’s optical center changes by only a few millimeters, which can be ignored when high calibration accuracy is not required. A diagram of the change in the optical center caused by zooming is shown in Figure 6.
The zoom–translation vector, T z , is consistent with the change in focal length. The maximum translation vector generated by the optical center is T z _ max = ( 0 , 0 , d max d min ) T , with the camera zooming from the minimum to the maximum.
Therefore, the zoom–translation vector as a function of focal length can be expressed as:
T z = f f min f max f min T z _ max
where f max and f min are the focal length ranges of the camera in pixels, f is the current focal length estimated in Section 4.1, and d max , d min are the maximum and minimum values of the actual focal length in mm, which are the inherent parameters of the PTZ camera.
According to the rotation formula, when the PTZ camera rotates α degrees horizontally and β degrees in pitch and the zoom value is z, the zoom–translation vector is:
T r z = R v α R h β T z

5.3. Solution of Extrinsic Parameters

According to Expressions (19) and (21), for PTZ cameras conform to Figure 1b, when the horizontal rotation of the left and right cameras is α degrees, the pitch rotation of the left and right cameras is β degrees, and the zoom value is z. The rotation and translation matrices of the camera coordinate system relative to its original position are:
R r o t l = R v α l R h β l T r o t z l = R v α l T h β l + T v α l + R v α l R h β l T z R r o t r = R v α r R h β r T r o t z r = R v α r T h β r + T v α r + R v α r R h β r T z
For PTZ cameras conforming to Figure 1c, the rotation and translation matrices are:
R r o t l = R h β l R v α l T r o t z l = R h β l T v α l + T h β l + R h β l R v α l T z R r o t r = R h β r R v α r T r o t z r = R h β r T v α r + T h β r + R h β r R v α r T z
According to the definition of extrinsic parameters (Expression (2)) and the analysis in Section 3.2, the extrinsic parameters of dual-PTZ cameras are:
R l r = R 0 l r ( R r o t r ) 1 R r o t l T l r = ( R r o t r ) 1 ( R 0 l r T r o t l + T 0 l r T r o t r )
where R 0 l r and T 0 l r are the rotation matrix and the translation vector in the initial position, respectively, which can be easily calibrated with Zhang’s method.

6. Results and Discussion

To verify the calibration method proposed in this paper, we used two different camera types in our experiments.
For intrinsic parameter calibration, we used a camera with a large zoom: a Hikvision DS-2DB3220 (Hikvision, Hangzhou China) with a focal length range of 4.8–96 mm, which can achieve a 20-times optical zoom. For extrinsic parameter calibration, we choose a camera with a large rotation radius vector to enhance the experimental effects: two fixed-position Hikvision iDS-2PT7T40BX-D4 (Hikvision, Hangzhou China) cameras with a focal length range of 11–55 mm, which can achieve a 5-times optical zoom, 0°~360° horizontal rotation, and −40°~30° pitch rotation. All the cameras above conformed to the model in Figure 1b.

6.1. The Extrinsic Parameters in the Initial Position

We set the initial position of the camera as pan = 0°, tilt = 0°, and zoom = 1 and calibrated the extrinsic parameters with Zhang’s method. The results are shown in Table 1.

6.2. Calibration Results of the Rotation Vector and Rotation Radius Vector

Using the method in Section 5.1.1 and Section 5.2.1, we rotated the left and right cameras horizontally by −20°, −16°, 8°, 8°, 16°, and 20° and rotated the pitch by −10°, −8°, −6°, 6°, 8°, and 10°, respectively. The horizontal and pitch rotation vectors we obtained, in their respective camera coordinate systems, are shown in Table 2, and the rotation radius vectors are shown in Table 3.

6.3. Intrinsic Parameter Estimation Results

6.3.1. Estimated Results of Focal Length

We took the camera focal length when the zoom value was 1, 2, ···· 19, 20, as with the previous calibration results; its variation trend is shown in Figure 7:
It can be seen that the focal length of this camera was approximately linear compared to the zoom value; therefore, we can use the linear interpolation method (4) and the quadratic function (5) to establish the model and find which model is better fitted.
The quadratic function is:
k ( z ) = 9.2989 z 2 + 1614.6 z 160.8423
Here, we will not express the piecewise function calculated by the linear interpolation. Instead, we have taken eight groups of unfitted zoom values and compared the calculated results with the ground truth. The results and errors estimated with the linear interpolation method and the quadratic function model are shown in Table 4. The Zhang method’s own error range is less than 0.6%; thus, we can use it as the ground truth.
It can be seen from Table 4 that the focal length estimated using the linear interpolation method has an error of less than 4%, which can basically meet the needs of practical applications. Meanwhile, the error in results estimated using the quadratic function is slightly higher than the linear interpolation method when the zoom value is small, and the errors are lower when the zoom value is large.

6.3.2. Distortion Parameter Estimation Results

According to the previous calibration results, it can be seen that the overall variation trend of the camera’s radial distortion parameter, k, increases with an increase in focal length, but it is not strictly monotonic. We used the second-order polynomial and the third-order polynomial to establish the relationship between the distortion parameter, k, and the zoom value in Section 4.1, and we compared their fitting degree with the ground truth, calibrated by Zhang’s method.
The functional relationship of the quadratic polynomial is:
k ( z ) = 0.003 z 2 + 0.2016 z + 0.4819
The functional relationship of the cubic polynomial is:
k ( z ) = 0.0013 z 3 0.0433 z 2 + 0.6151 z 0.4748
The fitting of the two functions and the ground truth is shown in Figure 8.
Obviously, the quadratic polynomial has a better fit with the ground truth. We can obtain the well-fitted function of the radial distortion parameter using the zoom value by comparing it with differently ordered polynomials in the ground truth.

6.4. Extrinsic Parameter Calibration Results of the Dual-PTZ Camera

To verify the extrinsic parameter calibration results, we set the left and right cameras to different angles and zooms and compared the calculated results from (24) with Zhang’s method. The results are shown in Table 5.
Taking the extrinsic parameters obtained with Zhang’s method as the ground truth and comparing them with the extrinsic parameters calculated using our method, it can be seen from Table 5 that the cosine similarity of the rotation matrix reached more than 99.8%, and the translation vector error was less than 1%. Considering that the PTZ camera remained fixed in its roll direction, we only took the recalculated Euler angles in the horizontal and pitch directions into comparison. The errors are less than 1 degree, which can basically meet the calibration requirements of the extrinsic parameters of dual-PTZ cameras.

7. Conclusions

In this paper, we proposed a general calibration method for dual-PTZ cameras under variable focal lengths and postures. For intrinsic parameter calibration, we used Zhang’s method to calibrate several intrinsic parameters under discrete zoom values and established different models to fit variations in focal length and radial distortion parameters. Based on the model, we can estimate the intrinsic parameters via the zoom value. For the calibration of extrinsic parameters, we made the cameras stay at specific angles to obtain several rotation and translation matrices between the coordinate system and the checkerboard planar pattern coordinate system while the planar pattern remained in a fixed position. Based on the matrices above, we calculated the actual rotation vectors and rotation radius vectors of the camera. We also determined the zoom–translation vector generated by the optical center motion during zooming into consideration. Based on the parameters we computed above, the extrinsic parameters of current dual-PTZ cameras can be calculated at arbitrary posture and zoom values based on feedback parameters. Our experiments showed that: the average focal length errors estimated by our model are less than 4%; the cosine similarities of rotation matrix R between the left and right cameras are more than 99.8%; the translation vector errors are less than 1%; and the recalculated Euler angle errors are less than 1 degree, in both the horizontal and pitch directions. These results indicate that our method meets the requirements of practical applications in intelligent unmanned systems.

Author Contributions

Conceptualization, K.M. and Y.X.; methodology, R.W.; software, S.P.; writing—original draft preparation, K.M.; validation, K.M., R.W. and S.P.; writing—review and editing, K.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

No applicable.

Informed Consent Statement

No applicable.

Data Availability Statement

No applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Unlu, H.U.; Niehaus, P.S.; Chirita, D.; Evangeliou, N.; Tzes, A. Deep Learning-based Visual Tracking of UAVs using a PTZ Camera System. In Proceedings of the IECON 2019—45th Annual Conference of the IEEE Industrial Electronics Society, Lisbon, Portugal, 14–17 October 2019; pp. 638–644. [Google Scholar] [CrossRef]
  2. Yun, K.; Kim, H.; Bae, K.; Park, J. Unsupervised Moving Object Detection through Background Models for PTZ Camera. In Proceedings of the 2020 25th International Conference on Pattern Recognition (ICPR), Milan, Italy, 10–15 January 2021; pp. 3201–3208. [Google Scholar] [CrossRef]
  3. Kumar, S.; Micheloni, S.; Piciarelli, C. Stereo localization using dual ptz cameras. In International Conference on Computer Analysis of Images and Patterns; Springer: Berlin, Germany, 2009; pp. 1061–1069. [Google Scholar] [CrossRef]
  4. Park, U.; Choi, H.-C.; Jain, A.K.; Lee, S.-W. Face Tracking and Recognition at a Distance: A Coaxial and Concentric PTZ Camera System. IEEE Trans. Inf. Forensics Secur. 2013, 8, 1665–1677. [Google Scholar] [CrossRef]
  5. Lisanti, G.; Masi, I.; Pernici, F. Continuous localization and mapping of a pan–tilt–zoom camera for wide area tracking. Mach. Vis. Appl. 2016, 27, 1071–1085. [Google Scholar] [CrossRef]
  6. Wan, D.; Zhou, J. Stereo vision using two PTZ cameras. Comput. Vis. Image Underst. 2008, 112, 184–194. [Google Scholar] [CrossRef]
  7. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  8. Sinha, S.; Pollefeys, M. Towards calibrating a pan-tilt-zoom camera network. In Proceedings of the 5th Workshop Omnidirectional Vision Camera Networks Non-Classical Cameras, Prague, Czech Republic, 16 May 2004; pp. 42–54. [Google Scholar]
  9. Wu, Z.; Radke, R.J. Keeping a pan-tilt-zoom camera calibrated. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 35, 1994–2007. [Google Scholar] [CrossRef] [PubMed]
  10. Alvarez, L.; Gómez, L.; Henríquez, P. Zoom Dependent Lens Distortion Mathematical Models. J. Math. Imaging Vis. 2012, 44, 480–490. [Google Scholar] [CrossRef]
  11. Chen, J.; Zhang, H.; Ge, B.; Zhao, C. A Method of Calibration and Measuring Foeal Length for Pan-Tilt-Zoom Camera. In Proceedings of the 2018 Chinese Automation Congress (CAC), Xi’an, China, 30 November–2 December 2018; pp. 749–754. [Google Scholar]
  12. Wan, D.; Zhou, J. Self-calibration of spherical rectification for a PTZ-stereo system. Image Vis. Comput. 2010, 28, 367–375. [Google Scholar] [CrossRef]
  13. Davis, J.; Chen, X. Calibrating pan-tilt cameras in wide-area surveillance networks. In Proceedings of the Ninth IEEE International Conference on Computer Vision, Nice, France, 3 April 2003; p. 144. [Google Scholar] [CrossRef]
  14. Hayman, E.; Murray, D.W. The effects of translational misalignment when self-calibrating rotating and zooming cameras. IEEE Trans. Pattern Anal. Mach. Intell. 2003, 25, 1015–1020. [Google Scholar] [CrossRef]
  15. Zhao, X.; Wang, J.; Wang, C. Calibration of multiple degrees of freedom binocular stereo vision system based on axis parameters. Opt. Technol. 2018, 44, 140–146. [Google Scholar] [CrossRef]
  16. Shih, S.W.; Hung, Y.P.; Lin, W.S. Calibration of an active binocular head. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 1998, 28, 426–442. [Google Scholar] [CrossRef]
  17. Chang, Y.; Wang, H.; Lee, S.; Wu, C.; Hsu, M. Calibration of a dual-PTZ-camera system for stereo vision based on parallel particle swarm optimization method. Proc. SPIE 2014, 9019, 87–95. [Google Scholar] [CrossRef]
  18. Agapito, L.; Hayman, E.; Reid, I. Self-calibration of rotating and zooming cameras. Int. J. Comput. Vis. 2001, 45, 107–127. [Google Scholar] [CrossRef]
Figure 1. (a) General model for dual-PTZ cameras. (b) The pitching rotation axis is the driven axis. (c) The horizontal rotation axis is the driven axis.
Figure 1. (a) General model for dual-PTZ cameras. (b) The pitching rotation axis is the driven axis. (c) The horizontal rotation axis is the driven axis.
Applsci 12 09148 g001
Figure 2. Schematic diagram of focal length variation by zoom. The zoom process of the camera is approximately equivalent to the process of the optical center moving back and forth on the optical axis.
Figure 2. Schematic diagram of focal length variation by zoom. The zoom process of the camera is approximately equivalent to the process of the optical center moving back and forth on the optical axis.
Applsci 12 09148 g002
Figure 3. Schematic diagram of extrinsic parameter variations caused by camera rotation and zoom. Through the rotation and zoom, the optical centers move from the initial position, o 1 , to position o 2 .
Figure 3. Schematic diagram of extrinsic parameter variations caused by camera rotation and zoom. Through the rotation and zoom, the optical centers move from the initial position, o 1 , to position o 2 .
Applsci 12 09148 g003
Figure 4. Variation of focal length with zoom of different cameras.
Figure 4. Variation of focal length with zoom of different cameras.
Applsci 12 09148 g004
Figure 5. Schematic diagram of horizontal rotation and pitch rotation. The optical center rotates from O 1 to O 2 around v .
Figure 5. Schematic diagram of horizontal rotation and pitch rotation. The optical center rotates from O 1 to O 2 around v .
Applsci 12 09148 g005
Figure 6. Schematic diagram of the change in the optical center caused by zooming.
Figure 6. Schematic diagram of the change in the optical center caused by zooming.
Applsci 12 09148 g006
Figure 7. Variation of focal length with zoom.
Figure 7. Variation of focal length with zoom.
Applsci 12 09148 g007
Figure 8. Quadratic (a) polynomial and cubic (b) polynomial fitting situations.
Figure 8. Quadratic (a) polynomial and cubic (b) polynomial fitting situations.
Applsci 12 09148 g008
Table 1. The extrinsic parameters at the initial position.
Table 1. The extrinsic parameters at the initial position.
Rotation matrix R 0 l r = 0.999 0.005 0.015 0.004 0.998 0.048 0.015 0.048 0.999
Translation vector T 0 l r = 1058.8 5.439 126.56
Table 2. Rotation vectors of left and right cameras.
Table 2. Rotation vectors of left and right cameras.
Left camera v l ( 0.022 0.993 0.114 ) T
h l ( 0.997 0.053 0.060 ) T
Right camera v r ( 0.009 0.988 0.265 ) T
h r ( 0.998 0.007 0.022 ) T
Table 3. Rotation radius vector of left and right cameras.
Table 3. Rotation radius vector of left and right cameras.
Left camera m l v ( 2.409 3.105 82.045 ) T
m l h ( 1.403 42.201 79.069 ) T
Right camera m r v ( 2.523 3.012 81.127 )
m r h ( 1.323 41.238 80.529 ) T
Table 4. Comparison between focal lengths estimated by different methods and the ground truth.
Table 4. Comparison between focal lengths estimated by different methods and the ground truth.
ZoomZhang’s MethodZhang Method’s Own Error RangeLinear InterpolationAverage ErrorQuadratic FunctionAverage Error
2.63946.1610.082%4054.102.73%4135.324.79%
4.26273.1450.159%6454.952.89%6617.455.49%
5.78634.6950.194%8659.830.29%8901.213.09%
8.613,026.340.167%13,025.480.00%13,197.831.32%
11.317,067.330.340%16,818.90−1.46%17,057.53−0.06%
13.519,597.120.452%20,179.472.97%20,102.232.58%
16.624,024.330.549%24,236.960.88%24,239.700.90%
19.328,424.450.642%27,377.25−3.6%27,697.69−2.56%
Table 5. Comparison between extrinsic parameters of our data and the ground truth.
Table 5. Comparison between extrinsic parameters of our data and the ground truth.
PTZ ParametersLeftRightProposed MethodZhang’s Method
Rotation MatrixTranslation Vector (mm)Rotation MatrixTranslation Vector (mm)
Pan (°)347.811 0.913 0.059 0.403 0.051 0.998 0.031 0.405 0.008 0.914 989.104 48.268 317.349 0.915 0.013 0.403 0.016 1.000 0.004 0.403 0.010 0.915 990.001 42.792 292.362
Tilt (°)−0.5−1.7
Zoom1.51.4Cosine similarity
99.95%
Translation vector error
0.65%
Horizontal Euler angle error (°)
0.126
Pitch Euler angle error (°)
−0.401
Pan (°)340.8353 0.974 0.009 0.228 0.008 1.000 0.008 0.228 0.006 0.974 1052.447 12.194 4.840 0.971 0.002 0.238 0.002 1.000 0.001 0.238 0.001 0.971 - 1060.201 6.315 - 1.610
Tilt (°)2.80.7
Zoom2.32.6Cosine similarity
99.99%
Translation vector error
−0.73%
Horizontal Euler angle error (°)
0.602
Pitch Euler angle error (°)
0.287
Pan (°)2.424.2 0.923 0.066 0.380 0.058 0.998 0.032 0.381 0.008 0.924 889.628 90.875 537.404 0.921 0.010 0.390 0.015 1.000 0.008 0.390 0.014 0.921 893.667 75.612 520.393
Tilt (°)−0.5−1.7
Zoom11Cosine similarity
99.89%
Translation vector error
0.62%
Horizontal Euler angle error (°)
−0.544
Pitch Euler angle error (°)
0.344
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mao, K.; Xu, Y.; Wang, R.; Pan, S. A General Calibration Method for Dual-PTZ Cameras Based on Feedback Parameters. Appl. Sci. 2022, 12, 9148. https://doi.org/10.3390/app12189148

AMA Style

Mao K, Xu Y, Wang R, Pan S. A General Calibration Method for Dual-PTZ Cameras Based on Feedback Parameters. Applied Sciences. 2022; 12(18):9148. https://doi.org/10.3390/app12189148

Chicago/Turabian Style

Mao, Kang, Youchun Xu, Rendong Wang, and Shiju Pan. 2022. "A General Calibration Method for Dual-PTZ Cameras Based on Feedback Parameters" Applied Sciences 12, no. 18: 9148. https://doi.org/10.3390/app12189148

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop