Next Article in Journal
The Dynamical Structure of a Warm Core Ring as Inferred from Glider Observations and Along-Track Altimetry
Next Article in Special Issue
Assessment of Nighttime Cloud Cover Products from MODIS and Himawari-8 Data with Ground-Based Camera Observations
Previous Article in Journal
Observing and Predicting Coastal Erosion at the Langue de Barbarie Sand Spit around Saint Louis (Senegal, West Africa) through Satellite-Derived Digital Elevation Model and Shoreline
Previous Article in Special Issue
Day and Night Clouds Detection Using a Thermal-Infrared All-Sky-View Camera
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Angular Calibration of Visible and Infrared Binocular All-Sky-View Cameras Using Sun Positions

1
Anhui Institute of Optics and Fine Mechanics, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei 230031, China
2
University of Science and Technology of China, Hefei 230026, China
*
Author to whom correspondence should be addressed.
Yiren Wang and Dong Liu are co-corresponding authors of this study.
Remote Sens. 2021, 13(13), 2455; https://doi.org/10.3390/rs13132455
Submission received: 17 May 2021 / Revised: 21 June 2021 / Accepted: 21 June 2021 / Published: 23 June 2021

Abstract

:
Visible and infrared binocular all-sky-view cameras can provide continuous and complementary ground-based cloud observations. Accurate angular calibration for every pixel is an essential premise to further cloud analysis and georeferencing. However, most current calibration methods mainly rely on calibration plates, which still remains difficult for simultaneously calibrating visible and infrared binocular cameras, especially with different imaging resolutions. Thus, in this study, we present a simple and convenient angular calibration method for wide field-of-view visible and infrared binocular cameras. Without any extra instruments, the proposed method only utilizes the relation between the angular information of direct sun lights and the projected sun pixel coordinates to compute the geometric imaging parameters of the two cameras. According to the obtained parameters, the pixel-view-angle for the visible and infrared all-sky images is efficiently computed via back projection. Meanwhile, the projected pixel coordinates for the incident lights at any angle can also be computed via reprojection. Experimental results show the effectiveness and accuracy of the proposed angular calibration through the error estimation of reprojection and back projection. As a novel application, we successfully achieve visible and infrared binocular image registration at the pixel level after finishing angular calibration, which not only verifies the accuracy of calibration results, but also contributes to further cloud parameter analysis under these two different imaging features. The registration results, to our knowledge, also provide a reference for the current blank in visible and infrared binocular cloud image registration.

Graphical Abstract

1. Introduction

The complex interactions among clouds, aerosols, and radiation have significant impacts on climate change and the hydrological cycle [1,2]. Visible and infrared all-sky-view cameras are powerful, ground-based cloud observation equipment [3,4,5], and some have been deployed on many meteorological observation networks dedicated to clouds–aerosols–radiation research, such as the Atmospheric Radiation Measurement (ARM) and Sky Radiometer Network (SKYNET) projects. Related studies have always been receiving close reviews. The most common studies use all-sky-view cameras to detect clouds and derive the local cloud cover [6,7,8,9], as a comparison to and validation of human observation and satellite observation [10,11]. Sky spectral radiance and luminance distribution can also be derived [12,13] and applied in solar power forecasting [14,15] and clouds and aerosol optical property analysis [16,17]. Compared with individual observation, the combination of visible and infrared all-sky-view cameras provides complementary and continuous whole sky information and also has better advantages for ground-based cloud observation through multi-channel imaging features. However, knowledge of the viewing directions (zenith and azimuth) for every pixel in captured visible and infrared all-sky images is essential to correctly analyze cloud parameters and georeference cloud location using a binocular vision system. Thus, the angular calibration for visible and infrared binocular all-sky-view cameras must be primarily performed.
Angular calibration is the process of computing geometric imaging parameters that represent the projection relation between the zenith and the azimuth angles of incident radiation and its corresponding pixel position on images. The current calibration for visible cameras is quite mature by using calibration plates with special checkerboard or circle patterns [18,19,20]. The calibration plate is placed in the field-of-view (FOV) of the camera, and angular calibration can be derived through two steps. First, multiple images of the calibration plate from different viewpoints are captured. Then, the precise feature points of the captured images are extracted to fit the imaging parameters of the defined projection model by using optimization algorithms. Crispel et al. [21] and Hensel et al. [22] utilized checkboard plates to successfully calibrate visible all-sky-view cameras as a preprocessing to cloud base height measurement, cloud movement, and solar power prediction. However, the abovementioned calibration plates are not suitable to achieve the simultaneous calibration of binocular imaging system whose angular calibration consists of both visible and infrared camera calibration, because infrared cameras are sensitive to the emitted infrared radiation. Some special materials or light heating methods are developed for calibration plates to provide thermal feature points at the same time [23,24,25,26]. Binocular camera calibration is challenging due to the complexity in fabricating such calibration plates and detecting distinct thermal feature points. In addition, the abovementioned calibration methods require visible and infrared all-sky-view cameras to capture the images of the calibration plates, which is inconvenient, especially when the cameras are not physically accessible. Thus, more flexible angular calibration for visible and infrared all-sky-view cameras should be developed.
To solve these problems, the known objects in the FOV of all-sky-view cameras, such as the sun and stars, may be considered as feature points to perform camera angular calibration. This is because the stellar positions in the sky and their corresponding projected pixel positions on an image can be known. Some related calibration works have been performed for visible all-sky-view cameras. Klaus et al. [27] exploited the accurate angular positions of fixed stars in a single input image to calibrate a visible all-sky-view camera, which performs well in terms of accuracy and robustness. Urquhart et al. [28] utilized sun positions in the sky and on the image planes to optimize the proposed generalized camera model to recover the extrinsic and intrinsic parameters of the visible all-sky-view camera. Schmidt et al. [15] performed an angular calibration by using a checkerboard plate to determine pixel zenith angles and rotating an all-sky image to geographic north through the visual comparison between the projected sun position and the visual appearance on the image. Richardson et al. [29] also used the sun’s trajectory to determine the north direction after correcting the lens distortion by using the standard checkerboard plate. However, both of the abovementioned works still rely on calibration plates, and the calibration performance has not been quantitatively tested. Although some successes have been achieved in the angular calibration of visible all-sky-view cameras, to our knowledge, few studies have used stellar positions to simultaneously estimate the geometric imaging parameters of visible and infrared binocular all-sky-view cameras to date.
This study aims to estimate the geometric imaging parameters and achieve angular calibration for visible and infrared binocular all-sky-view cameras by using sun positions. The relationship between direct sunlight and projected sun pixel positions are regarded as the feature inputs to compute all parameters, such as zenith position, focal length, and rotation angle (hereafter with respect to the north). No extra calibration instruments are required. The only prior data we need to know are the precise location and projection model of the cameras; thus, the proposed method is quite flexible, and it addresses some difficulties in the angular calibration of visible and infrared binocular all-sky-view cameras to a certain extent. According to the obtained imaging parameters, the received visible and infrared radiation can be provided with geographic direction through a reprojection and back projection procedure (Section 3.2), which is difficult to achieve with a traditional camera calibration based on calibration plates. Furthermore, the estimated geometric imaging parameters can be innovatively applied in the registration of visible and infrared clouds images pixel by pixel, enabling the possibility for further clouds information fusion.
The remainder of this paper is organized as follows. The visible and infrared binocular imaging system for ground-based clouds observation is presented and the data processing is also discussed in Section 2. The method for the geometric imaging parameter estimation and the procedure for reprojection and back projection are detailed in Section 3. The results of the geometric imaging parameter estimation and visible and infrared all-sky image registration are presented in Section 4. Lastly, the conclusion drawn from the study is provided in Section 5.

2. Description of Binocular Imaging Device and Data

2.1. Visible and Infrared Binocular All-Sky-View Cameras

The visible and infrared binocular all-sky-view cameras used in this study, namely all-sky camera 200 (ASC-200, Figure 1a), are fabricated by Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences [30]. ASC-200 is the second version of the ASC instrument [31,32] for 24 h high-quality cloud observation via a visible and infrared spectrum, and the determination of cloud cover and cloud base height. ASC-200 mainly contains a thermal infrared all-sky-view camera, a visible all-sky-view camera, an environmental regulation unit, and a data analysis system. The thermal infrared all-sky-view camera consists of a 640 × 512 thermal microbolometer array sensitive to 8–14 μ m radiation and a fisheye lens with an FOV of 160°. The visible camera composes of a 2000 × 1944 CMOS sensor sensitive to 450–650 nm radiation and a fisheye lens with an FOV of 180°. The data analysis system is in charge of the automatic acquisition, processing, and storage of visible and infrared all-sky images. The environmental regulation unit adjusts the inner temperature and humidity to keep the ASC-200 instrument stable in long-term fieldwork. The synchronization trigger for the two cameras is realized to simultaneously capture all-sky images every 10 minutes. The infrared camera is operational all times, and the visible camera works from sunrise to sunset. To reflect more visible sky details, the High Dynamic Range (HDR) technology [33] that fuses ten successive images with different low to high exposure times is applied to capture visible all-sky images. The infrared images are all resized to a 540 × 512 resolution to remove the ineffective areas.
ASC-200 instruments have been deployed on many meteorological stations. The one we used in this research is installed at the meteorological observation site of Anhui Air Traffic Management Bureau, Civil Aviation Administration of China (31.98°N latitude, 116.98°E, 62.95 m above sea level). Figure 1b shows that the captured visible and infrared all-sky images are uninterrupted by the surrounding features, such as buildings and trees.

2.2. Data Processing

In order to use the sun’s positions to achieve the angular calibration of the visible and infrared all-sky-view cameras of the ASC-200 instrument, the sun’s positions in the sky and the pixel coordinates on the images must correspond one-to-one. The related data processing can be divided into the following three steps.
First, a number of visible and infrared all-sky images are collected in which the sun is fully exposed. The recognition of whether the sun is exposed can be achieved through our previous work in the cloud detection algorithm, which classifies the sky, clouds, and the sun by using an end-to-end convolutional neural network [31]. However, the sun area in the visible all-sky images is much larger than that in the infrared all-sky images due to the over exposure caused by particle scattering (Figure 1b), thereby increasing the uncertainty in the following center pixel detection of the sun area. Thus, the visible images captured with the lowest exposure time (Figure 2a) are collected rather than the final HDR-fused images, to obtain a smaller solar area and more accurate sun positions. The other visible cameras also can achieve lowest exposure sky images by decreasing the exposure time to the minimum.
Second, the sun’s pixel coordinates on the collected visible and infrared all-sky images are computed. The sun’s pixel coordinates can be treated as the center pixel coordinates of the sun area. The Hough circle transformation that is widely used in recognizing circles is applied to identify the sun’s pixel coordinates, because the sun area often appears to be near-circular and symmetric in visible and infrared all-sky images. In order to ensure the accurate recognition of the sun’s center, the threshold range of a circle’s radius are constantly adjusted in the Hough circle transformation until a result with the best match is found. Prior to the Hough circle transformation, the opening operation is applied to all images to remove the noise and blur the edge. In Figure 2, the center pixel of the sun area in the lowest exposure visible images and infrared sky images can be accurately detected.
Once the sun’s pixel positions are already known, the zenith and azimuth angles corresponding to the sun’s sky positions are computed. In this study, the zenith and azimuth angles are computed using the Reda and Andreas algorithm [34], which takes the imaging time and latitude and longitude of the all-sky-view cameras as inputs. In the all-sky images (Figure 2) captured at local time (LT) 07:20 7 November 2020, the zenith and azimuth angles of the direct sun light are 33.19° and 216.29°, respectively.
According to the above procedure, three days of visible and infrared images are processed and form visible and infrared datasets, in which the data of the visible dataset are from 1 June (60 examples), 7 November (60 examples), and 30 September (47 examples) 2020. The data of the infrared dataset are from 1 June (66 examples), 7 November (60 examples), and 30 December (36 examples) 2020. In the visible and infrared datasets, the first two days of data are used as the training dataset, and the last day of data are used as the validation dataset.

3. Materials and Methods

Angular calibration is the process of computing geometric imaging parameters, whereby the pixels on the image and their incident lights can correspond one-to-one. In this section, the method for estimating the geometric imaging parameters, such as the zenith point, the focal length, and the rotation angle with respect to the north, is presented in Section 3.1. Then, the back projection process that determines the pixel-view-angle from any pixel positions and the reprojection process that determines the projected pixel coordinates from any incident lights are described in Section 3.2.

3.1. Geometric Imaging Parameter Estimation

3.1.1. Projection Models of All-Sky-View Cameras

The linear projection model is the basic projection type of the pinhole cameras; however, a small FOV limits their application for all-sky observations. Accordingly, the visible and infrared all-sky-view cameras are often equipped with fisheye lenses to guarantee the capture of complete hemispherical sky images. In order to project as large a sky scene as possible into a limited image plane, the fisheye lens is usually designed to follow a certain projection model to map the incident light from the different zenith and azimuth angles to the image plane. The common projection models of the fisheye lens of all-sky-view cameras are the equidistant projection (1) and the equisolid angle projection (2) [35], as follows:
r ( θ ) = f · θ ,
r ( θ ) = 2 f · sin ( θ / 2 ) ,
where θ denotes the zenith angle of incident light, r ( θ ) denotes the imaging radius from the zenith point, and   f denotes the ratio of the imaging radius and the incident zenith angle, also called the focal length (pixels per degree).
The fisheye lenses of the visible and infrared binocular all-sky-view cameras of the ASC-200 instrument are designed to follow the equidistant projection; thus, we take the equidistant projection as an example in this study. Nevertheless, the proposed method is also suitable for other projection models, such as the abovementioned equisolid angle. In Figure 3, we suppose a whole sky image I ( x ,   y ) , in which ( x i ,   y i ) denotes the pixel coordinate of the i-th row and j-th column, ( u ,   v ) denotes the pixel coordinates of the zenith point O on the image, and θ i , j denotes the zenith angle of the incident light for pixel ( x i ,   y i ) . Thus, the projection function of the ASC-200 instrument can also be rewritten from Equation (1), as follows:
( x i u ) 2 + ( y i v ) 2 = f · θ i , j ,
The unknown geometric imaging parameters, namely, u ,   v ,   and   f can be computed by solving Equation (3). Here, the decentering errors and high-order radial distortion are not considered in this equation because they are small in most lenses [25].

3.1.2. Computing Zenith Point and Focal Length

To compute the zenith point ( u ,   v )   and focal length f , the sun’s pixel position ( x s ,   y s ) and the zenith angle of the corresponding direct sunlight θ s are used as inputs to solve Equation (3). Although three groups of ideal input data ( x s ,   y s ,   θ s ) are mathematically enough to solve unknowns u ,     v , and f , the sun’s pixel position ( x s ,   y s ) inevitably contains random errors introduced by the center detection of the Hough circle transformation. Accordingly, the Levenberg–Marquardt (LM) non-linear optimization algorithm is introduced, which takes a number of ( x s ,   y s ,   θ s ) data as inputs to minimize the error function described in (4), to decrease the effect of random errors and obtain the proper u ,   v , and f values, as follows:
F ( u , v ,   f ;   x s ,   y s , θ s   ) = i = 1 n | ( x s i u ) 2 + ( y s i v ) 2 f · θ s i | 2 ,
where n is the number of input data.
The LM algorithm is a typical optimization algorithm that combines the gradient descendent and Gauss–Newton methods and provides the solver for non-linear least squares minimization with fast convergence. Reasonable parameter initialization is an important premise for the success of the LM algorithm. The zenith point ( u ,   v ) values can be set as the center of all-sky images (i.e., half of the width and height of the images), because the zenith point is not as far from the center of the images. The focal length value can be estimated as the following Equation (5) by using edge pixel points in the effective area of all-sky images:
f = r m a x / θ m a x ,
This notation is easily understandable because these edge points are far from the center at the distance of max imaging radius r m a x , and they are also the projection of incident light with max zenith angle θ m a x .
After initialization, the LM algorithm takes groups of data ( x s ,   y s ,   θ s ) as inputs, and further optimizes imaging parameters   u ,     v ,   and   f by iteratively minimizing the errors. The proper imaging parameters can be obtained until the errors are decreased to the fixed threshold we set, or the iteration numbers are up to the max value.

3.1.3. Computing Rotation Angle

In addition to the zenith point and focal length, the rotation angles of visible and infrared all-sky-view cameras can also be derived using the known azimuth angles and pixel coordinates of the sun. This feature is one of the advantages of stellar-based calibration, because traditional calibration methods based on calibration plates cannot obtain the geographical north direction. North on the image is the vector from the zenith point to the north direction. We take an infrared image as an example here. In Figure 4, we set a reference vector R = ( 0 ,   v ) ( u ,   v ) that is parallel to the x axis, and north must be deviated from reference vector R at a rotation angle set as   δ here [36]. This study aims to compute the rotation angle δ by using the sun’s positions.
We suppose the sun’s pixel point p with coordinates ( p x ,     p y ) on the image. The vector P = ( p x ,     p y ) ( u ,   v ) is deviated from reference vector R at a rotation angle set as β here. The corresponding azimuth angle of the sun’s pixel point is already known as set α , which also means the angle between vector P and the north direction. Thus, the rotation angle δ can be computed as follows:
δ = 1 n i n ( β i α i ) ,
where n is the number of input data. The angle β is computed using procedure (8), in which zenith point ( u ,   v ) is computed using the method in Section 3.1.2, and the temporary variable t is defined as follows:
t = a r c t a n | p y v | | p x u | 180 / π ,
β = { t     ,     p x u   a n d   p y v   360 t ,     p x u   a n d   p y > v   180 t ,     p x > u   a n d   p y v   180 + t ,     p x > u   a n d   p y > v ,
Thus, the visible and infrared binocular all-sky-view cameras can, respectively, obtain their rotation angles with respect to the north direction by using the azimuth angles and pixel coordinates of the sun, by which the rotation relation between these two binocular cameras can also be known.

3.2. Back Projection and Reprojection Process

The zenith and azimuth angles for every pixel on the captured all-sky images can be computed through a back projection process by using the geometric imaging parameters, namely, u ,   v ,   f ,   and   δ , of visible and infrared all-sky-view cameras. According to Equation (3), the zenith angle of the received radiation for any corresponding pixel coordinate ( x i , y i ) can be computed as follows:
θ i ,   j = ( x i u ) 2 + ( y i v ) 2   /   f ,
Meanwhile, the azimuth angle can be computed as β δ , where the rotation angle δ is known, and the angle β can also be computed through Equation (8). The zenith and azimuth angles are easily obtained for every pixel.
The projected pixel coordinates of the incident lights at any known zenith and azimuth angles can also be known by using the geometric imaging parameters   u ,   v ,   f ,   and   δ (i.e., the reprojection). Suppose that the zenith and azimuth angles of the incident light to be projected are θ and   α , respectively, its projected pixel coordinate ( x , y ) can be computed using the following Algorithm 1, where r is the distance between the projected pixel ( x , y ) and the zenith point ( u , v ), computed as product of the known focal length f and zenith angle θ . Meanwhile, β is the angle deviated from the reference axis, computed as the sum of the known azimuth angle α and rotation angle δ .
Algorithm 1: Realization of the Re-projection Process
Input: zenith and azimuth angles θ , α 11:    t e m p π ( β 90 ) / 180
Output: pixel coordinates x , y 12:    x u + r sin ( t e m p )
/* u , v ,   f , δ are known constants here */13:    y v r cos ( t e m p )
1: Set  r = f θ 14: else if 180 < β 270 then
2: Set  β = α + δ 15:    t e m p π ( β 180 ) / 180
3: if  β < 0 then16:    x u + r cos ( t e m p )
4:    β β + 360 17:    y v + r sin ( t e m p )
5: end if18: else
6: if  0 < β 90   or   β 360 then19:    t e m p π ( β 270 ) / 180
7:    t e m p π β / 180 20:    x u r sin ( t e m p )
8:    x u r cos ( t e m p ) 21:    y v + r cos ( t e m p )
9:    y v r sin ( t e m p ) 22: end if
10: else if 90   < β 180 then23: return  x , y
To date, the zenith and azimuth angles for every pixel on images and the projected pixel coordinates for any incident light can be easily calculated once the geometric imaging parameters are known, whether for visible or infrared all-sky-view cameras.

4. Experiment and Results

The angular calibration of visible and infrared binocular all-sky-view cameras using sun positions and their accuracy experiments are performed in this section. In Section 4.1, the geometric imaging parameters of these two cameras are estimated using the proposed method. Various evaluation experiments are conducted to verify the accuracy of the obtained imaging parameters. First, the reprojection errors are computed and the comparison between the synthetic sun trajectory and the actual observed sun positions is performed in Section 4.2. In addition, the zenith and azimuth angles back projected from the sun’s pixel positions are compared with the ground truth in Section 4.3. Finally, the visible and infrared all-sky images captured using the ASC-200 instrument achieve registration at the pixel level by applying the estimated geometric imaging parameters in Section 4.4.

4.1. The Geometric Imaging Parameter Estimation Results

To evaluate the effectiveness of the proposed method for simultaneously calibrating visible and infrared binocular all-sky-view cameras, the geometric imaging parameter estimation of the ASC-200 instrument is conducted. The zenith point ( u ,   v ) and focal length f for the infrared camera of ASC-200, as described in Section 3.1.2, are determined by optimizing the initial values of Equation (4) using the LM algorithm, which takes the sun’s zenith angles and the corresponding pixel coordinates from the infrared training dataset as inputs. The iterative optimization achieves fast convergence within seconds because Equation (4) does not include the complex high-order terms. Thereafter, the rotation angle δ is determined using the sun’s azimuth angles and the corresponding pixel coordinates from the infrared training dataset, as described in Section 3.1.3. The same operations are also performed on the visible all-sky-view camera by using the visible training dataset.
The parameter estimation results are illustrated in Table 1. The north direction is a ~25.45° anti-clockwise rotation from the reference axis in the visible all-sky-view camera, and a ~27.29° anti-clockwise rotation in the infrared all-sky-view camera. The difference in the rotation angles between these two binocular cameras is not so large, because they are manually adjusted to remain nearly aligned in the laboratory before the ASC-200 instrument is installed. Moreover, the estimation results of the other three imaging parameters are not far from their initial values. Thus, these obtained imaging parameters are basically in line with expectations.

4.2. Reprojection Analysis

4.2.1. Reprojection Errors

The reprojection errors for the visible and infrared all-sky-view cameras are performed to verify the accuracy of the estimated geometric imaging parameters. First, the re-projected pixel positions for the sun at the zenith and azimuth angles in the training dataset are calculated through estimated imaging parameters (Section 3.2). Meanwhile, the corresponding pixel coordinates of the sun in the training dataset are treated as ground truth. Thus, the reprojection errors are the difference between the re-projected pixel positions and the ground truth (Figure 5).
The reprojection errors are mostly within ±3.5 pixels in the visible all-sky-view camera, and they are more concentrated and mostly within ±1.5 pixels in the infrared all-sky-view camera. The main reason for that difference could be the uncertainty in detecting the sun’s center pixel because infrared images have smaller sun areas. Although the current mature calibration methods based on calibration plates for visible all-sky-view camera can achieve higher accuracy, the proposed method can still be considered satisfactory for meteorological observation in a more flexible way. The results prove the feasibility of the proposed method for simultaneously calibrating visible and infrared all-sky-view cameras.

4.2.2. Synthetic Sun Trajectory Calculation

The sun’s pixel coordinates on visible and infrared images can be computed at any time via the reprojection process by using the geometric imaging parameters of the visible and infrared all-sky-view cameras. The synthetic sun motion curve over a day is generated and visually compared with an actual observation to further verify the accuracy of the obtained geometric imaging parameters.
Instead of the training dataset, two days (2 and 13 August 2020) were selected to generate two synthetic sun trajectories here. First, the sun’s zenith and azimuth angles at every one second from 06:00 to 18:00 during each day were computed using the Reda and Andreas algorithm [28]. Then, the sun’s pixel coordinates corresponding to the abovementioned zenith and azimuth angles were computed via reprojection using the obtained geometric imaging parameters. These re-projected sun pixel points over a day can form a synthetic sun motion curve on the images because the sun is moving continuously from east to west [25]. Thus, two synthetic sun trajectories were generated for 2 and 13 August 2020.
To clearly present the difference between the synthetic sun trajectories and the observed sun positions, we plotted the synthetic sun curve into every image captured on those two days, and some results are presented in Figure 6. The observed sun positions are basically on the plotted curves, proving the accuracy of two synthetic sun trajectories. This notation also indicates that the obtained imaging parameters are accurate, and that the proposed angular calibration method is effective for visible and infrared all-sky-view cameras.

4.3. Back Projection Errors

To further verify the accuracy of the proposed method, the back projection errors for visible and infrared all-sky-view cameras were computed using the estimated geometric imaging parameters. First, the zenith and azimuth angles back projected from the sun’s pixel coordinates in the validation dataset were calculated (Section 3.2). Then, the corresponding zenith and azimuth angles of the sun in the validation dataset were treated as ground truth. The back projection errors are the difference between the calculated back projection results and the ground truth. The results of the back projection errors using all data from the validation dataset are shown in Figure 7.
In the visible and infrared all-sky-view cameras the back projection errors of azimuth angles are mostly within ±0.25°, and those of the zenith angles are mostly within ±0.5°. We found that the fluctuations in the zenith and azimuth errors in the infrared camera are smaller than those in the visible camera. This finding is consistent with the results of the reprojection errors in which an infrared camera appears more concentrated, because the smaller sun area in the infrared sky images helps to decrease the uncertainty in the determination of the sun’s pixel points. The results are further analyzed in terms of the azimuth and zenith angles as follows.
In terms of the azimuth angles, the data of the visible camera are mostly scattered randomly near the zero line but show a higher deviation at noon. The main reason for this phenomenon is that the sun approaches the center of the images at that time, where a small pixel error is present but may cause a large azimuth deviation. In contrast, the data of the infrared camera are stable and almost concentrated on the zero line.
With regard to the zenith angles, the data of the visible camera are also randomly scattered near the zero line but show a higher deviation after 16:00. This phenomenon could be due to the sun approaching the edge of the images, where the radial distortion is more obvious and may not strictly obey the designed equidistant projection model. The data of the infrared camera from 9:00 to 10:00 and from 13:00 to 14:00 are missed because the sun was obscured by the clouds. The back projection results underestimate the zenith angles at the edge of images but overestimate them at the center of images. This situation might indicate that the infrared camera has a higher term distortion, and the errors might be lessened by adding three-order terms to the projection model in the future.
To analyze the back projection errors statistically, the Root Mean Square Error (RMSE), Normalized Mean Square Error (NRMSE), Mean Absolute Error (MAE), Normalized Mean Square Error (NMAE), and Standard Deviation (SD) are used, as follows:
RMSE = 1 n i = 1 n x d 2 ,
NRMSE   = RMSE x m a x x m i n × 100 ,
MAE = 1 n i = 1 n | x d | ,
NMAE = MAE x m a x x m i n × 100 ,
SD = 1 n i = 1 n ( x d x d m e a n ) 2 ,
where
x d = x c x t ;
x d m e a n = 1 n i = 1 n x d ;
x c denotes the estimated values; x t denotes the ground truth; x m a x and x m i n are set as 360 and 0 for the azimuth angles, respectively; and x m a x and   x m i n are set as 90 and 0 for the zenith angles, respectively.
The comparison results are shown in Table 2. In the visible all-sky-view camera, the RMSE, MAE, and SD of the azimuth and zenith angles are approximately 0.2 ° . In the infrared all-sky-view camera, the RMSE, MAE, and SD of the azimuth angles are ~0.1 ° , and those of the zenith angles are ~0.25 ° . The NMAE and NRMSE are computed to compare the performance of the back projected zenith and azimuth angles whose value ranges are different, t. The results in Table 2 show that the back projected zenith angles achieve NRMSE values of 0.297% and NMAE values of 0.239% for the visible all-sky-view camera, and NRMSE of 0.315% and NMAE of 0.284% for infrared all-sky-view camera, but the back projected azimuth angles have better performance.

4.4. Visible and Infrared Binocular Image Registration

The geometric imaging parameters of the visible and infrared cameras of the ASC-200 instrument have been estimated and proven to be accurate through the abovementioned experiments. Such a combined visible and infrared binocular imaging system provides complementary sky conditions through two different wavelength bands and has the potential for a wide variety of atmospheric studies. For instance, the system is helpful for ground-based cloud observation because the infrared all-sky-view camera is generally less affected by haze and aerosol but suffers from the missing observation of thin clouds. In contrast, the visible all-sky-view camera is capable of detecting thin clouds but fails to deal with haze and aerosol conditions. Thus, the registration of captured visible and infrared all-sky images provides more reliable cloud information. However, some difficulties in effectively matching visible and infrared clouds images still remain due to limitations in their discrepancies in imaging spectrum, FOV, and sensor resolutions, which also impedes further data comparison and fusion. Hence, in this subsection, we try to fill in the blanks and register visible and infrared all-sky images at the pixel level for the ASC-200 instrument by applying these estimated geometric imaging parameters.
Here, we mainly warp the visible images to match them with the infrared images because the infrared images have a smaller resolution. The matching processes can be simply divided into two steps. First, we calculate the zenith and azimuth angles of every pixel of the infrared images by back projection using the obtained geometric imaging parameters of the infrared camera. Then, the pixel points of the visible images corresponding to the calculated zenith and azimuth angles are selected by reprojection using the imaging parameters of the visible all-sky-view camera. Thus, these selected pixel points form a new visible all-sky image (i.e., the warped visible image, whose resolution is the same as the one of infrared image).
The warped visible images corresponded with the infrared images pixel by pixel, and some registration results are presented in Figure 8. In comparison with the original visible all-sky images (Figure 8a), the scene in the warped visible images is visually magnified because the FOV is narrowed during the registration. The horizontal (Figure 8b) and vertical (Figure 8c) comparison results indicated that the visible and infrared images matched well, except for the edge of the images. The results prove not only the feasibility of the proposed registration procedure but also the accuracy of the proposed angular calibration using sun positions.
In addition, some interesting cases also deserve to be discussed. In the second image of Figure 8a, the visible all-sky image captures the long and thin contrail clouds. However, the infrared images almost miss that information. In the original visible image, we can know the existence of the contrail clouds; nevertheless, its warped visible image provides accurate pixel positions of the thin contrail clouds for the infrared image. In the third image of Figure 8a, the visible image is slightly contaminated by aerosols, and the appearance becomes whiter. The clouds around sun area in the visible image can be clearly recognized and located through the corresponding infrared image. These results contribute to further ground-based cloud observation and analysis.
However, we also find that the warped visible image of low clouds, especially those that appear at the edges of the images, as shown in the fourth image of Figure 8a, has some mismatch with the infrared images. This result could be because of the difference in the FOV between the visible and the infrared all-sky-view cameras. Although the visible and infrared binocular all-sky-view cameras are close to each other, the scene captured by the same zenith and azimuth angles still varies. The difference in the middle and high (altitude) clouds can be neglected. However, in the low clouds, the lower the clouds are, the larger the imaging difference is. The specific quantitative effects would be mathematically analyzed in the future, and the mismatch could be solved by introducing a non-rigid deformation.

5. Conclusions

Angular calibration is the import precondition for all-sky-view cameras to perform ground-based cloud observation and analysis. In this study, we have presented a convenient angular calibration method, which utilizes the sun’s positions to replace the traditional calibration plates to the obtain the zenith point and the focal length as well as the north direction, and successfully achieves simultaneous calibration for visible and infrared binocular all-sky-view cameras. The proposed method is flexible and simple because no extra calibration equipment is required. The only prior data needed are the camera location and the projection model. Accordingly, the zenith and azimuth angles for every pixel on the images, and the projected pixel points of incident radiation at any angle also can be recovered. The accuracy of the proposed method has been verified by the error estimation of reprojection and back projection. Moreover, the application of the proposed method, in effectively registering visible and infrared all-sky images at the pixel level, also verifies the performance of the proposed method.
This study is of great importance for further atmospheric research using visible and infrared all-sky-view cameras. After the angular calibration, the captured cloud images can be mapped onto the real geographical coordinates when cloud heights are known. The wind vector derived from the cloud motion of a sequence of images is also provided with the direction. Moreover, the visible and infrared registration could help promote a better understanding of the variation and interaction among clouds–aerosols–radiation and contribute to the related research on climate change through multi-channel imaging features. In our next work, the registration results for low clouds need to be further optimized and the registered visible and infrared clouds images will be fused to improve the ground-based cloud observation.

Author Contributions

Conceptualization, W.X. and Y.W.; formal analysis, W.X., Y.W., Y.X., Z.G., and D.L.; funding acquisition, Y.X., Z.G., and D.L.; methodology, W.X. and Y.W.; project administration, D.L.; software, W.X.; validation, W.X., Y.W., Y.X., Z.G., and D.L.; visualization, W.X.; writing—original draft, W.X.; writing—review and editing, Y.W., Y.X., Z.G., and D.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key Collaborative Research Program of the Alliance of International Science Organizations, grant number ANSO-CR-KP-2020-09; the International Partnership Program of Chinese Academy of Sciences, grant number 116134KYSB20180114; the Key research and development program of Anhui province of China, grant number S202104b11020058; and the CASHIPS Director’s Fund, grant number YZJJ2020QN34.

Data Availability Statement

The data presented in this study are available on request from the two corresponding authors. The data are not publicly available due to privacy restrictions.

Acknowledgments

The authors thank the staff of Anhui Air Traffic Management Bureau, Civil Aviation Administration of China for the routine maintenance of the ASC-200 instrument.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Haywood, J.M.; Abel, S.J.; Barrett, P.A.; Bellouin, N.; Blyth, A.; Bower, K.N.; Brooks, M.; Carslaw, K.; Che, H.; Coe, H.; et al. The CLoud–Aerosol–Radiation Interaction and Forcing: Year 2017 (CLARIFY-2017) measurement campaign. Atmos. Chem. Phys. 2021, 21, 1049–1084. [Google Scholar] [CrossRef]
  2. Zhou, C.; Zelinka, M.D.; Klein, S.A. Impact of decadal cloud variations on the Earth’s energy budget. Nat. Geosci. 2016, 9, 871–874. [Google Scholar] [CrossRef]
  3. Klebe, D.I.; Blatherwick, R.D.; Morris, V.R. Ground-based all-sky mid-infrared and visible imagery for purposes of characterizing cloud properties. Atmos. Meas. Tech. 2014, 7, 637–645. [Google Scholar] [CrossRef] [Green Version]
  4. Shields, J.E.; Karr, M.E.; Johnson, R.W.; Burden, A.R. Day/night whole sky imagers for 24-h cloud and sky assessment: History and overview. Appl. Opt. 2013, 52, 1605–1616. [Google Scholar] [CrossRef]
  5. Long, C.; Slater, D.; Tooman, T. Total Sky Imager Model 880 Status and Testing Results; DOE/SC-ARM/TR-006; TRN: US201120%%396 United States 10.2172/1020735 TRN: US201120%%396 DOEARM English; PNNL; DOEOffice of Science Atmospheric Radiation Measurement (ARM) Program (United States): Richland, WA, USA, 2001. [Google Scholar]
  6. Souza-Echer, M.P.; Pereira, E.B.; Bins, L.S.; Andrade, M.A.R. A Simple Method for the Assessment of the Cloud Cover State in High-Latitude Regions by a Ground-Based Digital Camera. J. Atmos. Ocean. Technol. 2006, 23, 437–447. [Google Scholar] [CrossRef]
  7. Long, C.N.; Sabburg, J.M.; Calbó, J.; Pagès, D. Retrieving Cloud Characteristics from Ground-Based Daytime Color All-Sky Images. J. Atmos. Ocean. Technol. 2006, 23, 633–652. [Google Scholar] [CrossRef] [Green Version]
  8. Aebi, C.; Gröbner, J.; Kämpfer, N. Cloud fraction determined by thermal infrared and visible all-sky cameras. Atmos. Meas. Tech. 2018, 11, 5549–5563. [Google Scholar] [CrossRef] [Green Version]
  9. Alonso-Montesinos, J. Real-Time Automatic Cloud Detection Using a Low-Cost Sky Camera. Remote. Sens. 2020, 12, 1382. [Google Scholar] [CrossRef]
  10. Werkmeister, A.; Lockhoff, M.; Schrempf, M.; Tohsing, K.; Liley, B.; Seckmeyer, G. Comparing satellite- to ground-based automated and manual cloud coverage observations—A case study. Atmos. Meas. Tech. 2015, 8, 2001–2015. [Google Scholar] [CrossRef] [Green Version]
  11. Yamashita, M.; Yoshimura, M. Ground-Based Cloud Observation for Satellite-Based Cloud Discrimination and its Validation. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39B8, 137. [Google Scholar] [CrossRef] [Green Version]
  12. Tohsing, K.; Schrempf, M.; Riechelmann, S.; Seckmeyer, G. Validation of spectral sky radiance derived from all-sky camera images—A case study. Atmos. Meas. Tech. 2014, 7, 2137–2146. [Google Scholar] [CrossRef] [Green Version]
  13. Román, R.; Antón, M.; Cazorla, A.; de Miguel, A.; Olmo, F.J.; Bilbao, J.; Alados-Arboledas, L. Calibration of an all-sky camera for obtaining sky radiance at three wavelengths. Atmos. Meas. Tech. 2012, 5, 2013–2024. [Google Scholar] [CrossRef] [Green Version]
  14. Urquhart, B.; Kurtz, B.; Dahlin, E.; Ghonima, M.; Shields, J.E.; Kleissl, J. Development of a sky imaging system for short-term solar power forecasting. Atmos. Meas. Tech. 2015, 8, 875–890. [Google Scholar] [CrossRef] [Green Version]
  15. Schmidt, T.; Kalisch, J.; Lorenz, E.; Heinemann, D. Evaluating the spatio-temporal performance of sky-imager-based solar irradiance analysis and forecasts. Atmos. Chem. Phys. 2016, 16, 3399–3412. [Google Scholar] [CrossRef] [Green Version]
  16. Olmo, F.J.; Cazorla, A.; Alados-Arboledas, L.; López-Álvarez, M.A.; Hernández-Andrés, J.; Romero, J. Retrieval of the optical depth using an all-sky CCD camera. Appl. Opt. 2008, 47, H182–H189. [Google Scholar] [CrossRef]
  17. Cazorla, A.; Shields, J.E.; Karr, M.E.; Olmo, F.J.; Burden, A.; Alados-Arboledas, L. Technical Note: Determination of aerosol optical properties by a calibrated sky imager. Atmos. Chem. Phys. 2009, 9, 6417–6427. [Google Scholar] [CrossRef] [Green Version]
  18. Tsai, R. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J. Robot. Autom. 1987, 3, 323–344. [Google Scholar] [CrossRef] [Green Version]
  19. Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  20. Scaramuzza, D.; Martinelli, A.; Siegwart, R. A Toolbox for Easily Calibrating Omnidirectional Cameras. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 5695–5701. [Google Scholar]
  21. Crispel, P.; Roberts, G. All-sky photogrammetry techniques to georeference a cloud field. Atmos. Meas. Tech. 2018, 11, 593–609. [Google Scholar] [CrossRef] [Green Version]
  22. Hensel, S.; Marinov, M.B.; Schwarz, R. Fisheye Camera Calibration and Distortion Correction for Ground Based Sky Imagery. In Proceedings of the 2018 IEEE XXVII International Scientific Conference Electronics—ET, Sozopol, Bulgaria, 13–15 September 2018; pp. 1–4. [Google Scholar]
  23. Yiu, M.; Harry, N.; Du, R. Acquisition of 3D surface temperature distribution of a car body. In Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong, China, 27 June–3 July 2005; p. 5. [Google Scholar]
  24. Shinko, Y.C.; Sangho, P.; Trivedi, M.M. Multiperspective Thermal IR and Video Arrays for 3D Body Tracking and Driver Activity Analysis. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05)—Workshops, San Diego, CA, USA, 21–23 September 2005; p. 3. [Google Scholar]
  25. Yahyanejad, S.; Misiorny, J.; Rinner, B. Lens distortion correction for thermal cameras to improve aerial imaging with small-scale UAVs. In Proceedings of the 2011 IEEE International Symposium on Robotic and Sensors Environments (ROSE), Montreal, QC, Canada, 17–18 September 2011; pp. 231–236. [Google Scholar]
  26. Xicai, L.; Qinqin, W.; Yuanqing, W. Binocular vision calibration method for a long-wavelength infrared camera and a visible spectrum camera with different resolutions. Opt. Express 2021, 29, 3855–3872. [Google Scholar] [CrossRef] [PubMed]
  27. Klaus, A.; Bauer, J.; Karner, K.; Elbischger, P.; Perko, R.; Bischof, H. Camera calibration from a single night sky image. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004, CVPR 2004, Washington, DC, USA, 27 June–2 July 2004. [Google Scholar]
  28. Urquhart, B.; Kurtz, B.; Kleissl, J. Sky camera geometric calibration using solar observations. Atmos. Meas. Tech. 2016, 9, 4279–4294. [Google Scholar] [CrossRef] [Green Version]
  29. Richardson, W.; Krishnaswami, H.; Vega, R.; Cervantes, M. A Low Cost, Edge Computing, All-Sky Imager for Cloud Tracking and Intra-Hour Irradiance Forecasting. Sustainability 2017, 9, 482. [Google Scholar] [CrossRef] [Green Version]
  30. Wang, Y.; Liu, D.; Xie, W.; Yang, M.; Gao, Z.; Ling, X.; Huang, Y.; Li, C.; Liu, Y.; Xia, Y. Day and Night Clouds Detection Using a Thermal-Infrared All-Sky-View Camera. Remote. Sens. 2021, 13, 1852. [Google Scholar] [CrossRef]
  31. Xie, W.; Liu, D.; Yang, M.; Chen, S.; Wang, B.; Wang, Z.; Xia, Y.; Liu, Y.; Wang, Y.; Zhang, C. SegCloud: A novel cloud image segmentation model using a deep convolutional neural network for ground-based all-sky-view camera observation. Atmos. Meas. Tech. 2020, 13, 1953–1961. [Google Scholar] [CrossRef] [Green Version]
  32. Fa, T.; Xie, W.; Wang, Y.; Xia, Y. Development of an all-sky imaging system for cloud cover assessment. Appl. Opt. 2019, 58, 5516–5524. [Google Scholar] [CrossRef] [PubMed]
  33. Debevec, P.E.; Malik, J. Recovering high dynamic range radiance maps from photographs. In Proceedings of the ACM SIGGRAPH 2008 classes, Los Angeles, CA, USA, 11–15 August 2008. Article 31. [Google Scholar]
  34. Reda, I.; Andreas, A. Solar position algorithm for solar radiation applications. Solar Energy 2004, 76, 577–589. [Google Scholar] [CrossRef]
  35. Kannala, J.; Brandt, S.S. A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 1335–1340. [Google Scholar] [CrossRef] [Green Version]
  36. Zhang, H.P.; Hu, Z.J.; Hu, Y.G.; Zhou, C. Calibration and verification of all-sky auroral image parameters by star maps. Chin. J. Geophys. 2020, 63, 401–411. (In Chinese) [Google Scholar] [CrossRef]
Figure 1. (a) All-sky camera 200 (ASC-200). (b) Two sets of visible (left) and infrared (right) all-sky images captured by ASC-200.
Figure 1. (a) All-sky camera 200 (ASC-200). (b) Two sets of visible (left) and infrared (right) all-sky images captured by ASC-200.
Remotesensing 13 02455 g001
Figure 2. Center detection results of the sun area by using Hough circle transformation for (a) visible all-sky image captured with lowest exposure time and (b) infrared all-sky images. The parts of the sun area are magnified in the green boxes.
Figure 2. Center detection results of the sun area by using Hough circle transformation for (a) visible all-sky image captured with lowest exposure time and (b) infrared all-sky images. The parts of the sun area are magnified in the green boxes.
Remotesensing 13 02455 g002
Figure 3. Projection diagram of an all-sky-view camera with fisheye lens.
Figure 3. Projection diagram of an all-sky-view camera with fisheye lens.
Remotesensing 13 02455 g003
Figure 4. Illustration of the rotation angle with respect to the north for an infrared all-sky image.
Figure 4. Illustration of the rotation angle with respect to the north for an infrared all-sky image.
Remotesensing 13 02455 g004
Figure 5. Reprojection errors for the visible and infrared all-sky-view cameras.
Figure 5. Reprojection errors for the visible and infrared all-sky-view cameras.
Remotesensing 13 02455 g005
Figure 6. Synthetic sun trajectory (red line) on infrared and visible all-sky images captured on (a) 2 August 2020 and (b) 13 August 2020 (all are local time). The green arrows point at the sun’s positions that are unobvious in the infrared all-sky images.
Figure 6. Synthetic sun trajectory (red line) on infrared and visible all-sky images captured on (a) 2 August 2020 and (b) 13 August 2020 (all are local time). The green arrows point at the sun’s positions that are unobvious in the infrared all-sky images.
Remotesensing 13 02455 g006
Figure 7. Back projection errors of the visible and infrared all-sky-view cameras.
Figure 7. Back projection errors of the visible and infrared all-sky-view cameras.
Remotesensing 13 02455 g007
Figure 8. Registration results of the four visible and infrared all-sky image pairs. (a) Original visible all-sky images. These original images are warped and registered with infrared images. (b,c) present the horizontal and vertical comparison between the infrared images and the warped visible images, respectively. The gray lines are plotted to better compare the results.
Figure 8. Registration results of the four visible and infrared all-sky image pairs. (a) Original visible all-sky images. These original images are warped and registered with infrared images. (b,c) present the horizontal and vertical comparison between the infrared images and the warped visible images, respectively. The gray lines are plotted to better compare the results.
Remotesensing 13 02455 g008aRemotesensing 13 02455 g008b
Table 1. Estimated geometric imaging parameters of the visible and infrared binocular all-sky-view cameras of the ASC-200 instrument. The units are in pixels for zenith coordinates u and v , in pixels per degree for focal length f , and in degrees for rotation angle δ .
Table 1. Estimated geometric imaging parameters of the visible and infrared binocular all-sky-view cameras of the ASC-200 instrument. The units are in pixels for zenith coordinates u and v , in pixels per degree for focal length f , and in degrees for rotation angle δ .
u v f δ
Visible cameraInitial value972100010None
Optimized value1005.42996.9710.2425.45
Infrared cameraInitial value2582703None
Optimized value243.86277.153.0627.29
Table 2. Quantitative comparison of the back projection errors of the visible and infrared all-sky-view cameras by using the RMSE, NRMSE, MAE, NMAE, and SD. The units of RMSE, MAE, and SD are in degree ( ° ), and those of NRMSE and NRMSE are in percent (%).
Table 2. Quantitative comparison of the back projection errors of the visible and infrared all-sky-view cameras by using the RMSE, NRMSE, MAE, NMAE, and SD. The units of RMSE, MAE, and SD are in degree ( ° ), and those of NRMSE and NRMSE are in percent (%).
RMSEMAESDNRMSENMAE
Visible cameraAzimuth angle0.21220.15270.25270.0590.042
Zenith angle0.26690.21500.20740.2970.239
Infrared cameraAzimuth angle0.11130.09140.11130.0310.025
Zenith angle0.28310.25570.27230.3150.284
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xie, W.; Wang, Y.; Xia, Y.; Gao, Z.; Liu, D. Angular Calibration of Visible and Infrared Binocular All-Sky-View Cameras Using Sun Positions. Remote Sens. 2021, 13, 2455. https://doi.org/10.3390/rs13132455

AMA Style

Xie W, Wang Y, Xia Y, Gao Z, Liu D. Angular Calibration of Visible and Infrared Binocular All-Sky-View Cameras Using Sun Positions. Remote Sensing. 2021; 13(13):2455. https://doi.org/10.3390/rs13132455

Chicago/Turabian Style

Xie, Wanyi, Yiren Wang, Yingwei Xia, Zhenyu Gao, and Dong Liu. 2021. "Angular Calibration of Visible and Infrared Binocular All-Sky-View Cameras Using Sun Positions" Remote Sensing 13, no. 13: 2455. https://doi.org/10.3390/rs13132455

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop