Next Article in Journal
A Novel Fast Sparse Bayesian Learning STAP Algorithm for Conformal Array Radar
Previous Article in Journal
R-MFNet: Analysis of Urban Carbon Stock Change against the Background of Land-Use Change Based on a Residual Multi-Module Fusion Network
Previous Article in Special Issue
Bandpass Alignment from Sentinel-2 to Gaofen-1 ARD Products with UNet-Induced Tile-Adaptive Lookup Tables
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Does the Rational Function Model’s Accuracy for GF1 and GF6 WFV Images Satisfy Practical Requirements?

1
Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China
2
School of Science, China University of Geosciences (Beijing), Beijing 100083, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2023, 15(11), 2820; https://doi.org/10.3390/rs15112820
Submission received: 9 May 2023 / Revised: 24 May 2023 / Accepted: 25 May 2023 / Published: 29 May 2023
(This article belongs to the Special Issue Gaofen 16m Analysis Ready Data)

Abstract

:
The Gaofen-1 (GF-1) and Gaofen-6 (GF-6) satellites have acquired many GF-1 and GF-6 wide-field-view (WFV) images. These images have been made available for free use globally. The GF-1 WFV (GF-1) and GF-6 WFV (GF-6) images have rational polynomial coefficients (RPCs). In practical applications, RPC corrections of GF-1 and GF-6 images need to be completed using the rational function model (RFM). However, can the accuracy of the rational function model satisfy practical application requirements? To address this issue, a geometric accuracy method was proposed in this paper to evaluate the accuracy of the RFM of GF-1 and GF-6 images. First, RPC corrections were completed using the RFM and refined RFM, respectively. The RFM was constructed using the RPCs and Shuttle Radar Topography Mission (SRTM) 90 m DEM. The RFM was refined via affine transformation based on control points (CPs), which resulted in a refined RFM. Then, an automatic matching method was proposed to complete the automatic matching of GF-1/GF-6 images and reference images, which enabled us to obtain many uniformly distributed CPs. Finally, these CPs were used to evaluate the geometric accuracy of the RFM and refined RFM. The 14th-layer Google images of the corresponding area were used as reference images. In the experiments, the advantages and disadvantages of BRIEF, SIFT, and the proposed method were first compared. Then, the values of the root mean square error (RSME) of 10,561 Chinese, French, and Brazilian GF-1 and GF-6 images were calculated and statistically analyzed, and the local geometric distortions of the GF-1 and GF-6 images were evaluated; these were used to evaluate the accuracy of the RFM. Last, the accuracy of the refined RFM was evaluated using the eight GF-1 and GF-6 images. The experimental results indicate that the accuracy of the RFM for most GF-1 and GF-6 images cannot meet the actual use requirement of being better than 1.0 pixel, the accuracy of the refined RFM for GF-1 images cannot meet practical requirement of being better than 1.0 pixel, and the accuracy of the refined RFM for most GF-6 images meets the practical requirement of being better than 1.0 pixel. However, the RMSE values that meet the requirements are between 0.9 and 1.0, and the geometric accuracy can be further improved.

1. Introduction

The Gaofen-1 satellite (GF-1) was launched in China on 26 April 2013. It carries two 2 m resolution panchromatic/8 m resolution multispectral cameras and four 16 m resolution wide-field-view (WFV) multispectral cameras, with a combined swath of 800 km and a swing angle of 35°. The GF-1 WFV images (GF-1 images) contain four bands with respective spectral ranges of 0.45–0.52 μm, 0.52–0.59 μm, 0.63–0.69 μm, and 0.77–0.89 μm. The Gaofen-6 satellite (GF-6) was launched in China on 2 June 2018. It carries a 2 m panchromatic/8 m multispectral high-resolution camera and a 16 m multispectral medium-resolution wide-field-view (WFV) camera with a swath of 800 km. The GF-6 WFV images (GF-6 images) consist of eight bands, with the first four bands having the same spectral range as the GF-1 images and the remaining four bands having spectral ranges of 0.69–0.73 μm, 0.73–0.77 μm, 0.40–0.45 μm, and 0.59–0.63 μm, respectively. At present, a large number of GF-1 and GF-6 images have been obtained and are widely used in many industries. For example, on 7 February 2021, a catastrophic mass flow descended in Chamoli, Uttarakhand, India, with over 200 people killed or missing [1]. In similar disasters, GF-1 and GF-6 images before and after the disaster can be used to quickly analyze and evaluate the damage. China has made GF-1 and GF-6 images freely available to the entire world.
The rational function model (RFM) can achieve high accuracy in fitting various physical sensor models, and its interpolation calculation is stable and accurate. In addition, the rational function model has almost the same accuracy as the physical sensor model [2,3]. Space Imaging was first used to employ the RFM for IKONOS satellite images; the Open Geospatial Consortium (OGC) has adopted rational polynomial coefficients (RPCs) as an incidental image parameter standard since 1999 [4]. It is now widely used to produce various high-resolution satellite images, and all public L1-level images include RPC files.
GF-1/GF-6 images also have their own RPC files. For GF-1/GF-6 images, can the accuracy of RFM meet practical requirements?
At present, some of the literature has studied the positioning accuracy of GF-6 images. Yin et al. (2023) [5] used an error-compensated RPCs model for the accurate orientation of GF-6 WFV stereo images, and a precise RPC model was obtained using the traditional beam adjustment method. However, the geometric accuracy evaluation results only show the relative accuracy between the two GF-6 images, not the absolute geometric accuracy of the GF-6 images. In other research [6], the direct geometric accuracy and the geometric accuracy with different adjustment models were evaluated. The direct geometric accuracy is approximately the distance of one ground sample, and the image geometric accuracy is 0.5–1.0 pixels. However, this paper only briefly describes the evaluation method, and the accuracy of a small number of GF-6 images with high direct geometric accuracy was evaluated. Wang et al. (2023) [7] proposed a sensor correction method based on virtual CMOS with distortion for the GF-6 WFV camera, which can improve the internal relative accuracy and the registration accuracy. However, the camera parameters are hard to obtain.
Therefore, in order to more comprehensively evaluate the geometric accuracy of GF-1 and GF-6 images, a geometric accuracy evaluation method is proposed in this paper to address the issue. The proposed method begins with RPC correction, and an RFM and a refined RFM are constructed and used to complete RPC correction. The RFM is refined via affine transformation based on control points (CPs), which results in a refined RFM. Then, an automatic matching method is proposed to generate a large number of uniformly distributed control points (CPs). Finally, these CPs are used to complete the geometric accuracy evaluation of the RFM and refined RFM of GF-1/GF-6 images.
A crucial step in evaluating the geometric accuracy of the RFM is obtaining many uniformly distributed CPs using an automatic matching method with high accuracy and speed.
Automatic matching based on point features has been widely used in remote sensing image matching, from the early Forstner [8], Harris [9], and SUSAN [10] methods to the SIFT [11], SURF [12], and other improved methods with rotation and scale invariance. After extracting feature points, automatic feature matching must be completed. Automatic feature matching primarily focuses on how feature points are described and how their corresponding similarity is measured. Typically, the description of feature points and similarity measures is based on the following descriptors: (1) A variety of similarity measures based on templates, such as the correlation coefficient, phase correlation, and mutual information. (2) Descriptors of various invariant features can form feature vectors, and the similarity measures for the feature vectors are primarily Euclidean distance or other distances. Methods such as SIFT and SURF have feature descriptions with better rotation and scale invariance, but they are computationally slow. Therefore, faster feature matching methods, such as BRIEF [13] and ORB [14], have been proposed.
SIFT and improved methods are widely used for remote sensing image matching. Due to the large size of a single GF-1/GF-6 image and the medium spatial resolution of GF-1/GF-6 images, SIFT and improved methods have the following problems: high hardware resource consumption, few or no CPs in areas with inconspicuous features, and a slow computation speed. Due to the use of more grayscale information, template matching based on normalized correlation coefficients has a high success rate and strong stability for medium-spatial-resolution remote sensing images with minimal rotation and scale differences.
On the basis of research on existing methods, an automatic matching method based on Harris, BRIEF, and template matching is proposed, which can generate a large number of uniformly distributed and high-accuracy CPs.
Google Earth images have been orthorectified; the research results on the geometric accuracy of Google Earth images (Google images) [15,16,17] indicate that the average geometric accuracy of Google Earth images is greater than 3 m. Therefore, the 14th-layer Google images of the corresponding area are used as reference images for the geometric accuracy evaluation of GF-1/GF-6 images.
The experimental results showed that the proposed method is superior to SIFT and BRIEF in terms of the number, distribution, and accuracy of the CPs, and the processing speed of the proposed method is much better than that of SIFT.
The accuracy of the RFM and refined RFM was evaluated separately in the experiments. In order to evaluate the geometric accuracy of the RFM, 10,561 GF-1 and GF-6 images of China, Brazil, and France were obtained, their geometric accuracy was statistically analyzed according to the different countries, and the local geometric distortions of the GF-1/GF-6 images were analyzed using eight GF-1 images and eight GF-6 images. Then, the geometric accuracy of the refined RFM was evaluated using the four GF-1 images and four GF-6 images.
The geometric accuracy evaluation results in this paper can not only help researchers to understand the geometric accuracy of China’s GF1/GF6 images, but also help researchers to develop high-precision geometric processing algorithms of GF1/GF6 images.

2. Geometric Accuracy Evaluation Method

2.1. Workflow of Proposed Method

A geometric accuracy evaluation method is proposed in this paper to evaluate the accuracy of the RFM. The proposed method is divided into three major steps: (1) the RPC correction of GF-1/GF-6 images was completed using the RFM and refined RFM, respectively; (2) the automatic matching of RPC-corrected image and Google image of the corresponding area was completed to obtain a large number of CPs; and (3) the geometric accuracy of the whole image was computed using these CPs and the local geometric distortions of the single image were analyzed, as shown in Figure 1.
  • RPC correction
The RFM was constructed using the RPCs and Shuttle Radar Topography Mission (SRTM) 90 m DEM. The RFM is usually refined via affine transformation based on some CPs [2]. This is defined as follows:
Δ l = l l = a 0 + a l l + a s s Δ s = s s = b 0 + b l l + b s s
where ( Δ l , Δ s ) express the discrepancies between the measured line and sample coordinates ( l , s ) and the RFM projected image coordinates ( l , s ) , and the coefficients a0, al, as, b0, bl, and bs are the adjustment parameters for each image.
In order to solve the coefficients of affine transformation, at least three pairs of CPs are required. In this paper, CPs were manually selected from GF-1/GF-6 images and reference images.
Then, the RPC correction of the GF-1/GF-6 images was completed using the RFM and refined RFM, respectively. The single GF-6 image was separated into three images for storage purposes, so the geometric accuracy was evaluated separately.
2.
Automatic image matching
In this paper, an automatic matching method based on Harris, BRIEF, and template matching is proposed to complete automatic matching between an RPC-corrected GF-1/GF-6 image and a Google image of the corresponding area, which can generate a large number of uniformly distributed and high-accuracy CPs, as detailed in the description in Section 2.2.
3.
Geometric accuracy analysis
The root mean square error (RMSE) is a measure of the normalized distance between the observed and the predicted data [18,19]. Here, the root mean square error (RMSE) was calculated as the geometric accuracy of the whole image using all CPs obtained from automatic matching, as shown below:
R M S E = 1 n i = 1 n [ ( x i x i ) 2 + ( y i y i ) 2 ] d s
For each control point (CP) on the GF-1/GF-6 image, the geometric error value (GEV) was calculated as shown below:
( x i x i ) 2 + ( y i y i ) 2 d s
In Equations (2) and (3), n represents the total number of CPs, ( x i , y i ) represents the geographic coordinates of the CP on the GF-1/GF-6 image, ( x i , y i ) represents the geographic coordinates of the CP on the reference image, and d s represents the resolution of the GF-1/GF-6 image.
For the GF-1/GF-6 images corrected by the RFM, the geometric accuracy of the whole image and the local geometric distortions were analyzed: (1) For many GF-1 and GF-6 images, histograms of the RMSE values were statistically analyzed according to different countries, the results of which were used to evaluate the geometric accuracy of whole image. (2) The local geometric distortions can affect the accuracy of the refined RFM; therefore, eight GF-1 images and eight GF-6 images from different countries and with different geometric accuracies were selected to evaluate the local geometric distortions. For each image, the geometric error value of each CP was calculated; then, a plot of geometric errors and a histogram of geometric error values were obtained. Lastly, the local geometric distortions for each image were analyzed using a plot of geometric errors and a histogram of geometric error values.
For the GF-1/GF-6 images corrected by the refined RFM, the RMSE values were calculated and analyzed.

2.2. Automatic Matching Method

This paper aims to evaluate the geometric accuracy of the RFM using the CPs obtained via automatic matching for a large number of GF-1 and GF-6 images. In order to obtain more accurate evaluation results, the CPs obtained via the automatic matching method had to be distributed as evenly as possible. In addition, since many images were completed in this paper, the processing speed of the automatic matching method had to be as fast as possible.
An automatic matching method based on Harris, BRIEF, and template matching was proposed in this paper. Harris is an extraction algorithm for feature points. BRIEF uses binary strings as an efficient feature point descriptor. BRIEF is faster than traditional descriptors such as SURF and SIFT in terms of speed. Template matching has a high success rate and good stability for the automatic matching of medium-spatial-resolution remote sensing images with small differences in scale and rotation.
Automatic image matching was completed based on image chunking, with the following steps: Firstly, the reference image was reprojected to the same projection and resolution as the RPC-corrected GF-1/GF-6 image, which can reduce the scale difference between the reference image and the RPC-corrected GF-1/GF-6 image. Secondly, the feature points were extracted from the original image block and the reference image block using Harris, and then, BRIEF was used to match the feature points. Lastly, for each feature point that failed to be matched, template matching was used. After all the feature points were matched, the RANSAC approach was used to detect incorrect CPs, and then, control point homogenization was carried out, as shown in Figure 2.
The following section describes the primary technologies involved in the automatic matching method proposed in this paper:
  • Image Chunking
The GF-1/GF-6 images were evenly chunked according to a size of B × B to form several original image blocks. For each original image block, the geographic coordinates of the center point of the original image block were used to calculate the center point of the reference image block in the corresponding position. The corresponding reference image block was extracted using the calculated center point. The size of the reference image block was bigger than that of the original image block. The size of B affects the memory footprint and processing speed of the algorithm. The larger the B value, the higher the memory footprint and the faster the processing. Therefore, it was necessary to determine the size of B according to the algorithm’s memory footprint, the size of the computer memory, and the number of parallel processing images.
2.
Feature point extraction using Harris
Harris is used to detect feature points in images and videos. Harris finds significant changes in image gradients for a local area (window) and is often used to extract feature points from images. In this paper, Harris was used to extract feature points from each image block.
3.
Feature point matching using BRIEF
For each feature point on the original image block and the reference image block, BRIEF was used to separately generate a 512-bit binary feature string. Then, the Hamming distance was used to complete feature string matching.
4.
Template matching
For each feature point that failed to be matched, a template window was extracted with the feature point at the center. Then, the corresponding position of the feature point on the reference image block was calculated based on the same geographic coordinate. The search window (where the size was larger than the template window) was extracted with this position at the center, and then, template matching was completed using Fast NCC [20].
5.
Incorrect control point detection
Although the proposed method can obtain many accurate CPs, there are still some incorrect CPs. Therefore, it was necessary to remove incorrect CPs. The RANSAC [21] method is usually used to detect incorrect CPs and can obtain good results in many cases. Therefore, the RANSAC method was used to detect incorrect CPs for each image block.
6.
Control point homogenization.
Firstly, the original image was gridded according to a size of 400 × 400; all CPs were assigned to various grids based on the control point coordinates of the original image. Then, only the CP with the optimal matching degree was maintained for each grid. The rules used to determine the optimal matching degree were as follows: (1) if all CPs in one grid are obtained from BRIEF, the CP with the smallest Hamming distance is maintained; (2) if all CPs in one grid are obtained from template matching, the CP with the maximum NCC value is maintained; (3) if CPs in one grid are obtained from BRIEF and template matching, the CP with the smallest Hamming distance is maintained.

3. Experiments for Geometric Accuracy Evaluation Method

3.1. Experimental Data

In this experiment, the advantages and disadvantages of BRIEF, SIFT, and the proposed method are compared from the perspectives of the number of CPs, the distribution of CPs, the accuracy of the CPs, and processing time. The RPC correction of the GF-1/GF-6 images was completed using the RFM. The experimental images were four GF-1 images; the details are shown in Table 1. In this and subsequent experiments, the value of B was 5000.

3.2. Experimental Results Regarding Number of CPs

For all experimental images, the proposed method obtains more CPs than SFIT and BRIEF. The number of CPs obtained by BRIEF is less than that of SIFT, as illustrated in Figure 3.

3.3. Experimental Results Regarding Distribution of CPs

Figure 4, Figure 5, Figure 6 and Figure 7 show the distribution of CPs for different methods. For all images, the distribution of CPs obtained from the proposed method is relatively uniform. For all images, the distribution obtained from the BRIEF method is not uniform, and some places do not have CPs. The distributions of CPs obtained from SIFT in the No. 1 image and No. 2 image are generally uniform. The distributions of CPs obtained from SIFT in the No. 3 image and No. 4 image are not uniform.

3.4. Experimental Results Regarding Processing Time

The processing times of the different methods are presented in ascending order: BRIEF, the proposed method, and SIFT. The processing time of SIFT is much higher than that of BRIEF and the proposed method, as illustrated in Figure 8.

3.5. Experimental Results Regarding Evaluation Accuracy

In order to verify the accuracy of the automatic geometric evaluation of BRIEF, SIFT, and the proposed method, a manual method was used to obtain an accurate RMSE. The steps of the manual method were as follows: first, about 60 uniformly distributed CPs were manually selected on the RPC-corrected image and reference image; then, these CPs were used to calculate the RMSE of each image. The manually selected CPs have high accuracy and a uniform distribution, so the RMSE obtained from the manual method was used to evaluate the accuracy of BRIEF, SIFT, and the proposed method.
The RMSE values obtained from the different methods used for the four experimental images are illustrated in Figure 9. For the No. 1 image, the RMSE value of SIFT is equal to that of the manual method, and the RSME values of BRIEF and the proposed method differ very little from that of the manual method. For the No. 2 image and No. 4 image, the RMSE value of the proposed method is equal to that of the manual method. For the No. 3 image, the RMSE of the proposed method is closest to that of the manual method.
According to a comprehensive comparison of the number of CPs, the distribution of CPs, the accuracy of the CPs, and processing time, the proposed method is superior to SIFT and BRIEF in terms of the geometric accuracy evaluation of GF1/GF6 images.

4. Geometric Accuracy Evaluation Results

4.1. The Results for the RFM

4.1.1. Experimental Data

The experimental images consist of 10,561 GF-1 and GF-6 images of China, Brazil, and France, including 5539 GF-1 images and 5022 GF-6 images, with GF-1 images imaged from 2013 to 2021 and GF-6 images imaged from 2018 to 2021. The details of the experimental images are shown in Table 2.

4.1.2. The Geometric Accuracy of the Whole Image

  • Experimental results of GF-1 images
The RMSE values of 2990 GF-1 images of China were statistically analyzed, and the RMSE values ranged from 1 to 37, with 96.5% of the images having RMSE values between 1 and 11; detailed results are shown in Figure 10.
The RMSE values of 355 GF-1 images of Brazil were statistically analyzed. The RMSE values ranged from 5 to 22; detailed results are shown in Figure 11.
The RMSE values of 2214 GF-1 images of France were statistically analyzed. The values of RMSE ranged from 2 to 45, with 96.2% images having RMSE values between 2 and 19; detailed results are shown in Figure 12.
By analyzing the RMSE values of GF-1 images of China, France, and Brazil, it was determined that the geometric accuracy of the GF-1 images of China is primarily distributed between 1 and 11, that of the GF-1 images of France is primarily distributed between 2 and 19, and that of the GF-1 images of Brazil is primarily distributed between 5 and 20. The geometric accuracy of the GF-1 images of China is superior to that of France and Brazil. The RMSE values for all GF-1 images cannot meet the practical requirement of being better than 1.0 pixel.
2.
Experimental results of GF-6 images
The RMSE values of 1077 GF-6 images of China were statistically analyzed. The RMSE values ranged from 0 to 60, with 99.4% of the images having RMSE values between 0 and 5.0; detailed results are shown in Figure 13.
The RMSE values of 2840 Brazil GF-6 images were statistically analyzed. The RMSE values ranged from 0 to 17, with 99.1% of the images having values between 0 and 4.0; detailed results are shown in Figure 14.
The RMSE values of 1105 GF-6 images of France were statistically analyzed. The RMSE values ranged from 0 to 9, with 99.0% of the images having values between 0 and 7; detailed results are shown in Figure 15.
The geometric accuracy of the GF-6 images of China is primarily distributed between 0 and 5; the geometric accuracy of the GF-6 images of France is primarily distributed between 0 and 7. The geometric accuracy of the GF-6 images of Brazil is primarily distributed between 0 and 4. There is no significant difference in the geometric accuracy of the three national images, but the RMSE values of a very small number of images can meet the practical requirement of being better than 1.0 pixel.
Comparing the geometric accuracy of GF-6 images and GF-1 images showed that the geometric accuracy of the GF-6 images is significantly superior to that of the GF-1 images. This result shows that the RFM’s accuracy for the GF-6 images is better than that for the GF-1 images, and the accuracy of the GF-6 satellite imaging parameters is better than that of GF-1.
The results of the geometric accuracy evaluation of the whole image show that if only the RPCs of the GF-1/GF-6 images are used to build the RFM, the accuracy of most GF-1/GF-6 images cannot meet the practical requirements, and even the RMSE values of some GF-1 images are relatively large, for two main reasons: (1) the satellite imaging parameters are inaccurate, resulting in inaccuracies in the RFM, and (2) when RPC correction is performed, the RFM is not optimized using CPs.

4.1.3. Local Geometric Distortions

  • Experimental results of GF-1 images
Four GF-1 images of China, two GF-1 images of Brazil, and two GF-1 images of France were selected from all the experimental GF-1 images. The local geometric distortions were analyzed using geometric error values and geometric error directions.
The details of the eight experimental images are shown in Table 3.
The geometric error values of CPs for the No. 1 image range from 0 to 4. There are significant differences in the geometric error directions of CPs, especially the geometric error values and directions in the left area of the image, which is obviously different from other areas. Therefore, the No. 1 image has significant local geometric distortions, as shown in Figure 16.
The geometric error values of CPs for the No. 2 image range from 64 to 98. There are significant differences in the geometric error values of the CPs, and small differences in the geometric error direction of the CPs. Therefore, the No. 2 image has significant local geometric distortions, as shown in Figure 17.
The geometric error values of the CPs for the No. 3 image range from 1 to 6, and 98.89% of the geometric error values are between 2 and 5. There are small differences in the geometric error values of the CPs, and significant differences in the geometric error directions of the CPs. Therefore, the No. 3 image has significant local geometric distortions, as shown in Figure 18.
The geometric error values of the CPs for the No. 4 image range from 3 to 8. There are significant differences in the geometric error directions of the CPs, especially the error directions on the right side of the image, which are significantly different from the other parts. Therefore, the No. 4 image has significant local geometric distortions, as shown in Figure 19.
The geometric error values of the CPs for the No. 5 image range from 9 to 13. There are small differences in the geometric error values and directions of the CPs. Therefore, the No. 5 image has small local geometric distortions, as shown in Figure 20.
The geometric error values of the CPs for the No. 6 image range from 8 to 13. There are small differences in the geometric error values and directions of the CPs. Therefore, the No. 6 image has small local geometric distortions, as shown in Figure 21.
The geometric error values of the CPs for the No. 7 image range from 1 to 5. There are small differences in the geometric error values of the CPs, and significant differences in the geometric error directions of the CPs. Therefore, the No. 7 image has significant local geometric distortions, as shown in Figure 22.
The geometric error values of the CPs for the No. 8 image range from 2 to 6. There are small differences in the geometric error values of the CPs, and significant differences in the geometric error directions of the CPs, especially in the lower part of the image. Therefore, the No. 8 image has significant local geometric distortions, as shown in Figure 23.
The results of the analysis of the local geometric distortions for the eight GF-1 images indicate that most of the GF-1 images have significant local geometric distortions, and a few images have small local geometric distortions.
2.
Experimental results of GF-6 images
Four GF-6 images of China, two GF-6 images of Brazil, and two GF-6 images of France were selected from all the experimental GF-6 images. The local geometric distortions were analyzed using the geometric error values and directions.
The details of the eight experimental images are shown in Table 4.
The geometric error values of the CPs for the No. 1 image range from 1 to 5; there are significant differences in the geometric error directions of the CPs, but very small differences in the geometric error values of the CPs. Therefore, the No. 1 image has small local geometric distortions, as shown in Figure 24.
The geometric error values of the CPs for the No. 2 image range from 0 to 4; there are significant differences in the geometric error directions of the CPs, but very small differences in the geometric error values of the CPs. Therefore, the No. 2 image has small local geometric distortions, as shown in Figure 25.
The geometric error values of the CPs for the No. 3 image range from 0 to 4; there are significant differences in the geometric error directions of the CPs, but very small differences in the geometric error values of the CPs. Therefore, the No. 3 image has small local geometric distortions, as shown in Figure 26.
The geometric error values of the CPs for the No. 4 image range between 1 and 5, and 85.38% of the CPs fall within a range of 2 to 4. There are significant differences in the geometric error directions of the CPs, but very small differences in the geometric error values of the CPs. Therefore, the No. 4 image has small local geometric distortions, as shown in Figure 27.
The geometric error values of the CPs for the No. 5 image range from 15 to 19, with 85.96% of the CPs falling within a range of 16 to 18. There are small differences in the geometric error directions and values of the CPs. Therefore, the No. 5 image has very small local geometric distortions, as shown in Figure 28.
The geometric error values of the CPs for the No. 6 image range from 1 to 4, with 91.4% of the CPs falling within a range of 2 to 3. There are significant differences in the geometric error directions of the CPs, but very small differences in the geometric error values of the CPs. Therefore, the No. 6 image has small local geometric distortions, as shown in Figure 29.
The geometric error values of the CPs for the No. 7 image range from 4 to 11; there are small differences in the geometric error directions of the CPs, and relatively large differences in the geometric error values of the CPs. Therefore, the No. 7 image has relatively large local geometric distortions, as shown in Figure 30.
The geometric error values of the CPs for the No. 8 image range from 0 to 3; there are significant differences in the geometric error directions of the CPs, but very small differences in the geometric error values of the CPs. Therefore, the No. 8 image has small local geometric distortions, as shown in Figure 31.
The analysis of the local geometric distortions for the eight GF-6 images showed that the majority of the GF-6 images have significant differences in their geometric error directions, but very small differences in their geometric error values. Therefore, the majority of the images have small local geometric distortions.
Comparing the local geometric distortions of the GF-6 images and the GF-1 images showed that the local geometric distortions of the GF-6 images are smaller than those of the GF-1 images.

4.2. The Results for the Refined RFM

4.2.1. Experimental Data

The experimental images consist of four GF-1 images and four GF-6 images; the details of the eight experimental images are shown in Table 5. Although most of the experimental GF-1 images are in China, the experimental images have different geometric RFM accuracies and different features in different regions of China, which does not affect the geometric accuracy evaluation results.

4.2.2. Experimental Results

In this experiment, for each experimental image, multiple refined rational function models were constructed using different number of CPs; then, the geometric accuracy of each refined RFM was evaluated using the proposed method. The results of the geometric accuracy evaluation are shown in Table 6.
For each image, the RMSE value decreases as the number of CPs increases, but the decrease in the RMSE value is not noticeable when the number of CPs increases to 20; the minimum value of the RMSE of each GF-1 image is greater than 1.0 and cannot meet the practical requirement of being better than 1.0 pixel. The reason for these results is that most GF-1 images have significant local geometric distortions, which cannot be corrected by a refined RFM. The minimum RMSE values of most GF-6 images is less than 1.0; however, the RMSE values are all greater than 0.9, and further processing is recommended to improve the geometric accuracy of the GF-6 images.

5. Conclusions

An automatic geometric accuracy evaluation method is proposed in this paper, which can be used to evaluate the geometric accuracy of the RFM for GF-1 and GF-6 images. First, RPC correction is completed using the RFM and refined RFM, respectively. The RFM is refined using some CPs, which results in a refined RFM. Second, an automatic matching method based on Harris, BRIEF, and template matching is proposed in this paper to obtain many evenly distributed CPs. SIFT and its improved methods can only obtain a few or no CPs in areas with inconspicuous features, and SIFT and most of its improved methods have a slow computational speed. The proposed method is superior to SIFT and BRIEF in terms of the number, distribution, and accuracy of the CPs, and the processing speed of the proposed method is much better than that of SIFT. Finally, the RFM and refined RFM are evaluated using many GF-1 and GF-6 images.
The geometric accuracy of the RFM was evaluated using 10,561 GF-1/GF-6 images of China, Brazil, and France. The experimental results indicate that the geometric accuracy of the GF-1 images is mainly distributed between 0 and 20. In contrast, the geometric accuracy of the GF-6 images is mainly distributed between 0 and 7. The geometric accuracy of the GF-6 images is obviously superior to that of the GF-1 images. Local geometric distortion analysis was performed using eight GF-1 and GF-6 images. The experimental results showed that most GF-1 images have significant local geometric distortions, and most GF-6 images have small local geometric distortions. According to the results of the analysis, the RMSE values, and the local geometric distortions in GF-1/GF-6 images, the geometric accuracy of the GF-6 images is better than that of the GF-1 images. Therefore, the RFM accuracy of most GF-1 and GF-6 images does not meet the practical requirement of being better than 1.0 pixel. The reason for these results is that the satellite imaging parameters are inaccurate and the RFM is not optimized using CPs.
The accuracy of the refined RFM was evaluated using four GF-1 images and four GF-6 images. The accuracy of the refined RFM of all the GF-1 experimental images does not meet the practical requirements, and the reason for these results is that most GF-1 images have significant local geometric distortions, which cannot be corrected by a refined RFM. When 20 CPs are used to construct a refined RFM, the accuracy of the refined RFM of most GF-6 experimental images can meet the practical requirement of being better than 1.0 pixel; however, the RMSE values are all greater than 0.9, and further processing is recommended to improve the geometric accuracy of the GF-6 images. In future work, the automatic matching method will be used to obtain CPs to refine the RFM of GF-1 and GF-6 images, so that an accuracy evaluation of a large number of images can be completed. The image registration method should be used to further improve the geometric accuracy of GF-1/GF-6 images.
The geometric accuracy evaluation results presented in this paper can help researchers to develop a high-precision geometric processing algorithm for GF-1/GF-6 images. The proposed method can also be used to evaluate the geometric accuracy of other Chinese satellite images.

Author Contributions

Conceptualization, X.S.; methodology, X.S.; software, X.S.; validation, J.Z.; formal analysis, J.Z.; resources, J.Z.; data curation, J.Z.; writing—original draft preparation, X.S.; writing—review and editing, X.S.; visualization, X.S. and J.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key Research and Development Program of China (grant number 2019YFE0197800).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shugar, D.H.; Jacquemart, M.; Shean, D.; Bhushan, S.; Upadhyay, K.; Sattar, A.; Schwanghart, W.; McBride, S.K.; Van Wyk de Vries, M.; Mergili, M.; et al. A massive rock and ice avalanche caused the 2021 disaster at Chamoli, Indian Himalaya. Science 2021, 373, 300–306. [Google Scholar] [CrossRef] [PubMed]
  2. Zhang, R.; Zhou, G.; Zhang, G.; Zhou, X.; Huang, J. RPC-Based Orthorectification for Satellite Images Using FPGA. Sensors 2018, 18, 2511. [Google Scholar] [CrossRef] [PubMed]
  3. Hu, Y.; Tao, V.; Croitoru, A. Understanding the rational function model: Methods and applications. Int. Arch. Photogramm. Remote Sens. 2004, 20, 119–124. [Google Scholar]
  4. Open Geospatial Consortium. The Open GIS Abstract Specification-Topic 7. Available online: https://www.ogc.org/standards/as/ (accessed on 8 May 2023).
  5. Yin, S.; Zhu, Y.; Hong, H.; Yang, T.; Chen, Y.; Tian, Y. DSM Extraction Based on Gaofen-6 Satellite High-Resolution Cross-Track Images with Wide Field of View. Sensors 2023, 23, 3497. [Google Scholar] [CrossRef] [PubMed]
  6. Zhao, L.; Fu, X.; Dou, X. Analysis of Geometric Performances of the Gaofen-6 WFV Camera. In Proceedings of the 6th China High Resolution Earth Observation Conference (CHREOC 2019), Chengdu, China, 1 September 2019; Lecture Notes in Electrical Engineering; Wang, L., Wu, Y., Gong, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2020; Volume 657. [Google Scholar]
  7. Wang, M.; Cheng, Y.; Guo, B.; Jin, S. Parameters determination and sensor correction method based on virtual CMOS with distortion for the GaoFen6 WFV camera. ISPRS J. Photogramm. Remote Sens. 2019, 156, 51–62. [Google Scholar] [CrossRef]
  8. Förstner, W.; Gülch, E. A fast operator for detection and precise location of distinct points, corners and circular features. In Proceedings of the ISPRS Workshop on Fast Processing of Photogrammetric Data, Interlaken, Switzerland, 2–4 June 1987; pp. 281–305. [Google Scholar]
  9. Noble, A. Finding corners. Image Vis. Comput. 1988, 6, 121–128. [Google Scholar] [CrossRef]
  10. Smith, S.M.; Brady, J.M. SUSAN—A new approach to low level image processing. Int. J. Comput. Vis. 1997, 23, 45–78. [Google Scholar] [CrossRef]
  11. Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  12. Bay, H.; Tuytelaars, T.; Van Gool, L. SURF: Speeded up robust features. In Proceedings of the Ninth European Conference on Computer Vision, Graz, Austria, 7–13 May 2006. [Google Scholar]
  13. Calonder, M.; Lepetit, V.; Strecha, C.; Fua, P. BRIEF: Binary Robust Independent Elementary Features. In Proceedings of the Computer Vision—ECCV 2010, Heraklion, Greece, 5–11 September 2010; Lecture Notes in Computer Science; Daniilidis, K., Maragos, P., Paragios, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6314. [Google Scholar] [CrossRef]
  14. Rublee, E.; Rabaud, V.; Konolige, K.; Bradski, G. ORB: An efficient alternative to SIFT or SURF. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 2564–2571. [Google Scholar] [CrossRef]
  15. Farah, A.; Algarni, D. Positional accuracy assessment of googleearth in riyadh. Artif. Satell. 2014, 49, 101–106. [Google Scholar] [CrossRef]
  16. Pulighe, G.; Baiocchi, V.; Lupia, F. Horizontal accuracy assessment of very high resolution google earth images in the city of rome, italy. Int. J. Digit. Earth 2015, 9, 342–362. [Google Scholar] [CrossRef]
  17. Nwilo, P.C.; Okolie, C.J.; Onyegbula, J.C.; Arungwa, I.D.; Ayoade, O.Q.; Daramola, O.E.; Orji, M.J.; Maduako, I.D.; Uyo, I.I. Positional accuracy assessment of historical Google Earth imagery in Lagos State, Nigeria. Appl. Geomat. 2022, 14, 545–568. [Google Scholar] [CrossRef]
  18. Hastie, T.; Friedman, J.; Tibshirani, R. The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Springer: New York, NY, USA, 2009. [Google Scholar]
  19. Muhuri, A.; Gascoin, S.; Menzel, L.; Kostadinov, T.S.; Harpold, A.A.; Sanmiguel-Vallelado, A.; López-Moreno, J.I. Performance Assessment of Optical Satellite-Based Operational Snow Cover Monitoring Algorithms in Forested Landscapes. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7159–7178. [Google Scholar] [CrossRef]
  20. Lewis, J.P. Fast Normalized Cross-Correlation. 1995. Available online: http://scribblethink.org/Work/nvisionInterface/nip.pdf (accessed on 8 May 2023).
  21. Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. Assoc. Comput. Mach. 1981, 24, 381–395. [Google Scholar] [CrossRef]
Figure 1. Workflow of proposed method.
Figure 1. Workflow of proposed method.
Remotesensing 15 02820 g001
Figure 2. Workflow of automatic matching method.
Figure 2. Workflow of automatic matching method.
Remotesensing 15 02820 g002
Figure 3. Histogram of number of CPs for different methods.
Figure 3. Histogram of number of CPs for different methods.
Remotesensing 15 02820 g003
Figure 4. The distribution of CPs of different methods for No. 1 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Figure 4. The distribution of CPs of different methods for No. 1 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Remotesensing 15 02820 g004
Figure 5. The distribution of CPs of different methods for No. 2 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Figure 5. The distribution of CPs of different methods for No. 2 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Remotesensing 15 02820 g005
Figure 6. The distribution of CPs of different methods for No. 3 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Figure 6. The distribution of CPs of different methods for No. 3 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Remotesensing 15 02820 g006
Figure 7. The distribution of CPs of different methods for No. 4 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Figure 7. The distribution of CPs of different methods for No. 4 image. (a) BRIEF. (b) SIFT. (c) Proposed method.
Remotesensing 15 02820 g007
Figure 8. Histogram of processing times for different methods.
Figure 8. Histogram of processing times for different methods.
Remotesensing 15 02820 g008
Figure 9. The RMSE values obtained from the different methods used for the four experimental images. (a) No. 1 image. (b) No. 2 image. (c) No. 3 image. (d) No. 4 image.
Figure 9. The RMSE values obtained from the different methods used for the four experimental images. (a) No. 1 image. (b) No. 2 image. (c) No. 3 image. (d) No. 4 image.
Remotesensing 15 02820 g009
Figure 10. RMSE histogram of GF-1 images of China.
Figure 10. RMSE histogram of GF-1 images of China.
Remotesensing 15 02820 g010
Figure 11. RMSE histogram of GF-1 images of Brazil.
Figure 11. RMSE histogram of GF-1 images of Brazil.
Remotesensing 15 02820 g011
Figure 12. RMSE histogram of GF-1 images of France.
Figure 12. RMSE histogram of GF-1 images of France.
Remotesensing 15 02820 g012
Figure 13. RMSE histogram of GF-6 images of China.
Figure 13. RMSE histogram of GF-6 images of China.
Remotesensing 15 02820 g013
Figure 14. RMSE histogram of GF-6 images of Brazil.
Figure 14. RMSE histogram of GF-6 images of Brazil.
Remotesensing 15 02820 g014
Figure 15. RMSE histogram of GF-6 images of France.
Figure 15. RMSE histogram of GF-6 images of France.
Remotesensing 15 02820 g015
Figure 16. Analysis results of local geometric distortions in No. 1 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 16. Analysis results of local geometric distortions in No. 1 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g016
Figure 17. Analysis results of local geometric distortions in No. 2 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 17. Analysis results of local geometric distortions in No. 2 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g017
Figure 18. Analysis results of local geometric distortions in No. 3 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 18. Analysis results of local geometric distortions in No. 3 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g018
Figure 19. Analysis results of local geometric distortions in No. 4 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 19. Analysis results of local geometric distortions in No. 4 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g019
Figure 20. Analysis results of local geometric distortions in No. 5 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 20. Analysis results of local geometric distortions in No. 5 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g020
Figure 21. Analysis results of local geometric distortions in No. 6 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 21. Analysis results of local geometric distortions in No. 6 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g021
Figure 22. Analysis results of local geometric distortions in No. 7 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 22. Analysis results of local geometric distortions in No. 7 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g022
Figure 23. Analysis results of local geometric distortions in No. 8 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 23. Analysis results of local geometric distortions in No. 8 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g023
Figure 24. Analysis results of local geometric distortions in No. 1 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 24. Analysis results of local geometric distortions in No. 1 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g024
Figure 25. Analysis results of local geometric distortions in No. 2 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 25. Analysis results of local geometric distortions in No. 2 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g025
Figure 26. Analysis results of local geometric distortions in No. 3 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 26. Analysis results of local geometric distortions in No. 3 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g026
Figure 27. Analysis results of local geometric distortions in No. 4 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 27. Analysis results of local geometric distortions in No. 4 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g027
Figure 28. Analysis results of local geometric distortions in No. 5 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 28. Analysis results of local geometric distortions in No. 5 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g028
Figure 29. Analysis results of local geometric distortions in No. 6 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 29. Analysis results of local geometric distortions in No. 6 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g029
Figure 30. Analysis results of local geometric distortions in No. 7 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 30. Analysis results of local geometric distortions in No. 7 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g030
Figure 31. Analysis results of local geometric distortions in No. 8 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Figure 31. Analysis results of local geometric distortions in No. 8 image: (a) plot of geometric errors, (b) histogram of geometric error values.
Remotesensing 15 02820 g031
Table 1. Experimental GF-1 image information for geometric accuracy evaluation method.
Table 1. Experimental GF-1 image information for geometric accuracy evaluation method.
NumberSensorsImaging TimeCountry
No. 1GF-1 WFV121 February 2018China
No. 2GF-1 WFV123 December 2018China
No. 3GF-1 WFV117 March 2014China
No. 4GF-1 WFV223 April 2021France
Table 2. Experimental GF-1/GF-6 image information for geometric accuracy evaluation.
Table 2. Experimental GF-1/GF-6 image information for geometric accuracy evaluation.
CountrySensorNumber of ImagesImaging Time Range
ChinaGF-1 WFV2990May 2013–June 2021
GF-6 WFV1077November 2018–June 2021
BrazilGF-1 WFV335May 2014–September 2020
GF-6 WFV2840July 2018–June 2021
FranceGF-1 WFV2214May 2013–June 2021
GF-6 WFV1105August 2018–June 2021
Table 3. Experimental GF-1 image information for analysis of local geometric distortions.
Table 3. Experimental GF-1 image information for analysis of local geometric distortions.
NumberSensorsImaging TimeCountryThe Value of RMSE
No. 1GF-1 WFV123 December 2018China2.3
No. 2GF-1 WFV117 March 2014China78
No. 3GF-1 WFV112 January 2021China3.4
No. 4GF-1 WFV330 August 2019China7.1
No. 5GF-1 WFV420 July 2017Brazil11.9
No. 6GF-1 WFV320 July 2017Brazil10.8
No. 7GF-1 WFV127 April 2021France10.5
No. 8GF-1 WFV223 April 2021France4.7
Table 4. Experimental GF-6 image information for analysis of local geometric distortions.
Table 4. Experimental GF-6 image information for analysis of local geometric distortions.
NumberSensorsImaging TimeCountryThe Value of RMSE
No. 1GF-6 WFV26 November 2018China2.7
No. 2GF-6 WFV26 January 2019China1.9
No. 3GF-6 WFV19 November 2020China1.9
No. 4GF-6 WFV14 December 2020China2.8
No. 5GF-6 WFV14 September 2020Brazil16.8
No. 6GF-6 WFV5 March 2021Brazil2.2
No. 7GF-6 WFV15 February 2019France6.7
No. 8GF-6 WFV31 March 2021France1.5
Table 5. Experimental GF-1/GF-6 image information for geometric accuracy evaluation.
Table 5. Experimental GF-1/GF-6 image information for geometric accuracy evaluation.
NumberSensorsImaging TimeCountry
No. 1GF-1 WFV117 March 2014China
No. 2GF-1 WFV323 December 2018China
No. 3GF-1 WFV130 August 2019China
No. 4GF-1 WFV223 April 2021France
No. 5GF6 WFV26 January 2019China
No. 6GF6 WFV19 November 2020China
No. 7GF6 WFV14 September 2020Brazil
No. 8GF6 WFV5 March 2021Brazil
Table 6. Experimental results of refined RFM with different numbers of CPs.
Table 6. Experimental results of refined RFM with different numbers of CPs.
Number of CPsRMSE
No. 1No. 2No. 3No. 4No. 5No. 6No. 7No. 8
0782.37.14.71.91.916.82.2
63.121.421.591.431.191.091.210.98
122.981.141.461.420.971.061.160.97
162.711.161.451.380.960.981.130.95
202.731.121.421.330.930.951.140.91
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shan, X.; Zhang, J. Does the Rational Function Model’s Accuracy for GF1 and GF6 WFV Images Satisfy Practical Requirements? Remote Sens. 2023, 15, 2820. https://doi.org/10.3390/rs15112820

AMA Style

Shan X, Zhang J. Does the Rational Function Model’s Accuracy for GF1 and GF6 WFV Images Satisfy Practical Requirements? Remote Sensing. 2023; 15(11):2820. https://doi.org/10.3390/rs15112820

Chicago/Turabian Style

Shan, Xiaojun, and Jingyi Zhang. 2023. "Does the Rational Function Model’s Accuracy for GF1 and GF6 WFV Images Satisfy Practical Requirements?" Remote Sensing 15, no. 11: 2820. https://doi.org/10.3390/rs15112820

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop