# Stepwise Fusion of Hyperspectral, Multispectral and Panchromatic Images with Spectral Grouping Strategy: A Comparative Study Using GF5 and GF1 Images

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Materials

#### 2.1. Study Area and Image Data

#### 2.2. Data Preprocessing

## 3. Methods

#### 3.1. Grouping Fusion Framework

_{x}(x = 1, 2, 3,…, Q) band by band. According to the spectral correspondences between HSI and MSI, the P bands of HSI are divided into y spectral intervals h

_{y}(y = 1, 2, 3,…, Q), and each spectral interval is a multi-band image. The spectral intervals where HSI and MSI do not overlap (the gray region in Figure 2) do not participate in the image fusion process.

_{1}, MSI

_{2},…, MSI

_{n}) and multiple HSI groups (HSI

_{1}, HSI

_{2},…, HSI

_{n}) are then fused in groups to obtain multiple sets of fused images (HSI_MSI

_{1}, HSI_MSI

_{2},…, HSI_MSI

_{n}). Finally, multiple sets of fused images are stacked by wavelength to get the final fused image (HSI_MSI). This grouping fusion framework simplifies the HSI–MSI fusion into several groups of image fusion tasks, in which a multi-band image is fused with a single-band image. In such cases, a single task can be easily implemented using traditional fusion methods, such as the six fusion algorithms mentioned in this study.

#### 3.2. Stepwise Fusion Approach

**(1) Strategy (HM)P (HSI+MSI)+PAN**: A Low spatial resolution HSI and a medium spatial resolution MSI are first fused using the above-mentioned grouping fusion framework. A medium spatial resolution HSI is obtained and then further fused with high spatial resolution PAN image to obtain a high spatial resolution HSI.

**(2) Strategy H(MP) HSI+(MSI+PAN)**: A medium spatial resolution MSI and a high spatial resolution PAN image are first fused. A high spatial resolution MSI is obtained and then further fused with a low spatial resolution HSI using spectral grouping, and a high spatial resolution HSI is obtained.

**(3) Strategy HP HSI+PAN**: The most common and traditional method is to adopt a directly HSI–PAN fusion, which can be directly implemented by many image fusion algorithms.

#### 3.3. Image Fusion Algorithms

#### 3.3.1. CS-Based Methods: BDSD_PC and PRACS

#### 3.3.2. MRA-Based Methods: MTF_GLP and MF

#### 3.3.3. Subspace-Based Methods: CNMF and PWMBF

#### 3.4. Evaluation of Image Fusion Performances

#### 3.4.1. Spectral Metrics

**v**is the reference image, $\widehat{\mathbf{v}}$ is the fused image and

**v**

_{j}∈R

^{n×1}and ${\widehat{\mathbf{v}}}_{\mathrm{j}}$∈R

^{n×1}represent the spectral signatures of the j-th pixel in the reference image and the fused image. A larger SAM means a more severe spectral distortion of the fused images. A SAM value equal to zero denotes absence of spectral distortion [47].

**x**

_{i}∈R

^{n×1}and ${\widehat{\mathbf{x}}}_{\mathrm{i}}$∈R

^{n×1}represent the i-th band of the reference image and fused image, respectively, d is the ratio of the spatial resolution between the HSI and PAN images and n is the number of pixels in the images. A larger ERGAS means greater spectral distortion.

**x**) is the maximum pixel value in the i-th reference band image. A larger PSNR value indicates a better result.

_{i}#### 3.4.2. Spatial Metrics

**M**and

**N**are the reference edge image and the edge image to be evaluated.

**M**

_{i}and

**N**

_{i}are samples of

**M**and

**N**; e is the total number of samples; $\stackrel{\_}{\mathbf{M}}$ and $\stackrel{\_}{\mathbf{N}}$ are the mean of

**M**and

**N**. The range of SCC is between 0 and 1, and a high SCC indicates that the similarity of spatial information between the fused image and the reference image is high.

#### 3.4.3. Computational Efficiency Metrics

## 4. Experimental Results

#### 4.1. Experimental Setup

#### 4.2. Performance over the Whole Image

#### 4.2.1. Performance Using Simulated Images

**Visual evaluation of spectral and spatial distortions:**Comparing the results of the three rows in Figure 5, the strategies (HM)P and H(MP) are generally better than the HP strategy in terms of spectral and spatial fidelity. The results of the (HM)P and H(MP) strategies are visually similar. Through our visual comparison, the spatial details of the BDSD_PC, MF and CNMF algorithms are much better than others. The results of these algorithms are similar from the perspective of spectral fidelity, and only the vegetation areas of the results using BDSD_PC algorithm are slightly brighter.

**Quantitative evaluation of spectral and spatial distortions:**The quantitative evaluation results further confirmed that under most algorithms, strategies (HM)P and H(MP) perform better than HP, with only a few exceptional cases. For example, using HP strategy, the results of MF and CNMF algorithms are slightly better than (HM)P and H(MP) in terms of spatial fidelity. For all algorithms, the performances of (HM)P and H(MP) strategies are equivalent. In terms of spectral fidelity, as can be seen from the last column of Table 4, Table 5 and Table 6, the BDSD_PC algorithm performed the worst, and the performances of the other five algorithms were not much different. It can be seen from the last column of Table 7 that the results using CNMF and MF have the best spatial fidelity, followed by PWMBF and BDSD_PC, MTF_GLP, and the worst is PRACS.

**Computational efficiency evaluation:**Using four algorithms (MTF_GLP, MF, CNMF and PWMBF), the computational efficiency of strategy HP is better than those of strategies (HM)P and H(MP), but there is not much difference. It is worth noting that the efficiency of H(MP) is much higher than that of (HM)P and HP when using BDSD_PC and PRACS algorithms. As can be seen from Table 8, the fusion of a single-band image and a 77-band image in HP and the second step of (HM)P consumes too much time. However, with the fewer bands involved in each fusion in the H(MP) strategy, the computational time decreased dramatically, which indicated that the time complexity of CS-based algorithms may be related to the number of bands in the image. The efficiency of the H(MP) strategy is generally better than that of (HM)P, except for the CNMF and PWMBF algorithms. It can be seen from the last column of Table 8 that the ranking of computational efficiency from high to low is PWMBF, MF, MTF_GLP, BDSD_PC, CNMF and PRACS.

**Summary:**In most cases, the spatial and spectral fidelity of strategies (HM)P and H(MP) is better than HP, and the performance of strategies (HM)P and H(MP) is similar. From the algorithm point of view, BDSD_PC has the worst spectral fidelity, and the other five algorithms are similar; the spatial fidelity sorted from good to bad is CNMF, MF, PWMBF, BDSD_PC, MTF_GLP, PRACS. In most cases, the stepwise strategy does not significantly increase the computational load when it fuses one more image than HP, and H(MP) reduces time complexity compared to HP and (HM)P when using CS-based algorithms. The efficiency of the H(MP) strategy is better than that of (HM)P. From the perspective of the algorithms, the computational efficiency sorted from high to low is: PWMBF, MF, MTF_GLP, BDSD_PC, CNMF, PRACS.

#### 4.2.2. Performances Using Real Images

**Visual evaluation of spectral and spatial distortions:**For most algorithms, the spectral distortions of results using the proposed stepwise strategies are better than those of the HP strategy. However, the (HM)P and H(MP) strategies and the HP strategy have different performances in terms of spatial fidelity. The spectral and spatial fidelities of the results using (HM)P and H(MP) strategies are visually similar. Using the three strategies, the results of the CNMF algorithm are significantly different, indicating that the algorithm is more sensitive to strategies. From the perspective of spatial distortion, BDSD_PC, MF, and CNMF performed well, followed by MTF_GLP, and PRACS and PWMBF performed poorly.

**Quantitative evaluation of spatial distortions:**The (HM)P and H(MP) strategies did not always outperform the HP strategy in terms of spatial fidelity. As shown in Table 9, the stepwise approaches outperformed HP when using PRACS, CNMF and PWMBF algorithms. However, using the other three algorithms, they provide comparable performances compared to the HP strategy. The two stepwise strategies performed similarly in terms of spatial fidelity, from our visual comparison and quantitative metrics in Table 9. It can be seen from the last column of Table 9 that the spatial fidelity of results using BDSD_PC and MF are the best, followed by CNMF and MTF_GLP, and PRACS and PWMBF are the worst.

**Computational efficiency:**Under MTF_GLP, CNMF and PWMBF algorithms, the computational efficiency of strategy HP is better than that of strategies (HM)P and H(MP), and it is opposite for the other three algorithms. Particularly, when using BDSD_PC and PRACS algorithms, the fusion of a single-band image and a 77-band image in HP and the second step of (HM)P consumes too much time, which results in much higher efficiency of H(MP) than (HM)P and HP. This illustrates that the complexity of CS-based algorithms may increase as the number of bands to be fused increases. The efficiency of strategy H(MP) is generally better than that of strategy (HM)P, except for the CNMF and PWMBF algorithms. It can be seen from the last column of Table 8 that the computational efficiency from high to low is PWMBF, MF, BDSD_PC, MTF_GLP, CNMF, and PRACS.

**Summary:**The spectral fidelity of strategies (HM)P and H(MP) is better than strategy HP, and the spatial fidelity of (HM)P and H(MP) is not always better than HP, in some cases slightly worse than strategy HP. The spectral and spatial information of (HM)P and H(MP) are similar. From the algorithm point of view, the spectral fidelity of the CNMF algorithm is very poor, and the other five algorithms are similar. The spatial fidelity is ranked from good to bad is BDSD_PC, MF, CNMF, MTF_GLP, PRACS, PWMBF. The computational efficiency of HP and the stepwise strategies is comparable, and strategy H(MP) is better than strategy (HM)P. From the perspective of the algorithm, the computational efficiency is sorted from high to low: PWMBF, MF, BDSD_PC, MTF_GLP, CNMF, PRACS.

#### 4.3. Performances over Vegetation Areas

#### 4.3.1. Performances Using Simulated Images

#### 4.3.2. Performances Using Real Images

#### 4.3.3. Summary of Performance on Vegetation

**Summary:**On vegetation areas, the spectral information of strategies (HM)P and H(MP) is generally better than that of strategy HP, and (HM)P and H(MP) are similar. From the algorithm point of view, BDSD_PC and CNMF have severe spectral distortion, and the other four algorithms perform similarly.

**Suggestions for selecting fusion strategies and algorithms:**Generally, from the experiments based on simulated data and real data, the strategies (HM)P and H(MP) are the better choices. For the regions where the spectral information restoration is most important, e.g., vegetation regions, the MF, MTF_GLP, PRACS and PWMBF algorithms achieved better image fusion results; from the visual evaluation, only the MF, MTF_GLP algorithms achieved high performance in spatial detail restoration.

#### 4.4. Performances over Built-Up Areas

#### 4.4.1. Performances Using Simulated Images

#### 4.4.2. Performance Using Real Images

#### 4.4.3. Summary of Performances over Built-Up Areas

**Suggestions for selecting fusion strategies and algorithms:**Comparing the performances over built-up areas in both simulated and real images, we suggest spending more time on selecting a better fusion algorithm, rather than selecting the strategy. MF, CNMF, and BDSD_PC achieved better image fusion results in terms of spatial fidelity. From the visual evaluation, only the MF algorithms achieved high performance in spectral details restoration.

## 5. Discussion

#### 5.1. Comparison of Strategies (HM)P, H(MP) and HP

**Spectral fidelity:**Generally, the spectral fidelity performance using the proposed strategies (HM)P and H(MP) was better than that of strategy HP. This is because MSI acted as a bridge, which enabled better integration of spectral information. It also illustrated the effectiveness of the stepwise and spectral grouping approach.

**Spatial fidelity:**From the perspective of spatial fidelity, the stepwise approach was not always better than the traditional strategy. In some cases, it was even slightly worse than the traditional HP approach. Nevertheless, combining all the experimental results, we still found that the stepwise strategies were significantly better than the traditional one.

**Computational efficiency:**In most circumstances, the efficiency of strategy HP was better than that of strategies (HM)P and H(MP), but there was not much difference, which may be due to the fact that (HM)P and H(MP) fuse one more image than HP. However, when using CS-based algorithms, we founnd that H(MP) was more efficient than HP and (HM)P, indicating that the stepwise fusion strategy may reduce the time complexity of the CS-based methods. In most cases, the efficiency of strategy H(MP) was better than that of strategy (HM)P.

**Summary:**Considering the spectral and spatial fidelity comprehensively, we found that the stepwise approaches are better than traditional one. Moreover, the stepwise fusion approach did not significantly increase the time complexity compared to traditional methods, and strategy H(MP) reduced the time complexity compared to HP when using CS-based algorithms (BDSD_PC and PRACS). We also found from the experimental results that the two stepwise approaches always produce comparable results for most algorithms and images. Therefore, we suggest fusing HSI, MSI and PAN images using stepwise and spectral grouping strategies to obtain better results.

#### 5.2. Comparison of Fusion Algorithms Using Stepwise and Spectral Grouping Strategy

**Spectral and spatial fidelity:**It can be seen from Table 16, that the MF algorithm had good performance in terms of spectral and spatial fidelity for different images and scenes. The MTF_GLP algorithms performed slightly worse than MF for some areas and images. However, the worst scores of MTF_GLP methods were As, therefore, their final scores were A. The other four algorithms had some obvious defects in spectral or spatial fidelity, making them obtain poor scores, and finally resulting in the scores of P.

**Computational efficiency:**As can be seen from Section 4.2, from an algorithmic point of view, the computational efficiency of simulated data fusion is ranked as PWMBF > MF > MTF_GLP > BDSD_PC > CNMF > PRACS, and the computational efficiency of real data fusion is ranked as PWMBF > MF > BDSD_PC > MTF_GLP > CNMF > PRACS, so the calculation efficiency of the six algorithms can also be divided into three levels in the Table 17. As is showed in Table 17, in terms of computational efficiency of the algorithms, MF and PWMBF are good, BDSD_PC and MTF_GLP are acceptable, and PRACS and CNMF are poor. From the perspective of the types of fusion algorithms, the CS-based algorithms have the lowest computational efficiency in general, while the MRA-based and subspace-based algorithms have better and equivalent computational efficiency.

#### 5.3. Issues to Be Further Investigated

## 6. Conclusions

**(1) Image fusion performances of different strategies:**Compared with the traditional fusion strategy HP, the results of the stepwise fusion strategy (HM)P and H(MP) have better spectral fidelity. However, from the perspective of spatial fidelity, the strategies (HM)P and H(MP) do not always outperform the strategy HP. Nevertheless, considering all the experimental results, we still found that the stepwise strategies were better than the traditional one, while the spectral and spatial fidelity of the stepwise strategies (HM)P and H(MP) were comparable.

**(2) Image fusion performances of different algorithms:**Six algorithms are evaluated with the stepwise fusion strategy. The spectral and spatial fidelity of the results of the MF algorithm was the best, followed by MTF-GLP. Although the results of BDSD_PC and CNMF had good spatial fidelity, the spectral fidelity was poor, whilst PRACS and PWMBF had better spectral fidelity and poor spatial retention.

**(3) Computational efficiency**

**of the fusion strategies:**The stepwise strategy does not significantly increase the computational load when it fuses one more image than HP, and stepwise strategy H(MP) reduces the time complexity compared with HP when using CS-based algorithms. Under most algorithms, strategy H(MP) is more computationally efficient than strategy (HM)P.

**(4) Computational efficiency of the fusion algorithms:**From the algorithm point of view, PWMBF and MF have the highest computational efficiency, followed by MTF_GLP, BDSD_PC, and the worst is CNMF, PRACS. From the perspective of the types of fusion algorithms, the CS-based algorithms have the lowest computational efficiency in general, while the MRA and subspace-based algorithms have better and equivalent computational efficiency.

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Liu, Y.-N.; Sun, D.-X.; Hu, X.-N.; Ye, X.; Li, Y.-D.; Liu, S.-F.; Cao, K.-Q.; Chai, M.-Y.; Zhou, W.-Y.-N.; Zhang, J.; et al. The Advanced Hyperspectral Imager Aboard China’s GaoFen-5 satellite. IEEE Geosci. Remote Sens. Mag.
**2019**, 7, 23–32. [Google Scholar] [CrossRef] - Grohnfeldt, C.; Zhu, X.X.; Bamler, R. Splitting the Hyperspectral-Multispectral Image Fusion Problem Autonomously into Weighted Pan-Sharpening Tasks-The Spectral Grouping Concept. In Proceedings of the 7th Workshop on Hyperspectral Image and Signal Processing—Evolution in Remote Sensing (WHISPERS), Tokyo, Japan, 2–5 June 2015. [Google Scholar]
- Yang, D.; Luo, Y.; Zeng, Y.; Si, F.; Xi, L.; Zhou, H.; Liu, W. Tropospheric NO
_{2}Pollution Monitoring with the GF-5 Satellite Environmental Trace Gases Monitoring Instrument over the North China Plain during Winter 2018–2019. Atmosphere**2021**, 12, 398. [Google Scholar] [CrossRef] - Tang, B.-H. Nonlinear Split-Window Algorithms for Estimating Land and Sea Surface Temperatures From Simulated Chinese Gaofen-5 Satellite Data. IEEE Trans. Geosci. Remote Sens.
**2018**, 56, 6280–6289. [Google Scholar] [CrossRef] - Ye, X.; Ren, H.; Liu, R.; Qin, Q.; Liu, Y.; Dong, J. Land Surface Temperature Estimate From Chinese Gaofen-5 Satellite Data Using Split-Window Algorithm. IEEE Trans. Geosci. Remote Sens.
**2017**, 55, 5877–5888. [Google Scholar] [CrossRef] - Wang, F.; Gao, J.; Zha, Y. Hyperspectral sensing of heavy metals in soil and vegetation: Feasibility and challenges. Isprs J. Photogramm. Remote Sens.
**2018**, 136, 73–84. [Google Scholar] [CrossRef] - Giardino, C.; Brando, V.E.; Dekker, A.G.; Strombeck, N.; Candiani, G. Assessment of water quality in Lake Garda (Italy) using Hyperion. Remote Sens. Environ.
**2007**, 109, 183–195. [Google Scholar] [CrossRef] - Xia, J.S.; Du, P.J.; He, X.Y.; Chanussot, J. Hyperspectral Remote Sensing Image Classification Based on Rotation Forest. IEEE Geosci. Remote Sens. Lett.
**2014**, 11, 239–243. [Google Scholar] [CrossRef] [Green Version] - Demir, B.; Erturk, S. Hyperspectral image classification using relevance vector machines. IEEE Geosci. Remote Sens. Lett.
**2007**, 4, 586–590. [Google Scholar] [CrossRef] - Ye, B.; Tian, S.; Cheng, Q.; Ge, Y. Application of Lithological Mapping Based on Advanced Hyperspectral Imager (AHSI) Imagery Onboard Gaofen-5 (GF-5) Satellite. Remote Sens.
**2020**, 12, 3990. [Google Scholar] [CrossRef] - Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A Critical Comparison Among Pansharpening Algorithms. IEEE Trans. Geosci. Remote Sens.
**2015**, 53, 2565–2586. [Google Scholar] [CrossRef] - Hardie, R.C.; Eismann, M.T.; Wilson, G.L. MAP estimation for hyperspectral image resolution enhancement using an auxiliary sensor. IEEE Trans. Image Process.
**2004**, 13, 1174–1184. [Google Scholar] [CrossRef] [PubMed] - Palsson, F.; Sveinsson, J.R.; Ulfarsson, M.O. Multispectral and Hyperspectral Image Fusion Using a 3-D-Convolutional Neural Network. IEEE Geosci. Remote Sens. Lett.
**2017**, 14, 639–643. [Google Scholar] [CrossRef] [Green Version] - Wei, Q.; Bioucas-Dias, J.; Dobigeon, N.; Tourneret, J.-Y. Hyperspectral and Multispectral Image Fusion Based on a Sparse Representation. IEEE Trans. Geosci. Remote Sens.
**2015**, 53, 3658–3668. [Google Scholar] [CrossRef] [Green Version] - Chen, Z.; Pu, H.; Wang, B.; Jiang, G.-M. Fusion of Hyperspectral and Multispectral Images: A Novel Framework Based on Generalization of Pan-Sharpening Methods. IEEE Geosci. Remote Sens. Lett.
**2014**, 11, 1418–1422. [Google Scholar] [CrossRef] - Selva, M.; Aiazzi, B.; Butera, F.; Chiarantini, L.; Baronti, S. Hyper-Sharpening: A First Approach on SIM-GA Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
**2015**, 8, 3008–3024. [Google Scholar] [CrossRef] - Simoes, M.; Bioucas-Dias, J.; Almeida, L.B.; Chanussot, J. A Convex Formulation for Hyperspectral Image Superresolution via Subspace-Based Regularization. IEEE Trans. Geosci. Remote Sens.
**2015**, 53, 3373–3388. [Google Scholar] [CrossRef] [Green Version] - Zhang, Y.; De Backer, S.; Scheunders, P. Noise-Resistant Wavelet-Based Bayesian Fusion of Multispectral and Hyperspectral Images. IEEE Trans. Geosci. Remote Sens.
**2009**, 47, 3834–3843. [Google Scholar] [CrossRef] - Lanaras, C.; Baltsavias, E.; Schindler, K. Hyperspectral Super-Resolution by Coupled Spectral Unmixing. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 11–18 December 2015; pp. 3586–3594. [Google Scholar]
- Bieniarz, J.; Mueller, R.; Zhu, X.X.; Reinartz, P. Hyperspectral Image Resolution Enhancement Based on Joint Sparsity Spectral Unmixing. In Proceedings of the IEEE Joint International Geoscience and Remote Sensing Symposium (IGARSS)/35th Canadian Symposium on Remote Sensing, Quebec City, QC, Canada, 13–18 July 2014; pp. 2645–2648. [Google Scholar]
- Yokoya, N.; Yairi, T.; Iwasaki, A. Coupled Nonnegative Matrix Factorization Unmixing for Hyperspectral and Multispectral Data Fusion. IEEE Trans. Geosci. Remote Sens.
**2012**, 50, 528–537. [Google Scholar] [CrossRef] - Qu, J.H.; Lei, J.; Li, Y.S.; Dong, W.Q.; Zeng, Z.Y.; Chen, D.Y. Structure Tensor-Based Algorithm for Hyperspectral and Panchromatic Images Fusion. Remote Sens.
**2018**, 10, 373. [Google Scholar] [CrossRef] [Green Version] - Cetin, M.; Musaoglu, N. Merging hyperspectral and panchromatic image data: Qualitative and quantitative analysis. Int. J. Remote Sens.
**2009**, 30, 1779–1804. [Google Scholar] [CrossRef] - Qu, J.H.; Li, Y.S.; Du, Q.; Xia, H.M. Hyperspectral and Panchromatic Image Fusion via Adaptive Tensor and Multi-Scale Retinex Algorithm. IEEE Access
**2020**, 8, 30522–30532. [Google Scholar] [CrossRef] - Dong, W.; Xiao, S.; Liang, J.; Qu, J. Fusion of hyperspectral and panchromatic images using structure tensor and matting model. Neurocomputing
**2020**, 399, 237–246. [Google Scholar] [CrossRef] - Bioucas-Dias, J.M.; Plaza, A.; Camps-Valls, G.; Scheunders, P.; Nasrabadi, N.M.; Chanussot, J. Hyperspectral Remote Sensing Data Analysis and Future Challenges. IEEE Geosci. Remote Sens. Mag.
**2013**, 1, 6–36. [Google Scholar] [CrossRef] [Green Version] - Meng, X.; Sun, W.; Ren, K.; Yang, G.; Shao, F.; Fu, R. Spatial-spectral fusion of GF-5/GF-1 remote sensing images based on multiresolution analysis. J. Remote Sens.
**2020**, 24, 379–387. [Google Scholar] - Shen, H. Integrated Fusion Method for Multiple Temporal-Spatial-Spectral Images. In Proceedings of the 22nd Congress of the International-Society-for-Photogrammetry-and-Remote-Sensing, Melbourne, Australia, 25 August–1 September 2012; pp. 407–410. [Google Scholar]
- Zhong, Y.; Wang, X.; Wang, S.; Zhang, L. Advances in spaceborne hyperspectral remote sensing in China. Geo-Spatial Inf. Sci.
**2021**, 24, 95–120. [Google Scholar] [CrossRef] - Li, J.; Feng, L.; Pang, X.P.; Gong, W.S.; Zhao, X. Radiometric cross Calibration of Gaofen-1 WFV Cameras Using Landsat-8 OLI Images: A Simple Image-Based Method. Remote Sens.
**2016**, 8, 411. [Google Scholar] [CrossRef] [Green Version] - Hao, P.; Wang, L.; Niu, Z. Potential of multitemporal Gaofen-1 panchromatic/multispectral images for crop classification: Case study in Xinjiang Uygur Autonomous Region, China. J. Appl. Remote Sens.
**2015**, 9, 096035. [Google Scholar] [CrossRef] - Mookambiga, A.; Gomathi, V. Comprehensive review on fusion techniques for spatial information enhancement in hyperspectral imagery. Multidimens. Syst. Signal Process.
**2016**, 27, 863–889. [Google Scholar] [CrossRef] - Li, X.; Yuan, Y.; Wang, Q. Hyperspectral and Multispectral Image Fusion Based on Band Simulation. IEEE Geosci. Remote Sens. Lett.
**2020**, 17, 479–483. [Google Scholar] [CrossRef] - Luo, S.; Zhou, S.; Qiang, B. A novel adaptive fast IHS transform fusion method driven by regional spectral characteristics for Gaofen-2 imagery. Int. J. Remote Sens.
**2020**, 41, 1321–1337. [Google Scholar] [CrossRef] - Ren, K.; Sun, W.; Meng, X.; Yang, G.; Du, Q. Fusing China GF-5 Hyperspectral Data with GF-1, GF-2 and Sentinel-2A Multispectral Data: Which Methods Should Be Used? Remote Sens.
**2020**, 12, 882. [Google Scholar] [CrossRef] [Green Version] - Yokoya, N.; Grohnfeldt, C.; Chanussot, J. Hyperspectral and Multispectral Data Fusion A comparative review of the recent literature. IEEE Geosci. Remote Sens. Mag.
**2017**, 5, 29–56. [Google Scholar] [CrossRef] - Vivone, G. Robust Band-Dependent Spatial-Detail Approaches for Panchromatic Sharpening. IEEE Trans. Geosci. Remote Sens.
**2019**, 57, 6421–6433. [Google Scholar] [CrossRef] - Choi, J.; Yu, K.; Kim, Y. A New Adaptive Component-Substitution-Based Satellite Image Fusion by Using Partial Replacement. IEEE Trans. Geosci. Remote Sens.
**2011**, 49, 295–309. [Google Scholar] [CrossRef] - Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. MTF-tailored multiscale fusion of high-resolution MS and pan imagery. Photogramm. Eng. Remote Sens.
**2006**, 72, 591–596. [Google Scholar] [CrossRef] - Restaino, R.; Vivone, G.; Dalla Mura, M.; Chanussot, J. Fusion of Multispectral and Panchromatic Images Based on Morphological Operators. IEEE Trans. Image Process.
**2016**, 25, 2882–2895. [Google Scholar] [CrossRef] [Green Version] - Palsson, F.; Sveinsson, J.R.; Ulfarsson, M.O.; Benediktsson, J.A. Model-Based Fusion of Multi-and Hyperspectral Images Using PCA and Wavelets. IEEE Trans. Geosci. Remote Sens.
**2015**, 53, 2652–2663. [Google Scholar] [CrossRef] - Ghassemian, H. A review of remote sensing image fusion methods. Inf. Fusion
**2016**, 32, 75–89. [Google Scholar] [CrossRef] - Garzelli, A.; Nencini, F.; Capobianco, L. Optimal MMSE pan sharpening of very high resolution multispectral images. IEEE Trans. Geosci. Remote Sens.
**2008**, 46, 228–236. [Google Scholar] [CrossRef] - Vivone, G.; Dalla Mura, M.; Garzelli, A.; Restaino, R.; Scarpa, G.; Ulfarsson, M.O.; Alparone, L.; Chanussot, J. A New Benchmark Based on Recent Advances in Multispectral Pansharpening: Revisiting Pansharpening With Classical and Emerging Pansharpening Methods. IEEE Geosci. Remote Sens. Mag.
**2021**, 9, 53–81. [Google Scholar] [CrossRef] - Lee, D.D.; Seung, H.S. Learning the parts of objects by non-negative matrix factorization. Nature
**1999**, 401, 788–791. [Google Scholar] [CrossRef] [PubMed] - Loncan, L.; Almeida, L.B.; Bioucas-Dias, J.M.; Briottet, X.; Chanussot, J.; Dobigeon, N.; Fabre, S.; Liao, W.; Licciardi, G.A.; Simoes, M.; et al. Hyperspectral Pansharpening: A Review. IEEE Geosci. Remote Sens. Mag.
**2015**, 3, 27–46. [Google Scholar] [CrossRef] [Green Version] - Alparone, L.; Wald, L.; Chanussot, J.; Thomas, C.; Gamba, P.; Bruce, L.M. Comparison of pansharpening algorithms: Outcome of the 2006 GRS-S data-fusion contest. IEEE Trans. Geosci. Remote Sens.
**2007**, 45, 3012–3021. [Google Scholar] [CrossRef] [Green Version] - Ranchin, T.; Wald, L. Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation. Photogramm. Eng. Remote Sens.
**2000**, 66, 49–61. [Google Scholar] - Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens.
**1997**, 63, 691–699. [Google Scholar]

**Figure 1.**Geographical location of study area. On the right is the 30 m GF-5 HSI with true color display (R: 639 nm; G: 549 nm; B: 472 nm).

**Figure 4.**Flowchart of three different fusion strategies. The yellow, red, and black arrows indicate the steps of three different strategies, respectively.

**Figure 6.**(

**a**–

**r**) Fusion results of real GF5 and GF1 images. Some poor results are marked in the figures.

Image Data | GF-5 HSI | GF-1 MSI | GF-1 PAN |
---|---|---|---|

Launch time of sensors | 9 May 2018 | 26 April 2013 | 26 April 2013 |

Spectral range/nm | 400–2500 | 450–520 | 450–900 |

520–590 | |||

630–690 | |||

770–890 | |||

Number of bands | 330 | 4 | 1 |

Spectral resolution/nm | 5 (VNIR) | - | - |

10 (SWIR) | - | - | |

Spatial resolution/m | 30 | 8 | 2 |

Acquisition time of images | 5 October 2018 | 2 October 2018 | 2 October 2018 |

Group | Spectral Interval (nm) | GF-1 MSI Band Index | GF-5 HSI Band Index |
---|---|---|---|

1 | 450–520 | 1 | 15–31 |

2 | 520–590 | 2 | 32–47 |

3 | 630–690 | 3 | 57–71 |

4 | 770–890 | 4 | 90–118 |

Data | Multi-Source Images | Spatial Resolution(m) | Image Size |
---|---|---|---|

Simulated Data | GF-5 HSI | 512 | 50 × 50 |

GF-1 MSI | 128 | 200 × 200 | |

GF-1 PAN | 32 | 800 × 800 | |

Real Data | GF-5 HSI | 32 | 80 × 80 |

GF-1 MSI | 8 | 320 × 320 | |

GF-1 PAN | 2 | 1280 × 1280 |

**Table 4.**SAM index of fusion results using simulated GF5 and GF1 images. A smaller value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 11.112 | 10.722 | 8.816 | 10.217 |

PRACS | 4.787 | 4.674 | 5.298 | 4.920 |

MTF_GLP | 5.242 | 4.905 | 5.806 | 5.318 |

MF | 4.652 | 3.866 | 4.964 | 4.494 |

CNMF | 3.900 | 4.181 | 4.353 | 4.145 |

PWMBF | 4.840 | 5.424 | 6.290 | 5.518 |

**Table 5.**ERGAS index of fusion results using simulated GF5 and GF1 images. A smaller value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 2.052 | 2.080 | 2.272 | 2.135 |

PRACS | 1.584 | 1.585 | 1.873 | 1.681 |

MTF_GLP | 1.531 | 1.521 | 1.830 | 1.628 |

MF | 1.698 | 1.574 | 1.831 | 1.701 |

CNMF | 1.604 | 1.627 | 1.803 | 1.678 |

PWMBF | 1.534 | 1.593 | 1.863 | 1.663 |

**Table 6.**PSNR index of fusion results using simulated GF5 and GF1 images. A larger value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 61.335 | 61.130 | 60.051 | 60.838 |

PRACS | 66.731 | 66.689 | 63.621 | 65.680 |

MTF_GLP | 67.417 | 67.519 | 63.819 | 66.252 |

MF | 65.451 | 67.037 | 63.785 | 65.424 |

CNMF | 66.215 | 65.880 | 63.964 | 65.353 |

PWMBF | 67.465 | 66.738 | 63.477 | 65.893 |

**Table 7.**SCC index of fusion results using simulated GF5 and GF1 images. A larger value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 0.823 | 0.835 | 0.583 | 0.747 |

PRACS | 0.658 | 0.657 | 0.520 | 0.612 |

MTF_GLP | 0.701 | 0.694 | 0.576 | 0.657 |

MF | 0.908 | 0.886 | 0.945 | 0.913 |

CNMF | 0.905 | 0.914 | 0.934 | 0.918 |

PWMBF | 0.830 | 0.852 | 0.722 | 0.801 |

**Table 8.**Computational time (in seconds) of fusing simulated GF5 and GF1 images. The background color here is to mark the total running time of the strategies (HM)P, H(MP) and HP, respectively, so as to distinguish it from the separate running time of the step 1 and step 2.

Algorithm | (HM)P | H(MP) | HP | Mean | ||||
---|---|---|---|---|---|---|---|---|

Step 1 | Step 2 | Total | Step 1 | Step 2 | Total | |||

BDSD_PC | 14.852 | 13.856 | 28.708 | 0.740 | 5.246 | 5.986 | 12.976 | 15.890 |

PRACS | 54.654 | 171.978 | 226.633 | 0.839 | 52.129 | 52.967 | 202.296 | 160.632 |

MTF_GLP | 2.590 | 25.044 | 27.633 | 1.534 | 8.867 | 10.400 | 8.404 | 15.479 |

MF | 6.207 | 6.348 | 12.556 | 0.455 | 9.031 | 9.486 | 8.264 | 10.102 |

CNMF | 5.802 | 33.179 | 38.981 | 4.734 | 48.049 | 52.783 | 19.015 | 36.926 |

PWMBF | 1.021 | 3.940 | 4.961 | 1.652 | 8.101 | 9.753 | 3.929 | 6.214 |

**Table 9.**SCC index of fusion results using real GF5 and GF1 images. A larger value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 0.983 | 0.982 | 0.997 | 0.987 |

PRACS | 0.751 | 0.726 | 0.653 | 0.710 |

MTF_GLP | 0.913 | 0.900 | 0.977 | 0.930 |

MF | 0.962 | 0.961 | 0.976 | 0.966 |

CNMF | 0.978 | 0.960 | 0.911 | 0.950 |

PWMBF | 0.670 | 0.724 | 0.675 | 0.690 |

Algorithm | (HM)P | H(MP) | HP | Mean | ||||
---|---|---|---|---|---|---|---|---|

Step 1 | Step 2 | Total | Step 1 | Step 2 | Total | |||

BDSD_PC | 21.295 | 34.911 | 56.205 | 1.215 | 11.641 | 12.857 | 31.491 | 33.518 |

PRACS | 145.614 | 592.949 | 738.563 | 1.818 | 173.443 | 175.262 | 580.711 | 498.179 |

MTF_GLP | 5.013 | 63.237 | 68.250 | 3.540 | 22.332 | 25.872 | 21.134 | 38.419 |

MF | 21.902 | 15.999 | 37.901 | 0.916 | 21.326 | 22.242 | 25.915 | 28.686 |

CNMF | 10.059 | 86.835 | 96.894 | 11.588 | 105.678 | 117.266 | 52.481 | 88.880 |

PWMBF | 1.594 | 11.001 | 12.595 | 4.254 | 22.999 | 27.253 | 11.180 | 17.009 |

**Table 11.**SAM index of vegetation areas fusion results using simulated GF5 and GF1 images. A smaller value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 2.360 | 2.403 | 1.956 | 2.240 |

PRACS | 1.476 | 1.414 | 1.690 | 1.527 |

MTF_GLP | 1.145 | 1.151 | 1.478 | 1.258 |

MF | 1.315 | 1.195 | 1.409 | 1.306 |

CNMF | 1.379 | 1.410 | 1.653 | 1.481 |

PWMBF | 1.318 | 1.343 | 1.504 | 1.388 |

**Table 12.**ERGAS index of vegetation areas fusion results using simulated GF5 and GF1 images. A smaller value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 1.378 | 1.339 | 1.376 | 1.364 |

PRACS | 0.785 | 0.772 | 0.856 | 0.804 |

MTF_GLP | 0.652 | 0.658 | 0.795 | 0.701 |

MF | 0.772 | 0.693 | 0.786 | 0.750 |

CNMF | 0.820 | 0.802 | 0.941 | 0.854 |

PWMBF | 0.755 | 0.749 | 0.818 | 0.774 |

**Table 13.**PSNR index of vegetation areas fusion results using simulated GF5 and GF1 images. A larger value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 51.743 | 51.909 | 51.828 | 51.827 |

PRACS | 62.255 | 62.590 | 60.795 | 61.880 |

MTF_GLP | 66.265 | 66.019 | 62.563 | 64.949 |

MF | 62.780 | 64.844 | 62.712 | 63.445 |

CNMF | 61.298 | 61.680 | 58.714 | 60.564 |

PWMBF | 63.791 | 64.162 | 62.103 | 63.352 |

**Table 14.**SCC index of built-up area fusion results using simulated GF5 and GF1 images. A larger value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 0.941 | 0.943 | 0.850 | 0.911 |

PRACS | 0.865 | 0.865 | 0.843 | 0.858 |

MTF_GLP | 0.891 | 0.887 | 0.850 | 0.876 |

MF | 0.985 | 0.986 | 0.989 | 0.987 |

CNMF | 0.985 | 0.982 | 0.988 | 0.985 |

PWMBF | 0.929 | 0.933 | 0.881 | 0.914 |

**Table 15.**SCC index of built-up areas fusion results using real GF5 and GF1 images. A larger value indicates better performance.

Algorithm | (HM)P | H(MP) | HP | Mean |
---|---|---|---|---|

BDSD_PC | 0.981 | 0.981 | 0.994 | 0.986 |

PRACS | 0.706 | 0.690 | 0.597 | 0.664 |

MTF_GLP | 0.880 | 0.867 | 0.917 | 0.888 |

MF | 0.987 | 0.985 | 0.996 | 0.989 |

CNMF | 0.995 | 0.980 | 0.923 | 0.966 |

PWMBF | 0.609 | 0.628 | 0.590 | 0.609 |

**Table 16.**Quantified performances of different algorithms’ spectral and spatial fidelity (G: good, A: acceptable, P: poor).

Scenes | Images | Figure/Table | BDSD_PC | PRACS | MTF_GLP | MF | CNMF | PWMBF |
---|---|---|---|---|---|---|---|---|

Full Image (Spectral/Spatial) | Simulated | Figure 5/Table 4, Table 5, Table 6 and Table 7 | P/A | G/P | G/A | G/G | G/G | G/A |

Real | Figure 6/Table 9 | G/G | G/P | G/A | G/G | A/G | G/P | |

Vegetation Area (Spectral) | Simulated | Figure 7/Table 11, Table 12 and Table 13 | P | G | G | G | P | G |

Real | Figure 8 | G | G | G | G | P | G | |

Built-up Area (Spatial) | Simulated | Figure 9/Table 14 | A | P | A | G | G | A |

Real | Figure 10/Table 15 | G | P | A | G | G | P | |

Overall Score | P | P | A | G | P | P |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Huang, L.; Hu, Z.; Luo, X.; Zhang, Q.; Wang, J.; Wu, G.
Stepwise Fusion of Hyperspectral, Multispectral and Panchromatic Images with Spectral Grouping Strategy: A Comparative Study Using GF5 and GF1 Images. *Remote Sens.* **2022**, *14*, 1021.
https://doi.org/10.3390/rs14041021

**AMA Style**

Huang L, Hu Z, Luo X, Zhang Q, Wang J, Wu G.
Stepwise Fusion of Hyperspectral, Multispectral and Panchromatic Images with Spectral Grouping Strategy: A Comparative Study Using GF5 and GF1 Images. *Remote Sensing*. 2022; 14(4):1021.
https://doi.org/10.3390/rs14041021

**Chicago/Turabian Style**

Huang, Leping, Zhongwen Hu, Xin Luo, Qian Zhang, Jingzhe Wang, and Guofeng Wu.
2022. "Stepwise Fusion of Hyperspectral, Multispectral and Panchromatic Images with Spectral Grouping Strategy: A Comparative Study Using GF5 and GF1 Images" *Remote Sensing* 14, no. 4: 1021.
https://doi.org/10.3390/rs14041021