Next Article in Journal
Bionic Integrated Positioning Mechanism Based on Bioinspired Polarization Compass and Inertial Navigation System
Previous Article in Journal
Cognitive Radio MANET Waveform Design and Evaluation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

A Color Restoration Algorithm for Diffractive Optical Images of Membrane Camera

1
State Key Laboratory of Remote Sensing Science, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100101, China
2
Department of Electronic Engineering, Tsinghua University, Beijing 100084, China
3
Key Laboratory of Earth Observation of Hainan Province, Sanya 572029, China
4
Beijing Municipal Commission of Planning and Natural Resources, Beijing 101160, China
5
College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, OR 97331, USA
*
Authors to whom correspondence should be addressed.
Sensors 2021, 21(4), 1053; https://doi.org/10.3390/s21041053
Submission received: 15 December 2020 / Revised: 20 January 2021 / Accepted: 29 January 2021 / Published: 4 February 2021
(This article belongs to the Section Physical Sensors)

Abstract

:
In order to verify the technology of the membrane diffractive imaging system for Chinese next generation geo-stationary earth orbit (GEO) satellite, a series of ground experiments have been carried out using a membrane optical camera with 80 mm aperture (Φ80) lens. The inherent chromatic aberration due to diffractive imaging appears in the obtained data. To address the issue, an effective color restoration algorithm framework by matching, tailoring, and non-linearly stretching the image histograms is proposed in this letter. Experimental results show the proposed approach has good performances in color restoration of the diffractive optical images than previous methods. The effectiveness and robustness of the algorithm are also quantitatively assessed using various color deviation indexes. The results indicate that the chromatic aberration of diffractive images can be effectively removed by about 85%. Also, the proposed method presents reasonable computational efficiency.

1. Introduction

The geo-stationary earth orbit (GEO) satellite can provide remote sensing image data with wide observation range and high temporal resolution [1,2]. However, because of the high orbital altitude, achieving high spatial resolution is still a challenge for the GEO optical system. Theoretically, to achieve the meter-level resolution at the GEO height, the aperture of camera can be up to 20 m [3,4]. Considering the support and control systems, the size and weight of satellite would be extremely large. This is prohibitive for the engineering fabrication and deployment of traditional optical camera.
To alleviate the technique contradictions of GEO remote sensing system, a novel membrane optical system has been proposed and seen increasing interests in relevant studies [5,6,7,8,9]. Unlike the traditional reflective optics, the membrane camera adopts the transmissive diffractive imaging mechanism. The diffractive primary is manufactured by the macromolecule polymer material which has the merits of light weight, low cost, and high flexibility. Thus, large aperture but light weight of the optical imaging system can be achieved. Also, easy deployment (light and packable) of the membrane optical system virtually eliminates the tight surface shape tolerances and significantly reduces the complexity of the control architecture faced by conventional large reflecting apertures [9,10]. Several ongoing missions equipped with the diffractive membrane elements include the “Eyeglass” telescope mission, the Membrane Optical Imager Real-time Exploitation (MOIRE) mission and the FalconSat-7 mission [9,11,12,13]. However, the inherent spectral dispersion and wavefront distortion of diffractive primary lead to prominent degradations of the diffractive images, e.g., image blurring, image hazing, and color distortion [14,15]. In addition, considering the long optical distance of GEO observation, the wavelength correlation of atmospheric scattering would also cause image chromatic aberration.
Chinese next generation GEO remote sensing satellites plan to equip the diffractive membrane optical system for earth observation with ultra-high spatial resolution. Ground experiments have been carried out for technique verification using a membrane camera with an 80 mm aperture (Φ80) primary. Image degradations mentioned above also appear in the image data. Particularly, without the chromatic aberration correction (CAC) system, the color distortions of data are severe. For the development of real-time processing CAC system and the follow-up applications of remote sensing analyses, an effective and efficient color restoration algorithm is in the imperative demand.
Previous studies of image restoration for diffractive imaging mainly focused on improving image definition using single-band data [14,16,17]. Image color restoration, or white balance, has been extensively studied and applied to the conventional reflection optical images [18]. Classic and efficient color restoration algorithms include the gray world algorithm (GWA), the perfect reflection method (PRM), and the white patch Retinex (WPR). [19,20]. These methods attempt to achieve human visual color constancy on the basis of various assumptions. The GWA assumes that the average reflectance in a scene for each channel under a neutral light source should be achromatic [21]. The PRM utilizes the maximum value of each image channel to compensate the color cast [22]. The WPR is based on the Retinex theory, which argues that perceived white is associated with the maximum cone signals [19,23,24]. However, these classic approaches may not satisfactorily work for certain conditions [25]. Thus, several state-of-art white balance methods were developed over the last two decades [25,26,27,28,29,30,31]. Huo et al. proposed a robust automatic white balance (RAWB) algorithm based on the color temperature estimation using extracting gray color points [25]. Limare et al. put forward a simplest color balance (SCB) method by stretching the digital number (DN) values of all channels as much as possible [26]. Recently, Mahmoud Afifi and Michael S. Brown published two novel method of image white balance, i.e., the interactive white balancing (IWB) method [29,31] and the deep white balance editing (DWBE) method, in CIC-2020 and CVPR-2020, respectively [30]. The IWB is on a basis of the nonlinear color-mapping functions [31] and enables an interactive white balance manipulation of user by selecting colors from images [29]. The implementation of DWBE is based on a novel deep learning framework. The above mentioned methods of image color restoration were carefully tried and assessed for the diffractive images with chromatic aberration.
In this letter, we propose an effective and tractable color restoration algorithm for diffractive optical images of membrane camera. According to the definitions of color deviation indexes, the statistic information among band histograms and spectral characteristics of the diffractive images are exploited in the algorithm. It incorporates operations of histogram matching, tailoring, and stretching. Thus we name it the HMTS method. Performances of the approach in color restoration are evaluated qualitatively and quantitatively. Experiments indicate that the proposed method shows tractability and robustness in chromatic aberration correction of diffractive images. This paper provides a technical support for the deployment of future diffractive membrane camera with large aperture at the geostationary orbit and the applications of earth observation.

2. Ground Experiments and Diffractive Image Data

In order to validate the imaging capability of diffractive membrane camera, several ground experiments have been accomplished from September 2017 to October 2019 under the sponsorship of Chinese ministry of science and technology (CMST). A diffractive membrane camera (DMC) with Φ80 mm primary was fabricated and used in experiments, which is shown in Figure 1a. The focal length of the DMC is 639.7 mm. It can achieve color imaging within the spectral range from 0.486 μm to 0.65 μm and at the distances from 100 m to infinity. In lab measurements, the DMC has diffraction efficiency of about 25%. Before the ground experiments, the sensor of the DMC was strictly calibrated in the laboratory to make sure the systematic biases are completely eliminated. The DMC and imaging targets were loaded on a folding-jib overhead working truck or a meteorological observation tower for imaging at various distances and imaging conditions. Figure 1b shows the bar target loaded on the overhead working truck. Figure 1c shows the work scene of using the overhead working truck during the ground experiments at the Huailai remote sensing comprehensive experimental site in Hebei province of China. Figure 1d presents the meteorological tower of the Institute of Atmospheric Physics, Chinese Academy of Sciences, used in the third ground experiment for loading the DMC. Synchronously, a traditional lens with the focal length of 647 mm and the Φ80 mm aperture was also employed for reference imaging. The reference camera used the same back-end image processing system as the DMC. Particularly, in the third ground experiments, to simulate the imaging condition on the GEO orbit, the DMC was loaded on the meteorological tower, which is more than 300 m high. The targets were placed on the ground. The meteorological tower can only support one researcher to work onboard. Considering the weight of camera systems and manual operations at high elevation, it was not safe to frequently switch operations between the DMC and the reference camera. As a result, very few reference images were synchronously obtained in the third ground experiments.
During the ground experiments, more than 20,000 diffractive images were obtained with various imaging parameter settings. Basically, the obtained diffractive images can be divided into two categories: The images of artificial targets (including various bar and resolution targets, toy models, and printed images) and the images of natural scenes (random scenes around the experimental site). Based on these image data, the imaging quality of the DMC was assessed using the spatial resolution, the modulation transfer function (MTF), and the signal to noise ratio (SNR), etc. Numerical results indicate that the image degradations appear in the single-band DMC images, which are expected and similar to the MOIRE mission [9]. Previous studies have suggested various empirical and physical-based restoration methods for diffractive images [14,15]. After implementing certain restoration approaches to the degraded single-band images, we found the qualities of DMC images basically meet the mission requirements in terms of the spatial resolution, the MTF and the SNR. However, due to the missing of chromatic aberration correction system of the DMC, serious color distortions are observed in the color diffractive images. As interpreted previously, this is resulted from the spectral distortion of diffractive primary and the wavelength correlation of atmospheric scattering. The chromatic aberration can be up to larger than 60%. Particularly, in the third ground experiment, because of the misuse of optical filter, all produced images present greenish color. Figure 2 shows two color distorted images obtained in ground experiments and the corresponding histograms. It is seen that the RGB histograms remarkably disperse in the grayscale. Notably, in the third ground experiment, because of the misuse of optical filter, all produced images present greenish color. Therefore, such color deviations can be seen as the superimposition of artificial distortion and inherent chromatic aberration of diffractive imaging. The greenish distortion has dependency of imaging circumstance which cannot be systematically removed. In addition, the atmospheric scattering and relatively low diffraction efficiency also cause the hazing effect of images so that the image visual quality would be diminished. Therefore, it is significant to perform color restoration to the diffractive images before remote sensing applications.

3. Methodology

In this section, we are presenting a color restoration algorithm for the diffractive images with color distortion. The aim of the color restoration is to make the images in conformity with the human visual system, so as to improve their interpretation value. Here we firstly introduce several color deviation indexes as the theoretical basis and evaluation criteria of the image color restoration.

3.1. Color Deviation Indexes

3.1.1. Mean Dispersion Index

The classic gray world assumption argues that the averages of digital number (DN) values over the entire image of R, G, and B channels are nearly equivalent. Thus, based on the GWA hypothesis, we propose a mean dispersion index (MDI) to evaluate the image color deviation. For an image with no color deviation, the GWA considers the differences among the mean values of DN for various channels are much smaller than the mean value of all DNs of the image [19]. Thus, we define the MDI as
MDI = | I R I S - 1 3 | + | I G I S - 1 3 | + | I B I S - 1 3 |
where I represents the DN value at each pixel. I R , I G and I B represent the mean DN values of each channel in the RGB color space. I S represents the summation of the mean DN values of all three channels. They are defined as
I k = 1 M i = 1 M I k [ i ] k = R , G , B
I S = I R + I G + I B
where M denotes the number of image pixels. From the definition, one can see the MDI is normalized between 0 and 1. For a color image, the closer the MDI is to 0, the less color deviation the image has.

3.1.2. Histogram Overlap Area

Histogram overlap area (HOA) is a commonly used index for evaluating the color deviation of image [32]. It describes the consistency of three channels in RGB color space and has the definition as
HOA = i = 0 255 min ( h R [ i ] , h G [ i ] , h B [ i ] )
with
h k [ i ] = H k [ i ] M k = R , G , B
H k [ i ] = n u m b e r ( I k = i ) k = R , G , B
where h k and H k represent the probability density histogram and histogram of the corresponding channel, respectively. n u m b e r ( ) represents the statistic function for counting times. Unlike the MDI, an image with the HOA close to 1 has less color deviation.

3.1.3. CIEDE2000 Chromatic Aberration Coefficient

The CIEDE2000 is the newest standard chromatic aberration coefficient proposed by the international commission on illumination (CIE). It can quantitatively evaluate the difference between two colors with high accuracy. Moreover, the evaluation result is closer to the human visual system than previous coefficients. Thus, we employ the CIEDE2000 to assess the color restoration effect of the proposed algorithm. The computation of the CIEDE2000 is summarized in [33].

3.2. The HMTS Algorithm

To address the chromatic aberration issue of the DMC images as shown in Figure 2, we propose a color restoration algorithm framework in this letter. It incorporates the operations of matching, tailoring, and non-linearly stretching the image histograms. Thus, we name it the HMTS algorithm. The main idea of this algorithm is to exploit the spectral and statistic information of diffractive images to achieve large HOA and small MDI of the restored images. Specifically, the histogram matching (HM) is to improve the channel consistency that complies with the gray world assumption and human visual sense. Tailoring and stretching (TS) of channel histograms can suppress the noises induced by the HM and improve the image contrast. The implementing steps of the HMTS algorithm are listed as follows.
  • Input the diffractive image with color deviation; and
  • Compute the mean DN values of each channel (i.e., I R , I G , and I B ) using (2).
  • Perform histogram matching, with selecting the channel with medium average as a reference, to the other two channels. Histogram matching is a commonly used method for image enhancement by matching the image histogram to a reference image. In this work, the inter-channel HM can significantly eliminate the luminance deviations among channels of diffractive images. Since the HM is a well-developed algorithm of image processing, one can refer to [18] for the details of HM. For the sake of simplicity, here we show the results of inter-channel HM for one of the test images shown in Figure 3. From the figure, one can see that the brightness of the original blue channel is significantly increased after the HM to the red channel. The histogram of blue channel is reformulated as that of the red one. However, as shown in Figure 3, it is also noted that the image noise of blue channel is also enlarged due to the HM operation. This phenomenon was also reported in [18]. These noises generally have very large or very small DN values. In other words, they concentrate at two ends of the grayscale histogram. A typical way to remove the image noises is to perform filtering. However, image filtering would be time consuming, particularly for the remote sensing images. Considering our motivation of supporting the real-time chromatic aberration correction system for the membrane diffractive camera, we utilize the histogram tailoring at two histogram ends to efficiently suppress the noise. Then, for the tailored histogram, we implement a non-linear histogram stretching procedure to improve the image contrast induced by the hazing effect of diffractive imaging. The details of histogram tailoring and stretching are illustrated in the following step.
  • Tailor and non-linearly stretch the histogram of each channel using
    g k ( x , y ) = { 0 , 0 < f k ( x , y ) < I k , b m H i s t o g r a m S t r e t c h i n g , I k , b m f k ( x , y ) I k , t p k = R , G , B 255 , I k , t p < f k ( x , y ) < 255
    where f k ( x , y ) and g k ( x , y ) denote the DN of pixel at ( x , y ) of original and restored images for the corresponding channel. I k , b m and I k , t p are the bottom and top DN cutoffs which are generally determined by the cutting percentage between 0.01% and 2%. Here the cutting percentage is defined as the number of pixels with DN smaller than I k , b m and larger than I k , t p divided by the total pixel numbers. Thus, the DN cutoffs can be given by the gray cumulative histogram for a specific cutting percentage. For the non-linear histogram stretching of the tailored histogram between DN cutoffs, we follow [34] and use the quadratic transformation function to stretch the histogram, which is described as follows
    I k , s t r . ( I k ) = α I k 2 + β I k + γ k = R , G , B
    where I k , s t r . denotes the new DN value after gray transformation. α, β and γ are the coefficients which can be computed following
    { I k , s t r . ( I k , b m ) = α I k , b m 2 + β I k , b m + γ = 0 I k , s t r . ( I k , m n ) = α I k , m n 2 + β I k , m n + γ = η 255 2 + ( 1 η ) I k , m n I k , s t r . ( I k , t p ) = α I k , t p 2 + β I k , t p + γ = 255
    [ α β γ ] = [ I k , b m 2 I k , b m 1 I k , m n 2 I k , m n 1 I k , t p 2 I k , t p 1 ] 1 [ 0 η 255 2 + ( 1 η ) I k , m n 255 ]
    where I k , m n is the mean DN value of corresponding channel. η [ 0 , 1 ] is the brightness weight which is selected at 0.4 in this study.
    There are two reasons to account for the process of histogram tailoring and stretching. Firstly, in the previous step of histogram matching, the image noise of the matching channel could be enlarged by matching to the reference channel. Those noise signals typically have extreme values. Thus, histogram tailoring at the two ends of grayscale histogram would be conducive to suppress the noise. Secondly, implementing the histogram tailoring also aims to improve the image contrast and definition to remove the hazing effect. From (7), it is noted that both histogram tailoring and stretching depend on the bottom and top DN cutoffs of histogram, i.e., I k , b m and I k , t p . The selection of the DN cutoffs determines the final color restoration effects. Theoretically, removing the hazing effect would increase the image depth of filed so that the image information would be increased. However, on the other hand, too much histogram tailoring would reduce the image information. Thus, the basic principle of selecting the histogram cutoffs is to reduce image noise and enhance the contrast to largest extent with losing less image information. As a result, we repeatedly perform histogram tailoring and non-linear stretching with various cutoffs to obtain the image with maximum information. The initial cutting percentage is given as 0.01%. Particularly, the information entropy (IE) is employed to evaluate the image information and be a stopping criterion. The IE is defined as [35]
    E = i = 0 255 p i log 2 p i
    where p i denotes the probability of grayscale i, which can be calculated from the image histogram (see (5) and (6)) as follows
    p i = k n u m b e r ( I k = i ) 3 × M k = R , G , B
    By repeatedly performing this procedure of histogram tailoring and non-linear stretching using various cutting percentages in a grid one by one, the image with largest IE would be obtained. The corresponding cutting percentage can be used to remove noises but retain the useful information. The detailed algorithm of histogram tailoring and stretching is described as follows (Algorithm 1).
    Algorithm 1. Histogram tailoring and stretching.
    Inputs: diffractive image after inter-channel HM,
         initial cutting percentage cp = 0.1%,
         initial information entropy ie = 0;
    Outputs: color restored image.
    1: for iteration i do
    2:  if cp ≤ 2% & ie(i) ≥ ie(i-1) do
    3:   compute gray cumulative histograms (GCH) and I k , m n of each channel;
    4:   compute I k , b m and I k , t p using cp and GCH for each channel;
    5:   compute α, β and γ using (10);
    6:   do gray transformation using (9);
    7:   compute image information entropy ie;
    8:   cpcp+0.1%;
    9:  else
    10:   break;
    11:  end if
    12: end for
    13: return color restored image
  • Output the restored image.

4. Results and Evaluations

In Figure 4, the proposed HMTS algorithm is implemented to the color distorted DMC images. The results are also compared to those using the GWA, the PRM, the WPR, the RAWB, the SCB, the IWB, and the DWBE. In particular, the auto white-balance correction module in the IWB was practically utilized. The DWBE was trained on the dataset which is available on http://cvil.eecs.yorku.ca/projects/public_html/sRGB_WB_correction/dataset.html (accessed on 1 February 2021). To comprehensively assess the performances of various methods in restoring the chromatic aberration of diffractive images, total 8 test diffractive images are used which involve all three ground experiments. For a fair and objective evaluation, we also present the corresponding reference images. However, for the test image 1 (warning board in the field) obtained in the first experiment, the reference camera unfortunately failed to follow up and capture image. In addition, as introduced in Section 2, due to the limitation of experimental environment, very few reference images were synchronously obtained in the third ground experiments. Thus, for those diffractive images with no reference images obtained, we used other images to show the original color of targets. Notably, the reference image is the digital image for printing, in which the colors are slightly different from those in the printed target for imaging. Table 1 presents the MDI and HOA indexes of original images and color restored images using various algorithms, which are presented in blue and red, respectively. Particularly, for the convenience of evaluation and discussion, the smallest two MDIs and the largest two HOAs are emphasized in black bold in Table 1.
From Figure 4, one could see that the proposed HMTS method in general has the best color restoration performance for the test diffractive images. The image visual effects are close to the reference images. This is also illustrated in Table 1 that the results of HMTS basically have smallest MDIs and largest HOAs. In addition, with considering the hazing effect due to low diffraction efficiency and atmospheric scattering, the restored images of HMTS present large image contrast and depth of field. For the test images 2 and 3, considering the cloudy and haze weather of the imaging circumstance (note this in the reference images), the diffractive images after color restoration are slightly overexposed due to the sky background. For the comparison methods, the performances of the GWA method are barely satisfactory. The chromatic aberrations in test images 1, 6, 7, and 8 are mostly removed by the GWA, which are also indicated by the color deviation indexes shown in Table 1. Notably, since the definition of MDI index obeys the basic assumption of the GWA, all the restored images using the GWA have very small MDI. Based on the visual evaluation and the quantitative indexes, the PRM and WPR also have relatively good performances for test images 1 and 7 and test images 4, 7, and 8, respectively. The RAWB method yields acceptable results for test images 1 and 4. However, it almost gives no improvements to the diffractive images with greenish color deviation obtained in the third ground experiment. For the SCB method, since it stretches the histograms of all channels as much as it can be, all the restored images have very large contrasts. In terms of the MDI, the SCB algorithm has relatively good performance for test images 6, 7 and 8. The IWB and DWBE also have different performances on various test images. It is seen that the IWB has better color restoration effects on test image 6 and 8, while the DWBE scores on test image 2 and 3. In addition, for the test image 5 (the white tower), it should be note that almost all comparison methods fail to give satisfying results. According to the HOA index, the restored result of test image 5 using the HMTS has much higher HOA than those of other comparison approaches.
In Table 2, we present the CPU time consumption of various algorithms to compare their computational efficiencies. The color restoration experiments are carried out on the device equipped with the Intel Core i7-8700 CPU at 3.20 GHz and 32 GB of memory capacity. According to the average time consumptions, one could see that the WPR is the fastest algorithm among the methods, followed closely by the PRM and our proposed HMTS. The DWBE and the RAWB have lowest computational efficiency. Considering the motivations of good color restoration performance and real-time processing capability for the CAC system development, the proposed HMTS algorithm has reasonable computational efficiency.
To further verify the robustness of the HMTS algorithm, more experiments are conducted on the diffractive images obtained from the third ground experiments with serious color deviation. In Figure 5, the HMTS algorithm is implemented to various resolution targets and image targets of ocean scenes and land scenes. The corresponding MDI and HOA of images are computed as well. By comparing the image before and after processing, one can see that the color of the diffractive images are well restored and the image visual effects are significantly improved. It is also noted that the HOAs increase and the MDIs decrease for all test images after implementing the HMTS algorithm. Considering all experimental images shown in Figure 4 and Figure 5 and taking the statistical averages of the color deviation indexes, it is found that the average MDI decreases from 0.349 to 0.037, and the average HOA increases from 0.154 to 0.776, after implementing the color restoration with HMTS. These indicate that the HMTS algorithm has good performance and strong robustness of color restoration for diffractive images of various scenes and targets.
Moreover, we utilize a standard color-checker (SCC) chart produced by the X-Rite to further assess the effects of the HMTS in color restoration. The CIEDE2000 coefficients are computed referring to the SSC image taken by the reference camera. Moreover, several comparison methods, e.g., the GWA, the WPR, the SCB and the IWB, which have relatively good color restoration performances for the images obtained in the third ground experiment, are also taken into comparisons. In Table 3, comparisons of the SCC images and the corresponding CIEDE2000 coefficients before and after implementing color restorations are presented. As shown in the table, comparing to the other four methods, the SCC images after implementing the HMTS have closer visual effects to those taken by reference camera. The mean value of the CIEDE 2000 coefficients of SCC images after color restoration using the HMTS is only about one seventh of that before color restoration, which is the smallest one among the comparison methods. This indicates that about 85% of the color deviations in diffractive images have been removed.

5. Conclusions

This letter presents a color restoration algorithm, the HMTS, for the diffractive optical images of membrane camera, which is planned to be equipped on Chinese next generation GEO remote sensing satellites. The HMTS algorithm incorporates steps of matching, tailoring and non-linearly stretching image histograms to achieve color restoration. Experimental results show that the restored images using the proposed algorithm comply with human visual sense well. The color deviation indexes also quantitatively validate the good performance and robustness of the HMTS in restoring the chromatic aberration of diffractive images. In addition, the proposed method has reasonable computational efficiency. Note that the HMTS method has the theoretical basis of channel consistency, which indicates it is a global automatic white balance (AWB) method. Thus, it would be invalid for the images with homogenous scenes which have quite different spectral properties at different channels. In fact, as stated in [25], this can be seen as an intrinsic limit of the global AWB methods for the color distorted images dominated by only one or two colors. Yet, this situation is less likely to happen for the earth observation of GEO satellite, because the wide imaging range would include diverse targets and scenes. Currently, with the development of the CAC system, this study would provide the technical support for the deployment of future GEO diffractive optical system and the relevant remote sensing applications.

Author Contributions

Conceptualization, Y.D. and X.Y.; methodology, Y.D. and Y.M.; investigation, Y.D. and Y.M.; data curation, Y.M.; writing—original draft preparation, Y.D.; writing—review and editing, Y.D., X.Y., Y.M., and C.X.; funding acquisition, X.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded in part by The National Key R&D Program of China, grant number 2016YFB0500200 and 2016YFB0500204, in part by the China Postdoctoral Science Foundation, grant number 2020M680554 and in part by the Open Fund of State Key Laboratory of Remote Sensing Science, grant number OFSLRSS202009.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding authors. The data are not publicly available due to the funding policy.

Acknowledgments

The authors would like to thank the editors and the anonymous reviewers for their constructive comments and suggestions that substantially improves this letter. The codes of the IWB and the DWBE algorithms are provide by Mahmoud Afifi from the York University, which is also available on https://github.com/mahmoudnafifi/Interactive_WB_correction and https://github.com/mahmoudnafifi/Deep_White_Balance.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Flohrer, T.; Schildknecht, T.; Musci, R.; Stoveken, E. Performance estimation for GEO space surveillance. Adv. Space Res. 2005, 35, 1226–1235. [Google Scholar] [CrossRef]
  2. Bonino, L.; Bresciani, F.; Piasini, G.; Pisani, M.; Cabral, A.; Rebordão, J.; Musso, F. An interferometer for high-resolution optical surveillance from GEO—Internal metrology breadboard. In Proceedings of the International Conference on Space Optics 2006, Noordwijk, The Netherlands, 27–30 June 2015. [Google Scholar] [CrossRef] [Green Version]
  3. Whiteaker, L.; Marshalek, K.G.; Domber, R.L. Large Aperture Diffractive Receiver for Deep Space Optical Communications. In Proceedings of the Applications of Lasers for Sensing and Free Space Communications, Arlington, VA, USA, 7–11 June 2015. [Google Scholar] [CrossRef]
  4. Lightsey, P.A.; Atkinson, C.; Clampin, M.; Feinberg, L.D. James Webb Space Telescope: Large deployable cryogenic telescope in space. Opt. Eng. 2012, 51. [Google Scholar] [CrossRef] [Green Version]
  5. Atcheson, P.; Domber, J.; Whiteaker, K.; Britten, J.A.; Dixit, S.N.; Farmer, B. MOIRE—Ground Demonstration of a Large Aperture Diffractive Transmissive Telescope. In Space Telescopes and Instrumentation 2014: Optical, Infrared, and Millimeter Wave; International Society for Optics and Photonics: Bellingham, WA, USA, 2014; Volume 9143. [Google Scholar] [CrossRef]
  6. Jiao, J.C.; Wang, B.H.; Wang, C.; Zhang, Y.; Jin, J.G.; Liu, Z.K.; Su, Y.; Ruan, N.J. Study on high resolution membrane-based diffractive optical imaging on geostationary orbit. Int. Arch. Photogramm. 2017, 42, 371. [Google Scholar] [CrossRef] [Green Version]
  7. Zhang, H.L.; Liu, H.; Xu, W.B.; Lu, Z.W. Large aperture diffractive optical telescope: A review. Opt. Laser Technol. 2020, 130. [Google Scholar] [CrossRef]
  8. Zhao, W.; Wang, X.; Liu, H.; Lu, Z.F.; Lu, Z.W. Development of space-based diffractive telescopes. Front. Inform. Technol. Electron. Eng. 2020, 21, 884–902. [Google Scholar] [CrossRef]
  9. Hyde, R.; Dixit, S.; Weisberg, A.; Rushford, M. Eyeglass: A very large aperture diffractive space telescope. Highly Innov. Space Telesc. Concepts 2002, 4849, 28–39. [Google Scholar] [CrossRef] [Green Version]
  10. Wang, R.Q.; Zhang, Z.Y.; Guo, C.L.; Xue, D.L.; Zhang, X.J. Effects of fabrication errors on diffraction efficiency for a diffractive membrane. Chin. Opt. Lett. 2016, 14. [Google Scholar] [CrossRef]
  11. Atcheson, P.; Stewart, C.; Domber, J.; Whiteaker, K.; Cole, J.; Spuhler, P.; Seltzer, A.; Britten, J.A.; Dixit, S.N.; Farmer, B.; et al. MOIRE—Initial Demonstration of a Transmissive Diffractive Membrane Optic for Large Lightweight Optical Telescopes. In Space Telescopes and Instrumentation 2012: Optical, Infrared, and Millimeter Wave; SPIE: Amsterdam, The Netherlands, 2012; Volume 8442. [Google Scholar] [CrossRef]
  12. Andersen, G.; Asmolova, O.; Dearborn, M.E.; McHarg, M.G. FalconSAT-7: A Membrane Photon Sieve Cube Sat Solar Telescope. In Space Telescopes and Instrumentation 2012: Optical, Infrared, and Millimeter Wave; SPIE: Amsterdam, The Netherlands, 2012; Volume 8442. [Google Scholar] [CrossRef]
  13. Andersen, G.P.; Asmolova, O. FalconSAT-7: A membrane space telescope. In Space Telescopes and Instrumentation 2014: Optical, Infrared, and Millimeter Wave; SPIE: Montréal, QC, Canada, 2014; Volume 9143. [Google Scholar] [CrossRef]
  14. Zhi, X.Y.; Jiang, S.K.; Zhang, W.; Wang, D.W.; Li, Y. Image degradation characteristics and restoration based on regularization for diffractive imaging. Infrared Phys. Technol. 2017, 86, 226–238. [Google Scholar] [CrossRef]
  15. MacEwen, H.A.; Breckinridge, J.B. Large diffractive/refractive apertures for space and airborne telescopes. In Sensors and Systems for Space Applications VI; SPIE: Baltimore, MA, USA, 2013; Volume 8739. [Google Scholar] [CrossRef] [Green Version]
  16. Asmolova, O.; Andersen, G.; Dearborn, M.E.; McHarg, M.G.; Quiller, T.; Dickinson, T.W. Optical testing of a membrane diffractive optic for space-based solar. In Practical Holography XXVIII: Materials and Applications; SPIE: San Francisco, CA, USA, 2014; Volume 9006. [Google Scholar] [CrossRef]
  17. Jiang, S.K.; Zhi, X.Y.; Dong, Y.; Zhang, W.; Wang, D.W. Inversion restoration for space diffractive membrane imaging system. Opt. Laser Eng. 2020, 125. [Google Scholar] [CrossRef]
  18. Gonzalez, R.C.; Woods, R.E. Digital Image Processing; Pearson Education: London, UK, 2011. [Google Scholar]
  19. Barnard, K.; Martin, L.; Coath, A.; Funt, B. A comparison of computational color constancy algorithms—Part II: Experiments with image data. IEEE Trans. Image Process. 2002, 11, 985–996. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Ebner, M. Combining white-patch retinex and the gray world assumption to achieve color constancy for multiple illuminants. Pattern Recognit. Proc. 2003, 2781, 60–67. [Google Scholar]
  21. Gijsenij, A.; Gevers, T.; van de Weijer, J. Computational Color Constancy: Survey and Experiments. IEEE Trans. Image Process. 2011, 20, 2475–2489. [Google Scholar] [CrossRef] [PubMed]
  22. Agarwal, V.; Abidi, B.R.; Koschan, A.; Abidi, M.A. An Overview of Color Constancy Algorithms. J. Pattern Recognit. Res. 2006, 1, 42–54. [Google Scholar] [CrossRef] [Green Version]
  23. Brainard, D.H.; Wandell, B.A. Analysis of the Retinex Theory of Color-Vision. J. Opt. Soc. Am. A 1986, 3, 1651–1661. [Google Scholar] [CrossRef] [PubMed]
  24. Lam, E.Y. Combining gray world and Retinex theory for automatic white balance in digital photography. In Proceedings of the Ninth International Symposium on Consumer Electronics, Macau SAR, China, 14–16 June 2005; pp. 134–139. [Google Scholar] [CrossRef] [Green Version]
  25. Huo, J.Y.; Chang, Y.L.; Wang, J.; Wei, X.X. Robust automatic white balance algorithm using gray color points in images. IEEE Trans. Consum. Electron. 2006, 52, 541–546. [Google Scholar] [CrossRef]
  26. Limare, N.; Lisani, J.-L.; Morel, J.-M.; Petro, A.B.; Sbert, C. Simplest Color Balance. Image Process. Line 2011, 1, 297–315. [Google Scholar] [CrossRef] [Green Version]
  27. Barron, J.T. Convolutional Color Constancy. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015. [Google Scholar] [CrossRef] [Green Version]
  28. Cheng, D.L.; Prasad, D.K.; Brown, M.S. Illuminant estimation for color constancy: Why spatial-domain methods work and the role of the color distribution. J. Opt. Soc. Am. A 2014, 31, 1049–1058. [Google Scholar] [CrossRef] [PubMed]
  29. Afifi, M.; Brown, M.S. Interactive White Balancing for Camera-Rendered Images. arXiv 2009, arXiv:2009.12632. [Google Scholar]
  30. Afifi, M.; Brown, M.S. Deep White-Balance Editing. In Proceedings of the 2020 Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020. [Google Scholar] [CrossRef]
  31. Afifi, M.; Price, B.; Cohen, S.; Brown, M.S. When Color Constancy Goes Wrong: Correcting Improperly White-Balanced Images. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 16–20 June 2019. [Google Scholar] [CrossRef]
  32. Jiang, T.; Nguyen, D.; Kuhnert, K.D. Auto White Balance Using the Coincidence of Chromaticity Histograms. In Proceedings of the 8th International Conference on Signal. Image Technology & Internet Based Systems (Sitis 2012), Sorrento, Italy, 25–29 November 2012; pp. 201–208. [Google Scholar] [CrossRef]
  33. Luo, M.R.; Cui, G.; Rigg, B. The development of the CIE 2000 color-difference formula: CIEDE2000. Color. Res. Appl. 2001, 26, 340–350. [Google Scholar] [CrossRef]
  34. Qiao, K.; Zhi, X.Y.; Jiang, S.K.; Zhang, L.; Yin, Z.K. Image inversion and quality enhancement for space large aperture diffractive imaging system. Opt. Precis. Eng. 2019, 7, 1465–1472. [Google Scholar] [CrossRef]
  35. Gonzalez, R.C.; Woods, R.E.; Eddins, S.L. Digital Image Processing Using MATLAB; Pearson Prentice Hall: Chennai, India, 2004. [Google Scholar]
Figure 1. (a) Φ80 mm diffractive membrane camera. (b) Bar target loaded on the overhead working truck. (c) Work scene of using the overhead working truck and (d) the meteorological tower for loading the diffractive membrane camera (DMC) during the ground experiments.
Figure 1. (a) Φ80 mm diffractive membrane camera. (b) Bar target loaded on the overhead working truck. (c) Work scene of using the overhead working truck and (d) the meteorological tower for loading the diffractive membrane camera (DMC) during the ground experiments.
Sensors 21 01053 g001
Figure 2. Color distorted images of the DMC and the corresponding histograms. (a) Test image 1. (b) Test image 2.
Figure 2. Color distorted images of the DMC and the corresponding histograms. (a) Test image 1. (b) Test image 2.
Sensors 21 01053 g002
Figure 3. Histogram matching between channels of the diffractive image with chromatic aberration.
Figure 3. Histogram matching between channels of the diffractive image with chromatic aberration.
Sensors 21 01053 g003
Figure 4. Comparison of color restoration effect of various algorithms for diffractive images.
Figure 4. Comparison of color restoration effect of various algorithms for diffractive images.
Sensors 21 01053 g004aSensors 21 01053 g004b
Figure 5. Results of the proposed HMTS for diffractive images with chromatic aberration in the third ground experiment. Blue and red numbers denote the MDI and HOA of the corresponding images.
Figure 5. Results of the proposed HMTS for diffractive images with chromatic aberration in the third ground experiment. Blue and red numbers denote the MDI and HOA of the corresponding images.
Sensors 21 01053 g005
Table 1. Comparison of the mean dispersion index (MDI) and histogram overlap area (HOA) of color restoration images using various algorithms. Blue and red numbers denote the MDIs and HOAs of the corresponding results with the smallest two MDIs and the largest two HOAs being emphasized in black bold.
Table 1. Comparison of the mean dispersion index (MDI) and histogram overlap area (HOA) of color restoration images using various algorithms. Blue and red numbers denote the MDIs and HOAs of the corresponding results with the smallest two MDIs and the largest two HOAs being emphasized in black bold.
Test Image 1Test Image 2Test Image 3Test Image 4Test Image 5Test Image 6Test Image 7Test Image 8
(MDI/HOA)
Original image0.172/0.1320.055/0.3660.044/0.2990.217/0.1660.450/3.976× 10−60.449/0.0120.473/0.1860.411/0.012
GWA2.887 × 10−4/0.6991.887 × 10−4/0.4150.002/0.4316.948 × 10−4/0.1610.003/0.5942.803 × 10−4/0.5494.980 × 10−4/0.8645.080 × 10−4/0.556
PRM0.016/0.6070.025/0.4390.033/0.3800.067/0.2300.128/0.1710.121/0.2370.035/0.8310.044/0.403
WPR0.045/0.4560.022/0.4840.037/0.4330.090/0.3890.070/0.3350.047/0.3020.022/0.8370.029/0.619
RAWB0.017/0.7790.005/0.3480.005/0.3690.088/0.3720.450/3.975 × 10−60.450/0.0120.474/0.1850.411/0.012
SCB0.159/0.1060.059/0.1390.084/0.1220.129/0.0830.205/0.0500.027/0.1050.031/0.0550.036/0.075
IWB0.107/0.2360.061/0.4060.052/0.2980.110/0.2690.103/0.2800.010/0.4160.083/0.6800.047/0.353
DWBE0.088/0.2880.013/0.4480.011/0.5180.145/0.1540.167/0.0870.188/0.1120.110/0.6700.330/0.171
HMTS0.005/0.9239.301 × 10−4/0.7690.002/0.7810.043/0.3850.029/0.8470.037/0.6630.006/0.9040.001/0.862
Reference image/0.015/0.6820.008/0.6340.066/0.193/0.017/0.4320.004/0.887/
Table 2. Comparison of the CPU time consumptions of various algorithms in color restorations for the experimental diffractive images.
Table 2. Comparison of the CPU time consumptions of various algorithms in color restorations for the experimental diffractive images.
Test Image 1Test Image 2Test Image 3Test Image 4Test Image 5Test Image 6Test Image 7Test Image 8Average
GWA0.890.610.670.800.830.700.670.700.73
PRM0.270.170.140.550.330.190.160.380.27
WPR0.080.010.020.080.090.020.020.030.04
RAWB7.343.684.525.054.202.522.382.263.99
SCB2.110.550.561.531.981.030.980.801.19
IWB1.750.830.801.031.200.910.901.281.08
DWBE7.201.601.564.014.963.183.085.053.83
HMTS0.690.040.140.590.670.330.220.230.36
Table 3. Comparisons of the colorimetric chart images and the corresponding CIEDE2000 coefficients before and after implementing color restoration using various methods.
Table 3. Comparisons of the colorimetric chart images and the corresponding CIEDE2000 coefficients before and after implementing color restoration using various methods.
SCC Images (Reference Camera)Before Color RestorationAfter Color Restoration
SCC ImagesCIEDE 2000GWAWPRSCBIWBHMTS
SCC ImagesCIEDE 2000SCC ImagesCIEDE 2000SCC ImagesCIEDE 2000SCC ImagesCIEDE 2000SCC ImagesCIEDE 2000
1 Sensors 21 01053 i001 Sensors 21 01053 i00226.4894 Sensors 21 01053 i0039.7787 Sensors 21 01053 i0042.4682 Sensors 21 01053 i0052.4583 Sensors 21 01053 i00613.6672 Sensors 21 01053 i0071.3466
2 Sensors 21 01053 i008 Sensors 21 01053 i00926.9623 Sensors 21 01053 i0109.0362 Sensors 21 01053 i01126.6412 Sensors 21 01053 i01216.3484 Sensors 21 01053 i01311.4082 Sensors 21 01053 i0146.8833
3 Sensors 21 01053 i015 Sensors 21 01053 i01618.6755 Sensors 21 01053 i0177.5564 Sensors 21 01053 i01818.0491 Sensors 21 01053 i0198.1791 Sensors 21 01053 i0202.4199 Sensors 21 01053 i0212.7182
4 Sensors 21 01053 i022 Sensors 21 01053 i02331.0801 Sensors 21 01053 i0246.3244 Sensors 21 01053 i02518.6522 Sensors 21 01053 i0268.0410 Sensors 21 01053 i02710.7433 Sensors 21 01053 i0281.5030
5 Sensors 21 01053 i029 Sensors 21 01053 i03035.8459 Sensors 21 01053 i03112.3313 Sensors 21 01053 i03226.3896 Sensors 21 01053 i03313.7142 Sensors 21 01053 i03417.8162 Sensors 21 01053 i0356.2752
6 Sensors 21 01053 i036 Sensors 21 01053 i03733.9223 Sensors 21 01053 i0389.0784 Sensors 21 01053 i03917.9978 Sensors 21 01053 i0407.6340 Sensors 21 01053 i0419.0625 Sensors 21 01053 i0423.5805
7 Sensors 21 01053 i043 Sensors 21 01053 i04428.3517 Sensors 21 01053 i0456.5519 Sensors 21 01053 i04625.3978 Sensors 21 01053 i04713.9728 Sensors 21 01053 i04814.0102 Sensors 21 01053 i0497.9623
Mean Value//28.7610/8.6653/19.3708/10.0497/11.3039/4.3242
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Du, Y.; Yang, X.; Ma, Y.; Xu, C. A Color Restoration Algorithm for Diffractive Optical Images of Membrane Camera. Sensors 2021, 21, 1053. https://doi.org/10.3390/s21041053

AMA Style

Du Y, Yang X, Ma Y, Xu C. A Color Restoration Algorithm for Diffractive Optical Images of Membrane Camera. Sensors. 2021; 21(4):1053. https://doi.org/10.3390/s21041053

Chicago/Turabian Style

Du, Yanlei, Xiaofeng Yang, Yiping Ma, and Chunxue Xu. 2021. "A Color Restoration Algorithm for Diffractive Optical Images of Membrane Camera" Sensors 21, no. 4: 1053. https://doi.org/10.3390/s21041053

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop