Next Article in Journal
Research on the Multi-Point Gravity Balance Method for the Large-Aperture Space Camera in the Imaging Quality Test
Previous Article in Journal
The Design of a Multilayer and Planar Metamaterial with the Multi-Functions of a High-Absorptivity and Ultra-Broadband Absorber and a Narrowband Sensor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hierarchical Feature Enhancement Algorithm for Multispectral Infrared Images of Dark and Weak Targets

1
School of Optoelectronic Engineering, Changchun University of Science and Technology, Changchun 130022, China
2
Jilin Provincial Key Laboratory of Space Optoelectronics Technology, Changchun 130022, China
*
Author to whom correspondence should be addressed.
Photonics 2023, 10(7), 805; https://doi.org/10.3390/photonics10070805
Submission received: 6 June 2023 / Revised: 27 June 2023 / Accepted: 7 July 2023 / Published: 11 July 2023

Abstract

:
A multispectral infrared zoom optical system design and a single-frame hierarchical guided filtering image enhancement algorithm are proposed to address the technical problems of low contrast, blurred edges, and weak signal strength of single-spectrum infrared imaging of faint targets, which are easily drowned out by noise. The multispectral infrared zoom optical system, based on the theory of complex achromatic and mechanical positive group compensation, can simultaneously acquire multispectral image information for faint targets. The single-frame hierarchical guided filtering image enhancement algorithm, which extracts the background features and detailed features of faint targets in a hierarchical manner and then weights fusion, effectively enhances the target and suppresses the interference of complex background and noise. Solving multi-frame processing increases data storage and real-time challenges. The experimental verification of the optical system design and image enhancement algorithm proposed in this paper separately verified that the experimental enhancement was significant, with the combined use improving Mean Square Error (MSE) by 14.32, Signal-Noise Ratio (SNR) by 11.64, Peak Signal-to-Noise Ratio (PSNR) by 12.78, and Structural Similarity (SSIM) by 14.0% compared to guided filtering. This research lays the theoretical foundation for the research of infrared detection and tracking technology for clusters of faint targets.

1. Introduction

Passive infrared thermal imaging technology does not require a radiation source and is mainly based on the temperature difference response, where the background infrared radiation is absorbed by the gas and produces an energy difference with the surrounding background, which can be imaged directly through the detector [1]. Single-spectrum infrared detection can obtain relatively little information about the target; the infrared image contrast is low, the target details are not obvious, the target is easily drowned out by noise and background information, etc. Therefore, multispectral high-sensitivity infrared target detection methods and infrared image enhancement algorithms have gradually become a research focus around the world.
Multispectral infrared optical systems combine two wavelengths, the mid-wave infrared and the long-wave infrared, in the same channel, giving the detector more detection channels, more comprehensive and accurate information on faint targets, higher sensitivity, and a more compact system structure. Infrared optical systems are mostly fixed-focus systems due to the special characteristics of the material and the detector itself, or the focal length of the system can only be switched between a few fixed focal lengths, resulting in discontinuous changes in the field of view [2]. This is why the study of continuous-zoom optical systems in the infrared is of particular importance. The combination of optical system design and the application of infrared image enhancement algorithms can effectively improve the detection and identification of faint targets and solve the problems of easy concealment and difficult identification of faint targets in the detection process [3].
Dmitry Reshidko et al. designed a dual-field co-optical zoom optical system using a cooled wide-band (1–5 m) focal plane detector in order to achieve simultaneous imaging of multiple spectral bands [4]. Yang et al. designed an uncooled infrared dual-band continuous zoom optical system, introducing diffractive surfaces and an even aspheric surface for aberration correction, using a common aperture to fuse infrared mid-wave and infrared long-wave into the same optical path, and then imaging the mid-wave and long-wave separately through a beam splitter [5]. Yang-Yang Su et al. designed a cooled IR dual-band zoom optical system using a secondary imaging approach to achieve 100% cold diaphragm efficiency and aberration correction for the mid- and long-wave bands using a common optical path imaging approach [6]. Deng Chunhua et al. [7] proposed an infrared image enhancement algorithm with low frequency redistribution and edge enhancement to effectively improve the contrast and detail smoothing of infrared images. Ge Peng et al. [8] proposed an infrared image detail enhancement algorithm based on guided filter image layering, which has better edge retention and effectively avoids the defect of “pseudo-edge” after enhancement. However, no research has been conducted on the design of uncooled multispectral infrared optical systems with the combined application of image enhancement algorithms.
This paper addresses the technical challenges of low contrast, blurred edges, and weak signal strength in IR images acquired from faint targets in complex backgrounds, the high false alarm rate of single-frame detection algorithms, and the increased data storage and computational effort of multi-frame processing [9,10]. In this paper, we propose a multispectral infrared zoom optical system and a single-frame hierarchical guided filter image enhancement algorithm. Through the design of the optical system to obtain more effective information about the target, the algorithm improves the overall detection and recognition effect. The technical problems of difficult identification of faint targets, blurred edges, weak signal strength, and low contrast between target and background are solved, and the multi-frame processing increases the storage capacity of data and solves real-time problems.
This paper will discuss the research in detail in five parts. Firstly, in the introduction, the research problem, motivation, and value of the research are described in detail, and the common infrared optical structure of the target, the target enhancement algorithm, and problem solving are briefly introduced. Secondly, the theoretical recommendations and workflow of the algorithm are discussed in methods. Again, the effect of combining optical structure design and algorithm application is discussed in detail in simulations and experiments. The proposed method is then experimentally validated in the experimental section and analyzed in comparison with current algorithms with ideal processing results. Finally, a conclusion is obtained.

2. Methods

2.1. Model Overview

The algorithm model studied in this paper is for a multispectral single-frame infrared image processing algorithm, and its model principal structure is shown in Figure 1. Firstly, the acquired infrared image information is layered by bit plane to obtain the image information of the background and target on different layers. Secondly, an improved layered, guided filtering algorithm is used to extract features from the background and target details, and the effective target information components are obtained through the target feature operator. Again, the target detail feature information after layering is fused with the higher-level image information, and the fused image is image enhanced. Finally, the background feature information is weighted and fused with the enhanced image detail information to achieve the enhancement effect on the dark and weak targets.

2.2. Improved Bootstrap Filtering Algorithm

When the target is far away from the detector, the target presents dark and weak characteristics, and at the same time, in the complex background, the noise has a greater impact on the image detail information. The amount of target information obtained through the ordinary infrared lens is relatively small. When using a single frame image for processing, the useful information of the target is drowned out in the complex background information or noise information. To a certain extent, it is more difficult to obtain more feature information about the target. The improved hierarchical guided filtering algorithm proposed in this paper is based on the theory of infrared image hierarchy to obtain the target background feature information and target detail feature information and use the guided filtering algorithm to extract and fuse the hierarchical feature information. The detailed algorithm flow is shown in Figure 2.
The principal derivation of the improved hierarchical bootstrap filtering algorithm proceeds as follows:
p i = q i + e i + i = 4 7 m i
where: p i original image, q i image profile information, e i image detail and noise information, i = 4 7 m i current frame high level target detail information.
The infrared raw image data bit depth is processed in layers, the lower four layers contain more background feature information, the higher four layers contain more target detail feature information, respectively, for each layer according to different target feature operators for target feature extraction.
i = 0 7 q i = a i = 0 7 I i + b
where p i is the input image, I i is the bootstrap image and q i is the filtered output image. The bootstrap filter defines a linear filtering process as follows, where the resulting filtered output is a weighted average for pixels at position i:
q i = j W i j ( I ) p j
where i and j denote the pixel subscripts, respectively. W i j is the filter kernel associated with the bootstrap image I only. The filter is linear with respect to p j . An important assumption of the guided filter is that there is a local linear relationship between the output image q i and the guided image I i over the filter window W k :
q i = a k I i + b k ,   i w k
a k and b k are the coefficients, and w k is the window. For a deterministic window W k with radius r, ( a k , b k ) will also be the only deterministic constant coefficients. To ensure the effectiveness of the layered bootstrap filtering on the image, the minimum value of r is taken to be 85 μm for a medium-length wave detector with image element size of 17 μm × 17 μm. When r = 85 μm, the minimum value of pixels involved in the calculation is 96. We now have to find a solution that minimizes the difference between the output graph q and the difference between the input graph p. The cost function is defined as follows:
E ( a k , b k ) = i w k ( ( a k I i + b k p i ) 2 + ε a k 2 )
Here, ε is the regularization parameter that we minimize for the cost function.
E a k = 2 i N ( ( p i + a k I i + b k ) I i + ε a k )
b k = p k a k u k
where p k and u k are the mean values within the P and I windows, respectively. Obtaining b k , we then solve for a k .
2 i N ( ( p i + a k I i + b k ) I i + ε a k ) = 0
The final calculation gives:
a k = C o v ( p , I ) V a r ( I ) + N ε = C o v ( p , I ) σ k 2 + ε
The improved hierarchical bootstrap filter algorithm in this paper, where the regularization parameter ε is used as an adaptive operator, analyzes the two limit variations. When ε = 0 , a = 1 , and b = 0 are the minimum solutions of E ( a k , b k ) , the bootstrap filter has no effect, and the input is the output. When ε > 0 , a 0 , and b = p ¯ k correspond to a weighted mean filter. When ε varies considerably, a 1 and b 0 have the desired effect on image edge preservation and are equivalent to a weighted mean filter. The window ω k size remains unchanged when ε changes itself within the defined area, with the desired effect on image edge and detail processing. The algorithm was tested in a PC with an Intel core i7 1.6 GHz CPU and 16 GB RAM for runtime, by acquiring 1000 sets of infrared greyscale images for filtering, with the mean bootstrap filter being approximately 60 ms/Mp when I q and 32 ms/Mp when I q .

2.3. Spatial–Temporal Feature Fusion and Siamese Network Learning

(1) Morphological image enhancement algorithms.
Morphological image enhancement algorithms are mainly based on the use of a combination of two methods, expansion and erosion, to achieve effective extraction of target feature information and noise removal. Expansion is the process of merging all background points in contact with an object into that object, causing the boundary to expand to the outside. With inflation, small holes in the image and small depressions at the edges of the image can be filled. Erosion and expansion are dyadic operations. Erosion is the process of eliminating boundary points, causing the boundary to shrink inward. Using the erosion operation, small and meaningless objects can be eliminated [11].
Suppose that there are two objects participating in the operation: the image A (the target of interest) and the set of structures B, which is called the structure element. The structure element is usually a disk, but it can actually be any shape.
Let A and B be subsets of Z 2 . Then the image A is translated by a distance along the vector x and is denoted A + x or A x , which is defined by:
A x = { c : c = a + x , a A }
The mapping of the structure element B to B or B ^ is defined as:
B ^ = { x : x = b , b B }
The complement of A is denoted as A ¯ or A c and is defined as:
A c = { x : x A }
The expansion of the structural element B on the image A, denoted A B :
A B = { x : B ^ x A }
The set A is corrupted by the structural element B, denoted as A Θ B :
A Θ B = { x : B x A }
(2) Image denoising and enhancement algorithms.
Infrared images contain a large amount of noise information. This paper uses a bilateral filtering algorithm to better maintain the edge information of the image target, estimate the target pixel gray value by using the spatial proximity function as the spatial domain weight factor, ensure the target image flat area gray value information, and strengthen the target edge information to achieve image smoothing and noise removal performance. The histogram equalization enhancement algorithm is used to enhance the contrast between target and background pixels in the image and improve the dynamic difference of the gray value between pixels in the image [12,13]. Finally, the enhanced image is sharpened by gamma transformation and Laplace to make the linear response of the image exposure intensity closer to the response perceived by the human eye and to improve the image contrast.

2.4. Multi-Layer Image Fusion Algorithm

By layering a single frame of an infrared image, two image components are obtained: the target feature information component E ( x , y ) based on background feature enhancement, the background component B ( x , y ) based on target feature, and the detail component D ( x , y ) . The two image information components acquired in the foreground are combined with appropriate fusion coefficients:
R ( x , y ) = β 1 B ( x , y ) + β 2 D ( x , y )
β 1 and β 2 are the fusion coefficients of B ( x , y ) and D ( x , y ) , respectively, which are used to adjust the merging ratio of the two layers. B ( x , y ) represents the pixel value of the background layer feature component; each pixel contains a large amount of background feature information. The β 1 value is taken in the range of 0–1. Theoretically, if this value is too small, the background feature information will be lost, so in order to ensure that the most background feature information is obtained after extraction, the β 1 value in this paper is taken as 1. D ( x , y ) represents the pixel value of the target detail feature component. β 2 theoretically, the larger the value range, the better, but since the infrared detector is 8-bit deep image information, if the β 2 value is too large, the feature pixel grayscale information will also overflow, losing the target feature information. The experimental target to obtain the maximum grayscale information is 50, so the value range is 3~5, to ensure that the data weighted fusion will not overflow and to ensure the maximum extent of the target feature information validity. adjusted according to the actual infrared image being processed; R ( x , y ) is the enhanced image. Finally, the target information component E ( x , y ) and the image R ( x , y ) are fused to obtain a high-contrast infrared image I ( x , y ) with outstanding detail of the target edges:
I ( x , y ) = exp ln [ R ( x , y ) ] + ln [ β 3 E ( x , y ) ]
where β 3 is the adjustment factor for L ( x , y ) , which is used to adjust the intensity of the component images in the final output image I ( x , y ) . β 3 is the coefficient of the target feature information component image E ( x , y ) ; the theoretical value range is between 0 and 1, the larger the theoretical value the better; the enhanced processing of the target feature information component E ( x , y ) pixel maximum value is close to 255; if 1 is taken, the pixel adjacent to the gray value of the sudden change will be treated as noise; so 0.2~0.5 is taken to effectively retain the target feature pixel information.

3. Simulation and Analysis

3.1. Technical Specifications and Design

The detector model used in this system is the Xcore LA6110 uncooled infrared focal plane detector, whose wavelength and relative spectral response curve are shown in Figure 3. The selected wavelength range for this system is 3 to 12 m. The detector image element number is 640 × 512, the image element size is 17 × 17 m, and the diagonal length of the detector target surface is 13.94 mm. The image enhancement algorithm with strong generalization and an ideal enhancement effect, in the acquisition of the target image requires the target in the overall imaging minimum pixel point to not be less than 96, and the conventional infrared medium and long wave lens on the market does not meet this demand. In order to improve the ability of multispectral infrared detection, we designed and simulated an infrared optical system because the multispectral and single-spectral infrared optical system structure is different due to the infrared. There are fewer optical materials that can meet the demands of a multispectral design that also includes mid-wave IR and long-wave IR, due to characteristics such as low transmittance and strong absorption. This is achieved by using materials with a lower coefficient of color difference between bands or by controlling the incidence of light on the surface of the element, among other methods, in order to eliminate color differences in the multispectral range. The availability of materials was then considered, and the system was determined to use three materials: sulfur-based glass (AMTIR-1), germanium, and zinc sulfide.
Based on the above design principles and methods, a multispectral uncooled infrared zoom optical system is designed. According to the actual experimental situation, the faint target size is W × H = 0.2 × 0.2 , the geometric mean size is W × H = 0.2 , and the imaging system is required to be observed between 3 m and 9 m from the target. The system focal length can be calculated according to Equation (17).
f = n × N p i x e l × L W × H
where f is the focal length of the system; n is the number of elements required n = 98 ; N p i x e l is the size of the elements; and L is the distance of the imaging system from the faint target. The focal length of the system is calculated to be in the range of 25–75 mm.
The diagonal length of the detector target surface, l = 13.94 mm, gives, at short focal lengths, the field of view:
w s = 2 arctan l / 2 25 = 31.2 °
Field of view at telephoto:
w L = 2 arctan l / 2 75 = 10.6 °
The detector image element size is 17 μm, and the cut-off frequency of the detector according to Nyquist’s sampling theorem is
f v = 1000 2 d = 1000 2 × 17 = 30 lp / mm
The specific design indicators for the system are shown in Table 1.

3.2. Design Results

Based on the theoretical analysis and calculations for the design of the zoom system, the final values of the spacing of each group element at different focal lengths were obtained and are shown in Table 2.
The calculated initial structure is optimized by setting variables and reasonable optimization parameters using the ZEMAX optical simulation software. As the refractive index of the sulfur glass (AMTIR-1) is the lowest among the three materials selected, AMTIR-1 should be placed in the middle as a negative lens in order to better correct for chromatic aberrations and field curvature. Due to the large relative aperture of the system and taking into account the requirements of achromatic and thermal aberrations, germanium and zinc sulfide, which have a higher refractive index, should be used as the front group of lenses.
The final wide-band uncooled infrared zoom optical system is shown in Figure 4, using a common optical path for imaging. The materials used are germanium, zinc germanium sulfide, and sulfur glass to meet the design requirements for a wide range of wavelengths and to eliminate chromatic aberrations at a wide range of wavelengths. In the optimization process, two aspherical surfaces were introduced to better correct the system’s advanced aberrations and to simplify the system structure, with the aspherical coefficients shown in Table 3 and a total system length of 160 mm.

3.3. Image Quality Analysis

The results of the overall optical system design were evaluated using MTF curves, diffuse spot radius, and grid aberrations and are shown in Figure 5. The MTF curves for each focal length in Figure 5 show the MTF at the system cut-off frequency of 30 lp/mm for each focal length approaching the diffraction limit; Column 2 in Figure 5 shows a plot of the spot column for each focal length of the optical system. The root mean square radius of the spot at each focal length of the optical system is less than the detector image element size of 17 μm, and the image quality is good. Figure 5, column 3, shows the grid aberration of each focal length of the optical system, and the aberration value of each focal length is less than 0.5%.

3.4. Cam Curve Drawing

The cam curve is drawn to describe the position and movement between the zoom and compensation groups in a zoom system. If the cam curve is not smooth enough and there are inflection points or breakpoints, the system may become stuck during the zooming process [14]. Therefore, the cam curve can be plotted to verify the reasonableness of the motion between the two elements of the zoom group and the compensation group in the zoom system. Based on the mechanical positive group zoom theory and the equation of motion, the displacement of the two elements at different focal lengths was calculated by taking the short focal length as the starting position. Based on the dynamic optics theory, the relevant data is input into Matlab and modeled to obtain the corresponding motion relationship between the zoom group and the compensation group. As shown in Figure 6, the system has a linear motion of the zoom group and a non-linear motion of the compensation group, and the motion curve is smooth enough without any break point.

3.5. Tolerance Analysis

The optical system obtained by the optical simulation software is only an imaging system in an ideal situation. The various optical components are prone to errors during machining and assembly, and these errors can lead to a reduction in the imaging quality of the processed system. Tolerance analysis is therefore designed to predict the extent to which various types of errors will affect the imaging quality of the optical system during machining and assembly before it is put into production. Often, these issues are regulated within a reasonable range of values, and the optical system is then designed so that it meets the design requirements.
Before setting tolerances, it is important to understand the main sources and influencing factors of systematic errors. According to the different causes of system errors, they can be divided into four main categories: process errors, material errors, mechanical errors, and environmental errors. Tolerance analysis is generally divided into three methods: sensitivity analysis (for the effect of individual tolerances on the optical system); inverse sensitivity analysis (based on a given performance target, the tolerance range to meet that target is automatically calculated); and Monte Carlo analysis (which simulates actual production to assess the overall effect of tolerances).
A detailed understanding of the accuracy requirements of the various optical processes is then required to give the current system a standard that is easy to achieve, easy to process, and easy to produce, and to set the back working distance of the system as a compensation parameter. The specific tolerance levels are shown in Table 4.
When carrying out a tolerance analysis of a zoom system, a tolerance of the Q4 class is usually chosen, as this reduces machining costs and set-up times while also ensuring the basic performance of the system. If a higher grade of tolerance is chosen, although the quality of the system can be improved, it also increases the machining difficulty and set-up complexity, thus increasing the cost and risk.
First, you need to open the tolerance operand editor in the tolerance analysis dialog and then enter the tolerance range for the parameters radius, thickness, eccentricity, and tilt of the lens in the system. Next, select the DIFF.MTF mean as the evaluation criterion and set the number of Monte Carlo analyses to 100. Finally, the tolerance analysis was run to obtain the performance distribution and sensitivity analysis of the system, and the results are shown in Table 5 and Table 6.
From Table 5 and Table 6, the imaging quality of the system achieves the expected level within the given Q4 tolerance range and according to the evaluation criteria, such as the DIFF.MTF mean value. The IL300 lathe with spindle runout within 20 nm and full stroke accuracy of 0.1 um was selected for machining. The machining equipment can achieve the accuracy required by the design, and the level of set-up can basically meet the design requirements of the system without too much optimization and adjustment.

3.6. Simulation Results

In order to verify the effect of the proposed optical system design and image enhancement algorithm on the same target after processing, as shown in Figure 7, Figure 7a is the target image acquired by using an ordinary infrared lens, in which only the faint outline of the target can be seen. Figure 7c is the target image acquired with the multispectral zoom optical system designed in this paper, in which the target outline can be clearly seen. Figure 7b shows the effect of using the image enhancement algorithm proposed in this paper for the original target image of Figure 7a. Figure 7d shows the effect after applying the proposed image enhancement algorithm to the original target image in Figure 7c; the target outline is clear, and the contrast is high after the enhancement process. Figure 7a1–d1 show the statistical information of the histogram of the original and enhanced images. Figure 7a2–d2 are three-dimensional statistical plots of the gray-scale information of the original and enhanced images. Figure 7a3–d3 are the statistical information plots of the intensity information of the original and enhanced images. The comparative analysis of the experimental results shows that the optical structure design proposed in this paper has a compact structure, a wide wavelength range, good imaging quality, and can acquire multispectral target information data simultaneously. The proposed image enhancement algorithm has strong generalization and robustness and has good image enhancement effects for both common optical systems and multispectral optical detection systems.

4. Experimental and Results

4.1. Experimental Equipment and Conditions

We performed outfield experiments and applications. The following description of the experimental environment and experimental equipment is beneficial to other researchers for verification and analysis. The experimental process uses a medium- and long-wave infrared camera to acquire target images through a designed optical zoom lens and a three-fold magnification of the target detection point, which facilitates the observation and analysis of the data. The target is made of a rectangular metal plate with a surface coating of 0.5 mm thickness and a circular plaster of approximately 10 cm diameter. The distance between the mid- and long-wave infrared detectors and the target was 9 m, and the experimental image was acquired at 21:00 p.m. with grass, buildings, and the sky in the background of the target. The hardware equipment and parameters used for the experiments are shown in Table 7.

4.2. Analysis of Experimental Data

In the above experimental process, 1000 raw mid- and long-wave infrared images with grass, buildings, and sky in the background were acquired, for a total of 3000 target images. Any single infrared raw image was randomly selected; this time the background of the extracted image was grass, as shown in Figure 8a; the target area was magnified three times and placed in the upper left corner area of the raw image; the histogram of the raw infrared image was performed as shown in Figure 8a1; the three-dimensional gray scale information statistics were shown in Figure 8a2; and the 2D intensity information statistics were shown in Figure 8a3, respectively.
Through the two groups of experimental effect comparison analysis in Figure 8, using the infrared optical system designed in this paper the target can be obtained; the image edge is clear, but the contrast is low. After processing through the enhancement algorithm proposed in this paper, the target is clear and the image contrast is high, and the target intensity information is enhanced. From Figure 8b1,a1 histogram statistics obviously demonstrate the accumulated number of statistical greyscale values that increased to 3000. In three-dimensional grayscale data statistics Figure 8b2 versus Figure 8a2, the target contrast values are enhanced by nearly 250 and the interference clutter acquired in the background is removed. The analysis of the above indicators shows that the image enhancement algorithm proposed in this paper had the desired effect.

4.3. Comparison to State-of-the-Art Methods

In order to further verify the accuracy and robustness of the experimental data, the target images obtained from the previous experiments were randomly extracted, and the target background of the extracted images was the sky. The current popular image enhancement processing algorithms CLAHE, VWGIF, and GWGIF and the algorithm proposed in this paper were compared and analyzed, and the experimental data analysis was repeated ten times and the experimental analysis data was counted, respectively. The results of the ten random experimental evaluations are shown in Table 7. A randomly selected group of experimental comparative analysis data in the paper is shown in Figure 9.
Figure 9a is the target image acquired by the front-end using a single-spectrum infrared lens, and the target outline is difficult to identify in the visual effects, but the target outline can be distinguished by histogram data statistics in Figure 9a1, three-dimensional contrast and gray-scale information statistics in Figure 9a2, and two-dimensional intensity information statistics in Figure 9a3 analysis, but the interference is serious and the edge of the outline is unclear. Figure 9b is an image acquired using the multispectral optical zoom lens designed in this paper, and the target and background distinctions can be distinguished in the visual effects. Figure 9c–f shows the enhanced effect of Figure 9b by the current popular algorithm and the algorithm proposed in this paper, respectively. The enhanced effects of the four image enhancement algorithms are significantly different from the original image, both in terms of visual judgment and data comparison. The algorithm proposed in this paper has uniform luminance information, as seen in the histogram in Figure 9f1. A comparative analysis of the three-dimensional gray-scale statistical plot in Figure 9f2 and the two-dimensional intensity statistical plot in Figure 9f3 shows that the proposed algorithm in the paper gives the best results with the other three algorithms, with an increase in contrast of about 33% and an increase in intensity information of about 30%. The target is enhanced by the proposed algorithm in this paper with clear target edges, high contrast between target and background, and enhanced target detail characteristics.

4.4. Evaluation Metrics

The objective evaluation index of image target characteristics can accurately measure the quality of the image after processing by infrared image enhancement algorithms. In this paper, the mean square error (MSE), signal-to-noise ratio (SNR), peak signal-to-noise ratio (PSNR), and structure similarity(SSIM) were used to analyze and evaluate the effect of image enhancement. The evaluation of the image enhancement effect was carried out using four evaluation metrics [15,16,17,18,19].
In order to ensure the objectivity and accuracy of the evaluation process, ten sets of data were randomly selected from the 3000 infrared raw images of the previous experiment and enhanced with the CLAHE algorithm, GIF algorithm, VWGIF algorithm, GWGIF algorithm, and the algorithm proposed in this paper to obtain MSE, SNR, PSNR, and SSIM indexes, and the average value of the ten sets of evaluation indexes was taken as the final evaluation criteria. The experimental data are shown in Table 8.
For the MSE evaluation index, the smaller the better; for the SNR, PSNR, and SSIM evaluation indexes, the larger the better. Through statistical analysis of the evaluation indexes of the images, the algorithm proposed in this paper improves MSE by 14.32, SNR by 11.64, PSNR by 12.78, and SSIM by 14.0% compared to guided filtering. The average fluctuation of the MSE metrics in the ten groups was 17.10% better than the current optimal algorithm, and the average fluctuation of the SNR metrics in the ten groups was 12.90% better than the current optimal algorithm. It can be seen that the algorithm has strong robustness and generalizability.

4.5. Discussion

Through the previous theoretical design and analysis of experimental results, this paper proposes a multispectral infrared zoom optical system design and a single-frame hierarchical guided filtering image enhancement algorithm. The optical structure is designed to improve the amount of image information acquisition data, and a dedicated multispectral infrared zoom optical system is designed with a focal length of 25~75 mm, an operating wavelength of 3~12 μm, an F-number of 1.8, an uncooled dual-color infrared focal plane detector with 640 × 512 resolution, a pixel size of 17 μm, and a total system length of 160 mm. The system has good image quality with a modulation transfer function close to the diffraction limit at 30 lp/mm for each field of view in the full focal length. Through simulation and experimental data analysis, it is verified that the optical system design and image enhancement algorithm proposed in this paper have the ideal experimental effect, and the objective evaluation index of the image target characteristics shows that MSE improves by 14.32, SNR improves by 11.64, PSNR improves by 11.78, and SSIM improves by 14.0%. The average fluctuation of the absolute value of the MSE index in the ten groups is better than the current optimal algorithm by 17.10%, and the average fluctuation of the absolute value of the SNR index in the ten groups is better than the current optimal algorithm by 12.90%. The algorithm proved to have strong robustness and generalizability.
This paper proposes a multispectral infrared zoom optical system design for fixed detector models and parameters, which is not applicable to all infrared detectors and has certain limitations in the use process, but the theoretical design method is universal. The current optical system design only includes two spectral bands, mid-wave and long-wave, and it is planned to include the short-wave band in the optical system design at a later stage so that more raw information about the target and background can be obtained through the optical system. The optical system imaging requires the target and the imaging system to be observed at a distance between 3 m and 9 m, which is suitable for close-range target detection, and later it is planned to design for long-range target detection to expand its application area. After a large number of multispectral infrared image data acquisitions, the experimental method of random sampling is used to verify that the optical system and image enhancement algorithm proposed in this paper have good generalization, stability, and robustness, whether the experiments are conducted individually or in combination. The optical system designed in this paper only has two spectral bands: medium-wave and long-wave infrared. If the short-wave spectral band is also included, the acquired target data will have more information and be more valuable for target identification and tracking. At the same time, with the introduction of multiple spectral bands, the noise information contained in the image will also be richer, requiring continuous research into image enhancement algorithms to obtain high-quality image information. Infrared image enhancement algorithms need to be further studied in depth through further research on the characteristics of infrared targets to propose more optimized infrared image enhancement algorithms, which will be a key direction for continued research in the future.
The optical system and image enhancement algorithm designed in this paper have the desired effect when applied alone and the optimal effect when applied in combination. This extends the method proposed in this paper to a wide range of application environments, currently applicable to firefighting, security, medical [20], high-speed rail, and other fields. In the future, through further refinement of the design and optimization, it will be applicable to important areas such as border security, aerospace, military detection, and remote sensing.

5. Conclusions

This paper proposes a multispectral infrared zoom optical system design and a single-frame hierarchical guided filtering image enhancement algorithm. It solves the problem of detecting faint targets in complex backgrounds, where the target information is easily drowned out by noise or background, improves the edge information of faint target imaging, and enhances the detail characteristics of the target. The method also solves the problem of the high false alarm rate of the single-frame image enhancement algorithm and the problem of increasing data storage capacity and real-time performance through multi-frame processing. In this paper, a 3× wide-band uncooled infrared zoom optical system is designed based on a multispectral infrared zoom optical system with complex achromatic and mechanical positive group compensation theories. The system adopts the mechanical compensation method of positive group compensation to achieve a continuous zoom from 25 mm to 75 mm with free switching in the field of view range of 10.6° to 31.2°. All focal lengths of the system are close to the diffraction limit at the cut-off frequency (30 lp/mm), and the maximum aberration is less than 0.5%. This paper proposes a single-frame hierarchical guided filtering image enhancement algorithm based on hierarchical feature extraction and fusion of faint target data that effectively enhances the target, suppresses the interference of complex backgrounds and noise, obtains useful data from target features during the process of hierarchical feature extraction, removes interference information such as noise and background, reduces data storage capacity, and improves computing timeliness. The optical system design and image enhancement algorithm proposed in this paper have obvious effects when used alone and in contrast to the original structure, and even more so when used in combination, not only improving the clarity of the edge contours of the faint target and enhancing the contrast of the image, but also improving the detectability of the faint target. Visually, it is easy to distinguish and identify dark and weak targets and the background, and it has stronger generalization ability and robustness compared to currently popular algorithms. Statistically analyzed in terms of image evaluation metrics, the algorithm proposed in this paper improves the guided filtering MSE by 14.32, SNR by 11.64, PSNR by 11.78, and SSIM by 14.0% compared to the original. The average fluctuation of the MSE metrics in the ten groups was 17.10% better than the current optimal algorithm, and the average fluctuation of the SNR metrics in the ten groups was 12.90% better than the current optimal algorithm. The algorithm proposed in this paper is applicable to the enhancement and recognition of faint and weak targets in dynamic infrared images and lays the research foundation for dynamic cluster small target detection and tracking.

Author Contributions

Conceptualization, S.Y. and Z.Z.; methodology, S.Y.; software, Z.Z.; validation, H.S. and Q.F.; data curation, H.S. and S.Y.; writing—original draft preparation, S.Y., Z.Z. and Y.L.; writing—review and editing, S.Y. and Z.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Jilin Province Science and Technology Development Plan (DZJ202301ZYTS417), the National Natural Science Foundation of China (NSFC) (No. 61890964, No. 62127813), Changchun Science and Technology Development Plan (No. 21ZY36).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhang, X.; Ji, W.; Li, L.; Wang, X.; Qin, C. Research progress on passive infrared imaging detection technology and system performance evaluation of natural gas leakage. Infrared Laser Eng. 2019, 9, 805–812. [Google Scholar] [CrossRef]
  2. Yang, W.; Liu, J.; Han, P.; Shao, X.; Zhao, X. Design of infrared zoom imaging system based on concentric spherical lens with wide Fov and high resolution. J. Infrared Millim Waves 2019, 12, S204001-1–S204001-13. [Google Scholar]
  3. Reshidko, D.; Reshidko, P.; Ran, C. Optical design study and prototyping of a dual-field zoom lens imaging in the 1-5 micron infrared waveband. In SPIE Conference on Zoom Lenses; SPIE: Bellingham, WA, USA, 2015. [Google Scholar]
  4. Malka, D.; Berke, B.A.; Tischler, Y.; Zalevsky, Z. Improving Raman spectra of pure silicon using super-resolved method. J. Opt. 2019, 2, 75801–75812. [Google Scholar] [CrossRef]
  5. Yang, H.; Yang, X.; Mei, C.; Chen, W. Design of hybrid refractive-diffractive infrared dual-band zoom optical system. Infrared Laser Eng. 2020, 10, 20200036-1–20200036-8. [Google Scholar]
  6. Li, Y.; Yang, J.; Peng, J.; Liu, L. Design of cooled infrared dual-band wide angle athermal optical system. Laser Infraed 2023, 5, 712–715. [Google Scholar]
  7. Deng, C.; Zhou, Y. Infrared image enhancement algorithm based on low frequency redistribution and edge enhancement. Laser Infrared 2023, 1, 146–152. [Google Scholar] [CrossRef]
  8. Ge, P.; Yang, B.; Han, Q.; Liu, P.; Chen, S.; Hu, D.; Zhang, Q. Infrared Image Detail Enhancement Algorithm Based on Hierarchical Processing by Guided Image Filter. Infrared Technol. 2018, 12, 1161–1169. [Google Scholar]
  9. Soliman, N.F.; Alabdulkreem, E.A.; Algarni, A.D.; Banby, G.M. Efficient Deep Learning Modalities for Object Detection from Infrared Images. Comput. Mater. Contin. 2022, 2, 2546–2563. [Google Scholar] [CrossRef]
  10. Zhou, H.; Li, B.; Wang, H.; Bian, C. Fast and accurate detection of infrared dim small target in low altitude complex scenes. J. Natl. Univ. Def. Technol. 2023, 2, 74–85. [Google Scholar] [CrossRef]
  11. Zhang, Y.; Zhu, H.; Cheng, H.; Zhang, J.; Zhang, Q. Infrared small target detection algorithm based on frequency domain saliency analysis and morphological filtering. Laser Infrared 2022, 10, 1488–1492. [Google Scholar]
  12. Li, X. Infrared image filtering and enhancement processing method based upon image processing technology. J. Electron. Imaging 2022, 5, 051408. [Google Scholar] [CrossRef]
  13. Hua, X.; Ge, Y. Image enhancement algorithm for the dim targets in infrared images. Electron. Des. Eng. 2016, 12, 148–150. [Google Scholar] [CrossRef]
  14. Cao, C.; Liao, Z.; Bai, Y.; Liao, S.; Fan, Z. design of high zoom ratio LWIR zoom system with large relative aperture. J. Appl. Opt. 2018, 11, 773–779. [Google Scholar] [CrossRef]
  15. Zhang, Q.; Xu, X.; Pan, Y.; Hu, M. Design of projection optical system for dynamic star simulator with long exit pupil distance. J. Chang. Univ. Sci. Technol. (Nat. Sci. Ed.) 2021, 12, 13–18. [Google Scholar]
  16. Shi, C.; Lin, Y. Objective image quality assessment based on image color appearance and gradient features. Acta Phys. Sin. 2020, 22, 228701. [Google Scholar] [CrossRef]
  17. Yang, B.; Pan, D.; Jiang, Z.; Huang, J.; Gui, W. A Cross-Scale Decomposition Method for Low-Light Image Enhancement. Signal Process. 2022, 8, 108752. [Google Scholar] [CrossRef]
  18. Ma, T.; Yang, Z.; Luo, Y.; Zhuang, C. Infrared dim small target detection method based on depth feature fusion. J. Zhengzhou Univ. (Nat. Sci. Ed.) 2023, 5, 65–72. [Google Scholar] [CrossRef]
  19. Wang, H.; Cao, D.; Zhao, Y.; Yang, Y. Survey of infrared dim small target detection algorithm based on deep learning. Laser Infrared 2022, 9, 1275–1279. [Google Scholar] [CrossRef]
  20. Malka, D.; Vegerhof, A.; Cohen, E.; Rayhshtat, M.; Libenson, A.; Aviv Shalev, M.; Zalevsky, Z. Improved Diagnostic Process of Multiple Sclerosis Using Automated Detection and Selection Process in Magnetic Resonance Imaging. Appl. Sci. 2017, 7, 831. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Schematic diagram of the structure of the improved bootstrap filtering algorithm.( Colours are used to distinguish different modules, with the same colour indicating the same function.).
Figure 1. Schematic diagram of the structure of the improved bootstrap filtering algorithm.( Colours are used to distinguish different modules, with the same colour indicating the same function.).
Photonics 10 00805 g001
Figure 2. Improved bootstrap filtering algorithm flow.
Figure 2. Improved bootstrap filtering algorithm flow.
Photonics 10 00805 g002
Figure 3. Detector relative spectral response curves.
Figure 3. Detector relative spectral response curves.
Photonics 10 00805 g003
Figure 4. Diagram showing the results of optimizing a wide-band uncooled infrared zoom optical system (a) Short focus (25 mm) (b) Central focus (50 mm) (c) Long focus (75 mm). ( The three colours indicate different Real Image Heights, red 7 mm, green 5 mm, blue 0 mm).
Figure 4. Diagram showing the results of optimizing a wide-band uncooled infrared zoom optical system (a) Short focus (25 mm) (b) Central focus (50 mm) (c) Long focus (75 mm). ( The three colours indicate different Real Image Heights, red 7 mm, green 5 mm, blue 0 mm).
Photonics 10 00805 g004
Figure 5. Multispectral uncooled infrared zoom optical system specifications. (a) Short-focus MTF. (b) Central-focus MTF. (c) Long-focus MTF. (d) Short-focus point column chart. (e) Central-focus point column chart. (f) Long-focus point column chart. (g) Short-focus grid distortion. (h) Central-focus grid distortion. (i) Long-focus grid distortion.
Figure 5. Multispectral uncooled infrared zoom optical system specifications. (a) Short-focus MTF. (b) Central-focus MTF. (c) Long-focus MTF. (d) Short-focus point column chart. (e) Central-focus point column chart. (f) Long-focus point column chart. (g) Short-focus grid distortion. (h) Central-focus grid distortion. (i) Long-focus grid distortion.
Photonics 10 00805 g005
Figure 6. Zoom system cam curve.
Figure 6. Zoom system cam curve.
Photonics 10 00805 g006
Figure 7. Experimental results (a) Raw images of single spectrum lenses. (b) The effect of our algorithm enhancement. (c) Multispectral lens raw images. (d) The effect of our algorithm enhancement. (a1~d1) Histogram statistical infographics. (a2~d2) Three-dimensional statistical chart of greyscale information. (a3~d3) Statistical chart of intensity information.
Figure 7. Experimental results (a) Raw images of single spectrum lenses. (b) The effect of our algorithm enhancement. (c) Multispectral lens raw images. (d) The effect of our algorithm enhancement. (a1~d1) Histogram statistical infographics. (a2~d2) Three-dimensional statistical chart of greyscale information. (a3~d3) Statistical chart of intensity information.
Photonics 10 00805 g007
Figure 8. Experimental results (a) Multispectral lens raw images. (b) The effect of our algorithm enhancement. (a1,b1) Histogram statistical infographics. (a2,b2) Three-dimensional statistical chart of greyscale information. (a3,b3) Statistical chart of intensity information.
Figure 8. Experimental results (a) Multispectral lens raw images. (b) The effect of our algorithm enhancement. (a1,b1) Histogram statistical infographics. (a2,b2) Three-dimensional statistical chart of greyscale information. (a3,b3) Statistical chart of intensity information.
Photonics 10 00805 g008
Figure 9. Comparison of the results of different algorithms for target enhancement (a) Raw images of single spectrum lenses. (b). Multispectral lens raw images (c) CLAHE algorithm. (d) VWGIF algorithm. (e) GWGIF algorithm. (f) Our algorithm. (a1~f1) Histogram statistical infographics. (a2~f2) Three-dimensional statistical chart of greyscale information. (a3~f3) Statistical chart of intensity information.
Figure 9. Comparison of the results of different algorithms for target enhancement (a) Raw images of single spectrum lenses. (b). Multispectral lens raw images (c) CLAHE algorithm. (d) VWGIF algorithm. (e) GWGIF algorithm. (f) Our algorithm. (a1~f1) Histogram statistical infographics. (a2~f2) Three-dimensional statistical chart of greyscale information. (a3~f3) Statistical chart of intensity information.
Photonics 10 00805 g009
Table 1. System design indicators.
Table 1. System design indicators.
ParametersDesign Indicators
Wavelength/μm3~12
Focal length/mm25~75
F/#1.8
Field of view (°)10.6~31.2
Aberrations/%≤0.5
Total system length/mm≤160
Cut-off frequency requirements30 lp ≥ 0.3
Table 2. Interval values for each group element at different focal lengths.
Table 2. Interval values for each group element at different focal lengths.
Air SpacingShort Focus/mmCentral Focus/mmLong Focus/mm
d 12 2.1636.9851.41
d 23 71.6528.485.93
d 34 5.3515.2324.73
Table 3. Aspheric coefficient.
Table 3. Aspheric coefficient.
AsphericCone Factor4th Order Term Coefficient6th Order Term Coefficient8th Order Term Coefficient
54.4992.282−63.075−8−7.677−10
130.362−5.486−65.038−8−1.594−10
Table 4. Tolerance Scale.
Table 4. Tolerance Scale.
GradeRefractive IndexNumber of AperturesThickness/mmIrregularityLens Prism/(′) Abbey Error/%Lens Tilt/(′) Lens Eccentricity/mmMirror Set Position/mmTilt of the Mirror set/(′) Mirror Set Eccentric/mm
Q10.00010.50.010.10.170.010.170.0010.010.170.001
Q20.000310.010.10.30.030.30.0030.010.30.003
Q30.000510.01250.250.50.050.50.0050.01250.50.005
Q40.000820.0250.250.80.080.80.0080.0250.80.008
Q50.00120.03750.510.110.010.037510.01
Q60.00330.050.51.50.31.50.030.051.50.03
Q70.00830.075120.520.050.07520.05
Q80.00830.1230.830.080.130.08
Table 5. Tolerance analysis results.
Table 5. Tolerance analysis results.
NameShort FocusCentral FocusLong Focus
Nominal (mm) 0.2860.3070.382
Best (mm) 0.4960.5020.531
Worst (mm) 0.2390.2860.335
Average (mm) 0.3240.3580.401
Compensator statistics
Post-focus change
Minimum (mm) −0.821−0.685−0.316
Maximum (mm) 1.7191.5360.447
Average (mm) −0.025−0.0610.056
Standard deviation (mm) 0.4120.3250.294
Table 6. Monte Carlo analysis results.
Table 6. Monte Carlo analysis results.
Monte Carlo AnalysisShort Focus DIFF.MTF MeanCentrally Focused DIFF.MTF MeanLong Focus DIFF.MTF Mean
90%>0.250198520.301140210.36342636
80%>0.256358790.308907030.38552938
50%>0.260251440.311517650.38924014
20%>0.285804290.322241470.41818301
10%>0.302510040.348570560.44751512
Table 7. Experimental environment and specific equipment parameters configuration table.
Table 7. Experimental environment and specific equipment parameters configuration table.
Configuration TypesSpecific ConfigurationConfiguration Data
The hardwareCPUIntel(R) Core i7
Frequency1.8 GHz
RAM32 G
SoftwareThe operating systemWin 10
The development toolsIRController_V3.02
CamerasOperating bands3~12 μm
Pixel resolution640 × 512
Pixel spacing17 μm
NETD≤60 mK@25 °C
Frame rate50 Hz
Experimental environmentAmbient temperature21.7 °C
Ambient humidity67%
Table 8. Four evaluation indicators for the processing results of various enhancement algorithms.
Table 8. Four evaluation indicators for the processing results of various enhancement algorithms.
Evaluation Indicators12345678910Average
MSEOurs37.6436.2837.0737.634.2436.5436.7234.6137.2633.2636.12
VWGIF89.2587.5189.6487.3690.2889.3789.3990.0890.5791.3289.48
GIF108.64110.28109.36114.32110.08108.35107.64107.29119.37108.29110.36
GWGIF47.5750.0649.3751.0848.2551.2751.3649.5852.1753.6650.44
CLAHE133.48134.54135.67132.18134.72132.19130.58130.61134.28132.19133.04
SNROurs31.2630.1829.8931.2430.6731.0532.0131.8931.7831.4731.14
VWGIF22.6720.1818.7918.9619.2718.7419.0619.1818.1119.8519.48
GIF18.6418.0517.9816.8717.0215.4616.0216.0716.7416.2616.91
GWGIF21.0320.1819.7218.7319.0619.2419.0717.6220.1820.1819.50
CLAHE18.4317.6218.1718.3416.2817.2616.9816.0519.0717.2517.55
PSNROurs38.2637.8937.0639.2741.0340.6539.7638.7941.2840.8539.48
VWGIF28.1526.3826.0426.6427.0125.3726.3127.1125.6324.3126.30
GIF26.1525.4924.0625.0324.8725.0624.6725.0823.8725.0924.94
GWGIF28.6627.6226.3527.5225.6427.1626.4825.0125.8926.7526.70
CLAHE21.8718.8720.8620.1821.6522.8522.0620.8321.6522.0621.29
SSIMOurs0.950.940.950.960.950.940.930.940.950.960.95
VWGIF0.760.720.750.790.740.760.780.810.790.780.77
GIF0.710.730.690.720.30.680.740.670.660.690.66
GWGIF0.870.850.810.820.790.810.790.810.790.780.81
CLAHE0.520.510.490.480.430.420.510.530.490.470.49
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, S.; Zou, Z.; Li, Y.; Shi, H.; Fu, Q. Hierarchical Feature Enhancement Algorithm for Multispectral Infrared Images of Dark and Weak Targets. Photonics 2023, 10, 805. https://doi.org/10.3390/photonics10070805

AMA Style

Yang S, Zou Z, Li Y, Shi H, Fu Q. Hierarchical Feature Enhancement Algorithm for Multispectral Infrared Images of Dark and Weak Targets. Photonics. 2023; 10(7):805. https://doi.org/10.3390/photonics10070805

Chicago/Turabian Style

Yang, Shuai, Zhihui Zou, Yingchao Li, Haodong Shi, and Qiang Fu. 2023. "Hierarchical Feature Enhancement Algorithm for Multispectral Infrared Images of Dark and Weak Targets" Photonics 10, no. 7: 805. https://doi.org/10.3390/photonics10070805

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop