Next Article in Journal
Forecasting for Ultra-Short-Term Electric Power Load Based on Integrated Artificial Neural Networks
Next Article in Special Issue
Study on the Allocation of a Rescue Base in the Arctic
Previous Article in Journal
Mei Symmetry and Invariants of Quasi-Fractional Dynamical Systems with Non-Standard Lagrangians
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Image Enhancement Using Modified Histogram and Log-Exp Transformation

1
School of Communication and Information Engineering, Shanghai University, Shanghai 200444, China
2
Faculty of Electronic and Information Engineering, Huaiyin Institute of Technology, Huaian 223003, China
*
Author to whom correspondence should be addressed.
Symmetry 2019, 11(8), 1062; https://doi.org/10.3390/sym11081062
Submission received: 11 July 2019 / Revised: 13 August 2019 / Accepted: 13 August 2019 / Published: 20 August 2019

Abstract

:
An effective method to enhance the contrast of digital images is proposed in this paper. A histogram function is developed to make the histogram curve smoother, which can be used to avoid the loss of information in the processed image. Besides the histogram function, an adaptive gamma correction for the histogram is proposed to stretch the brightness contrast. Moreover, the log-exp transformation strategy is presented to progressively increase the low intensity while suppressing the decrement of the high intensity. In order to further widen the dynamic range of the image, the nonlinear normalization transformation is put forward to make the output image more natural and clearer. In the experiment on non-uniform illumination images, the average contrast per pixel (CPP), root mean square (RMS), and discrete entropy (DE) metrics of the developed approach are shown to be superior to selected state-of-the-art methods.

1. Introduction

Enhancement technology is regarded as one of the most fundamental problems in computer vision. It can be widely used in many applications such as monitoring, imaging systems, human–computer interactions, and so on [1,2,3]. In many cases, the contrast in digital video or images is poor, which can be caused by many circumstances including a lack of operator expertise and inadequacy of the image acquisition equipment. In most cases, the captured scenes are under unfavorable environmental conditions—for example, the presence of clouds, the lack of sunlight, backlight or indoor lighting, and so on. These conditions might also result in reduced contrast quality. In view of the above problems, many researchers still concentrate on poor contrast image enhancement [4,5]. Generally speaking, the enhancement techniques for poor contrast images are broadly made up of two categories: direct enhancements [6,7,8] and indirect enhancements [9,10,11]. The direct enhancement methods [6,7,8] can directly define the image contrast by a specific contrast term. Most of these metrics, however, cannot simultaneously measure the contrast of simple and complex patterns in images. Various image enhancement techniques have been presented [12,13,14,15,16] to enhance the contrast of an image. For instance, histogram equalization (HE) is a widely used technique that is simple and easily implemented [13]. Unfortunately, HE [13] does not always give satisfactory results since it might cause annoying artifacts and intensity saturation effects [16].
Scholars have proposed many other HE-based methods, such as brightness preserving bi-histogram equalization (BBHE) [15], dualistic sub-image histogram equalization (DSIHE) [17], and minimum mean brightness error bi-histogram equalization (MMBEBHE) [18]. BBHE [15] divides the histogram based on the image, meanwhile DSIHE [17] uses the image median to partition the histogram. The method of six threshold plateau limits for a segmented histogram [19] was introduced to improve the BBHE method by Lim et al. The image histogram is recursively segmented into multiple groups based on the mean brightness error (MBE) in MMBEBHE [18] method. The methods mentioned above have enabled great progress in the field. However, there are some drawbacks including failure in images with non-symmetric distribution [15], the loss of structural information [16], the preservation of the mean brightness [17], and the production of annoying side effects [18]. Therefore, these algorithms have failed to achieve the desired improvements in contrast enhancement. Moreover, the difference between the input and output image can be minimal when using these methods [20].
To overcome the limitations mentioned above, recursive mean-separate histogram equalization (RMSHE) [21] was proposed for the recursive division of histograms based on the local mean. The average brightness of the processed image is close to that of the input image. Adaptively modified histogram equalization (AMHE) [22] was proposed to modify the probability density function (PDF) of the grayscale. AMHE [22] does not produce any degradation. However, it darkens the bright areas and fails to boost the brightness of the dark regions. Recursive sub-image histogram equalization (RSIHE) [23] based on contrast enhancement was developed to improve DSIHE [17], and the recursive segmentation was introduced in a similar manner to that reported in [21]. Hasikin et al. [24] proposed an adaptive fuzzy intensity measurement method that can selectively enhance the dark area of the image without brightening the bright area.
Some other methods based on histogram equalization have been introduced, which can enhance the contrast of images with brightness. Dynamic histogram specification [25] was developed to preserve the shape of the input image histogram without significant enhancement. A contrast enhancement algorithm was developed in [26,27] for color images. Adaptive gamma correction with weighting distribution (AGCWD) [28] was presented to enhance brightness and preserve the available histogram information. Although AGCWD [28] enhances the contrast and preserves the overall brightness of an image, it may not give the desired results. If the maximum intensity of the output image is limited by the maximum intensity of the input image, the input image will lack bright pixels [29]. An image enhancement technique called exposure sub-image histogram equalization (ESIHE) [30] was developed. This technique divides the clipped histogram into two parts according to the pre-calculated exposure threshold [31]. The intensity exposure in histogram segmentation before histogram clipping was developed to enhance the image [32]. This shows that the threshold value computed from the intensity exposure of the image is employed to divide the input histogram, which can yield certain enhancement results. However, it usually causes under-enhancement. Bi-histogram equalization in modified histogram bins (BHEMHB) [33] was developed to segment the input histogram based on the median brightness of an image, which alters the histogram bins before HE [13]. Unfortunately, it makes limited improvement in the contrast.
A novel image enhancement method is proposed through adaptive gamma correction with reasonable histogram function in this paper. This method can improve the low intensity of the image and avoid the significant decrement of the high intensity; thus, the dynamic range of the image is effectively extended. The main contributions of this paper are as follows: Firstly, a novel histogram function is developed to modify the histogram of the image, which can be used to make the resultant transformation curve smoother. This can avoid the loss of some information in the processed image. Secondly, log-exp transformation (LET) is proposed to make the weighted values in the dark regions of the image higher while the ones in the bright regions are lower. In addition, a nonlinear normalization transformation (NNT) is put forward to obtain a wider dynamic range for the brightness component of the output image. This makes the output image more natural and clearer. The third important contribution is that the image can be enhanced using ordinary hardware without the need to hypothesize the scenario contents in advance. The extensive experimental comparisons with state-of-the-art methods show the superior performance of the proposed method.
An overall flowchart for image enhancement is shown in Figure 1.
The rest of the paper is organized as follows. The modified histogram-based enhancement is described in Section 2. The experimental results and analyses are discussed in Section 3. Some conclusions follow in Section 4.

2. Modified Histogram-Based Enhancement

The HSV (Hue, Saturation, Value) color space is selected to deal with the color image processing, since it has some advantages such as a color representation for human perception and color information separation from the brightness (or lightness) information [34]. In the HSV color space, the color contents are represented by H (hue) and S (saturation), whereas the luminance intensity is represented by V(luminance value). To eliminate the side effects of false color, the proposed method only enhances the V component, while the H and S components remain unchanged.

2.1. Adaptive Gamma Correction with Modified Histogram

Gamma correction methods constitute a series of general histogram modifications techniques that can be obtained simply by using the variable adaptive parameter γ. The simple form of gamma correction based on the transformation is described as:
T ( l ) = l max ( l / l max ) γ
where lmax denotes the maximum intensity of the input image. After executing Formula (1), the intensity l of each pixel in the input image is converted to T(l). However, when the contrast is directly modified by gamma correction, the different images exhibit the same intensity changes due to the fixed parameters. Fortunately, this problem can be solved by calculating the probability density of each intensity level in the digital image.
The AGCWD [28] algorithm modifies the statistical histogram and lessens the generation of adverse effects by applying the weighting distribution (WD) function. The WD function is defined as:
p d f w ( l ) = p d f max ( p d f ( l ) p d f min p d f max p d f min ) α
where pdfmax is the maximum of probability density function, pdfmin is the minimum, and α is an adjusted parameter, which is set at 0.8 in [28].
The cdfw(l)is approximated as:
c d f w ( l ) = k = 0 l p d f w ( k ) / p d f w
where cdf denotes the cumulative distribution function, and the sum of pdfw is calculated as:
p d f w = l = 0 l max p d f w ( l )
The resultant cdf curves of AGCWD [28] in α = 0.8 for front lighting #1 and the side lighting are shown in Figure 2. It can be seen that the resultant transformation curves do not increase smoothly for AGCWD [28] in α = 0.8.
To prove this viewpoint (in front lighting #1 α = 0.8) further, we can mark two intervals on the x-axis ((l1, l2), (l3, l4)) and their corresponding intervals on the y-axis ((c1, c2), (c3, c4), respectively). The small interval (l1, l2) is transformed to the sufficiently large interval (c1, c2). Some sufficient weights are correspondingly assigned to (3) in this case. When a large interval (l3, l4) is transformed to the small interval (c3, c4), the weight should be assigned to be a smaller one. This may lead to the loss of information in the processed image.
A modified histogram function is proposed to solve this problem as follows:
p d f w m ( l ) = s t d ( L ) s t d ( L ) m e a n ( L ) × p d f w ( l ) δ + ( m e a n ( L ) + 255 ) × p d f w ( l )
where std(L) represents the variance value of the image luminance component, mean(L) is the mean value of the image luminance component, and δ is an adjusted parameter (this is discussed later).
The modified cdfwm is calculated by:
c d f w m ( l ) = l = 0 l max p d f w m ( l ) / p d f w m
Figure 3 shows the cumulative distribution function curves in AGCWD [28] and the proposed method for δ = 0.5, respectively.
In Figure 3, it can be observed that the proposed method can avoid under-enhancement and over-enhancement in the low-intensity region of by smoothing the fluctuated parts. On the other hand, the high-intensity level is only slightly decreased to enhance the contrast with intensity preservation.
Adaptive gamma correction with modified histogram (AGCMH) is given correspondingly as:
L A G C = T ( l ) = l max ( l l max ) γ
where γ = 1 c d f w m ( l ) .

2.2. Log-Exp Transformation

Although the brightness contrast is stretched, it looks dark for some regions in the processed image as shown in Figure 1. The dark region should be processed further. A log-exp transformation (LET) was developed to cope with the illumination compensation of the image and preserve its naturalness, in which there is higher weight for the dark region and lower weight for the bright region. The LET is defined as follows:
T L E T = x ( log b ( L A G C + a ) + c )
where T L E T : N × N N × N , x = max ( L A G C ) ( 1 / max ( log b ( L A G C + a ) + c ) ) , the function max(•) calculates the maximum value of the matrix, a = mean(LAGC), b = std(LAGC), and c = mean(LAGC)/std(LAGC). The element of intensity matrix LAGC is denoted as lij.
It can be found that TLET is an increasing function from (8). A small value of lij would be sufficiently transformed to TLET (lij) with a larger increment, while a large value of lij would be transformed to TLET (lij) with a smaller increment after implementing (8). When lij reaches the maximum of LAGC, the value of TLET (lij) is equal to lij, as shown in Figure 4.
The transformation would cause the image to blur to a certain degree, as shown in Figure 1. A nonlinear normalization is proposed to deal with this effect in the next section.

2.3. Nonlinear Normalization Transformation

Nonlinear normalization transformation (NNT) was developed to deal with the blur problem exhibited by LET as follows:
L N N T = T N N T ( l i j ) = χ × l i j μ + λ
where μ is a variable parameter—when it is set to −1, (9) would be a linear normalization function. χ and λ are parameters to adjust the intensity of LNNT within [0, 255] as follows:
{ T N N T ( l max ) × χ + λ = 255 T N N T ( l min ) × χ + λ = 0
After the nonlinear normalization transformation, the luminance range is within [0, 255].
To highlight the result, an example is given in Figure 5 for AGCMH, LET, and NNT. It can be seen that AGCMH causes the image histogram to appear sketched with some dark regions (Figure 5b). The dark region is brightened with blur to a certain extent after LET, as shown in Figure 5c. The dynamic range of the image is widened and the output image looks like natural and clear after NNT (Figure 5d).

3. Experimental Results and Analyses

To test the performance of the proposed approach, state-of-the-art methods were selected including RMSHE [21], AMHE [22], RSIHE [23], AGCWD [28], ESIHE [30], and BHEMHB [33]. The comparison was performed addressing both subjective and objective evaluations.

3.1. Parameter Analysis

To build a fair comparison, some parameters including δ in (4) and μ in (8) were set to constant values so that they were less influenced by the different scene contents. To determine reasonable values of δ and μ, 2000 images from [35,36,37,38] were selected to test the enhancement performance in average values of contrast per pixel (CPP) [39], root mean square (RMS) [40], and discrete entropy (DE) [41]. Figure 6 shows the statistical results for different values of δ and μ. It can be seen from Figure 6 that there were excellent performances when δ was 0.5 and μ was 0.7. Therefore, δ and μ were set to 0.5 and 0.7, respectively, and maintained at these values in the experiment.

3.2. Subjective Evaluation

For the subjective evaluation, the test images were selected from DIP3/e Book Images [35], the Caltech faces 1999 dataset [36], NASA release images [37], and the Extended Yale B database [38]. The chosen images were classified as dim, backlighting, front lighting, daytime, side lighting, or gray. There were non-uniform illuminations for most of the selected images.
Some results are given in Figure 7 with the investigated methods for dim #1. Both RMSHE [21] and RSIHE [23] made improvements in the dark area with some gray-level loss in the bright area, as shown in Figure 7b,d, respectively. AMHE [22] equalized the original histogram without changes in the overall luminance, as shown in Figure 7c. Both AGCWD [28] and ESIHE [30] obtained a superior dynamic contrast as compared to the abovementioned methods. There were some dark regions, as shown in Figure 7e,f, respectively. The result of BHEMHB [33] was more natural than those of AGCWD [28] and ESIHE [30]. There were some unnatural regions, as shown in Figure 7g. However, it can be seen that the proposed method exhibited an excellent performance (Figure 7h).
Another example for dim #2 is given in Figure 8. RMSHE [21], AMHE [22], and RSIHE [23] improved the dark region with the low level of brightness, providing a weak enhancement, as shown in Figure 8b–d, respectively. An outstanding result of AGCWD [28] was demonstrated, apart from some regions with dark brightness. ESIHE [30] and BHEMHB [33] enhanced the contrast of the image and preserved its original features. However, it looked somewhat unnatural, as shown in Figure 8f,g, respectively. It can be noted that a clearer result was achieved by the proposed method, as shown in Figure 8h.
Some enhancement results for a backlighting image using different methods are shown in Figure 9.
There are poor illuminations due to the backlighting in the image shown in Figure 9a. RMSHE [21], AMHE [22], and RSIHE [23] achieved improvements in the facial region. There were some dark and unnatural regions, especially in the facial areas, as shown in Figure 9b–d, respectively. Some information was lost in AGCWD [28] due to the unreasonable cdf curve, as shown in Figure 9e. Both ESIHE [30] and BHEMHB [33] enhanced the contrast and preserved the color information to some extent. It still looked somewhat dark, as shown in Figure 9f,g, respectively. Meanwhile, the proposed method achieved a more natural and clearer result (Figure 9h).
Some enhancement results for front lighting #1 using different methods are shown in Figure 10. Both RMSHE [21] and RSIHE [23] led to improvements in the background. Some gray-level information was lost in the facial region, as shown in Figure 10b,d, respectively. AMHE [22] compressed the dynamic range of the image and resulted in a blurred image, as shown in Figure 10c. AGCWD [28] also achieved improvement in the background. There was an oversaturated area in the facial region, as shown in Figure 10e. Both ESIHE [30] and BHEMHB [33] obtained a superior dynamic range. There were some dark regions, as shown in Figure 10f,g. However, excellent visual quality for human beings was achieved in the proposed method, as shown in Figure 10h.
Some enhancement results for front lighting #2 with different methods are shown in Figure 11. RMSHE [21], AMHE [22] and RSIHE [23] showed improvements in the contrast sketch, as shown in Figure 11b–d, respectively. AGCWD [28] achieved a fine dynamic range. There were some dark regions still apparent in Figure 11e. Both ESIHE [30] and BHEMHB [33] showed high visibility. There were some over-enhancements in the facial regions, as shown in Figure 11f,g. The proposed method gave a more natural and clearer image, as shown in Figure 11h.
Some enhancements for both daytime #1 and #2 using different methods are shown in Figure 12 and Figure 13, respectively. It can be seen that a superior enhancement performance was given by the proposed method as compared to the other methods in Figure 12 and Figure 13.
Some enhancements for side lighting are shown in Figure 14. RMSHE [21], AMHE [22], and RSIHE [23] gave improvements in the shadow areas with inconsistent luminance. However, there were still some dark regions, as shown in Figure 14b–d. AGCWD [28] achieved a fine dynamic brightness range. Yet, the image looked both unnatural and unclear for human vision, as shown in Figure 14e. Both ESIHE [30] and BHEMHB [33] improved the visibility and enhanced the details, and the dynamic range was compressed, as shown in Figure 14f,g, respectively. Meanwhile, the transition between brightness and darkness was smooth in the image following the proposed method. Moreover, the processed image texture became clearer, as shown in Figure 14g.
Another enhancement example for a gray image is shown in Figure 15. RMSHE [21] led to over-enhancement in the bright region, as shown in Figure 15b. Both AMHE [22] and RSIHE [23] obtained limited improvements in the dark region, as shown in Figure 15c,d, respectively. AGCWD [28] improved the visibility to some extent. The image was a little blurred since this method compressed the dynamic range of brightness, as shown in Figure 15e. Both ESIHE [30] and BHEMHB [33] caused some details to stand out. However, the image still looked dark due to its limited improvement in brightness. The proposed method had the best visibility and naturalness, as shown in Figure 15h.

3.3. Objective Evaluations

To further test the enhancement performance quantitatively, CPP [39], RMS [40], and DE [41] were employed to perform a comparison of the proposed method with some of the state-the-art approaches mentioned above.

3.3.1. Contrast Per Pixel (CPP)

CPP [39] indicates the degree of variation in the surrounding pixels. Its characteristics are proportional to the contrast. It can measure the degree of detailed expression of an M × N images follows:
C P P = i = o M 1 j = 0 N 1 ( 1 9 m = 1 1 n = 1 1 | l ( i , j ) ¯ l ( i + m , j + n ) ¯ | ) M N
where l indicates the luminance element of the image.
Some results are given in Table 1 for CPP [39] obtained by the investigated methods mentioned above.
It can be seen from Table 1 that the proposed method outperforms the comparison algorithms in CPP [39] as a whole.
To further highlight the performance of the proposed method, 2000 images were divided into five groups. Each group contained 400 images. The groups included backlight (G1), daytime (G2), front lighting (G3), dim (G4), and side lighting (G5) images. We computed their corresponding CPP values, as given in Table 2.
It can be further found from Table 2 that among the methods investigated for comparison, the best enhancement performance was achieved by the developed method.

3.3.2. Root Mean Square (RMS)

RMS is a common way to measure the contrast of an image [40], where a higher value indicates the better visibility of the output image. The RMS [40] is defined as:
R M S = 1 M N i = o M 1 j = 0 N 1 ( μ l ( i , j ) ) 2
where μ represents the mean of the input image, l indicates the luminance element of the input image, and M and N represent the width and height of the image, respectively.
Some results for the RMS are given in Table 3 and Table 4 [40] with the investigated methods mentioned above. As indicated in Table 3, the RMSHE method obtained the best result for the dim #2 image, and AGCWD reached the highest RMS value for the front lighting #1 and side lighting images. The proposed algorithm performed best in all other images. The proposed method acquired the highest value in all groups except G5, and the highest average value was also obtained by our algorithm.
The best enhancement performances by the developed method is shown among the other investigated methods, as shown in Table 3 and Table 4.

3.3.3. Discrete Entropy (DE)

According to the information theory, the richness of information in the image can be determined by Shannon entropy. The greater the entropy value of the image, the higher the information contained in the output image will have [41]. The discrete entropy (DE) for the whole image of the gray-level l can be defined as:
D E = l = 0 L 1 p ( l ) log 2 p ( l )
Only if p(0) = p(1) = ... = p(L − 1) = 1/L can the entropy of the image achieve its maximum. This is the case when the probability distribution of image intensity values is uniform. Some results are given in Table 5 and Table 6 for the values of DE achieved using the investigated methods mentioned above.
It can be further found from Table 5 and Table 6 that there are the best enhancement performances are exhibited by the developed method as compared to the other investigated approaches.
We note that the proposed method can be applied to effectively improve the visibility, contrast, and brightness of non-uniform illumination, as well as preserve image details. This method was shown to mathc or exceed the performances of other selected state-of-the-art methods.
In order to test its performance in real-time processing, we used images with the size of 896 × 592 pixels for further testing. All the experiments were run in MATLAB 2016 on a PC with 3.3 GHz CPU and 4 GB RAM. Table 7 shows the average running time of different methods. The proposed method obtained a superior performance in real-time processing compared with most of the other algorithms.

4. Conclusions

In this paper, an effective method to enhance the contrast of digital images was presented. The histogram of the input image was modified by a novel function, which can avoid the loss of information in the image processing procedure. The log-exp transformation was developed to make the weighted values in the dark regions of the image higher while the ones in the bright regions were lower. In order to obtain a wider dynamic range for the brightness component of the output image, a nonlinear normalization transformation was put forward. The clearness of non-uniform illumination images was obviously improved following the application of the proposed method. The experimental results show that the proposed method has a better image enhancement performance compared with other state-of-the-art methods.

Author Contributions

L.Z. implemented the core algorithm and drafted the manuscript. Y.G. reviewed and edited the manuscript. All authors discussed the results and implications, commented on the manuscript at all stages, and approved the final version.

Funding

This work was supported in part by the Natural Science Foundation of China (grant nos. 11176016 and 60872117).

Conflicts of Interest

Liyun Zhuang and Yepeng Guan declare that there are no conflicts of interest regarding the publication of this paper.

References

  1. Du, S.; Ward, R.K. Adaptive region-based image enhancement method for robust face recognition under variable illumination conditions. IEEE Trans. Circuits Syst. Video Technol. 2010, 20, 1165–1175. [Google Scholar] [CrossRef]
  2. Sun, G.; Liu, S.; Wang, W.; Chen, Z. Dynamic range compression and detail enhancement algorithm for infrared image. Appl. Opt. 2014, 53, 6013–6029. [Google Scholar] [CrossRef] [PubMed]
  3. Huang, T.; Shih, K.; Yeh, S.; Chen, H. Enhancement of backlight-scaled images. IEEE Trans. Image Process. 2013, 22, 4587–4597. [Google Scholar] [CrossRef] [PubMed]
  4. Atta, R.; Ghanbari, M. Low-contrast satellite images enhancement using discrete cosine transform pyramid and singular value decomposition. IET Image Process. 2013, 7, 472–483. [Google Scholar] [CrossRef]
  5. Han, H.; Sohn, K. Automatic illumination and color compensation using mean shift and sigma filter. IEEE Trans. Consum. Electr. 2009, 55, 978–986. [Google Scholar] [CrossRef]
  6. Beghdadi, A.; Le Negrate, A. Contrast enhancement technique based on local detection of edges. Comput. Vis. Image Und. 1989, 46, 162–174. [Google Scholar]
  7. Cheng, H.D.; Xu, H. A novel fuzzy logic approach to contrast enhancement. Pattern Recognit. 1989, 33, 809–819. [Google Scholar] [CrossRef]
  8. Tang, J.; Liu, X.; Sun, Q. A direct image contrast enhancement algorithm in the wavelet domain for screening mammograms. IEEE J. Sel. Top. Signal Process. 2009, 3, 74–80. [Google Scholar] [CrossRef]
  9. Sherrier, R.H.; Johnson, G.A. Regionally adaptive histogram equalization of the chest. IEEE Trans. Med. Imaging 1987, 6, 1–7. [Google Scholar] [CrossRef]
  10. Polesel, A.; Ramponi, G.; Mathews, V.J. Image enhancement via adaptive unsharp masking. IEEE Trans. Image Process. 2000, 9, 505–510. [Google Scholar] [CrossRef] [Green Version]
  11. Chiu, Y.S.; Cheng, F.C.; Huang, S.C. Efficient contrast enhancement using adaptive gamma correction and cumulative intensity distribution. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Anchorage, AK, USA, 9–12 October 2011; pp. 2946–2950. [Google Scholar]
  12. Cheng, H.D.; Shi, X.J. A simple and effective histogram equalization approach to image enhancement. Digit Signal Process. 2004, 14, 158–170. [Google Scholar] [CrossRef]
  13. Coltuc, D.; Bolon, P.; Chassery, J.M. Exact histogram specification. IEEE Trans. Image Process. 2006, 15, 1143–1152. [Google Scholar] [CrossRef] [PubMed]
  14. Hussain, K.; Rahman, S.; Khaled, S.M.; Abdullah-Al-Wadud, M.; Shoyaib, M. Dark image enhancement by locally transformed histogram. In Proceedings of the IEEE International Conference on Software, Knowledge, Information Management and Applications, Dhaka, Bangladesh, 18–20 December 2014; pp. 1–7. [Google Scholar]
  15. Kim, Y.T. Contrast enhancement using brightness preserving bihistogram equalization. IEEE Trans. Consum. Electr. 1997, 43, 1–8. [Google Scholar]
  16. Kim, M.; Chung, M.G. Recursively separated and weighted histogram equalization for brightness preservation and contrast enhancement. IEEE Trans. Consum. Electr. 2008, 54, 1389–1397. [Google Scholar] [CrossRef]
  17. Wang, Y.; Chen, Q.; Zhang, B. Image enhancement based on equal area dualistic sub-image histogram equalization method. IEEE Trans. Consum. Elect. 1999, 45, 68–75. [Google Scholar] [CrossRef]
  18. Chen, S.D.; Ramli, A.R. Minimum mean brightness error bihistogram equalization in contrast enhancement. IEEE Trans. Consum. Electr. 2003, 49, 1310–1319. [Google Scholar] [CrossRef]
  19. Lim, S.H.; Isa, N.A.M.; Ooi, C.H.; Toh, K.K.V. A new histogram equalization method for digital image enhancement and brightness preservation. Signal Image Video Process. 2015, 9, 675–689. [Google Scholar] [CrossRef]
  20. Wang, C.; Ye, Z. Brightness preserving histogram equalization with maximum entropy: A variational perspective. IEEE Trans. Consum. Electr. 2005, 51, 1326–1334. [Google Scholar] [CrossRef]
  21. Chen, S.D.; Ramli, A.R. Contrast enhancement using recursive mean separate histogram equalization for scalable brightness preservation. IEEE Trans. Consum. Electr. 2003, 49, 1301–1309. [Google Scholar] [CrossRef]
  22. Kim, H.J.; Lee, J.M.; Lee, J.A.; Oh, S.G.; Kim, W.Y. Contrast enhancement using adaptively modified histogram equalization. In Pacific-Rim Symposium on Image and Video Technology; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  23. Sim, K.S.; Tso, C.P.; Tan, Y.Y. Recursive sub-image histogram equalization applied to gray scale images. Pattern Recogn. Lett. 2007, 28, 1209–1221. [Google Scholar] [CrossRef]
  24. Hasikin, K.; Isa, N.A.M. Adaptive fuzzy intensity measure enhancement technique for non-uniform illumination and low-contrast images. Signal Image Video Process. 2015, 9, 1419–1442. [Google Scholar] [CrossRef]
  25. Sun, C.C.; Ruan, S.J.; Shie, M.C.; Pai, T.W. Dnamic contrast enhancement based on histogram specification. IEEE Trans. Consum. Electr. 2005, 51, 1300–1305. [Google Scholar]
  26. Tsai, C.M.; Yeh, Z.M. Contrast enhancement by automatic and parameter-free piecewise linear transformation for color images. IEEE Trans. Consum. Electr. 2008, 54, 213–219. [Google Scholar] [CrossRef] [Green Version]
  27. Tsai, C.M.; Yeh, Z.M.; Wang, Y.F. Decision tree-based contrast enhancement for various color images. Mach. Vis. Appl. 2011, 22, 21–37. [Google Scholar] [CrossRef]
  28. Huang, S.C.; Cheng, F.C.; Chiu, Y.S. Efficient contrast enhancement using adaptive gamma correction with weighting distribution. IEEE Trans. 2013, 22, 1032–1041. [Google Scholar] [CrossRef] [PubMed]
  29. Rahman, S.; Rahman, M.M.; Hussain, K.; Khaled, S.M.; Shoyaib, M. Image enhancement in spatial domain: A comprehensive study. In Proceedings of the 2014 17th International Conference on Computer and Information Technology, Dhaka, Bangladesh, 22–23 December 2014; pp. 368–373. [Google Scholar]
  30. Singh, K.; Kapoor, R. Image enhancement using exposure based sub image histogram equalization. Pattern Recogn. Lett. 2014, 36, 10–14. [Google Scholar] [CrossRef]
  31. Hasikin, K.; Isa, N.A.M. Adaptive fuzzy contrast factor enhancement technique for low contrast and nonuniform illumination images. Signal Image Video Process. 2014, 8, 1591–1603. [Google Scholar] [CrossRef]
  32. Tang, J.R.; Isa, N.A.M. Intensity exposure-based bi-histogram equalization for image enhancement. Turk. J. Electr. Eng. Comput. Sci. 2016, 24, 3564–3585. [Google Scholar] [CrossRef]
  33. Tang, J.R.; Isa, N.A.M. Bi-histogram equalization using modified histogram bins. Appl. Soft Comput. 2017, 55, 31–43. [Google Scholar] [CrossRef]
  34. Hanmandlu, M.; Verma, O.P.; Kumar, N.K.; Kulkarni, M. A novel optimal fuzzy system for color image enhancement using bacterial foraging. IEEE Trans. Instrum. Meas. 2009, 58, 2867–2879. [Google Scholar] [CrossRef]
  35. Gonzalez, R.C.; Woods, R.E. DIP3/e Book Images. Available online: http://www.image processing place.com/DIP3E/dip3e-book-images-downloads.htm (accessed on 7 January 2019).
  36. Woods, M. Frontal Face Dataset. Available online: http://www.vision.caltech.edu/htmlles/archive.html (accessed on 7 January 2019).
  37. Braukus, M.; Henry, K. NASA Technology Helps Weekend Photographers Look Like Pros. Available online: http://dragon.larc.nasa.gov/retinex/pao/news/ (accessed on 7 January 2019).
  38. Peli, E. Contrast in complex images. J. Opt. Soc. Am. A 1990, 7, 2032–2040. [Google Scholar] [CrossRef] [PubMed]
  39. Lee, K.C. The Extended Yale Face Database B. Available online: http://vision.ucsd.edu/~iskwak/ExtYaleDatabase/ExtYaleB.html (accessed on 7 January 2019).
  40. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
  41. Gull, S.F.; Skilling, J. Maximum entropy method in image processing. IEE Proc. 1984, 131, 646–659. [Google Scholar] [CrossRef]
Figure 1. An overall flowchart of the proposed approach for image enhancement.
Figure 1. An overall flowchart of the proposed approach for image enhancement.
Symmetry 11 01062 g001
Figure 2. (a) Front lighting #1 (left) and side lighting (right). (b) Resultant transformation curves of adaptive gamma correction with weighting distribution (AGCWD) [28] after applying the weighting distribution function.
Figure 2. (a) Front lighting #1 (left) and side lighting (right). (b) Resultant transformation curves of adaptive gamma correction with weighting distribution (AGCWD) [28] after applying the weighting distribution function.
Symmetry 11 01062 g002
Figure 3. Transformation curves after the histogram modified function for lighting #1, and side lighting in Figure 2.
Figure 3. Transformation curves after the histogram modified function for lighting #1, and side lighting in Figure 2.
Symmetry 11 01062 g003
Figure 4. Comparisons for processed results (top) and corresponding 3D model of the luminance component (bottom). (a) Adaptive gamma correction with modified histogram (AGCMH), (b) log-exp transformation (LET).
Figure 4. Comparisons for processed results (top) and corresponding 3D model of the luminance component (bottom). (a) Adaptive gamma correction with modified histogram (AGCMH), (b) log-exp transformation (LET).
Symmetry 11 01062 g004
Figure 5. (a) Original V (luminance value) component (top) and corresponding histogram (bottom), (b) AGCMH, (c) LET, and (d) nonlinear normalization transformation (NNT).
Figure 5. (a) Original V (luminance value) component (top) and corresponding histogram (bottom), (b) AGCMH, (c) LET, and (d) nonlinear normalization transformation (NNT).
Symmetry 11 01062 g005
Figure 6. Enhancement performance for different values of δ and μ. (a) Contrast per pixel (CPP) [37], (b) root mean square (RMS) [38], (c) discrete entropy (DE) [39].
Figure 6. Enhancement performance for different values of δ and μ. (a) Contrast per pixel (CPP) [37], (b) root mean square (RMS) [38], (c) discrete entropy (DE) [39].
Symmetry 11 01062 g006
Figure 7. Results for dim #1. (a) Original image, (b) recursive mean-separate histogram equalization (RMSHE) [20], (c) AMHE [21], (d) recursive sub-image histogram equalization (RSIHE) [22], (e) AGCWD [26], (f) exposure sub-image histogram equalization (ESIHE) [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 7. Results for dim #1. (a) Original image, (b) recursive mean-separate histogram equalization (RMSHE) [20], (c) AMHE [21], (d) recursive sub-image histogram equalization (RSIHE) [22], (e) AGCWD [26], (f) exposure sub-image histogram equalization (ESIHE) [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g007
Figure 8. Results for dim #2. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 8. Results for dim #2. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g008
Figure 9. Results for backlighting. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 9. Results for backlighting. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g009
Figure 10. Results for front lighting #1. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 10. Results for front lighting #1. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g010
Figure 11. Results for front lighting #2. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 11. Results for front lighting #2. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g011
Figure 12. Results for daytime #1. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 12. Results for daytime #1. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g012
Figure 13. Results for daytime #2. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 13. Results for daytime #2. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g013
Figure 14. Results for side lighting. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 14. Results for side lighting. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g014
Figure 15. Results for gray. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Figure 15. Results for gray. (a) Original image, (b) RMSHE [20], (c) AMHE [21], (d) RSIHE [22], (e) AGCWD [26], (f) ESIHE [28], (g) BHEMHB [33], and (h) the proposed method.
Symmetry 11 01062 g015
Table 1. Objective evaluation results in term of CPP.
Table 1. Objective evaluation results in term of CPP.
ImageRMSHE [21]AMHE [22]RSIHE [23]AGCWD [28]ESIHE [30]BHEMHB [33]
Dim #16.92 3.327.555.334.165.11
Dim #211.066.649.6610.929.427.27
Backlighting16.8515.7716.8417.1315.7215.21
Front Lighting #16.174.505.604.955.016.15
Front Lighting #25.875.987.167.697.057.45
Daytime #19.418.089.307.458.959.56
Daytime #213.1312.3813.3611.3312.3512.67
Side Lighting9.6010.139.3011.579.499.66
Gray9.308.389.247.368.139.60
Average9.818.359.819.308.929.19
Table 2. CPP performance in five groups comparisons with different methods.
Table 2. CPP performance in five groups comparisons with different methods.
ImageG1G2G3G4G5Average
RMSHE [21]8.689.207.354.795.457.09
AMHE [22]7.997.356.483.574.375.95
RSIHE [23]8.588.096.965.585.006.84
AGCWD [28]8.207.858.044.625.176.78
ESIHE [30]8.598.737.624.284.446.73
BHEMHB [33]8.798.777.894.665.137.05
Proposed8.848.588.487.686.037.92
Table 3. Objective evaluation results in term of RMS.
Table 3. Objective evaluation results in term of RMS.
ImageRMSHE [21]AMHE [22]RSIHE [23]AGCWD [28]ESIHE [30]BHEMHB [33]Proposed
Dim #172.4145.3072.9577.7862.9843.5779.12
Dim #272.3838.5563.3264.8454.4138.4872.22
Backlighting82.5878.4681.4882.1881.2977.5184.45
Front Lighting #174.3372.0576.3788.0575.4473.6583.59
Front Lighting #269.0779.0882.2687.4976.7068.4087.91
Daytime #174.1358.1967.1258.5565.2771.1575.70
Daytime #271.2360.8864.1063.4061.3860.8272.11
Side Lighting73.9473.3372.3082.0171.5164.777.86
Gray67.7158.5566.0854.9556.7662.0870.47
Average73.0962.7171.7873.2567.30 62.2678.16
Table 4. RMS performance in five groups comparisons with different methods.
Table 4. RMS performance in five groups comparisons with different methods.
MethodG1G2G3G4G5Average
RMSHE [21]77.5771.7573.2358.4769.3270.07
AMHE [22]80.4762.9765.7139.8267.7363.34
RSIHE [23]83.9266.5168.1463.2571.7270.71
AGCWD [28]83.4466.8475.0858.7488.1674.45
ESIHE [30]73.9769.2768.8568.6768.5769.87
BHEMHB [33]73.8931.3468.8540.9157.6754.53
Proposed84.2472.4575.1873.9083.9577.94
Table 5. Objective evaluation results in term of DE.
Table 5. Objective evaluation results in term of DE.
ImageRMSHE [21]AMHE [22]RSIHE [23]AGCWD [28]ESIHE [30]BHEMHB [33]Proposed
Dim #16.836.757.197.357.147.257.91
Dim #27.207.026.987.537.387.147.90
Backlighting7.357.297.347.307.227.267.38
Front Lighting #17.987.747.897.607.867.967.79
Front Lighting #27.607.697.667.507.847.837.86
Daytime #17.807.627.777.347.757.677.87
Daytime #27.677.717.657.107.647.677.69
Side Lighting7.627.697.607.677.687.717.75
Gray7.357.457.356.477.257.567.61
Average7.497.447.497.327.537.567.75
Table 6. DE performance in five groups comparisons with different methods.
Table 6. DE performance in five groups comparisons with different methods.
MethodG1G2G3G4G5 Average
RMSHE [21]7.477.877.655.877.367.24
AMHE [22]7.477.727.736.527.077.3
RSIHE [23]7.527.757.675.836.977.15
AGCWD [28]7.227.397.636.597.067.18
ESIHE [30]7.517.857.616.516.877.27
BHEMHB [33]7.537.847.766.757.27.42
Proposed7.687.667.767.247.377.54
Table 7. Average running time (s).
Table 7. Average running time (s).
RMSHE [21]AMHE [22]RSIHE [23]AGCWD [28]ESIHE [30]BHEMHB [33]Proposed
0.210.260.230.300.180.190.17

Share and Cite

MDPI and ACS Style

Zhuang, L.; Guan, Y. Image Enhancement Using Modified Histogram and Log-Exp Transformation. Symmetry 2019, 11, 1062. https://doi.org/10.3390/sym11081062

AMA Style

Zhuang L, Guan Y. Image Enhancement Using Modified Histogram and Log-Exp Transformation. Symmetry. 2019; 11(8):1062. https://doi.org/10.3390/sym11081062

Chicago/Turabian Style

Zhuang, Liyun, and Yepeng Guan. 2019. "Image Enhancement Using Modified Histogram and Log-Exp Transformation" Symmetry 11, no. 8: 1062. https://doi.org/10.3390/sym11081062

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop