Next Article in Journal
Quantitative Analysis Method and Correction Algorithm Based on Directivity Beam Pattern for Mismatches between Sensitive Units of Acoustic Dyadic Sensors
Next Article in Special Issue
Color Conversion of Wide-Color-Gamut Cameras Using Optimal Training Groups
Previous Article in Journal
Three-Dimensional Tracking of a Target under Angle-Frequency Measurements with Multiple Frequency Lines
Previous Article in Special Issue
Spectral Filter Selection Based on Human Color Vision for Spectral Reflectance Recovery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Colorimetric Characterization of Color Imaging System Based on Kernel Partial Least Squares

1
School of Information Science and Engineering, Dalian Polytechnic University, Dalian 116034, China
2
National Key Lab of Colour Science and Engineering, Beijing Institute of Technology, Beijing 100081, China
3
Research Center of Graphic Communication, Printing and Packaging, Wuhan University, Wuhan 430079, China
*
Authors to whom correspondence should be addressed.
Sensors 2023, 23(12), 5706; https://doi.org/10.3390/s23125706
Submission received: 2 May 2023 / Revised: 5 June 2023 / Accepted: 15 June 2023 / Published: 19 June 2023
(This article belongs to the Special Issue Recent Trends and Advances in Color and Spectral Sensors)

Abstract

:
Colorimetric characterization is the basis of color information management in color imaging systems. In this paper, we propose a colorimetric characterization method based on kernel partial least squares (KPLS) for color imaging systems. This method takes the kernel function expansion of the three-channel response values (RGB) in the device-dependent space of the imaging system as input feature vectors, and CIE-1931 XYZ as output vectors. We first establish a KPLS color-characterization model for color imaging systems. Then we determine the hyperparameters based on nested cross validation and grid search; a color space transformation model is realized. The proposed model is validated with experiments. The CIELAB, CIELUV and CIEDE2000 color differences are used as evaluation metrics. The results of the nested cross validation test for the ColorChecker SG chart show that the proposed model is superior to the weighted nonlinear regression model and the neural network model. The method proposed in this paper has good prediction accuracy.

1. Introduction

Color imaging systems have been widely used in industrial manufacturing, medicine, geology exploration, art and other fields. Color characterization [1,2,3] is an important means to evaluate the color characteristics of color imaging systems; it establishes the transformation relationship between the device-dependent RGB color space and the device-independent color space of the color imaging systems, and realizes the color consistency from color information acquisition to color information output [4]. Colorimetric characterization is very important for image high-fidelity display, colorimetric measurement, color reproduction, color analysis, color management of different devices [5] and color appearance prediction [6], and the requirements for the accuracy and stability of color characterization are increasing for different applications. Color analysis [7,8,9,10] plays a very important role in the evaluation of the performance of many technologies including paper-based devices that use colorimetric reactions and that researchers have used with respect to different color spaces such as weighted RGB, greyscale, etc.
According to international standards, color characterization can be generally divided into two categories: spectral response-based and target color-based characterization methods [11,12,13]. The spectral response-based characterization method is to find out the relationship between the spectral response and the CIE color-matching function under the condition of knowing the three-channel spectral response of the color imaging system, and establish the transformation relationship between the device-dependent RGB color space and the device-independent CIEXYZ color space [14]. However, measurement for the three-channel spectral response of the imaging system not only requires special experimental equipment and measurement equipment, but also requires professional measurement methods [15]. Monochromators and radiometers are among the measurement equipment used, and experts are required to manage the operation in the laboratory [16]; the operation is difficult for general device users. The advantage is that if the spectral response is known, the tristimulus values of objects under any known spectral conditions of a light source can be predicted, and the estimation methods of camera spectral response characteristics are also constantly explored [17]. The target color-based color-characterization method establishes the mapping relationship between the input and output color spaces based on samples, including interpolation methods, regression models, etc. The method has low requirements for experiments and is widely used in practice. In the early stages of color characterization, interpolation methods were widely used [18]. One of the common color-characterization methods at that time was the three-dimensional lookup table method [19,20]. Based on different regression models, common target color-based color-characterization methods include least squares polynomial regression method [21,22], neural network [23,24,25,26,27], support vector machine [28], etc.; from the perspective of input features of the model, these methods can also incorporate different feature-selection and optimization techniques. RGB cross-polynomial features have been widely used in color measurement, information management and color correction for their high conversion accuracy and low computational cost [29,30,31,32,33]. Essentially, this is a technique that uses the nonlinearity of input features to perform dimensionality increase. In addition to polynomial expansion, other kernel-based increasing dimension techniques also need further exploration in the field of colorimetric characterization. There are many research methods based on the ordinary least squares method with RGB cross-polynomial dimensionality increase. In 2000, Hong et al. proposed a polynomial regression method [34], which uses different combinations of multiple inputs of R, G and B cross-terms. In 2007, Bianco proposed a pattern search optimization algorithm [35] to study the conversion from RGB to CIEXYZ space. It uses pattern search optimization of the least squares method for 3 × 3 RGB to XYZ conversion. Meanwhile, for the problems of medium color spectral reflectance reconstruction and training sample validity in color measurement, various improved algorithms of the ordinary least squares method have also been applied in different studies. In terms of color spectral reconstruction, in 2010, Shen et al. proposed that the partial least squares regression (PLS) method can also be used to build a regression model based on the correlation between response values and spectral reflectance [36]. In 2013, Heikkinen et al. proposed a kernel ridge regression (KRR) method for spectral reflectance [37]. Essentially, KRR is a method that nonlinearly transforms low-dimensional camera responses to high-dimensional feature space; and it performs regularized least squares regression on the reflectance data in the feature space. In 2019, Xiao et al. proposed a new method for spectral reflectance reconstruction based on kernel partial least squares regression (KPLS) [38]. The problem of colorimetric characterization is different from that of color spectral reflectance reconstruction. Although the spectral reflectance of the object surface is a high-resolution expression of the object color, the spectral reflectance of the object surface needs to be combined with the light source to show the color information. And the spectral reflectance of the object surface cannot be directly used for colorimetric characterization and color space conversion; meanwhile, this method uses RMSE for parameter tuning without considering visual color difference, and cannot accurately optimize model parameters for visual color difference. In terms of training sample validity, in 2018, Amiri et al. proposed a weighted nonlinear regression method (WT-NONLIN), which enables commercial digital RGB cameras to be used for spectral and colorimetric color reproduction [39].
The accuracy of color space conversion is constrained by the regression model, the training samples and the feature extraction. The effectiveness of feature extraction is the key to improving the accuracy. We propose a new method for colorimetric characterization of color imaging systems based on KPLS. The method establishes the mapping relationship between the three-channel response kernel function of the color imaging system and the CIE1931 tristimulus values. Through canonical correlation analysis, the direction vector with the strongest correlation between input and output is determined. Based on different direction vectors, a multivariate nonlinear regression model is established. The method considers the relevance between the input vector and the output vector; this method can promote predictive ability when a feature vector is selected. It can effectively solve the multicollinearity problem caused by kernel function expansion, and eliminate irrelevant input variables. It makes the model more concise and stable. It is proved that this method is an effective colorimetric characterization method by using two nested cross-validation.

2. Theory and Method

2.1. Colorimetric Characterization of Color Imaging System Based on KPLS

The basic idea of the colorimetric characterization method for a color imaging system is to establish the mapping relationship between the device-dependent space RGB and the device-independent space XYZ, so as to realize the transformation model from RGB to XYZ. In order to obtain the colorimetric characterization mapping from RGB to XYZ, it is usually necessary to use a standard color chart as a measurement sample, and measure its three-channel response values and tristimulus values at the same time, so as to obtain a series of calibration data pairs of XYZ and RGB. In this study, the KPLS method is used for modeling research. Firstly, the RGB dataset is expanded via the kernel function, then the dataset is divided. Next, the PLS regression model is trained. Finally, using color difference as evaluation indicators, the trained model is used to predict on the test set and evaluate the performance of the algorithm. The colorimetric characterization method based on KPLS proposed in this paper is shown in Figure 1.

2.2. Kernel Expansion of the RGB Color Value

A series of data pairs of R , G , B and X , Y , Z are obtained by simultaneously measuring the three-channel response values and the three-stimulus values. The input matrix N = n i j m × 3 and the output matrix U = u i j m × 3 are composed of m groups of data, where n i = R i , G i , B i , i = 1 m , u i = X i , Y i , Z i , i = 1 m . In order to deal with the nonlinear characteristics between variables, we use a nonlinear mapping function to map the input matrix to the input matrix P in the high-dimensional feature space. That is, n i j n i j . The matrix after dimensionality reduction is shown in Equation (1).
P = n 1 , n 2 , , n m T
where denotes the basis function.
Since the dimension of can be arbitrarily large or even infinite, we usually define the following kernel matrix K to avoid explicitly using
K = P P T
where the element k i , j of the i th row and j th column of K is
k i , j = n i T , n j = f ker n i , n j
where 〈 〉 denotes the inner product operation, and f ker is the kernel function. In this paper, we use three kernel functions, which are polynomial kernel, root polynomial kernel and Gaussian kernel, to perform dimensionality enhancement; these kernel functions are shown in Equations (4)–(6)
f ker P n i T , n j = s = 0 d d ! s ! d s ! n i T , n j s + s t = 1 s 1 n i T , n j t t ! + 1
f ker R n i T , n j = s = 0 d d ! s ! d s ! n i T , n j s + s t = 1 s 1 n i T , n j t t ! + 1 1 / s
f ker G n i T , n j = exp n i n j 2 2 σ 2 , i , j m

2.3. Color Space Conversion Based on KPLS

To enhance the accuracy of the conversion model between the source color space (RGB) and the target color space (XYZ), we use a kernel function to expand the feature vector for regression. Taking into account the correlation among the expanded feature vectors, we apply the PLS method. PLS regression is related to principal component regression. However, instead of finding a hyperplane of maximum variance between the response and independent variables, it finds a linear regression model by projecting both the predicted and observed variables to a new space. First, the data are normalized.
E 0 = n i j n j ¯ / S n j m × m F 0 = u i j u j ¯ / S u j m × 3
P is normalized to E 0 , and U is normalized to F 0 . To consider the correlation between the input and output vectors, we decomposed them and modeled the two direction vectors with the highest correlation [40]. The first component u 1 is extracted from the independent variable matrix E 0 , and the first component t 1 of the dependent variable matrix F 0 is extracted. Namely, t 1 = E 0 w 1 , where w 1 is the direction vector. Because E 0 is the normalized matrix, w 1 is the unit vector. It denotes the direction of the first axis in decomposition of E 0 . That is, w 1 T w 1 = 1 . Similarly, u 1 = F 0 c 1 . The objective function can be described as in Equation (8)
max w 1 , c 1 E 0 w 1 , F 0 t 1 ,   s . t w 1 T w 1 = 1 c 1 T c 1 = 1
Using the Lagrange multiplier method to solve, we obtain w 1 and c 1 . Then we can obtain the components t 1 = E 0 w 1 and u 1 = F 0 c 1 . Then, we find the regression equations of E 0 and F 0 with respect to t 1 , respectively.
E 0 = t 1 h 1 T + E 1
F 0 = t 1 r 1 T + F 1
where E 1 and F 1 are the residual matrices of two regression equations, and the regression coefficient vectors are
h 1 = E 0 T t 1 t 1 2
r 1 = F 0 T t 1 t 1 2
To meet the accuracy requirement, we continued to search for the second pair of direction vectors with the highest correlation. The residual matrices E 1 and F 1 replace E 0 and F 0 . We find the second directions of the axis w 2 , c 2 from E 1 and F 1 , and then we obtained the second components t 2 , u 2 . Continuing this calculation, if the mathematical rank of P is a, then Equations (13) and (14) can be eventually obtained
E 0 = h a T t 1 + h a T t 2 + h a T t a ,
F 0 = r a T t 1 + r a T t 2 + r a T t a + F ,
where h 1 , h 2 h a , r 1 , r 2 r a are the regression coefficient.
t 1 , t 2 t a can be expressed as a linear combination of the vectors of the standardized vector. Based on Equation (14), we can obtain the regression form as follows
F 0 = r 1 E 0 w 1 + + r a E 0 w a + F
where w a = j = 1 h 1 Ι w j h j w a , Ι is the unit matrix. Then we can obtain
F = j λ j x j + F
where λ j = a r a w a j . Finally, following the standardized inverse process, the regression equation of F is reduced to the regression equation of P to U .
The color space conversion algorithm based on KPLS is shown in Figure 2.

2.4. Evaluation Metrics

We chose three different color difference formulas according to different application domains to evaluate our model. CIEDE2000 is a widely used color difference formula that considers the needs of different fields [41]. CIELAB color space can better describe the psychological effects of object colors, and it is suitable for materials such as dyes, pigments and inks when the color difference is larger than the visual recognition threshold but smaller than the color difference between adjacent colors in the Munsell system [42,43]. CIELUV is also a relatively perceptually uniform color space, and it is suitable for applications of spectral colors and color vision. The CIELUV color difference formula performs better than other methods when predicting the color difference of illumination stimuli, especially when using a black background [44,45,46]. We used CIELAB, CIELUV and CIEDE2000 color differences to evaluate the generalization ability and prediction accuracy of the KPLS model [47,48,49].

3. Experiment

3.1. Experimental Scheme

To validate the proposed model, we first constructed a dataset using the RGB and XYZ values of the color samples. To verify the colorimetric characterization accuracy of the model proposed in this paper, we further divided the dataset into training and testing sets. We trained different colorimetric characterization models based on the training set, and evaluated these models based on the testing set. Finally, we compared the colorimetric characterization effects of different models. The experimental scheme is shown in Figure 3. The color samples used in the experiment are selected from the internationally common ColorChecker SG color chart (X-Rite, Grand Rapids, MN, USA); a D65 light booth (Datacolor, Lawrenceville, GA, USA) was used as the light source. A PR715 spectroradiometer (Photo Research, Syracuse, NY, USA) was used to measure the tristimulus values (CIE XYZ) of 96 non-neutral color patches in the SG color chart. A Canon EOS 1000D was used in manual operation mode. Its resolution was 3888 × 2592 (22.2 mm × 14.8 mm), the ISO was 800 and the F number was 10. When obtaining the color block image, RAW format image was used to capture RGB values. A total of 96 pairs of data were obtained in the experiment, and we divided the 96-group dataset into training and testing sets in accordance with ten-fold cross-validation.
The measured sample dataset is shown in Figure 4. Figure 4a,b show the sample distribution, and Figure 4c,d show the chromaticity distribution.

3.2. Correlation Analysis of Input and Output Vectors

The input vector is the kernel function expansion of RGB. Therefore, there is multi-collinearity among input feature variables. The Variance Inflation Factor (VIF) represents the magnification of model parameter estimation variances between non-collinearity and multi-collinearity. It can be calculated with Equation (17)
V j = 1 R j 2 1
where V j represents VIF corresponding to j th variable; R j 2 is the complex measuring coefficient of regression between dependent variables x j and other independent variables x i . The VIF of the input feature variables is shown in Figure 5. If it is more than 100 times larger than that of non-collinearity, the confidence interval is too wide, and the significance test is not credible. Those results reflect the severe collinearity of the R, G and B kernel function expansion.
The correlation between input and output vectors directly affects the accuracy of the regression model; therefore, we analyze the RGB kernel function expansion and its correlation with X, Y and Z. The absolute value of the correlation coefficient reflects the degree of correlation. When the absolute values of the correlation coefficient are greater than 0.6, that means high correlation between the two vectors. The Pearson correlation coefficients between each term and X, Y and Z are shown in Figure 6, Figure 7 and Figure 8a–c, respectively. We can see that the input vectors are highly correlated with X, Y and Z, respectively. Moreover, we applied principal component analysis (PCA) to reduce the multicollinearity among terms of input vectors. The VIF of each component after PCA is approximately 1. The Pearson correlation coefficients between each component after PCA and X, Y and Z are shown in Figure 6, Figure 7 and Figure 8d–f, respectively. There is only one component whose absolute value of the correlation coefficient is greater than 0.6. It suggests that most principal components have a low correlation with X, Y and Z. As shown in Figure 5, Figure 6, Figure 7 and Figure 8, RGB kernel function expansion has a high degree of information overlap and low information validity. It represents high multicollinearity, which affects the accuracy of parameter estimation and the robustness of the model. Although PCA can eliminate collinearity, it also reduces the correlation with the dependent variable.

3.3. Hyperparameter Selection

In this paper, we perform kernel expansion on the input features of the KPLS model using the polynomial kernel function, the Gaussian kernel function and the root polynomial kernel function. We optimize and evaluate their hyperparameters. The hyperparameters include the order of the polynomial, σ which represents the width parameter of the radial basis function and the numbers of components of the PLS model. Unlike the traditional k-fold cross-validation method, we use a nested cross-validation method to select the optimal hyperparameters and evaluate the performance of the model at the same time. We perform two-layer cross-validation. First, we split the training set and test set. Then we split each training set into training set and validation set. The validation set is used for model hyperparameter selection, and the test set is used for accuracy evaluation. The 10-fold cross-validation procedure for model hyperparameter optimization is nested inside the 10-fold cross-validation procedure for accuracy evaluation. The inner loop cross-validation applies grid search to find the optimal hyperparameters within the preset parameter range, so as to minimize the average CIEDE2000 color difference on the validation set; then we use this set of optimal hyperparameters to evaluate the model performance on the test set in the outer loop. Finally, we obtain the CIELAB, CIELUV and CIEDE2000 color differences on the test set. Table 1 shows the optimal hyperparameter combinations selected via the KPLS model based on polynomial kernel expansion, Gaussian kernel and root polynomial kernel expansion in ten training sets. Figure 9 shows the change in CIEDE2000 color difference on the test set under different hyperparameter combinations on the first training set. Figure 10 shows the correlation coefficients between the dependent variable and independent variable obtained via the KPLS model based on polynomial kernel expansion, Gaussian kernel expansion and root polynomial kernel function expansion, respectively. It can be seen that the KPLS model effectively eliminates the problem of multicollinearity between independent variables.

3.4. Experimental Results of This Paper

To assess the performance of the KPLS model in colorimetric characterization, we use three different color difference formulas—CIEDE2000, CIELAB and CIELUV—to calculate the average color difference on the test set in the outer loop. We collect the test set data from each fold of the cross-validation. We obtain a regression analysis of 96 pairs of actual and predicted data. Figure 11, Figure 12 and Figure 13 show the regression results of the KPLS model based on polynomial kernel expansion, Gaussian kernel expansion and root polynomial kernel expansion on the test set in the 10-fold cross-validation. Table 2 gives the average color difference of the model prediction on the test set of each fold.

3.5. Comparison with Other Methods

Polynomial regression model [21,22] and neural network model [23,24,25,26,27] are used in colorimetric characterization methods commonly. We compare the KPLS model with the weighted nonlinear polynomial regression model, neural network model and root polynomial regression model [32]. The comparison results are shown in Figure 14 and Figure 15. MLP represents the neural network model. In the MLP model, the hidden layer is 10, the learning rate is 0.1 and the learning target is 0.00001. WT-NONLIN_1, WT-NONLIN_2, WT-NONLIN_3 and WT-NONLIN_4 represent the weighted regression model which uses four different formulae for calculating the distance here [39]. RP-OLS represents the root polynomial regression model. We choose 3 as the degree of the polynomial. The same dataset is used in this experiment. After dividing the dataset by nested cross-validation, color differences predicted by each model are compared. In Figure 14, the ten test sets are collected together. There are 96 samples. We compare the average color differences of different models under the same sample. In Figure 15, we compare the average color differences of different models of each fold test set. The CIELAB, CIELUV and CIEDE2000 color differences of the test set are calculated and compared. The experimental results are shown in Table 3. Table 3 shows the average color difference of the model prediction on the test set of each fold with different algorithms.

4. Discussion

We observed a large color difference point in the colorimetric characterization results of the KPLS model and the weighted nonlinear polynomial regression model. This point is the 60th pair of data, corresponding to the 6th fold of the ten-fold data. The sample distribution of the training and test sets for this group of data are shown in Figure 16. The red points are the training set; the blue points are the test set, and the blue square point is the 60th pair of data. We can see that these data are far away from the training set samples in the test set. They belong to the extrapolation prediction of the model. They exhibit a large difference in comparison with the data in the training set, so we cannot accurately predict them. Therefore, how to increase the coverage range of the color gamut of the training dataset or improve the accuracy of the model to enhance the prediction ability of the extrapolation points is the next research direction.

5. Conclusions

This paper studies a colorimetric characterization method for color imaging systems based on KPLS. The method uses device-dependent RGB space as input features and expands the feature vectors with kernel functions. According to feature analysis and canonical correlation analysis, a KPLS colorimetric characterization model for color imaging systems is established. The hyperparameters of the model are determined via nested cross-validation and grid search, and the color space conversion model is constructed. The experimental verification is carried out by using a Canon 1000D commercial digital camera to shoot the SG color card, and CIELAB, CIELUV, CIEDE2000 color differences are used for evaluation. The results show that the color difference of the KPLS colorimetric characterization model based on root polynomial function expansion is better than that of the weighted nonlinear regression model, the neural network model and the root polynomial regression model. The colorimetric characterization method for color imaging systems based on KPLS is an effective method with good prediction accuracy and nonlinear fitting ability, which can support the cross-media color management of color imaging systems well.

Author Contributions

Conceptualization, N.L. and Q.L.; methodology, S.Z. and X.X.; software, S.Z., Z.F. and L.L.; validation, S.Z. and X.X.; writing—original draft, S.Z. and X.X.; writing—review and editing, S.Z. and X.X. All authors have read and agreed to the published version of the manuscript.

Funding

National Natural Science Foundation of China (Nos. 61975012).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are available on request from the corresponding author.

Acknowledgments

We are particularly grateful to the reviewers for their rigor and erudition. The proposal of the root kernel function in this paper was inspired by the reviewer.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mou, T.; Shen, H. Colorimetric characterization of imaging device by total color difference minimization. J. Zhejiang Univ. Sci. A 2006, 7, 1041–1045. [Google Scholar] [CrossRef]
  2. Ma, Y.; Liu, H.; Liu, X. A research on the color characterization of digital camera. J. Beijing Inst. Graph. Commun. 2006, 14, 9–12. [Google Scholar]
  3. Zhang, X. Study of Color Reproduction Theory and Method for Digital Image; Zhejiang University: Hangzhou, China, 2010. [Google Scholar]
  4. Fu, S.; Cui, C.; Zhang, R. Colorimetric characterization modeling software for digital imaging device. Opto. Electron. Eng. 2010, 37, 88–92. [Google Scholar]
  5. Green, P. Color Management: Understanding and Using ICC Profiles; Wiley: Hoboken, NJ, USA, 2010; pp. 20–34. [Google Scholar]
  6. Danny, C.R. Publication CIE 159: A colour appearance model for colour management systems: CIECAM02. Color Res. Appl. 2006, 31, 156–159. [Google Scholar]
  7. Charbaji, A.; Heidari-Bafroui, H.; Rahmani, N.; Anagnostopoulos, C.; Faghri, M. Colorimetric Determination of Nitrate after Reduction to Nitrite in a Paper-Based Dip Strip. Chem. Proc. 2021, 5, 9. [Google Scholar]
  8. Berlina, A.N.; Ragozina, M.Y.; Komova, N.S.; Serebrennikova, K.V.; Zherdev, A.V.; Dzantiev, B.B. Development of Lateral Flow Test-System for the Immunoassay of Dibutyl Phthalate in Natural Waters. Biosensors 2022, 12, 1002. [Google Scholar] [CrossRef]
  9. Kim, J.-H.; Lee, Y.-J.; Ahn, Y.-J.; Kim, M.; Lee, G.-J. In situ detection of hydrogen sulfide in 3D-cultured, live prostate cancer cells using a paper-integrated analytical device. Chemosensors 2022, 10, 27. [Google Scholar] [CrossRef]
  10. Pomili, T.; Gatto, F.; Pompa, P.P. A Lateral Flow Device for Point-of-Care Detection of Doxorubicin. Biosensors 2022, 12, 896. [Google Scholar] [CrossRef] [PubMed]
  11. ISO 17321-1:2012; Graphic Technology and Photography-Colour Characterisation of Digital Still Cameras (DSCs)—Part 1: Stimuli, Metrology, and Test Procedures. International Organization for Standardization: Geneva, Switzerland, 5 November 2012.
  12. ISO 17321-1:2012; Graphic Technology and Photography-Colour Characterization of Digital Still Cameras (DSCs)—Part 2: Methods for Determining Transforms from Raw Dsc to Scene-Referred. International Organization for Standardization: Geneva, Switzerland, 12 October 2012.
  13. IEC 61966-9:200; Colour Measurement and Management-Multimedia Systems and Equipment-Part 9: Digital Cameras. International Electrotechnical Commission: Geneva, Switzerland, November 2003.
  14. Verdu, F.M.; Pujol, J.; Capilla, P. Calculation of the color matching functions of digital cameras from their complete spectral sensitivities. J. Imaging Sci. Technol. 2002, 46, 15–25. [Google Scholar] [CrossRef]
  15. Chouikha, M.B.; Placais, B.; Pouleau, G. Benefits and drawbacks of two methods for characterizing digital cameras. In Proceedings of the IS&T CGIV 2006 3rd European Conference on Colour in Graphics, Imaging, and Vision, Leeds, UK, 19–22 June 2006; Society for Imaging Science and Technology: Springfield, VA, USA, 2006; pp. 185–188. [Google Scholar]
  16. Lee, S.H.; Choi, J.S. Design and implementation of color correction system for images captured by digital camera. IEEE Trans. Consum. Electron. 2008, 54, 268–276. [Google Scholar] [CrossRef]
  17. Rump, M.; Zinke, A.; Klein, R. Practical spectral characterization of trichromatic cameras. ACM Trans. Graph. 2011, 30, 170. [Google Scholar] [CrossRef]
  18. Hung, P.C. Colorimetric calibration in electronic imaging devices using a look-up-table model and interpolations. J. Electron. Imaging 1993, 2, 53–61. [Google Scholar] [CrossRef]
  19. Balasubramanian, R. Reducing the cost of look up table based color transformations. J. Imaging Sci. Technol. 2000, 44, 321–327. [Google Scholar] [CrossRef]
  20. Johnson, T. Methods for characterizing colour scanners and digital cameras. Displays 1996, 16, 183–191. [Google Scholar] [CrossRef]
  21. Rowlands, D.A. Color conversion matrices in digital cameras: A tutorial. Opt. Eng. 2020, 59, 110801. [Google Scholar] [CrossRef]
  22. Jing, J.; Fang, S.; Shi, Z.; Xia, Q.; Li, Y. An efficient nonlinear polynomial color characterization method based on interrelations of color spaces. Color Res. Appl. 2020, 45, 1023–1039. [Google Scholar]
  23. Liu, Y.; Yu, H.; Shi, J. Camera characterization using back-propagation artificial neutral network based on Munsell system. Proc. SPIE 2008, 6621, 66210A. [Google Scholar]
  24. Li, Y.; Liao, N.; Li, H.; Lv, N.; Wu, W. Colorimetric characterization of the wide-color-gamut camera using the multilayer artificial neural network. J. Opt. Soc. Am. 2023, 40, 629–636. [Google Scholar] [CrossRef] [PubMed]
  25. Liu, L.; Xie, X.; Zhang, Y.; Cao, F.; Liang, J.; Liao, N. Colorimetric characterization of color imaging systems using a multi-input PSO-BP neural network. Color Res. Appl. 2022, 47, 855–865. [Google Scholar] [CrossRef]
  26. Miao, H.; Zhang, L. The color characteristic model based on optimized BP neural network. China Acad. Conf. Printing Packaging 2016, 369, 55–63. [Google Scholar]
  27. Wang, P.; Chou, J.; Tseng, C. Colorimetric characterization of color image sensors based on convolutional neural network modeling. Sens. Mater. 2019, 31, 1513–1522. [Google Scholar] [CrossRef]
  28. Yang, B.; Chou, H.; Yang, T. Color reproduction method by support vector regression for color computer vision. Optik 2013, 124, 5649–5656. [Google Scholar] [CrossRef]
  29. Gong, R.; Wang, Q.; Shao, X.; Liu, J.J. A color calibration method between different digital cameras. Optik 2016, 127, 3281–3285. [Google Scholar] [CrossRef]
  30. Wu, X.; Fang, J.; Xu, H.; Wang, Z. High dynamic range image reconstruction in device-independent color space based on camera colorimetric characterization. Optik 2017, 140, 776–785. [Google Scholar] [CrossRef]
  31. Molada-Tebar, A.; Lerma, J.L.; Marques-Mateu, Á. Camera characterization for improving color archaeological documentation. Color Res. Appl. 2018, 43, 47–57. [Google Scholar]
  32. Finlayson, G.; Mackiewicz, M.; Hurlbert, A. Color correction using root-polynomial regression. IEEE Trans. Image Process. 2015, 24, 1460–1470. [Google Scholar] [CrossRef] [Green Version]
  33. Yamakabe, R.; Monno, Y.; Tanaka, M. Tunable color correction for noisy images. J. Electron. Imaging 2020, 29, 033012. [Google Scholar] [CrossRef]
  34. Hong, G.; Luo, M.; Rhodes, P.A. A study of digital camera colorimetric characterization based on polynomial modeling. Color Res. Appl. 2015, 26, 76–84. [Google Scholar] [CrossRef]
  35. Bianco, S.; Gasparini, F.; Russo, A.; Schettini, R. A new method for RGB to XYZ transformation based on pattern search optimization. IEEE Trans. Consum. Electron. 2007, 53, 1020–1028. [Google Scholar] [CrossRef]
  36. Shen, H.; Wan, H.; Zhang, Z. Estimating reflectance from multispectral camera responses based on partial least-squares regression. J. Electron. Imaging 2010, 19, 020501. [Google Scholar] [CrossRef] [Green Version]
  37. Heikkinen, V.; Mirhashemi, A.; Alho, J. Link functions and matérn kernel in the estimation of reflectance spectra from rgb responses. J. Opt. Soc. Am. A 2013, 30, 2444–2454. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Xiao, G.; Wan, X.; Wang, L.; Liu, S. Reflectance spectra reconstruction from trichromatic camera based on kernel partial least square method. Opt. Express 2019, 27. [Google Scholar] [CrossRef] [PubMed]
  39. Amiri, M.M.; Fairchild, M.D. A strategy toward spectral and colorimetric color reproduction using ordinary digital cameras. Color Res. Appl. 2018, 43, 675–684. [Google Scholar] [CrossRef]
  40. Abdi, H.; Williams, L.J. Partial least squares methods: Partial least squares correlation and partial least square regression. In Computational Toxicology: Volume II; Springer: Berlin/Heidelberg, Germany, 2013; pp. 549–579. [Google Scholar]
  41. Georgoula, M. Assessing colour differences of lighting stimuli using a visual display. PhD Thesis, University of Leeds, Leeds, UK, 2015. [Google Scholar]
  42. Commission Internationale de l’Eclairage (CIE). Recommendations on Uniform Color Spaces—Color Difference Equations, Psychometric Color Terms; CIE Publication: Vienna, Austria, 1978. [Google Scholar]
  43. Liu, H.; Cui, G.; Huang, M. Color-difference threshold for printed images. Appl. Mech. Mater. 2013, 469, 236–239. [Google Scholar]
  44. Luo, M.; Cui, G.; Georgoula, M. Colour difference evaluation for white light sources. Light Res. Technol. 2015, 47, 360–369. [Google Scholar] [CrossRef]
  45. Wen, S. P-46: A color space derived from CIELUV for display color management. SID Symp. Dig. Tech. Pap. 2012, 42, 1269–1272. [Google Scholar] [CrossRef]
  46. Schanda, J. CIE u′, v′uniform chromaticity scale diagram and CIELUV color space. In Encyclopedia of Color Science and Technology; Luo, M.R., Ed.; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
  47. Hill, B.; Fw, V.; Roger, T. Comparative analysis of the quantization of color spaces on the basis of the CIELAB color difference formula. ACM Trans. Graph. 1997, 16, 109–154. [Google Scholar] [CrossRef]
  48. Melgosa, M.; Trémeau, A.; Cui, G. Colour Difference Evaluation; Springer: Berlin/Heidelberg, Germany, 2013; pp. 59–77. [Google Scholar]
  49. Zhang, X.; Qiang, W.; Li, J. Estimating spectral reflectance from camera responses based on CIE XYZ tristimulus values under multi-illuminants. Color Res. Appl. 2017, 42, 68–75. [Google Scholar] [CrossRef]
Figure 1. Colorimetric characterization method based on KPLS.
Figure 1. Colorimetric characterization method based on KPLS.
Sensors 23 05706 g001
Figure 2. Color space conversion algorithm flow based on KPLS.
Figure 2. Color space conversion algorithm flow based on KPLS.
Sensors 23 05706 g002
Figure 3. Experimental Scheme.
Figure 3. Experimental Scheme.
Sensors 23 05706 g003
Figure 4. Color and colorimetric distributions of 96 patches collected in our experiment: (a) RGB distribution of 96 datasets; (b) CIE 1931XYZ tristimulus value distribution; (c) device-dependent RGB color space chromaticity distribution; (d) CIE1931XYZ chromaticity distribution.
Figure 4. Color and colorimetric distributions of 96 patches collected in our experiment: (a) RGB distribution of 96 datasets; (b) CIE 1931XYZ tristimulus value distribution; (c) device-dependent RGB color space chromaticity distribution; (d) CIE1931XYZ chromaticity distribution.
Sensors 23 05706 g004
Figure 5. VIF of input feature: (a) Based on polynomial kernel function expansion; (b) Based on Gaussian kernel function expansion; (c) Based on root polynomial kernel function expansion.
Figure 5. VIF of input feature: (a) Based on polynomial kernel function expansion; (b) Based on Gaussian kernel function expansion; (c) Based on root polynomial kernel function expansion.
Sensors 23 05706 g005
Figure 6. The correlation between XYZ and the polynomial kernel function expansion of RGB: (a) Correlation between X and features; (b) Correlation between Y and features; (c) Correlation between Z and features; (d) Correlation between X and features after principal component analysis; (e) Correlation between Y and features after principal component analysis; (f) Correlation between Z and features after principal component analysis.
Figure 6. The correlation between XYZ and the polynomial kernel function expansion of RGB: (a) Correlation between X and features; (b) Correlation between Y and features; (c) Correlation between Z and features; (d) Correlation between X and features after principal component analysis; (e) Correlation between Y and features after principal component analysis; (f) Correlation between Z and features after principal component analysis.
Sensors 23 05706 g006
Figure 7. The correlation between XYZ and the Gaussian kernel function expansion of RGB: (a) Correlation between X and features; (b) Correlation between Y and features; (c) Correlation between Z and features; (d) Correlation between X and features after principal component analysis; (e) Correlation between Y and features after principal component analysis; (f) Correlation between Z and features after principal component analysis.
Figure 7. The correlation between XYZ and the Gaussian kernel function expansion of RGB: (a) Correlation between X and features; (b) Correlation between Y and features; (c) Correlation between Z and features; (d) Correlation between X and features after principal component analysis; (e) Correlation between Y and features after principal component analysis; (f) Correlation between Z and features after principal component analysis.
Sensors 23 05706 g007
Figure 8. The correlation between XYZ and the root polynomial function expansion of RGB: (a) Correlation between X and features; (b) Correlation between Y and features; (c) Correlation between Z and features; (d) Correlation between X and features after principal component analysis; (e) Correlation between Y and features after principal component analysis; (f) Correlation between Z and features after principal component analysis.
Figure 8. The correlation between XYZ and the root polynomial function expansion of RGB: (a) Correlation between X and features; (b) Correlation between Y and features; (c) Correlation between Z and features; (d) Correlation between X and features after principal component analysis; (e) Correlation between Y and features after principal component analysis; (f) Correlation between Z and features after principal component analysis.
Sensors 23 05706 g008
Figure 9. Color difference of CIEDE2000 based on KPLS model under different hyperparameter combinations: (a) Based on polynomial kernel function expansion; (b) Based on Gaussian kernel function expansion; (c) Based on root polynomial kernel function expansion.
Figure 9. Color difference of CIEDE2000 based on KPLS model under different hyperparameter combinations: (a) Based on polynomial kernel function expansion; (b) Based on Gaussian kernel function expansion; (c) Based on root polynomial kernel function expansion.
Sensors 23 05706 g009
Figure 10. The correlation coefficient between the components of the KPLS model: (a) Based on polynomial kernel function expansion; (b) Based on Gaussian kernel function expansion; (c) Based on root polynomial kernel function expansion.
Figure 10. The correlation coefficient between the components of the KPLS model: (a) Based on polynomial kernel function expansion; (b) Based on Gaussian kernel function expansion; (c) Based on root polynomial kernel function expansion.
Sensors 23 05706 g010
Figure 11. KPLS regression based on polynomial kernel expansion: (a) X test data and forecast data pairs; (b) Y test data and forecast data pairs; (c) Z test data and forecast data pairs.
Figure 11. KPLS regression based on polynomial kernel expansion: (a) X test data and forecast data pairs; (b) Y test data and forecast data pairs; (c) Z test data and forecast data pairs.
Sensors 23 05706 g011
Figure 12. KPLS regression based on Gaussian kernel expansion: (a) X test data and forecast data pairs; (b) Y test data and forecast data pairs; (c) Z test data and forecast data pairs.
Figure 12. KPLS regression based on Gaussian kernel expansion: (a) X test data and forecast data pairs; (b) Y test data and forecast data pairs; (c) Z test data and forecast data pairs.
Sensors 23 05706 g012
Figure 13. KPLS regression based on root polynomial kernel expansion: (a) X test data and forecast data pairs; (b) Y test data and forecast data pairs; (c) Z test data and forecast data pairs.
Figure 13. KPLS regression based on root polynomial kernel expansion: (a) X test data and forecast data pairs; (b) Y test data and forecast data pairs; (c) Z test data and forecast data pairs.
Sensors 23 05706 g013
Figure 14. Comparison of colorimetric characteristics of different algorithms in the cross-validation method: (a) 96 non-neutral color prediction CIELAB color difference; (b) 96 non-neutral color prediction CIELUV color difference; (c) 96 non-neutral color prediction CIEDE2000 color difference.
Figure 14. Comparison of colorimetric characteristics of different algorithms in the cross-validation method: (a) 96 non-neutral color prediction CIELAB color difference; (b) 96 non-neutral color prediction CIELUV color difference; (c) 96 non-neutral color prediction CIEDE2000 color difference.
Sensors 23 05706 g014
Figure 15. Comparison of average color difference of different algorithms: (a) Average CIELAB color difference of ten-fold test data; (b) Average CIELUV color difference of ten-fold test data; (c) Average CIEDE2000 color difference of ten-fold test data.
Figure 15. Comparison of average color difference of different algorithms: (a) Average CIELAB color difference of ten-fold test data; (b) Average CIELUV color difference of ten-fold test data; (c) Average CIEDE2000 color difference of ten-fold test data.
Sensors 23 05706 g015
Figure 16. Test set distribution where the outliers are located.
Figure 16. Test set distribution where the outliers are located.
Sensors 23 05706 g016
Table 1. The Hyperparameter optimization results of KPLS model in cross-validation.
Table 1. The Hyperparameter optimization results of KPLS model in cross-validation.
The Series of FoldP-KernelG-KernelRP-Kernel
OrderComponentsDE2000 σ ComponentsDE2000OrderComponentsDE2000
13170.862 176210.863 380.752
23180.895 285220.858 380.756
33170.897 196210.880 380.742
43180.947 400200.940 5110.759
53170.941 267220.896 5120.764
63170.794 236230.882 580.769
74180.825 230220.773 380.670
83180.838 226220.749 390.699
93180.898 225220.861 5110.720
104170.840 245230.738 580.707
Table 2. Model average color difference per fold test set.
Table 2. Model average color difference per fold test set.
The Series of FoldP-KernelG-KernelRP-Kernel
LABLUVDE2000LABLUVDE2000LABLUVDE2000
10.9070.9750.6660.8550.9330.6130.7460.8480.527
21.5831.5580.8011.2941.4010.6761.2731.3440.597
31.5351.3510.6401.7551.4440.8442.1301.7180.918
41.4291.2070.6351.7721.7841.1501.3761.2210.665
50.6810.7980.5410.8340.8350.6310.5810.7140.442
62.5501.6661.2061.8231.3721.0531.0070.7530.603
71.5421.4521.1881.5171.4291.1441.7491.5571.311
81.6551.4710.8981.1461.1640.7111.2581.3910.799
91.1541.0100.6651.0991.0560.6961.1941.2320.733
102.4111.8471.1162.6841.9541.1132.5221.8651.003
Table 3. The average color difference of different algorithms.
Table 3. The average color difference of different algorithms.
ModelCIELAB
Color Difference
CIELUV
Color Difference
CIEDE2000
Color Difference
KPLSP-kernel1.54471.33350.8356
KPLSG-kernel1.47791.33720.8631
KPLSRP-kernel1.38361.26430.7598
RP-OLS [32]1.42211.29330.7775
MLP4.0524.41662.8895
WT-NONLIN formula 1 [39]1.79771.58391.0207
WT-NONLIN formula 2 [39]1.72721.53430.9858
WT-NONLIN formula 3 [39]1.72421.54290.9799
WT-NONLIN formula 4 [39]1.58781.40170.8847
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhao, S.; Liu, L.; Feng, Z.; Liao, N.; Liu, Q.; Xie, X. Colorimetric Characterization of Color Imaging System Based on Kernel Partial Least Squares. Sensors 2023, 23, 5706. https://doi.org/10.3390/s23125706

AMA Style

Zhao S, Liu L, Feng Z, Liao N, Liu Q, Xie X. Colorimetric Characterization of Color Imaging System Based on Kernel Partial Least Squares. Sensors. 2023; 23(12):5706. https://doi.org/10.3390/s23125706

Chicago/Turabian Style

Zhao, Siyu, Lu Liu, Zibing Feng, Ningfang Liao, Qiang Liu, and Xufen Xie. 2023. "Colorimetric Characterization of Color Imaging System Based on Kernel Partial Least Squares" Sensors 23, no. 12: 5706. https://doi.org/10.3390/s23125706

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop