Next Article in Journal
DiffuD2T: Empowering Data-to-Text Generation with Diffusion
Next Article in Special Issue
Hyperspectral Image Classification Using Geodesic Spatial–Spectral Collaborative Representation
Previous Article in Journal
Arbitrary-Oriented Object Detection in Aerial Images with Dynamic Deformable Convolution and Self-Normalizing Channel Attention
Previous Article in Special Issue
A Review and Comparative Study of Explainable Deep Learning Models Applied on Action Recognition in Real Time
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Color and Texture Analysis of Textiles Using Image Acquisition and Spectral Analysis in Calibrated Sphere Imaging System-II

1
School of Electronics, ITER, S’O’A University, Bhubaneswar 751030, Odisha, India
2
Laboratory of Wearable Materials for Healthcare, City University, Hong Kong, China
3
Department of Computing, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong, China
4
School of Engineering, Fraser Noble Building, University of Aberdeen, Aberdeen AB24 3UE, UK
5
Department of Textile Engineering, OUTR (Former CET), Bhubaneswar 751029, Odisha, India
*
Authors to whom correspondence should be addressed.
Electronics 2023, 12(9), 2135; https://doi.org/10.3390/electronics12092135
Submission received: 12 March 2023 / Revised: 24 April 2023 / Accepted: 4 May 2023 / Published: 6 May 2023
(This article belongs to the Collection Image and Video Analysis and Understanding)

Abstract

:
The extended application of device-dependent systems’ vision is growing exponentially, but these systems face challenges in precisely imitating the human perception models established by the device-independent systems of the Commission internationale de l’éclairage (CIE). We previously discussed the theoretical treatment and experimental validation of developing a calibrated integrated sphere imaging system to imitate the visible spectroscopy environment. The RGB polynomial function was derived to obtain a meaningful interpretation of color features. In this study, we dyed three different types of textured materials in the same bath with a yellow reactive dye at incremental concentrations to see how their color difference profiles tested. Three typical cotton textures were dyed with three ultra-RGB remozol reactive dyes and their combinations. The color concentration ranges of 1%, 2%, 3%, and 4% were chosen for each dye, followed by their binary and ternary mixtures. The aim was to verify the fundamental spectral feature mapping in various imaging color spaces and spectral domains. The findings are quite interesting and help us to understand the ground truth behind working in two domains. In addition, the trends of color mixing, CIE color difference, CIExy (chromaticity) color gamut, and RGB gamut and their distinguishing features were verified. Human perception accuracy was also compared in both domains to clarify the influence of texture. These fundamental experiments and observations on human perception and calibrated imaging color space could clarify the expected precision in both domains.

1. Introduction

We discussed the significance of this study, prior art, and theoretical treatment of imaging from an integrating sphere in our previous paper [1]. We experimented with textile texture and color by varying these factors in a controlled manner (red, blue, yellow, and cyan dyes at various concentrations (0.25%, 0.5%, 0.75%, 1%, 1.5%, 2%, 3%, 4%, 5%, and 6%)). In addition, a simple calibration technique that describes how unique digital color signatures can be derived from calibrated RGB to extract the best features for color and texture was proposed and validated. This alter ego of the reflectance function, missing in the imaging domain, was experimentally validated to be used for visualization, identification, and application for qualitative and quantitative color–texture analysis [1].
The present investigation aims to conduct a qualitative and quantitative analysis of color perception in terms of DERGB and DE precision using our proposed method, along with a study of color combination. Further, we studied various RGB spaces that were critically represented with varied texture and color combinations.
Many applications such as spectral measurements, image processing, and human vision require congruent and precise results in real-world problems. Complex operations, such as calibration protocols, device profiling, illumination uniformity, viewing geometry, device characterization, and so on, are used to bridge the gap between device-dependent and device-independent color transformation. In most cases, the characterization or prediction models are prone to theoretical assumption errors [1,2,3] as well as practical imperfections or limitations [4,5]. It could be stated that the illumination source’s properties and its uniformity over the material are likely the main offenders and play a crucial role in the precise estimation of color and texture qualities [6,7].
Most of the challenges were discussed in our previous article, and it is noteworthy here to mention some conclusive inferences from a few current researchers. Nie et al. (2023) [8] reported on problems with serious interference of specular reflections in the endoscopic image that majorly contribute to errors in computer vision algorithms. Abdulateef and Hasoon (2023) [9] studied the limitations of image analysis as it essentially needs a clear, bright, and no-shadow RGB image to obtain accurate results. As stated by Lin and Finlayson (2023) [10], “Surprisingly, we show that all compared algorithms—regardless of their model complexity—degrade to broadly the same level of performance.”
As a matter of fact, the device-dependent systems of RGB color spaces were knowingly evolved with the color compressed gamut of CIE and encouraged widely as a tool for easy communication, real-time application, business, and so on, with the growth of users of computers, phones, and other digital media [1,3]. As of today, it is a big challenge for AI systems and advanced algorithms to be trained accurately with prior domain knowledge for better identification, classification, and prediction of subjects or materials of interest. Nature has many colors and it may be beyond our knowledge how they react physically for our color perception. In physics terms, reflection, transmission, absorption, and scattering within the visible range of light involve numerous materials like dyes, pigments, and biomaterials for varied applications.
Materials show their unique properties of light reflectance and absorption over the visible range. If a substrate is dyed, we can see its family of curves by increasing the dye concentration (e.g., Figure 1). Interestingly, textile materials are quite suitable for experimenting with varied textures and dyes.
We dyed three different types of textured materials with a yellow reactive dye at incremental concentrations in the same bath. Dyeing experiments for three ultra-RGB remozol reactive dyes and their combinations were then carried out on the same substrate for four concentration ranges (single, binary, and ternary mixtures) to validate the fundamental spectral feature mapping in both domains and varied imaging color spaces. The motivation behind these ground truth experiments was to analyze the critical issue of human perception and computer vision: the device-independent human perception CIE model and current progress in digital image processing, with a simplified explanation.

2. Materials and Methods

In our previous study, we discussed the development of a novel integrating sphere imaging system with a theoretical explanation. We experimented with textile properties by varying the texture in a controlled way and coloring the samples with red, blue, yellow, and cyan dyes at various concentrations (0.25%, 0.5%, 0.75%, 1%, 1.5%, 2%, 3%, 4%, 5%, and 6%). We experimented and derived calibrated RGB polynomials and compared them with spectral measurement profiles. Procedures to precisely define the qualitative and quantitative influences of color and texture and color prediction capabilities were also planned and investigated. The present investigation aims to conduct a qualitative and quantitative analysis of the precision of our proposed method along with an analysis of color combination. Three types of bleached cotton fabrics with plain, twill, and modified twill were initially dyed in the same dye bath with a yellow reactive dye (Levafix Brilliant Yellow 4GL; vinyl sulphone class) in an incremental order of concentration, i.e., 0.25%, 0.5%, 0.75%, 1%, 1.5%, 2%, 2.5%, 3%, 4%, and 5% (10 shades for each of the three textures). Further, dyeing was carried out with three kinds of ultra-RGB Remozol reactive dyes and their combinations (single, binary, and ternary mixtures) on the same substrate in four concentration ranges: 1%, 2%, 3%, and 4% (Ultra-RGB Carmen (Dye A), Navy Blue (Dye B), and Red (Dye C)). The dyes were provided by Dystar, Hong Kong. A Minolta 2600D spectrophotometer was used to measure reflectance (360 to 740 nm with a 10 nm gap), XYZ, CIEL*a*b*, and color difference (DE). Images of plain, twill, and modified twill samples of the same percentage shade were taken together (Figure 2) and, after calibrating the diffused imaging system with a white tile, their average RGB values were computed in MATLAB. All the detailed measurement values are provided as Supplementary File.

3. Results and Discussion

3.1. Three Types of Textures with Incremental Yellow Color Variations

Textile samples with three different textures were dyed in the same bath so that color uptake could be verified in both domains and a quantitative and qualitative analysis of color and the color difference between textiles could be conducted effectively. The integrated sphere imaging system was initially calibrated with the white plate, and all three kinds of textures dyed in the same bath were measured together (Figure 2) to ensure clarity in the determination of RGB values, denoted as Canon D450 (Camera) RGB.
Figure 3 shows the sample images and Table 1 shows the experimental RGB and color difference DE (root mean square difference of CIE L*a*b*) and DERGB (calculated as root mean square difference of R, G, and B), using the first plain weaved sample Y2 as a reference.
In both domains, their spectral and derived imaging RGB polynomials are profound. A visual difference is observed in the higher wavelength ranges of red and green (1-r, 2-g, and so on in the RGB polynomial graph in Figure 4).

3.2. Color Difference for Human Perception

From Figure 4 and Table 1, it can be observed that the textures can be visually distinguished into 3 families: plain, twill, and modified twill. The highest was M.Twill (on top), followed by Twill (middle), and the least was plain (below). The same trend can be seen by observing CIE L* values in the spectral domain as well as intensity values (R+G+B/3) in the calibrated imaging domain in Figure 7. The color difference can be distinguished much better in terms of DERGB from the proposed system than DE in CIEL*a*b* (Figure 5). It was observed that the DERGB and color difference DE readings follow obvious trends as the color concentration increases among each of the three textures. Further, we ranked all samples to compare both domains’ perception in terms of DE and DERGB.
Experimental RGBs, color difference DE, and DERGB of the textures (Y42 as reference) are given in Table 1 and plotted in Figure 5. More discrimination is noticed in the case of DERGB compared with DE. For the color perception point of view, images were ranked in terms of measured DE and estimated DERGB from images using Y42′s RGB as reference. The congruent ranks are highlighted in italic with thick borders and the flipped reading sets are in bold (color difference ranks are flipped while evaluating DE and DERGB). It can be observed that these flipped sets (Y46, Y48), (Y53, Y55), and (Y65, Y70) have closer DE values among them (0.022, 0.287, and 0.485, respectively).
If the color difference (DE) is less than 0.5, it can be perceived as the same color and we accept it for industrial applications as well [11,12]. While observing this set of samples (shown in Figure 6) carefully, it can be easily noticed that the left-side samples Y46, Y53, and Y65 may be perceived as more colored than the samples Y48, Y55, and Y70, respectively. In fact, texture clearly has a greater impact on imaging systems, whereas color spectral reflection measurement was designed for color (reflection spectra in the visible wavelength range).
Critically, these color perception mismatches, or zones of confusion, are always debated, and the errors were profoundly significant. This is because, in most cases, the imaging domain RGB parameters are derived with a lot of assumptions and a lack of calibration, illumination information, viewing geometry, etc., while being transformed from or to CIE spaces. The reason for this is that human perception models only accept the CIE system because standardization of RGB color space is near to impossible [3,13]. These theoretical errors have been reported for decades, and it is evident that the images obtained from this calibrated imaging system could be closer to our color perception and appearance due to texture change. In addition, a color image with calibrated RGBs can be much more useful than spectral ones for real-world applications. Complex algorithms for domain transfer from device-dependent systems to device-independent systems are progressively being used for various important applications, including medical imaging; however, these conversions may cause serious errors and outliers when generalized. Sciuto et al. (2017) [14] and Lo Sciuto et al. (2021) [15] reported some improved network classifiers and feature extraction algorithms for better results to recognize organic solar cells defects. The typical RGB values of standard illuminants in different device-dependent RGB spaces are given in Table 2. These values can be computed and were available from the spectral calculator spreadsheet by Bruce Justin Lindbloom [16], which was used later for RGB calculations and visualizations in various RGB color spaces. The application of device-dependent systems has been growing exponentially for AI, cloud computing, virtual realities, and many more. It is critical that device-independent systems like CIEL*a*b* are always associated with their illumination and observer pairs. However, image processing researchers rarely consider this and incorrectly assume many parameters for computing RGB associations for empirical models with complex algorithms.

3.3. Color Intensity for Quantitative Evaluation

The intensity values were computed as (R+G+B)/3 for 10 ranges of incremental dye concentration (Table 3). It can be easily noticed that the estimated intensity decreases as the concentration of dye increases, which implies lower reflection properties of the surface with higher dye absorption.
The relationship between color concentration and intensity was established using quantitative evaluation (Figure 7). Here, it is relevant to investigate whether the calibrated image intensity could imitate absorbance or dye uptake (K/S), i.e., the intensity should be estimated reasonably well when the concentration is known. The measured intensities of the three textures (total of 30) were plotted against the 10 incremental concentration ranges, as shown in Figure 7, below. The Hoerl model (y = a*(bx)*(xc)) was one of the simplest and best-fitted models that we used, and the estimated values were as follows:
  • PLAIN: a = 181.97, b = 1.0034, c = −0.04.891; R2 = 0.996
  • TWILL: a = 182.54, b = 1.008, c = −0.060; R2 = 0.9958
  • M.TWILL: a = 183.04, b = 1.0084, c = −0.0626 R2 = 0.9942
A high coefficient of correlation was observed for all the cases (R2 > 0.99), confirming a good prediction probability from calibrated imaging.

3.4. Color Combinations and Verification of Various Color Space and CIE Chromaticity Visualizations

Further, we conducted ground truth experiments with three dyes using an equal proportion of their primary, secondary, and ternary mixtures (Table 4 and Figure 8). Various color space RGB representations of the dye mixtures (BC at 1, 2, 3, and 4%) were computed [14] and represented in Figure 9.
The reflectance measurement was conducted using spectrophotometry, and calibrated imaging RGBs were calculated in MATLAB. The spreadsheet by Bruce Justin Lindbloom was used to calculate various device-dependent RGB standards, and all raw data were provided on an Excel file. Typical binary combinations of 1, 2, 3, and 4% of Dye B and Dye C for three major color space RGB representations (Apple RGB, Adobe RGB, and Pro-photo RGB) are given in Figure 9. It is evident that they could be well represented in both qualitative and quantitative analyses (curves of the same family have similar spectral reflectance). In addition, we investigated the similarity of their representations in the CIE chromaticity (CIExy) diagram (Figure 10) and the 3D representation of linearity of individual and dye mixtures in both domains (Figure 11). The physical significance of this is that a two-dye mixture in a particular proportion will be in a straight line until it becomes saturated.
These experimental findings demonstrate that both qualitative and quantitative evaluations are possible in the calibrated digital domain and that they are comparable to spectral or device-independent systems. The calibrated red, green, blue, RG, RB, GB, and RGB polynomial expansion can be treated as an alter ego of spectral responses for any color space, including dye mixtures, for practical applications. The CIE XYZ and 3D RGB trends revealed the distinct dye and combination profiles.

3.5. Reflectance Prediction in Terms of Calibrated RGB Polynomial Regression

As we discussed how the proposed RGB polynomial could potentially be used as an alter ego for the reflectance function earlier, we investigated further how well the reflectance function (31 values over a visible wavelength of 400–700 nm with a 10 nm gap) could be predicted. It is logical that the mixture of primary RGBs generates a wide gamut of colors. Here we have taken these 27 dye mixture samples in a similar fashion to how the database is prepared for computer color matting for prediction. It can be mathematically denoted as follows:
R λ = f ( R G B p o l y n o m i a l )
We used 3, 8, 11, 20, and 23 argument coefficients here, and did not try more arguments as they caused more complexity and predilection in our earlier investigation [5,17,18]. The predicted reflectance were then converted into theoretical CIEL*a*b* for various illuminant–observer pairs to calculate the predicted DE. All the experimental data (actual and predicted reflectance, coefficients, and DE) for various illuminat–observer pairs are provided as a Supplementary File. Figure 12 clearly shows that the reflectance function is well predicted from the 23-argument polynomial RGB. The arguments for the polynomial RGB function are denoted below. Table 5 illustrates the summary of color difference results based on these 5 RGB polynomial arguments were derived from the predicted reflectance values. In Figure 12a the experimental reflectance was plotted as continuous line and predicted reflectance of 23 argument model was marked as *.
  • 3: R G B
  • 8: R G B R×G×B R×G R×B G×B 1
  • 11: R G B R×G×B R×G R×B G×B R2 G2 B2 1
  • 20: R G B R×G×B R×G R×B G×B R2 G2 B2 R3
  • G3 B3 G×R2 B×G2 R×B2 B×R2 R×G2 G×B2 1
  • 23: R G B R×G×B R×G R×B G×B R2 G2 B2
  • R3 G3 B3 G×R2 B×G2 R×B2 B×R2 R×G2 G×B2
  • G×B×R2 B×R×G2 R×G×B2 1

4. Conclusions

In fact, exact spectral reconstruction is even more difficult to achieve. These challenges can be easily understood if we revisit the development of CIE systems themselves with defined illuminants, observer functions (cleverly designed with real-time inputs from expert human observers), and viewing geometry set-ups. For example, we need to predict 31 values (%R from 400–700 nm in a 10 nm gap) from three colorimetric readings of CIE L*, a*, and b*. The systems required to do this are well established and being utilized for specific domain applications as of today, unlike RGBs; those are being evolved for use in pleasant images with less know-how.
In current practice, the transformation of device-dependent imaging parameters into device-independent CIE parameters is mandatory as human perception only accepts the CIE system. Before pre- and post-processing of the color images in compressed digital color space or to map specific RGBs to a specific spectrophotometer or colorimetric reading under a certain illuminant and observer, domain knowledge is a definite prerequisite. Specifically, the color and appearance perception mismatches, or zones of confusion, are being critically debated in real-world applications. In fact, humans can differentiate more colors than CIEs. The fundamental cause of these errors was profoundly significant for critical decision-making applications when imaging domain parameters with a lot of assumptions and a lack of calibration, illumination information, uniformity, etc., were transformed to CIE spaces.
Previously, we proposed an alternative reflectance function obtained from the calibrated sphere imaging system with a theoretical explanation to analyze and validate the close proximity of texture and color. Here, with three different textures and the incremental color depth of three color combinations, we experimented and validated the qualitative analysis of color and proximity texture in both domains. The concentration of a particular color can be estimated from the calibrated image intensity. The human perception of color differences and its ambiguity are explained. The color difference in terms of DERGB can be perceived well and texture has a large influence on it. The reflectance predictions from polynomial regression RGB models were found to be reasonably accurate. The various RGB spaces and CIExyz for color combinations were found to be congruent, and finally, precision can be ensured if an image is well calibrated with diffused and uniform illumination constancy.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/electronics12092135/s1, All the experimental data were provided as a Supplementary File.

Author Contributions

N.R. and A.K. conceptualized the current investigation, conducted the experiments, analyzed data with manuscript preparation under the supervision of P.P., J.H., G.B. and K.N., who helped in providing experimental feedback and reviewed the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

We are thankful to for the support from new fiber science and IoT Lab, OUTR sponsored by TEQIP-3 seed money and MODROB (/9-34/RIFDMO DPOLICY-1/2018-19).

Data Availability Statement

Provided in 2 Supplementary Files.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Rout, N.; Baciu, G.; Pattanaik, P.; Nakkeeran, K.; Khandual, A. Color and Texture Analysis of Textiles Using Image Acquisition and Spectral Analysis in Calibrated Sphere Imaging System-I. Electronics 2022, 24, 3887. [Google Scholar] [CrossRef]
  2. Yao, P. Advanced Textile Image Analysis Based on Multispectral Color Reproduction. Ph.D. Dissertation, Hong Kong Polytechnic University, Hong Kong, China, 2022. [Google Scholar]
  3. Khandual, A.; Baciu, G.; Rout, N. Colorimetric processing of digital color image! Int. J. Adv. Res. Comp. Sc. Soft. Eng. 2013, 3, 103–107. [Google Scholar]
  4. Zhang, J.; Su, R.; Fu, Q.; Ren, W.; Heide, F.; Nie, Y. A survey on computational spectral reconstruction methods from RGB to hyperspectral imaging. Sci. Rep. 2022, 12, 11905. [Google Scholar] [CrossRef] [PubMed]
  5. Khandual, A.; Baciu, G.; Hu, J.; Zeng, E. Color Characterization for Scanners: Dpi and Color Co-Ordinate Issues. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 2012, 2, 354–365. [Google Scholar]
  6. Ershov, E.; Savchik, A.; Shepelev, D.; Banić, N.; Brown, M.S.; Timofte, R.; Mudenagudi, U. NTIRE 2022 challenge on night photography rendering. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 1287–1300. [Google Scholar]
  7. Nayak, S.; Khandual, A.; Mishra, J. Ground truth study on fractal dimension of color images of similar texture. J. Text. Inst. 2018, 109, 1159–1167. [Google Scholar] [CrossRef]
  8. Nie, C.; Xu, C.; Li, Z.; Chu, L.; Hu, Y. Specular reflections detection and removal for endoscopic images based on brightness classification. Sensors 2023, 23, 974. [Google Scholar] [CrossRef] [PubMed]
  9. Abdulateef, S.K.; Hasoon, A.N. Comparison of the components of different color spaces to enhanced image representation. J. Image Process. Intell. Remote Sens. 2023, 3, 11–17. [Google Scholar]
  10. Lin, Y.T.; Finlayson, G.D. An investigation on worst-case spectral reconstruction from RGB images via Radiance Mondrian World assumption. Color Res. Appl. 2023, 48, 230–242. [Google Scholar] [CrossRef]
  11. Gupte, V.C. Color Technology: Tools, Techniques and Applications; Woodhead Publishing: Sawston, UK, 2008. [Google Scholar]
  12. Kandi, S.G. The Effect of Spectrophotometer Geometry on the Measured Colors for Textile Samples with Different Textures. J. Eng. Fibers Fabr. 2011, 6, 70–78. [Google Scholar] [CrossRef]
  13. Süsstrunk, S.; Buckley, R.; Swen, S. Standard RGB color spaces. In Proceedings of the IS&T;/SID 7th Color Imaging Conference, Lausanne, Switzerland, 16–19 November 1999; pp. 127–134. [Google Scholar]
  14. Sciuto, G.L.; Capizzi, G.; Gotleyb, D.; Linde, S.; Shikler, R.; Woźniak, M.; Połap, D. Combining SVD and co-occurrence matrix information to recognize organic solar cells defects with a elliptical basis function network classifier. In Proceedings of the Artificial Intelligence and Soft Computing: 16th International Conference, ICAISC 2017, , Proceedings, Part II 16, Zakopane, Poland, 11–15 June 2017; Springer International Publishing: Berlin/Heidelberg, Germany; pp. 518–532. [Google Scholar]
  15. Lo Sciuto, G.; Capizzi, G.; Shikler, R.; Napoli, C. Organic solar cells defects classification by using a new feature extraction algorithm and an EBNN with an innovative pruning algorithm. Int. J. Intell. Syst. 2021, 36, 2443–2464. [Google Scholar] [CrossRef]
  16. Bruce Justin Lindbloom’s Spectral Calculator Spreadsheet. Available online: http://www.brucelindbloom.com/ (accessed on 1 March 2023).
  17. Khandual, A.; Baciu, G.; Hu, J.; Zheng, D. Colour characterization for scanners: Validations on Textiles & Paints. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 2001, 3, 1008–1013. [Google Scholar]
  18. Baciu, G.; Khandual, A.; Hu, J.; Xin, B. Device and Method for Testing Fabric Color. CN 101788341 B Patent 1 October 2012. [Google Scholar]
Figure 1. A typical % reflectance and K/S profile of ultra orange RGB dye.
Figure 1. A typical % reflectance and K/S profile of ultra orange RGB dye.
Electronics 12 02135 g001
Figure 2. Measurement of three textures together in a calibrated integrated sphere imaging system.
Figure 2. Measurement of three textures together in a calibrated integrated sphere imaging system.
Electronics 12 02135 g002
Figure 3. Incremental yellow dyed samples for three textures.
Figure 3. Incremental yellow dyed samples for three textures.
Electronics 12 02135 g003aElectronics 12 02135 g003b
Figure 4. Spectral and RGB polynomial representations of all 30 yellow samples.
Figure 4. Spectral and RGB polynomial representations of all 30 yellow samples.
Electronics 12 02135 g004
Figure 5. DERGB and DE of three textures, dyed with yellow in the same bath.
Figure 5. DERGB and DE of three textures, dyed with yellow in the same bath.
Electronics 12 02135 g005
Figure 6. Image ranking according to the color difference DE and DERGB for all 30 samples.
Figure 6. Image ranking according to the color difference DE and DERGB for all 30 samples.
Electronics 12 02135 g006
Figure 7. Intensity average vs. dye concentrations (a) of all 30 yellow samples; Hoerl model fit for (b) plain weave, (c) twill weave, and (d) modified twill weave.
Figure 7. Intensity average vs. dye concentrations (a) of all 30 yellow samples; Hoerl model fit for (b) plain weave, (c) twill weave, and (d) modified twill weave.
Electronics 12 02135 g007
Figure 8. Images of the dyed samples; three dyes: primary, secondary, and ternary mixture.
Figure 8. Images of the dyed samples; three dyes: primary, secondary, and ternary mixture.
Electronics 12 02135 g008
Figure 9. Various color space RGB representations of the dye mixture (BC at 1, 2, 3, and 4%).
Figure 9. Various color space RGB representations of the dye mixture (BC at 1, 2, 3, and 4%).
Electronics 12 02135 g009
Figure 10. CIExy diagram for all 27 color combinations.
Figure 10. CIExy diagram for all 27 color combinations.
Electronics 12 02135 g010
Figure 11. 3D Plots of CIExyz and RGBs of all 27 dyed samples.
Figure 11. 3D Plots of CIExyz and RGBs of all 27 dyed samples.
Electronics 12 02135 g011
Figure 12. (a) Reflectance functions of 27 samples predicted from a 23-argument RGB polynomial. It is noteworthy that these reflectance models would be more accurate in cases where we use the same substrate and dye combinations and it is highly probable that their predictions would ensure non-metameric matches for all sets of ASTM illuminant–observer pairs. The experimental and predicted CIE L*a*b* are plotted in (bd). (b) CIE L* vs. Lp*, (c) CIE a* vs. ap*, and (d) CIE b* vs. bp*.
Figure 12. (a) Reflectance functions of 27 samples predicted from a 23-argument RGB polynomial. It is noteworthy that these reflectance models would be more accurate in cases where we use the same substrate and dye combinations and it is highly probable that their predictions would ensure non-metameric matches for all sets of ASTM illuminant–observer pairs. The experimental and predicted CIE L*a*b* are plotted in (bd). (b) CIE L* vs. Lp*, (c) CIE a* vs. ap*, and (d) CIE b* vs. bp*.
Electronics 12 02135 g012aElectronics 12 02135 g012b
Table 1. Experimental RGBs, color difference DE, and DERGB of three textures (Y42 as reference).
Table 1. Experimental RGBs, color difference DE, and DERGB of three textures (Y42 as reference).
Image PlainRGBLabDERGBDE
Y42204.533209.373157.18482.264−10.59140.6720.0000.000
Y45208.447207.940135.76981.522−11.02552.99921.81712.357
Y48209.246207.691130.60081.427−10.94256.52327.05115.877
Y51210.326206.055121.69780.914−10.66261.40636.10920.778
Y54211.899205.708116.83380.870−10.02464.53441.18223.910
Y55217.369207.660109.91982.483−9.23570.78149.00730.141
Y58219.379205.17799.17782.355−8.23275.61660.02335.024
Y61219.503203.86395.18982.013−7.57376.96664.01536.420
Y64219.910201.00388.26581.746−7.07579.62471.10839.114
Y67221.463201.33785.56781.241−6.40681.14774.02840.704
Y70221.886199.65180.80680.811−5.92682.22478.92641.839
Image TwillRGBLabDERGBDE
Y43213.043212.648157.69484.994−10.57443.9349.1324.254
Y46214.214211.979136.87884.005−10.87856.47322.64615.899
Y49213.269208.123126.85883.784−10.61062.74831.58422.129
Y52215.311209.160120.16282.911−10.05066.03238.55925.374
Y56221.901208.694107.55984.147−7.27775.26952.58134.807
Y59225.645204.78491.71883.460−5.96180.49468.93940.108
Y62224.473202.66888.07483.045−5.23982.00672.24141.686
Y65226.816201.74084.95182.349−4.16282.50475.97742.324
Y68227.116198.94881.70482.181−3.24782.99979.47342.960
Y71226.662195.97775.43081.522−2.68485.34185.74945.370
Image M. TwillRGBLabDERGBDE
Y44217.647216.200152.43487.016−10.60048.63315.5289.271
Y47218.018213.410129.57885.588−9.77563.20030.98722.787
Y50218.583212.677124.08885.232−9.17868.21536.10727.739
Y53219.456210.864115.42984.960−8.42770.90344.36730.428
Y57213.761202.186102.45580.302−8.79071.80455.96531.246
Y60213.628200.36296.94579.861−8.50373.38361.58532.866
Y63215.491201.27495.16779.881−8.51973.12263.49632.604
Y66215.868199.69989.60079.372−7.62876.59169.20836.157
Y69214.343195.38984.45378.937−7.08678.14574.71037.784
Table 2. Typical RGB values of standard illuminants in different device-dependent RGB spaces.
Table 2. Typical RGB values of standard illuminants in different device-dependent RGB spaces.
WorkingReferenceRange = [0.0, 1.0]Range = [0, 255]
SpaceIlluminantRedGreenBlueRedGreenBlue
Adobe RGB (1998)D650.49470.42190.4095126108104
Apple RGBD650.44400.34590.33471138885
Best RGBD500.49330.43330.4120126110105
Beta RGBD500.49130.42700.4115125109105
Bruce RGBD650.50990.42190.4095130108104
CIE RGBE0.51200.43220.4130131110105
ColorMatch RGBD500.44440.34540.33331138885
Don RGB 4D500.49340.42850.4115126109105
ECI RGB v2D500.53190.45860.4445136117113
Ekta Space PS5 RGBD500.49830.42750.4114127109105
NTSC RGBC0.49220.42510.4122126108105
PAL/SECAM RGBD650.51660.42190.4088132108104
ProPhoto RGBD500.40710.35990.33861049286
SMPTE-C RGBD650.52610.41990.4092134107104
sRGBD650.52460.42330.4098134108105
Wide Gamut RGBD500.48750.43260.4109124110105
Table 3. Intensity of all three textured samples with respect to 10 incremental concentration ranges.
Table 3. Intensity of all three textured samples with respect to 10 incremental concentration ranges.
Dye Conc.PlainTwillM. Twill
0.25194.656199.113200.631
0.5188.669191.246191.7
0.75185.828186.492186.726
1182.218183.356184.989
INTENSITY1.5179.798180.277180.537
2175.681176.884176.851
3174.65176.792177.01
4173.182173.664173.835
5170.434172.941173.575
6170.315171.017170.746
Table 4. Experimental RGBs of dyed samples; three dyes: primary, secondary, and ternary mixture.
Table 4. Experimental RGBs of dyed samples; three dyes: primary, secondary, and ternary mixture.
Expt. RGB Dye Conc.
R1G1B1 Dye ADye BDye C
244.94245.19245.07White plate Start000
212.67215.70220.67Plain000
221.82224.30229.30Twill000
222.30223.41228.29M.Twill000
225.8854183.092725161.5266 1
228.9698171.01615145.9466 2
231.4409157.77435128.4438 3
229.0542151.03805120.8743 4
132.9036187.883875199.5574 1
103.2905171.487225186.7164 2
81.85803154.90475171.7595 3
68.36905147.47215165.4398 4
226.5088143.528725168.8932 1
237.1594109.479525143.0099 2
237.026593.90495128.7294 3
237.194876.637475114.6642 4
66.9882586.42077568.52315 11
42.908854.63862540.33238 22
32.2197840.276731.8978 33
33.4971837.41677531.14613 44
236.877647.7297553.10033 1 1
228.144832.00857540.05553 2 2
219.252822.07787532.17638 3 3
216.45220.8009530.01298 4 4
75.4747356.26132581.50173 11
51.2766837.81247556.64295 22
40.1175331.7993545.60925 33
34.0604326.3580538.14755 44
122.514590.8242596.30858 0.50.50.5
88.7598860.10202562.1925 111
28.3583333.925437.27445 222
245.04245.13245.36White plate end
Electronics 12 02135 i001: Dye A -Ultra-RGB Carmen, Dye -B - Ultra-RGB Navy Blue, Dye -C: Ultra-RGB Red.
Table 5. DE from predicted reflectance for D65_64.
Table 5. DE from predicted reflectance for D65_64.
D65_643 coeff8 coeff11 coeff20 coeff23 Coeff
DE Avg2.34011.51.32210.41060.3072
DE Max4.05352.722.90291.07761.0373
DE Min0.82910.40390.08730.10070.0595
DE std0.8480.73610.75660.26910.2486
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rout, N.; Hu, J.; Baciu, G.; Pattanaik, P.; Nakkeeran, K.; Khandual, A. Color and Texture Analysis of Textiles Using Image Acquisition and Spectral Analysis in Calibrated Sphere Imaging System-II. Electronics 2023, 12, 2135. https://doi.org/10.3390/electronics12092135

AMA Style

Rout N, Hu J, Baciu G, Pattanaik P, Nakkeeran K, Khandual A. Color and Texture Analysis of Textiles Using Image Acquisition and Spectral Analysis in Calibrated Sphere Imaging System-II. Electronics. 2023; 12(9):2135. https://doi.org/10.3390/electronics12092135

Chicago/Turabian Style

Rout, Nibedita, Jinlian Hu, George Baciu, Priyabrata Pattanaik, K. Nakkeeran, and Asimananda Khandual. 2023. "Color and Texture Analysis of Textiles Using Image Acquisition and Spectral Analysis in Calibrated Sphere Imaging System-II" Electronics 12, no. 9: 2135. https://doi.org/10.3390/electronics12092135

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop