Next Article in Journal
Numerical Analysis on Dynamic Response of CFRP-Wrapped RC Columns under Lateral Impact Loading
Previous Article in Journal
Pouch-Type Asymmetric Supercapacitor Based on Nickel–Cobalt Metal–Organic Framework
Previous Article in Special Issue
Study of Material Color Influences on Mechanical Characteristics of Fused Deposition Modeling Parts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Developing a Quality Evaluation System for Color Reproduction of Color 3D Printing Based on MATLAB Multi-Metrics

1
State Key Laboratory of Pulp and Paper Engineering, South China University of Technology, Guangzhou 510640, China
2
College of Communication and Art Design, University of Shanghai for Science and Technology, Shanghai 200093, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Materials 2023, 16(6), 2424; https://doi.org/10.3390/ma16062424
Submission received: 1 February 2023 / Revised: 10 March 2023 / Accepted: 14 March 2023 / Published: 18 March 2023
(This article belongs to the Special Issue Recent Advances in Color 3D Printing)

Abstract

:
Color 3D printing has been widely used in many fields such as cultural, medical, industrial, and food. The color reproduction accuracy of 3D printed products in these fields is becoming increasingly demanding, which requires more reproduction methods and practical tools. At present, most color 3D printing devices use one quantitative index, that is, color difference, to directly predict the color reproduction quality. However, this single quantitative index is not optimal for the curved surface of 3D printed color objects. Based on color evaluation principles, in this study, five new quantitative metrics consisting of color gamut comparison index, color SSIM index, color FSIM index, iCID index, and subjective scaling values are proposed for comparison, and the corresponding GUI design and code implementation of new color quality evaluation system are performed by MATLAB. Moreover, the comprehensive color assessment of color 3D printed products is confirmed by utilizing standard image acquisition and microscopic imaging methods that are not limited to printing materials and sampling locations. The operation of this system is validated to provide interactivity, simplicity and high efficiency. As a result, the system can provide new valuable feedback for color separation and output calibration of color 3D printing devices.

1. Introduction

In recent years, 3D printing has become a flexible and powerful technology in advanced manufacturing [1]. It has been widely used in many countries for personalized manufacturing and smart fabrication [2,3], which has also led to a huge change in industrial design [4,5], manufacturing methods [6,7], and digital flow [8]. With the diversification of materials and digitization of processes, 3D printing for high fidelity manufacturing of shape and performance is becoming more and more achievable [9,10,11]. One of the key challenges that 3D printing needs to address is color reproduction and its control, which is becoming increasingly popular in multidisciplinary applications [12]. As a result, color 3D printing has become an important branch of the 3D printing industry, which is growing quickly [13].
Currently, color reproduction in 3D printing basically relies on the textured color of the material or adhesive, which is affected by the color conversion algorithm and digital transmission model. For example, Wang et al. explored the color change properties of natural pH-sensitive pigments using an electrolytic method to achieve hue control in 3D printing by adjusting the potential [14]. Meanwhile, Wei et al. presented the response surface methodology to determine the optimal color specification to compensate for the color deviation between the measured color of the sample using PolyJet 3D printing and the target digital color [15]. Subsequently, Lee et al. explored the effects of layer thickness and print orientation on the color stability of 3D printed objects using a spectrophotometric method [16]. Obviously, the current research literature on color 3D printing focuses on offline color reproduction quality evaluation and optimization, and lacks practical online inspection and evaluation tools.
Parallelly, color gamut and chromaticity attributes have emerged as color reproduction quality evaluation indexes in the study of color reproduction influence factors for color 3D printing [17,18]. This is a good inspiration for the current 3D printing community, which relies heavily on a single-color difference as an indicator of color reproduction quality. Furthermore, some preliminary studies have also confirmed that chromatic metrics do not precisely match human vision in terms of color representation for color 3D printed objects, and more evaluation metrics are needed [19,20]. For this reason, image-based metrics have been gradually developed for color 3D printing, such as structural similarity, feature similarity, and image difference, which are full-reference image quality comparison metrics [21,22,23]. These image-based metrics are implemented in statistics, from grayscale map comparison to color map comparison.
In addition, studies on the quality evaluation of the color reproduction of 3D printed products with different substrates have been discussed as part of the objective metric described above. Previously, Liu et al. developed an initial 3D color reproduction system based on 3D scanning and 3D printing based on a polynomial regression color management approach in order to improve its color difference [24]. In fact, the current analysis of color 3D printing color accuracy reproduction challenges also indirectly illustrates the dilemma of the lack of color accuracy measurement methods in existing standard organizations and provides an extended framework for their color reproduction evaluation accuracy improvement that can be evaluated online [25]. At the same time, the color reproduction evaluation of color 3D printing is not embedded in the printing system and there are few available online analysis tools that include these objective metrics. Therefore, the development of a color reproduction quality evaluation system that can automatically analyze data is particularly critical for color 3D printing.
In this paper, a quality evaluation system for the color reproduction of color 3D prints is designed and implemented by MATLAB multi-metrics assigned in different modules. These metrics are composed of the color difference index, color gamut comparison index, color SSIM index, color FSIM index, iCID index, and subjective scaling values, which are also analyzed with different indicators and statistical methods to provide more flexible evaluation options. Section 2 illustrates basic correlation and algorithmic framework of color quality evaluation metrics for color 3D printing. Section 3 proposes the guided interface design and key codes for this evaluation system with specific details. Section 4 tests the effectiveness of the current developed system and discusses its optimization ideas for color quality evaluation.

2. Quality Evaluation Metrics for Color Reproduction of Color 3D Prints

2.1. Correlation of Color Quality Evaluation Metrics

The number of color quality evaluation indicators has a great impact on the accuracy and efficiency of color reproduction quality evaluation of color 3D printed parts. Color difference is the most commonly used quantitative metric, but does not evaluate arbitrary colored areas of color 3D printed parts very well. As a result, this study proposes three color image-based metrics and a subjective scaling metric for correlation analysis, which can fully reflect the color reproduction characteristics of color 3D printed samples. Three color image-based metrics are the color SSIM value, color FSIM value, and iCID value based on standard imaging conditions and precise devices.
The color difference value is usually calculated using two color difference formulas, CIEDE76 and CIEDE00, with the former being concise and the latter being highly accurate. The image-based metrics count all colors at a certain viewpoint, allowing for a comparison of sample points in different curvature regions. Existing image-based metrics are converted to grayscale maps for statistical analysis, but the image-based metrics used in this study are compiled directly on color maps to evaluate the color differences. The features and weights used in the statistics of different color image-based metrics will be different, and thus three typical metrics are selected for testing and analysis in this study. As the acquisition of the standard image required for color-image-based metrics is more complicated than the acquisition of the sample point color values, this system performs a numerical correlation analysis between the color difference and color image-based metrics on the printed sample data. The surface coloring efficiency (SCE) metric was also based on the physical measurement of the sampling surfaces on the color 3D prints in order to provide new objective assessment. Moreover, the subjective scaling metric is a standard observer’s perceived evaluation of the surface color of color 3D printed parts, and is the basis for the development of all of the current objective evaluation methods.
Overall, this evaluation system contains most key metrics currently relevant to the color quality of color 3D prints, from objective indicators to subjective indicators.

2.2. Algorithmic Framework of Color Quality Evaluation Metrics

Based on the multiple metrics required for the color quality evaluation described above, the solution process was decomposed step-by-step and compiled into specific functional modules for color 3D printing of different materials using the 2016a version MATLAB software.
(1)
Color difference algorithm
There are many formulas that can be used for point-by-point color difference analysis, but the classical color difference formulas are CIEDE1976 and CIEDE2000 [23].
Δ E 76 * ( O , I ) = ( L o * L i * ) 2 + ( a o * a i * ) 2 + ( b o * b i * ) 2
Δ E 00 * ( O , I ) = ( L o * L i * K L * S L ) 2 + ( C o * C i * K C * S C ) 2 + ( H o * H i * K H * S H ) 2 + R T ( C o * C i * K C * S C ) * ( H o * H i * K H * S H )
where Δ E 76 * ( O , I ) in Equation (1) is the color difference of all color samples of two color 3D reference plates calculated based on the CIEDE 1976 color difference formula, and similarly, Δ E 00 * ( O , I ) in Equation (2) is the color difference of all color of the samples of two color 3D reference plates calculated based on the CIEDE 2000 color difference formula. The L*, a*, b* are the chromaticity values of the test sample.
In order to provide multiple evaluations of the color differences of color samples on the surface of color 3D printed parts, all of them are compiled into the quality evaluation system in this study. Hence, Figure 1 shows the specific compilation flow corresponding to the two algorithms.
(2)
FSIM algorithm
The FSIM algorithm is a full reference class algorithm commonly used for the objective comparison of the image quality and is used to quantify color differences in feature areas. This study utilizes this metric to quantify the color quality of the main feature region of a standard acquired image under a certain viewpoint of a color 3D printed part, and its corresponding compilation framework is shown in Figure 2. This FSIM algorithm is compiled in the form of a MATLAB function that is oriented to grayscale maps for difference calculation. In the function module of this system, the color channels will be separated for the test images and then quantified by FSIM statistics, which is found in the main function.
(3)
SSIM algorithm
The SSIM algorithm serves a similar purpose as the FSIM algorithm, but its statistics focus on image structure differences, which are important for color 3D feature comparisons. Figure 3 shows its compilation flow, but its color SSIM value for color image application needs to provide the calling code in the corresponding module of this system.
(4)
iCID algorithm
The iCID algorithm is a full reference metric that is statistically analyzed by calculating the perception of color differences. Figure 4 shows its compilation framework and numerical statistics. This current algorithm is also called a functional function in the MATLAB system.
(5)
SCE algorithm
The SCE algorithm is an algorithm used to quantify the surface coloring efficiency of color 3D printed parts, mainly considering whiteness, roughness, and transparency. At the same time, the weights of the corresponding surface properties can be set flexibly in the system to match different printing materials, as shown in Figure 5. Thus, it is compiled into the SCE_3d function, which is known as the main program.
(6)
Color gamut comparison algorithm
The ICC recommends one formula for color gamut comparison algorithm, but this study provides three formulas in conjunction with the team’s previous research [25]. Figure 6 shows its corresponding numerical statistics and compilation framework. In addition, the Venn function is used to make a life-like visual demonstration of the values calculated by the above formula, which can be used in the main program.
(7)
Objective correlation algorithm
The integrated evaluation of multiple metrics can achieve a specific balance between efficiency and accuracy. The correlation properties between the current objective five metrics are in need of validation, and alternative fitting metrics that can replace the color difference metric are identified. So, Figure 7 demonstrates the corresponding compilation framework and provides an optimization reference for setting the image-based metric linear fit parameters.
(8)
Subjective scaling algorithm
Importantly, the subjective scaling algorithm is used for the quantification of color perception differences on physical surfaces and is commonly characterized by the mean opinion score (MOS). Figure 8 illustrates the compiled framework of the current study for the analysis of visual perceptual differences on the surface of colored 3D printed parts.
The current section shows, in detail, the compilation framework of the color quality evaluation algorithms and gives ideas for optimization in color image applications. Some of them are commonly used algorithms and some are innovative metrics of this research team [22,23]; all of them can be compiled into MATLAB functions for different function module calls.

3. Developing a Color Quality Evaluation System for Color 3D Printing

Based on above-mentioned metrics and their compilation framework, this section specifically elaborates on the development of a quality evaluation system for color reproduction of color 3D printing. This system will be divided into eight functional modules, each corresponding to one or more of the above metric calculations, and its structural framework is shown in Figure 9. Module 1 and Module 2 are used as data input ports, separately, and the other modules are based on their data for the corresponding metric analysis. Generally, the development of each functional module includes the interface design and the corresponding implementation code.

3.1. Interface Design and Demonstration of Module 1

There are essential structural differences between color 3D printed benchmarks and graphic printed benchmarks, which is the core value of this system being developed for 3D printing community. This module is a relatively simple input function module that provides the user with the flexibility to select test reference samples. Figure 10 shows the initial interface of Module 1 and its Sample1 benchmark file import demonstration. Table 1 also gives details of the types, tags, and roles of the controls used in its interface.

3.2. Interface Design and Demonstration of Module 2

After the benchmark test model is selected in Module 1, the following raw data measurements of the color 3D printed samples can be imported into Module 2, item by item, for preliminary analysis and visualization. This module provides five data sets and two image sets for import to provide the renaming, data visualization, and classification storage of each type of test data, as shown in Figure 11. In addition, Table 2 also gives details about the types, tags, and roles of the controls used in its interface.
After any category of data is imported, a new window pops up showing five subplots, each containing the measured values for all of the color patches on the benchmark test model or their average values. Figure 12a shows the CIEL*a*b* test data, each subplot corresponds to a color 3D printed test sample, while Figure 12b,c shows a detailed example of the import of two types of image sets. This functional module provides an image display of the macroscopic features and microscopic features to analyze both the state of its coloring material accumulation and the differences in its overall surface light radiation properties. This is another innovative point of the current quality evaluation system.

3.3. Interface Design and Demonstration of Module 3

After all of the image set inputs and global variable calls are set up in Module 2, then Module 3 can start resolving image-based metrics and generating image quality difference maps. This module provides the above three image-based metric comparison operations for two types of image sets, and satisfies the renaming, data visualization, and classification storage of the solved image-based metrics, whose initial interface and key codes are shown in Figure 13. In addition, Table 3 also gives details about the types, tags, and roles of the controls used in the interface.

3.4. Interface Design and Demonstration of Module 4

Starting with Module 4, the function of the modules of this evaluation system is metric analysis and quantitative correlation. The current module is the overall color difference analysis and provides two representative color difference formulas (CIEDE1976 and CIEDE2000) processing codes, shown in Figure 14. Specifically, the control at the bottom of the interface can also provide a specific color difference analysis of the colors of the same height area on each printed benchmark. In addition, Table 4 also gives details of the types, tags and roles of the controls used in its interface.

3.5. Interface Design and Demonstration of Module 5

Module 5 also calls the saved data of Module 2 for color gamut volume calculation, difference comparison and interactive visualization. The comparison order is based on the order of the image set saved in Module 3, and the reference sample name is by default the tested image of the first benchmark. Uniquely, the module provides three color gamut contrast formulas (GCI, GSI, GDI) operating codes for the user to select, shown in Figure 15. In addition, Table 5 also gives details of the types, tags and roles of the controls used in its interface.

3.6. Interface Design and Demonstration of Module 6

The SCE metric in this module is not a fixed metric. Module 6 also calls up the whiteness, roughness and gloss data saved in Module 2, combines them with the defined coefficients of the surface coloring efficiency formula, and then calculates the surface coloring efficiency values and visualizes the trends. The module provides the same order of surface coloring efficiency comparison operations as Module 3, and defaults the reference sample name to the first benchmark. Practically, the module provides a mixed selection function for three types of attributes: whiteness, roughness and gloss in Figure 16, and satisfies the flexible setting of weight coefficients for each type of attribute. In addition, Table 6 also gives details of the types, tags and roles of the controls used in its interface.

3.7. Interface Design and Demonstration of Module 7

The function of Module 7 is to construct a linear relationship between the overall average of the image-based metrics saved in Module 3 and the overall average color difference saved in Module 4, and to visualize them one by one. Since the order of the overall average color difference operations in Module 4 is the same as the order of the image set saved in Module 3, the two objective measures can be corresponded to each other. The module provides a flexible choice of horizontal and vertical data for linear fitting of objective metrics in the form of the following menus. Interestingly, when the horizontal coordinate data used for subsequent linear fits are the same, the new selected image-based metric data are displayed superimposed in the same coordinate system by codes, shown in Figure 17. In addition, Table 7 also gives details of the types, tags and roles of the controls used in its interface.

3.8. Interface Design and Demonstration of Module 8

The function of Module 8 is to provide the need for subjective and objective metrics to quantify the correlation visualization and preservation by four control buttons, as shown in Figure 18. The data used in this module for “Subjective index” are the data saved in Module 2, which are used to derive the overall average of the subjective perceptions of each benchmark sample. The data for the “Objective index” are derived from the coefficient values solved by linear regression between the three image-based metrics and the color difference metric in order to construct an integrated image-based metric, and then quantify the trend in the same coordinate system as the requested subjective metric. Based on the color 3D printed samples, Figure 18 also shows their subjective and objective correlation features under the two color difference calculation formulas. In addition, Table 8 also gives details of the types, tags, and roles of the controls used in its interface.

4. Results and Discussion

4.1. System Operation Test

Figure 19 shows the initial interface details of the developed system, and Figure 20 shows the terminal interface details based on the real data of our published research article using a paper-based color 3D printer [26]. This system provides a very clear view of the output characteristics of each module, and each module is independent and complementary, providing detailed data trends in terms of the color reproduction quality evaluation. Critically, clicking the control button “Save all” stores the current interface state of the entire system, which provides an external reference for color quality evaluation of the color 3D printing sample.
This evaluation system runs smoothly on Win10 PC with more than 5100 lines of total code. Importantly, this system is based on the evaluation behavior of the current color 3D printing color reproduction in industry, and its overall function can also be embedded in the online quality inspection system for color 3D printing. In terms of specific functional implementation, one needs to consider the subjective and objective acquisition features as well as the operational flows, and the other needs excellent architectural design and GUI design. Its architecture is designed to address the smooth transfer of data flow and logic flow between each of the modules, which is a key factor affecting the efficiency of computing and the presentation of results. So, overall interface is designed with eight functional modules that meet the user aesthetics while providing enough detailed feature information to evaluate.

4.2. Discussion on the Practicality of the Current System

This system is capable of test benchmark input, measurement data import, analysis result visualization, and interface status output, and its test results based on the test base data of published research papers are the same as those of the manual inference [22,23]. Thus, this subsection will further discuss the derived functions of some modules to expand its application in different color 3D printing devices.
(1)
Integrated analysis of image-based metrics using the weighted parameter method
There are many image quality attributes, and the quality evaluation index developed for a particular class of attributes does not accurately apply to all image quality evaluations. This leads to the same problem for the evaluation of high-definition acquired images for specific views of color 3D printed entities. Based on the current color 3D printed benchmark and its measurement data, Table 9 and Table 10 show the linear regression equation and the goodness of fit between the overall color difference and the corresponding image-based metric for the two color difference modes, respectively. It can be found that a change in the mode of color difference calculation leads to a change in the weight coefficients of the constructed linear regression equations, but their numerical expressions are in the same form. In addition, artificial intelligence algorithms may be able to improve the fit when the color difference is associated with fewer image-based metrics for the evaluation analysis.
(2)
Optimization of the surface coloring efficiency formula based on microscopic imaging
The setting of the weights of the variable parameters for the evaluation of the surface coloring efficiency in this system can be optimized by quantifying the microscopic imaging features on the surface of the benchmark substrate. Starting from the measurement data of the substrate surface properties, it is possible to circumvent the harsh conditions for accurate color measurement and it also offers the possibility to adjust the coloring properties of the non-printed layer during the printing material distribution. Based on the current evaluation system, the combination of image-based metrics from Module 3 can be quantitatively characterized for the set of microscopic images imported in Module 2 to guide the setting of the weight coefficients for each attribute in the surface coloring efficiency formulation in Module 6. Microscopic image analysis of sampling points on the substrate can qualitatively identify numerical fluctuations in the above surface property measurements, but there is no specific numerical model to guide exactly how to adjust them.
(3)
Comprehensive analysis of evaluation results for multiple objective metrics
The current evaluation system provides three types of objective evaluations: image-based metrics for Module 3, chromaticity (color difference and color gamut) for Module 4 and Module 5, and surface coloring efficiency for Module 6, which is quantitatively correlated in Module 7 and Module 8. For color 3D printing applications with uncomplicated color features, the color reproduction quality evaluation of the 3D printed parts can be given quickly and more accurately by using only the color difference metric and color gamut metric, which can be fully satisfied by the current evaluation system. However, when it comes to color 3D printed parts with particularly demanding color reproduction requirements, after acquiring the measurement data required for the calculation of the above three types of objective metrics, it is necessary to reconsider the consistency and preference of the assessment conclusion of each type of objective metric. The goal of the current evaluation system is to be a handy and efficient tool with no sense of autonomous judgment, and it is entirely up to trained color 3D printing engineers to give comprehensive inferences based on the system’s output. This inference can be directly reported to the material assignment system of the color 3D printing device, which in turn can adjust the color sorting parameters to optimize its color reproduction quality. As a result, when encountering disagreement in the evaluation inference of the above three types of objective metrics, this current evaluation system will first refer to chromaticity as the first choice, followed by image-based metrics, and finally to surface coloring efficiency.

5. Conclusions

This study comprehensively elaborates on the interface design and code compilation required for the automated evaluation system of color reproduction quality for color 3D printing, and sufficiently discusses the practicality of the current system for the color evaluation of actual 3D printed parts. This evaluation system takes into account the longitudinal height variation of printed color samples, which is fundamentally different from current color reproduction quality evaluation systems in the field of graphic printing in terms of the evaluation of the geometric features of the printed object. The current system can run efficiently, and the next step is to use an artificial intelligence algorithm for accurate fitting of the multi-material data. The number of colors tested by this system is not many, and the subsequent testing of large color gamut entities are also worth exploring in order to improve its evaluation accuracy for multi-material color 3D printing.

Author Contributions

Conceptualization, J.Y. and G.C.; methodology, L.W. and Q.W.; software and data curation, L.W.; validation, L.W. and J.Y.; visualization, L.W. and J.Y.; formal analysis, L.W. and J.Y.; investigation, Q.W. and L.W.; resources, G.C.; writing—original draft preparation, Q.W. and L.W.; writing—review and editing, J.Y. and G.C.; supervision, G.C.; project administration, J.Y.; funding acquisition, G.C. and J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been financially supported by the Natural Science Foundation of China (Grant No. 61973127), Guangdong Provincial Science and Technology Program (Grant No. 2022A1515011416), and Young Scholars Launching Fund (Grant No. 10-00-309-008-01).

Institutional Review Board Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Larson, N.; Mueller, J.; Chortos, A.; Davidson, Z.S.; Clarke, D.R.; Lewis, J.A. Rotational multimaterial printing of filaments with subvoxel control. Nature 2023, 613, 682–688. [Google Scholar] [CrossRef] [PubMed]
  2. Muflikhun, M.A.; Sentanu, D.A. Characteristics and performance of carabiner remodeling using 3D printing with graded filler and different orientation methods. Eng. Fail. Anal. 2021, 130, 105795. [Google Scholar] [CrossRef]
  3. Ren, Z.; Gao, L.; Clark, S.J.; Fezzaa, K.; Shevchenko, P.; Choi, A.; Everhart, W.; Rollett, A.D.; Chen, L.; Sun, T. Machine learning-aided real-time detection of keyhole pore generation in laser powder bed fusion. Science 2023, 379, 89–94. [Google Scholar] [CrossRef]
  4. Dalpadulo, E.; Petruccioli, A.; Gherardini, F.; Leali, F. A Review of Automotive Spare-Part Reconstruction Based on Additive Manufacturing. J. Manuf. Mater. Process. 2022, 6, 133. [Google Scholar] [CrossRef]
  5. Khorasani, M.; Ghasemi, A.; Rolfe, B.; Gibson, I. Additive manufacturing a powerful tool for the aerospace industry. Rapid Prototyp. J. 2021, 28, 87–100. [Google Scholar] [CrossRef]
  6. Url, P.; Stampfl, D.; Tödtling, M.; Vorraber, W. Challenges of an additive manufacturing service platform for medical applications. Procedia CIRP 2022, 112, 400–405. [Google Scholar] [CrossRef]
  7. Derossi, A.; Corradini, M.; Caporizzi, R.; Oral, M.O.; Severini, C. Accelerating the process development of innovative food products by prototyping through 3D printing technology. Food Biosci. 2023, 52, 102417. [Google Scholar] [CrossRef]
  8. Zhang, F.; Deng, K. Innovative application of 3D printing technology in Fashion design. J. Phys. Conf. Ser. 2021, 1790, 012030. [Google Scholar]
  9. Singh, B. Role of Additive Manufacturing in Development of Forming Tools and Dies for Sheet Metal Forming: A Review. Key Eng. Mater. 2022, 6584, 119–128. [Google Scholar] [CrossRef]
  10. Alammar, A.; Kois, J.; Revilla-León, M.; Att, W. Additive Manufacturing Technologies: Current Status and Future Perspectives. J. Prosthodont. 2022, 31 (Suppl. S1), 4–12. [Google Scholar] [CrossRef] [PubMed]
  11. Nascimento, R.; Martins, I.; Dutra, T.; Moreira, L. Computer Vision Based Quality Control for Additive Manufacturing Parts. Int. J. Adv. Manuf. Technol. 2022, 124, 3241–3256. [Google Scholar] [CrossRef]
  12. Golhin, A.; Sole, S.A.; Strandlie, A. Color appearance in rotational material jetting. Int. J. Adv. Manuf. Technol. 2022, 124, 1183–1198. [Google Scholar] [CrossRef]
  13. Pagac, M.; Hajnys, J.; Ma, Q.; Jansa, J.; Stefek, P.; Mesicek, J. A Review of Vat Photopolymerization Technology: Materials, Applications, Challenges, and Future Trends of 3D Printing. Polymers 2021, 13, 598. [Google Scholar] [CrossRef] [PubMed]
  14. Wang, R.; Li, Z.; Shi, J.; Holmes, M.; Wang, X.; Zhang, J.; Zhai, X.; Huang, X.; Zou, X. Color 3D printing of pulped Yam utilizing a natural pH sensitive pigment. Addit. Manuf. 2021, 46, 102062. [Google Scholar] [CrossRef]
  15. Wei, X.; Bhardwaj, A.; Zeng, L.; Pei, Z. Prediction and Compensation of Color Deviation by Response Surface Methodology for PolyJet 3D Printing. J. Manuf. Mater. Process. 2021, 5, 131. [Google Scholar] [CrossRef]
  16. Lee, E.-H.; Ahn, J.-S.; Lim, Y.-J.; Kwon, H.-B.; Kim, M.-J. Effect of layer thickness and printing orientation on the color stability and stainability of a 3D-printed resin material. J. Prosthet. Dent. 2022, 127, 784.e1–784.e7. [Google Scholar] [CrossRef]
  17. Zheng, L.; Li, C.; Yang, S. Analysis of Color Gamut in Color 3D Printing. Lect. Notes Electr. Eng. 2020, 600, 148–155. [Google Scholar]
  18. Li, C.; Zheng, L.; Xiao, Y. Study on the Influencing Factors of Color Reproduction in Color 3D Printing. Lect. Notes Electr. Eng. 2020, 600, 156–163. [Google Scholar]
  19. Huang, M.; Pan, J.; Wang, Y.; Li, Y.; Hu, X.; Li, X.; Xiang, D.; Hemingray, C.; Xiao, K. Influences of shape, size, and gloss on the perceived color difference of 3D printed objects. J. Opt. Soc. Am. A Opt. Image Sci. Vis. 2022, 39, 916–926. [Google Scholar] [CrossRef] [PubMed]
  20. Han, X.; Ren, Y.; Wang, Y.; Wang, J.; Xiong, Y.; Wu, G.; Li, X. Color Reproduction Analysis for 3D Printing Based on Photosensitive Resin. Interdiscip. Res. Print. Packag. 2022, 896, 54–58. [Google Scholar]
  21. Yao, D.; Yuan, J.; Tian, J.; Wang, L.; Chen, G. Pigment Penetration Characterization of Colored Boundaries in Powder-Based Color 3D Printing. Materials 2022, 15, 3245. [Google Scholar] [CrossRef] [PubMed]
  22. Yuan, J.; Tian, J.; Chen, C.; Chen, G. Experimental Investigation of Color Reproduction Quality of Color 3D Printing Based on Colored Layer Features. Molecules 2020, 25, 2909. [Google Scholar] [CrossRef] [PubMed]
  23. Tian, J.; Yuan, J.; Li, H.; Yao, D.; Chen, G. Advanced Surface Color Quality Assessment in Paper-Based Full-Color 3D Printing. Materials 2021, 14, 736. [Google Scholar] [CrossRef] [PubMed]
  24. Liu, Y.; Zhang, R.; Ye, H.; Wang, S.; Wang, K.-P.; Liu, Y.; Zhou, Y. The development of a 3D colour reproduction system of digital impressions with an intraoral scanner and a 3D printer: A preliminary study. Sci. Rep. 2019, 9, 20052. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Yuan, J.; Chen, G.; Li, H.; Prautzsch, H.; Xiao, K. Accurate and Computational: A Review of Color Reproduction in Full-color 3D Printing. Mater. Des. 2021, 209, 109943. [Google Scholar] [CrossRef]
  26. Yuan, J.; Chen, C.; Tian, J.; Yao, D.Y.; Li, H.; Chen, G.X. Color Reproduction Evaluation of Paper-Based Full-Color 3D Printing based on Image Quality Metrics. Digit. Print. 2020, 5, 26–34. [Google Scholar]
Figure 1. Compilation framework for color difference algorithms: (a) CIEDE1976 and (b) CIEDE2000.
Figure 1. Compilation framework for color difference algorithms: (a) CIEDE1976 and (b) CIEDE2000.
Materials 16 02424 g001
Figure 2. Compilation framework for the grayscale FSIM algorithm.
Figure 2. Compilation framework for the grayscale FSIM algorithm.
Materials 16 02424 g002
Figure 3. Compilation framework for the grayscale SSIM algorithm.
Figure 3. Compilation framework for the grayscale SSIM algorithm.
Materials 16 02424 g003
Figure 4. Compilation framework for the iCID algorithm.
Figure 4. Compilation framework for the iCID algorithm.
Materials 16 02424 g004
Figure 5. Compilation framework for the proposed SCE algorithm.
Figure 5. Compilation framework for the proposed SCE algorithm.
Materials 16 02424 g005
Figure 6. Compilation framework for color gamut comparison algorithms.
Figure 6. Compilation framework for color gamut comparison algorithms.
Materials 16 02424 g006
Figure 7. Compilation framework for the objective correlation algorithm.
Figure 7. Compilation framework for the objective correlation algorithm.
Materials 16 02424 g007
Figure 8. Compilation framework for the subjective scaling algorithm.
Figure 8. Compilation framework for the subjective scaling algorithm.
Materials 16 02424 g008
Figure 9. The basic architecture of the proposed system.
Figure 9. The basic architecture of the proposed system.
Materials 16 02424 g009
Figure 10. Initial interface and sample demonstration of Module 1.
Figure 10. Initial interface and sample demonstration of Module 1.
Materials 16 02424 g010
Figure 11. Initial interface of Module 2.
Figure 11. Initial interface of Module 2.
Materials 16 02424 g011
Figure 12. Demonstration of numerical and image import: (a) CIEL*a*b* values of five areas; (b) high resolution acquired images of Sample 3; (c) microscopic imaging of magenta block.
Figure 12. Demonstration of numerical and image import: (a) CIEL*a*b* values of five areas; (b) high resolution acquired images of Sample 3; (c) microscopic imaging of magenta block.
Materials 16 02424 g012
Figure 13. Interface design and demonstration of its calling codes in Module 3: (a) initial Interface; (b) color SSIM metric; (c) iCID metric; (d) color FSIM metric.
Figure 13. Interface design and demonstration of its calling codes in Module 3: (a) initial Interface; (b) color SSIM metric; (c) iCID metric; (d) color FSIM metric.
Materials 16 02424 g013
Figure 14. Interface design and demonstration of its calling codes in Module 4.
Figure 14. Interface design and demonstration of its calling codes in Module 4.
Materials 16 02424 g014
Figure 15. Interface design and demonstration of its calling codes in Module 5.
Figure 15. Interface design and demonstration of its calling codes in Module 5.
Materials 16 02424 g015
Figure 16. Interface design and demonstration of its calling codes in Module 6.
Figure 16. Interface design and demonstration of its calling codes in Module 6.
Materials 16 02424 g016
Figure 17. Interface design and demonstration of the calling codes in Module 7: (a) initialization; (b) terminal; (c) code show.
Figure 17. Interface design and demonstration of the calling codes in Module 7: (a) initialization; (b) terminal; (c) code show.
Materials 16 02424 g017
Figure 18. Interface design and its case demonstration in Module 8.
Figure 18. Interface design and its case demonstration in Module 8.
Materials 16 02424 g018
Figure 19. Overall interface design and initialization of the current system.
Figure 19. Overall interface design and initialization of the current system.
Materials 16 02424 g019
Figure 20. A test case of the current system based on Sample1 and real data in [26].
Figure 20. A test case of the current system based on Sample1 and real data in [26].
Materials 16 02424 g020
Table 1. Types, labels, and roles of controls used in the Module 1 interface.
Table 1. Types, labels, and roles of controls used in the Module 1 interface.
Type of ControlsTag of ControlsRole of Controls
pushbuttonshowImporting model views from folders, displaying the images and their names in the corresponding positions
axesFigure 1Displaying the imported model view in the coordinate system
editedit 1Displaying the name of the imported model view in the text box
Table 2. Types, labels, and roles of the controls used in the Module 2 interface.
Table 2. Types, labels, and roles of the controls used in the Module 2 interface.
Type of ControlsTag of ControlsRole of Controls
pushbuttonlab_valueDisplay of the CIEL*a*b* values for the specified benchmark test model
pushbuttonwhitenessDisplay of the whiteness of the specified benchmark test model
pushbuttonglossinessDisplaying the glossiness of the specified benchmark test model
pushbuttonroughnessDisplay of the roughness of the specified benchmark test model
pushbuttonSubjective_scalesDisplay of the subjective perception values of the specified benchmark test model
pushbuttonhd_imageImporting a high-definition acquisition image of the specified benchmark test model
pushbuttonmi_imageImporting a microscopic image of the specified benchmark test model
pushbuttonimage_displayMaking Figure 11 display the image selected in image_type
popupmenuimage_typeDisplaying in the drop-down menu the image type just imported
popupmenueach_imageDisplaying the names of all images under the image type selected by image_type in the drop-down menu
axesFigure 11Displaying the selected image in the coordinate system
Table 3. Types, labels, and roles of controls used in the Module 3 interface.
Table 3. Types, labels, and roles of controls used in the Module 3 interface.
Type of ControlsTag of ControlsRole of Controls
radiobuttonradiobutton1Selecting the target image set and updating the drop-down menu Reference_image with Control_image
radiobuttonradiobutton8Selecting other image sets and updating the drop-down menu Reference_image with Control_image
popupmenuReference_imageDisplay all image names of the specified image set in the drop-down menu and specify the reference image
popupmenuControl_imageDisplaying all image names of the specified image set in the drop-down menu and specifying the control image
popupmenuMetric_selectionSelection of the target image base metric algorithm in the drop-down menu
pushbuttonCalculationPerforming operations according to the options in the preceding drop-down menu
pushbuttonpushbutton12Calculation of the image basis metric solution for the reference image and other control images in the drop-down menu and saving it in the specified excel file
editedit3Display of the values of the most recent operation of the button Calculation
axesFigure 13Displaying the corresponding mapping of the image basis solution in the coordinate system
Table 4. Types, labels and roles of controls used in the Module 4 interface.
Table 4. Types, labels and roles of controls used in the Module 4 interface.
Type of ControlsTag of ControlsRole of Controls
radiobuttonCIEDE_1976Calling the CIEDE1976 color difference algorithm to solve for the average color difference value of the coloring steps and the overall average color difference value
radiobuttonCIEDE_2000Calling the CIEDE2000 color difference algorithm and solving for the coloring step average color difference value and the overall average color difference value.
pushbuttonDelta_E_analysisDisplaying the overall average color difference value in Figure 14 and the coloring step average color difference value in a new pop-up window
axesFigure 14Displaying the corresponding overall average color difference value in the coordinate system
Table 5. Types, labels and roles of controls used in the Module 5 interface.
Table 5. Types, labels and roles of controls used in the Module 5 interface.
Type of ControlsTag of ControlsRole of Controls
popupmenuVxDesignating as a reference sample
popupmenuVyDesignating as a control sample
radiobuttonGCIDisplaying the color gamut interaction mapping map and contrast values based on the GCI algorithm in Figure 15
radiobuttonGSIDisplaying the color gamut interaction mapping map and contrast values based on the GSI algorithm in Figure 15
radiobuttonGDIDisplaying the color gamut interaction mapping map and contrast values based on the GDI algorithm in Figure 15
axesFigure 15Display the corresponding color gamut interaction map and values in the coordinate system
Table 6. Types, labels and roles of controls used in the Module 6 interface.
Table 6. Types, labels and roles of controls used in the Module 6 interface.
Type of ControlsTag of ControlsRole of Controls
checkboxcheckbox1Whether the SCE formula includes a weighting factor for roughness
checkboxcheckbox2Whether the SCE formula contains a weighting factor for glossiness
checkboxcheckbox3Whether the SCE formula contains a weighting factor for whiteness
editedit4Radio checkbox1 to update the weight coefficients of roughness
editedit5Radio checkbox2 to update the weight coefficients for glossiness
editedit6Radio checkbox3 to update the weight coefficients for whiteness
pushbuttonpushbutton14Display of the overall average SCE difference in Figure 16 and a new window showing the base white block SCE values
axesFigure 16Display the overall average SCE difference within the coordinate system
Table 7. Types, labels, and roles of controls used in the Module 7 interface.
Table 7. Types, labels, and roles of controls used in the Module 7 interface.
Type of ControlsTag of ControlsRole of Controls
popupmenuhorizontal_axesThe first drop-down menu provides a choice of color difference calculation methods
popupmenuvertical_axesThe second drop-down menu provides the image basis metric method and displays the linear fit results and their storage in the specified excel
axesFigure 17Display multiple sets of linear fit results within the coordinate system
Table 8. Types, labels, and roles of controls used in the Module 8 interface.
Table 8. Types, labels, and roles of controls used in the Module 8 interface.
Type of ControlsTag of ControlsRole of Controls
pushbuttonpushbutton16Linear regression with image base metric data according to the color difference selected by the drop-down menu horizontal_axes
pushbuttonpushbutton17Importing subjective perceptual metric data
pushbuttonpushbutton18Causing a scatter plot of the subjective and objective data to be displayed in Figure 18, displaying the regression equation under the plot
pushbuttonpushbutton19Saving the current interface of the system
axesFigure 18Display the scatter plot of subjective and objective data in the coordinate system
Table 9. Linear regression equation using the CIEDE1976 color difference mode.
Table 9. Linear regression equation using the CIEDE1976 color difference mode.
ElementLinear Regression Equation R 2
MSSIMDelta E76 = −22.761 MSSIM + 23.1280.678
FSIMcDelta E76 = −28.595 FSIMc + 28.9860.661
iCIDDelta E76 = 21.948 iCID + 0.3630.641
MSSIM + FSIMcDelta E76 = −80.983 MSSIM + 74.230 FSIMc + 7.1160.697
MSSIM + iCIDDelta E76 = −198.963 MSSIM − 175.109 iCID + 199.7340.817
FSIMc + iCIDDelta E76 = −532.992 FSIMc − 393.479 iCID + 534.3380.839
MSSIM + FSIMc + iCIDDelta E76 = −190.382 MSSIM − 512.662 FSIMc − 566.191 iCID + 704.7421.000
Table 10. Linear regression equation using the CIEDE2000 color difference mode.
Table 10. Linear regression equation using the CIEDE2000 color difference mode.
ElementLinear Regression Equation R 2
MSSIMDelta E00 = −16.768 MSSIM + 17.0590.663
FSIMcDelta E00 = −20.999 FSIMc + 21.3120.643
iCIDDelta E00 = 16.122 iCID + 0.2930.623
MSSIM + FSIMcDelta E00 = −72.394 MSSIM + 70.920 FSIMc + 1.7620.693
MSSIM + iCIDDelta E00 = −160.525 MSSIM − 142.865 iCID + 161.1470.829
FSIMc + iCIDDelta E00 = −385.248 FSIMc − 284.150 iCID + 386.2500.809
MSSIM + FSIMc + iCIDDelta E00 = −154.353 MSSIM − 368.765 FSIMc − 424.176 iCID + 524.4061.000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, L.; Yuan, J.; Wu, Q.; Chen, G. Developing a Quality Evaluation System for Color Reproduction of Color 3D Printing Based on MATLAB Multi-Metrics. Materials 2023, 16, 2424. https://doi.org/10.3390/ma16062424

AMA Style

Wang L, Yuan J, Wu Q, Chen G. Developing a Quality Evaluation System for Color Reproduction of Color 3D Printing Based on MATLAB Multi-Metrics. Materials. 2023; 16(6):2424. https://doi.org/10.3390/ma16062424

Chicago/Turabian Style

Wang, Liru, Jiangping Yuan, Qinghua Wu, and Guangxue Chen. 2023. "Developing a Quality Evaluation System for Color Reproduction of Color 3D Printing Based on MATLAB Multi-Metrics" Materials 16, no. 6: 2424. https://doi.org/10.3390/ma16062424

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop