Next Article in Journal
Methodology to Evaluate the Performance of Portable Photogrammetry for Large-Volume Metrology
Next Article in Special Issue
Simulation Uncertainty for a Virtual Ultrasonic Flow Meter
Previous Article in Journal
Optimization of the Processing Time of Cross-Correlation Spectra for Frequency Measurements of Noisy Signals
Previous Article in Special Issue
GUM-Compliant Uncertainty Evaluation Using Virtual Experiments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Impact of Imperfect Artefacts and the Modus Operandi on Uncertainty Quantification Using Virtual Instruments

1
Van Swinden Laboratorium, Thijsseweg 11, 2629 JA Delft, The Netherlands
2
Physikalisch-Technische Bundesanstalt, Abbestraße 2-12, 10587 Berlin, Germany
*
Author to whom correspondence should be addressed.
Metrology 2022, 2(2), 311-319; https://doi.org/10.3390/metrology2020019
Submission received: 18 March 2022 / Revised: 6 May 2022 / Accepted: 8 June 2022 / Published: 12 June 2022
(This article belongs to the Special Issue Virtual Measuring Systems and Digital Twins)

Abstract

:
The usage of virtual instruments (VIs) to analyze measurements and calculate uncertainties is increasing. Well-known examples are virtual coordinate measurement machines (VCMMs) which are often used and even commercially offered to assess measurement uncertainties of CMMs. A more recent usage of the VI concept is posed by the modeling of scatterometers. These VIs can be used to assess the measurement uncertainty after the measurement has been performed based on the real measurement data or prior to the measurement to predict the measurement uncertainty using a type of simulated measurement data. The research question addressed in this paper is to assess if this predicted uncertainty will be similar in magnitude to the calculated uncertainty based on the measurement data. It turns out that this is not necessarily the case. The main observation of this paper was that the uncertainty predicted by a VI can be highly sensitive to the chosen way of operating the VI. To amend this situation, a simple procedure was proposed that can be used prior to performing the real measurement and that is believed to produce a conservative prediction of the measurement uncertainty in most cases. This was verified in a case study involving the measurement of the asphericity of an imperfect sphere using a CMM, with the uncertainty calculated by means of a VCMM.

1. Introduction

A prerequisite for measurements in a metrological context is to evaluate the uncertainty of the measurement result. The mainstream document for performing such uncertainty evaluations is the Guide to the Expression of Uncertainty in Measurement (GUM) [1] and its supplements. The GUM requires specifying a mathematical model. This can be readily performed in the case of relatively simple measurements but requires more work in the case of more involved measurement tasks. For complex instruments, a likewise complex mathematical model can be constructed, which can help the uncertainty of the quantity of interest, which is called the measurand in the GUM, be calculated. A virtual instrument (VI) can be part of this model or even comprise the model completely. In several publications [2,3,4,5], the concept of simulating the measurement instrument and using a Monte Carlo approach to calculate the uncertainty associated with an estimate of the measurand is proposed, and it continues to be a lively area of research [6,7,8]. In this process, all uncertain model parameters are randomly varied by sampling parameter values from appropriate probability distributions and, subsequently, the mathematical model is evaluated. This results in a distribution of possible values for the measurand from which an uncertainty can be calculated.
In the past, uncertainty evaluation using VIs has been applied to various instruments, but the virtual coordinate measurement machine (VCMM) seems to be one of the best-known examples. In various scientific publications, attention has been paid to modeling the CMM more faithfully and integrating additional evaluation routines, in addition to the well-known evaluation of the parameters of the standard geometric elements such as circles, spheres, and planes. The usage of VCMMs has been standardized in ISO 15330-4:2008 [9], which claims to provide a GUM-compliant uncertainty evaluation. VCMMs have been implemented in various commercial solutions. In the Good Practice Guide [10], some of these solutions are discussed. In this guide, it is also explained that the uncertainty can be calculated after performing the measurement using real measurement data or before the measurement takes place to predict the measurement uncertainty of the real measurement. In the latter case, the artefact and associated measurement data have to be simulated, and this can be conducted in different ways. However, there is typically no guidance on how this should be done in detail. A straightforward approach is to use the nominal shape of the artefact. However, we show that this approach does not always provide adequate results.
The fact that a VI can be operated in multiple ways is not unique to VCMMs but is very general, e.g., in current research in the EMPIR project ATMOC [11], a VI is being developed for a scatterometer and, in that case, similar questions are of importance. Furthermore, since the mathematical model of the GUM and the model represented by a VI can differ, reaching a GUM-compliant uncertainty assessment for real data using a VI is not straightforward, and the development of appropriate methods is required [12]. The procedure applied in this paper for uncertainty evaluation was based on a Monte Carlo procedure similar to that used in Supplement 1 to the GUM [13].
The research question addressed was to assess if the predicted uncertainties based on the simulated data were of similar magnitude as the uncertainties based on real measurement data. As it turned out that this is not automatically the case, the second question was how one can operate a VI prior to performing the actual measurement in order to produce a safe, conservative prediction of the uncertainty for the real measurement. The presentation in the next sections is conducted alongside an example of the measurement of the asphericity of a sphere by means of a CMM measurement.
The structure of the remainder of the paper is as follows: in Section 2, the employed CMM and VCMM and the measured artefact are described; in Section 3, the results of our investigation are presented; in Section 4, the results are discussed; the paper ends with the conclusion in Section 5.

2. Materials and Methods

To assess the effect of the choices mentioned in Section 1, a numerical simulation model of a CMM (i.e., a VCMM), was used and is presented in the next section. In Section 2.2, the particular dataset used is described together with the research design.

2.1. Numerical Simulation Model

For the numerical example, a VCMM was used that was developed some years ago in several research projects [14,15]. It models a highly accurate Zeiss F25 CMM that possesses a submicrometer measurement uncertainty when measuring individual points. It was designed to measure microparts in a measurement volume of 100 × 100 × 100 mm. The F25 CMM itself was jointly developed by Carl Zeiss, TU Eindhoven, and VSL. In Figure 1, the F25 CMM is shown at the VSL length laboratory, where it was also regularly calibrated, mainly using interferometric methods. The smallest used probe tip diameter amounted to only 120 μm.
The VCMM models geometric machine errors, probe errors, and errors arising from the ambient temperature. There are both fully random contributions that are independent for every measured (x,y,z)-point, and there are systematic contributions that are ‘frozen’ in the CMM’s imperfect mechanical structure and imperfect probe roundness. These systematic contributions are the same for every measurement, and they generally also cause a correlated error structure for all measured (x,y,z)-points. For example, all points with the same y-coordinate may have the same error due to the presence of an unknown error in the ruler for the y-axis. The VCMM was implemented in MATLAB [16]. Parameter values were determined by calibrating the various error sources of the CMM. The VCMM has the option to use real or simulated measurement data as input, and all individual errors can be controlled and, thus, included or excluded in a calculation. A substantial number of geometric evaluations have been implemented, ranging from least squares and minimum zone geometric elements to the determination of aspheric lens parameters and the Zernike decomposition. The VCMM was validated by measuring several reference artefacts and evaluating the consistency of the estimates for the measurand in terms of calculated uncertainties.

2.2. Dataset and Research Design

In this paper, this VCMM was applied to the measurement data of 25 points on the upper half of a sphere with a diameter of 4 mm in the way specified by ISO10360 [17], see Figure 2.
Realistic machine parameters were used, and a minimum zone sphere fit algorithm was applied. The asphericity was then quantified by the peak-to-valley (PV) value of the residuals of the best-fit minimum zone sphere, i.e., the sphere that had the smallest maximum absolute value of the fit residuals. The number of repetitions used in the VCMM calculation was 10,000.
The research design was to apply the VCMM to the following types of data and to compare the calculated uncertainties:
  • Real measurement data of a measured sphere;
  • Simulated data of the nominal measurement points;
  • Simulated data of an imperfect sphere with a similar PV value—version 1;
  • Simulated data of an imperfect sphere with a similar PV value—version 2.
The choice of scenarios was made to compare the difference in results when using real and simulated data. Furthermore, since different ways of operating are possible for the latter case, three different variants were considered: one that was based on a perfect artefact and two versions of an imperfect sphere.

3. Results

3.1. Uncertainty Calculation

In this section, the employed Monte Carlo method for calculating uncertainties is specified. In addition, we explain why different results can be expected when dealing with measured and simulated data. Furthermore, we argue that there is some inherent arbitrariness in the case of simulated data due to the assumptions that need to be taken regarding the underlying ground truth.
The measurement process was modeled by adding systematic errors δXsys and random errors δXran to the ground truth Xtrue, leading to the simulated measurement data Xmeas.
Xmeas = Xtrue + δXsys + δXran
In the case of a CMM, all these variables could be seen as matrices with rows with (x,y,z)-coordinates corresponding to the measured (x,y,z)-points. The measurand Y (e.g., the asphericity) is defined by applying the evaluation function f to the ground truth Xtrue.
Y = f(Xtrue)
When simulating data, the assumed ground truth Xtrue and, hence, the measurand are known, while in the case of measured data the measurand generally remains unknown. An estimate of the measurand is obtained for real data Xmeas by inserting Xmeas into model (2). The uncertainty associated with this estimate was taken as the standard deviation of the distribution that was obtained when repeatedly adding randomly drawn systematic errors δXsys and random errors δXran to Xmeas, followed by the application of the evaluation function f in (2). For simulated data, the standard deviation of the distribution, obtained when repeatedly adding randomly drawn systematic errors δXsys and random errors δXran to some Xsim, followed by the application of the evaluation function f, is taken as the standard uncertainty predicted for the case of a real measurement. The required choice of Xsim ought to be done in light of what one expects Xtrue to be.
Two differences emerged between the two cases. Measured data were contaminated by systematic and random errors, and by adding randomly drawn samples of (further) simulated errors, the unknown ground truth Xtrue was in effect corrupted twice before inserting it into the evaluation function f, while in the case of simulated data, the assumed ground truth Xsim was corrupted only once. As a consequence, at least, the mean of the distribution of the values obtained for the measurand in the course of the Monte Carlo uncertainty procedure could be expected to be different from the result when the evaluation function f was applied to the measured data, but due to the nonlinearities of the evaluation function f, the spread of the distributions could also be different.
A second difference was that for the case of simulated data, one had to make a choice for the ground truth Xsim, and that choice will generally be different from the unknown actual ground truth Xtrue in the real experiment. Furthermore, different choices were possible, and all were expected to affect the results.

3.2. Numerical Example Related to a Sphere with Form Deviation in a VCMM

In Table 1 the effect of selecting different options for the input dataset and enabled uncertainty sources is shown. The simulated imperfect spheres 1 and 2 were created, starting from a perfect sphere and then changing some coordinates until the fitted PV value was the same as for the measured sphere. This was conducted in two different ways for both spheres.
When no uncertainty sources were enabled, the calculated uncertainty was 0 nm as expected. Furthermore, it can be seen that when adding uncertainty, the mean PV value increased by a few nanometers for the imperfect spheres and even by 47 nm for the perfect sphere. Note that in the latter case, a 95% probabilistically symmetric coverage interval based would not cover the true value. The fact that the VCMM did not seem to properly quantify this uncertainty was not a flaw or error in the VCMM model. Any single perturbation by δXsys and δXran of the data will cause the data to become imperfect and shift the PV value to a positive value, and it can thus happen that the calculated probabilistically symmetric coverage interval would not cover the true value, PV = 0 nm, as the value 0 is the smallest of all possible values.
This phenomenon is not unique to this particular model, but also a model, such as Y = (X1)2 + (X2)2, can suffer from this phenomenon, see [18,19] for an explanation and further analysis of this issue.
Furthermore, it can be seen that according to the employed model, the systematic contributions to the uncertainty were much more influential than the random components. Thus, this means that if the machine could be characterized more accurately (which is not easy to realize in practice), the measurement uncertainty could be reduced. If one really tried to pursue this, the separation between random and systematic contributions may need to be assessed in more detail than the coarse separation that was made for the purpose of this paper.
The example shows that the choice of input data in a virtual instrument does significantly matter regarding the calculated uncertainty. If prior to the measurement of a real artefact, the uncertainty was assessed using a simulated perfect artefact, the uncertainty would be underestimated by 40% in our example, namely, a calculated standard uncertainty of 9 nm instead of 15 nm. Using a simulated artefact with the same asphericity value, the standard uncertainty could be either under- or overestimated. The overestimation here was 23%, namely, a calculated uncertainty of 19 nm instead of 15 nm. Depending on the application and the target uncertainty, it can be a serious problem if the uncertainty in a real measurement is significantly higher than what was a priori expected.
The answer to the first research question, regarding whether the predicted uncertainty will always be of a similar size as the uncertainty based on the real measurement data, was thus that this is clearly not the case. When using the nominal perfectly shaped artefact for the prediction, the uncertainty can be considerably underestimated.

3.3. Improved Method for Predicting Uncertainties with a VI

In Section 3.2, it was shown that it significantly matters how a VI is operated. In this section, we present a simple method that is expected to produce a safe conservative estimate for the standard uncertainty in the case of using real measurement data. The idea was to simply apply the VI not to a single simulated artefact but to a significant number of simulated imperfect artefacts as well as to take the largest found uncertainty as a conservative prediction of the uncertainty for the real case. The imperfect artefacts can be simply created by starting from the nominal perfect artefact and adding random noise. The size of the random noise should be representative of the expected form deviation in the real case. In our example, we applied Gaussian noise with a standard deviation of 100 nm to each of the measurement points. A histogram of the calculated PV values and associated standard uncertainties is shown in Figure 3. The range of PV values was from 165 to 561 nm and, thus, encompasses a large range around the true value of 310 nm. The maximum expected standard uncertainty was 19 nm, which was indeed a conservative upperbound for the value of 15 nm for the real artefact. If more information is available (e.g., that the PV value should lie between 250 and 350 nm), then only simulated spheres with a PV value in this range can be retained in the analysis. We performed this analysis. However, the maximum observed uncertainty was still 19 nm; therefore, this selection of simulated artefacts did not lower the upperbound. Similarly, the lowest observed uncertainty of 7 nm did not increase.
In Table 2 a summary is given regarding the results of various ways of operating a VCMM. The target was to predict an upperbound for the uncertainty of the PV value of the measured data in the second row. It can be seen in the third row that using the nominal artefact shape led to an underestimation, and also only simulating one imperfect artefact with a similar shape was not safe, as it may also lead to a considerable underestimation of the uncertainty. In the proposed approach, 1000 imperfect artefacts were simulated, and the largest uncertainty was retained. This method, indeed, provided an upperbound on the uncertainty for the real measurement. This will be the case as long as the true artefact does not possess an extreme worst-case shape.

4. Discussion

In this section, we discuss the results presented in Section 3 from a theoretical perspective.
As can be seen from Table 1, the mean value of the distribution of the PV values obtained by the Monte Carlo method used for the uncertainty analysis was 3 nm higher than the result when evaluating the measurement data. In this case, this was insignificant in view of the standard uncertainty of 15 nm of the asphericity, but the difference could be more significant in other cases. Intuitively, this increase in the value could be understood in the following way. The measurement data contained errors, and in the Monte Carlo uncertainty analysis, more artificial errors were added prior to applying the evaluation function f. As the asphericity increased with higher levels of added errors, because they do not generally average out, the majority of the individual Monte Carlo runs had a higher PV value than the PV value obtained for the measurement data. As an estimate of the asphericity, it seems therefore advisable to evaluate the asphericity based on the measurement data alone, as it only contains the measurement errors once.
Nonlinear models can be highly sensitive to the values of the input data. This may not only be the case for the calculated value of the measurand Y but can also apply to the calculated uncertainty. The values in Table 1 show that for the two slightly different inputs, X1 and X2 (i.e., the coordinate data from the different spheres), the resulting value for Y was very similar, i.e., the PV value was almost the same, but the standard uncertainty was very different. We expect that this difference was due to the nonlinear nature of the VCMM.

5. Conclusions

Virtual instruments are helpful tools to model complex physical measurements that can be used, for example, to assess the impact of different sources of uncertainty. In this work, we showed that significantly different uncertainties can emerge when operating a virtual CMM in different ways. More specifically, we showed that when predicting the measurement uncertainty for a specific measurement using a simulated artefact that resembles the true artefact, the calculated uncertainty can still be considerably different. This can cause serious technical problems and/or cause considerable additional costs in a practical situation if the value of the measurement uncertainty is of critical importance. It seems that this sensitivity of the output of a VCMM to the exact input values is commonly not highlighted in guidance documents, which can be considered a deficit. One should therefore be careful in how to use a VI to support an uncertainty quantification. Ideally, one would operate a VCMM using the actual ground truth Xtrue as Xsim. However, Xtrue is not known and, therefore, an alternative approach has to be followed. For this purpose, we proposed to perform a large number of repeated operations of the VCMM using a variety of imperfect artefacts that can reasonably be expected and to take the largest found uncertainty as a conservative prediction of the uncertainty that might be obtained in the real measurement.

Author Contributions

Conceptualization, G.K.; methodology, G.K.; software, G.K.; formal analysis, G.K., G.W. and C.E.; investigation, G.K.; writing—original draft preparation, G.K.; writing—review and editing, G.K., G.W. and C.E. All authors have read and agreed to the published version of the manuscript.

Funding

This project (20IND04 ATMOC) received funding from the EMPIR program co-financed by the Participating States and from the European Union’s Horizon 2020 Research and Innovation Programme.

Institutional Review Board Statement

Not Applicable.

Data Availability Statement

The data that support the findings of this study are available upon reasonable request from the authors.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. BIPM; IEC; IFCC; ILAC; ISO; IUPAC; IUPAP; OIML. Evaluation of Measurement Data—Guide to the Expression of Uncertainty in Measurement, JCGM 100:2008, GUM 1995 with Minor Corrections. 2008; Available online: https://www.bipm.org/documents/20126/2071204/JCGM_100_2008_E.pdf/cb0ef43f-baa5-11cf-3f85-4dcd86f77bd6 (accessed on 17 March 2022).
  2. Trapet, E.; Waldele, F. The Virtual CMM Concept. In Advanced Mathematical Tools in Metrology; World Scientific: Singapore, 1996; Volume 2, pp. 238–247. [Google Scholar]
  3. Balsamo, A.; Di Ciommo, M.; Mugno, R.; Rebaglia, B.; Ricci, E.; Grella, R. Evaluation of CMM uncertainty through Monte Carlo simulations. CIRP Ann. 1999, 48, 425–428. [Google Scholar] [CrossRef]
  4. Han Haitjema, B.W.; van Dorp, M.; Morel, P.; Schellekens, H.J. Uncertainty estimation by the concept of virtual instruments. In Proceedings of the SPIE 4401, Recent Developments in Traceable Dimensional Measurements, Munich, Germany, 22 October 2001. [Google Scholar] [CrossRef]
  5. Aggogeri, F.; Barbato, G.; Barini, E.M.; Genta, G.; Levi, R. Measurement uncertainty assessment of coordinate measuring machines by simulation and planned experimentation. CIRP J. Manuf. Sci. Technol. 2011, 4, 51–56. [Google Scholar] [CrossRef]
  6. Gąska, A.; Harmatys, W.; Gąska, P.; Gruza, M.; Gromczak, K.; Ostrowska, K. Virtual CMM-based model for uncertainty estimation of coordinate measurements performed in industrial conditions. Measurement 2017, 98, 361–371. [Google Scholar] [CrossRef]
  7. Heißelmann, D.; Franke, M.; Rost, K.; Wendt, K.; Kistner, T.; Schwehn, C. Determination of measurement uncertainty by Monte Carlo simulation. In Advanced Mathematical and Computational Tools in Metrology and Testing XI; World Scientific: Singapore, 2019; pp. 192–202. [Google Scholar]
  8. Vlaeyen, M.; Haitjema, H.; Dewulf, W. Digital Twin of an Optical Measurement System. Sensors 2021, 21, 6638. [Google Scholar] [CrossRef] [PubMed]
  9. ISO/TS 15530-4:2008; Geometrical Product Specifications (GPS). Coordinate Measuring Machines (CMM): Part 4: Evaluating Task-Specific Measurement Uncertainty Using Simulation. ISO: Geneva, Switzerland, 2008.
  10. Flack, D. NPL Good Practice Guide No. 130, Co-Ordinate Measuring Machine Task-Specific Measurement Uncertainties; National Physical Laboratory: Teddington, UK, 2013. [Google Scholar]
  11. EMPIR Project 20IND04 ATMOC. Traceable Metrology of Soft X-ray to IR Optical Constants and Nanofilms for Advanced Manufacturing, 2021–2024. Available online: https://www.euramet.org/research-innovation/search-research-projects/details/project/traceable-metrology-of-soft-x-ray-to-ir-optical-constants-and-nanofilms-for-advanced-manufacturing (accessed on 17 March 2022).
  12. Wübbeler, G.; Marschall, M.; Kniel, K.; Heißelmann, D.; Härtig, F.; Elster, C. GUM-Compliant Uncertainty Evaluation Using Virtual Experiments. Metrology 2022, 2, 114–127. [Google Scholar] [CrossRef]
  13. BIPM; IEC; IFCC; ILAC; ISO; IUPAC; IUPAP; OIML. Evaluation of Measurement Data—Guide to the Expression of Uncertainty in Measurement—Supplement 1 to the “Guide to the Expression of Uncertainty in Measurement”—Propagation of Distributions Using a Monte Carlo Method, JCGM 101:2008; 2008; Available online: https://www.bipm.org/documents/20126/2071204/JCGM_101_2008_E.pdf/325dcaad-c15a-407c-1105-8b7f322d651c (accessed on 17 March 2022).
  14. EMRP Project IND10. Optical and Tactile Metrology for Absolute form Characterisation, 2011–2014. Available online: https://www.ptb.de/emrp/ind10-home.html (accessed on 17 March 2022).
  15. EMRP Project IND59. Multi-Sensor Metrology for Microparts in Innovative Industrial Products, 2013–2016. Available online: https://www.ptb.de/emrp/microparts-project.html (accessed on 17 March 2022).
  16. Matlab Software. The Mathworks. Available online: https://www.mathworks.com (accessed on 17 March 2022).
  17. ISO 10360-2:2009; Geometrical Product Specifications (GPS)—Acceptance and Reverification Tests for Coordinate Measuring Machines (CMM)—Part 2: CMMs Used for Measuring Linear Dimensions. ISO: Geneva, Switzerland, 2009.
  18. Giaquinto, N.; Fabbiano, L. Examples of S1 coverage intervals with very good and very bad long-run success rate. Metrologia 2016, 53, S65. [Google Scholar] [CrossRef]
  19. Wübbeler, G.; Elster, C. On the transferability of the GUM-S1 type A uncertainty. Metrologia 2020, 57, 015005. [Google Scholar] [CrossRef]
Figure 1. (a) Zeiss F25 CMM at the VSL laboratory; (b) close-up image of the tactile probe tip; (c) interferometric calibration of specific machine errors.
Figure 1. (a) Zeiss F25 CMM at the VSL laboratory; (b) close-up image of the tactile probe tip; (c) interferometric calibration of specific machine errors.
Metrology 02 00019 g001
Figure 2. Nominal location of the measured 25 points of the spheres: (a) projection on the (x,y)-plane; (b) projection on the (x,z)-plane. The azimuth angles θ varied from 0° to 90° in steps of 22.5° and are plotted in different colors. The colors on the left and right images correspond to the same sets of points.
Figure 2. Nominal location of the measured 25 points of the spheres: (a) projection on the (x,y)-plane; (b) projection on the (x,z)-plane. The azimuth angles θ varied from 0° to 90° in steps of 22.5° and are plotted in different colors. The colors on the left and right images correspond to the same sets of points.
Metrology 02 00019 g002
Figure 3. (a) Histogram with PV values of the simulated imperfect spheres; (b) histogram with the standard uncertainties of the simulated imperfect spheres. In both histograms, the values calculated for the measurement data are indicated by a vertical red bar.
Figure 3. (a) Histogram with PV values of the simulated imperfect spheres; (b) histogram with the standard uncertainties of the simulated imperfect spheres. In both histograms, the values calculated for the measurement data are indicated by a vertical red bar.
Metrology 02 00019 g003
Table 1. Effect of the input dataset and enabled uncertainty sources on the sphericity as quantified by the PV value and on the calculated standard uncertainty of the sphericity u(PV).
Table 1. Effect of the input dataset and enabled uncertainty sources on the sphericity as quantified by the PV value and on the calculated standard uncertainty of the sphericity u(PV).
Input DatasetUncertainty SourcesPV (nm)u(PV) (nm)
Measured spherenone3100
Measured sphereall31315
Measured sphererandom3106
Measured spheresystematic31314
Simulated perfect spherenone00
Simulated perfect sphereall479
Simulated perfect sphererandom162
Simulated perfect spheresystematic449
Simulated imperfect sphere 1none3100
Simulated imperfect sphere 1all31019
Simulated imperfect sphere 1random3106
Simulated imperfect sphere 1systematic31013
Simulated imperfect sphere 2none3100
Simulated imperfect sphere 2all31116
Simulated imperfect sphere 2random31110
Simulated imperfect sphere 2systematic31115
Table 2. Calculated uncertainty for various ways of operating the VCMM.
Table 2. Calculated uncertainty for various ways of operating the VCMM.
Modus Operandi of VCMMu(PV) (nm)
Uncertainty for real measurement data15
Predicted uncertainty based on nominal data10
Predicted uncertainty based on a single imperfect artefact (lowest value)7
Predicted uncertainty based on simulating a large number of imperfect artefacts (proposed approach)19
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kok, G.; Wübbeler, G.; Elster, C. Impact of Imperfect Artefacts and the Modus Operandi on Uncertainty Quantification Using Virtual Instruments. Metrology 2022, 2, 311-319. https://doi.org/10.3390/metrology2020019

AMA Style

Kok G, Wübbeler G, Elster C. Impact of Imperfect Artefacts and the Modus Operandi on Uncertainty Quantification Using Virtual Instruments. Metrology. 2022; 2(2):311-319. https://doi.org/10.3390/metrology2020019

Chicago/Turabian Style

Kok, Gertjan, Gerd Wübbeler, and Clemens Elster. 2022. "Impact of Imperfect Artefacts and the Modus Operandi on Uncertainty Quantification Using Virtual Instruments" Metrology 2, no. 2: 311-319. https://doi.org/10.3390/metrology2020019

Article Metrics

Back to TopTop