Metrology doi: 10.3390/metrology4020010
Authors: Clement Moreau Julie Lemesle David Páez Margarit François Blateyron Maxence Bigerelle
Stitching methods allow one to measure a wider surface without the loss of resolution. The observation of small details with a better topographical representation is thus possible. However, it is not excluded that stitching methods generate some errors or aberrations on topography reconstruction. A device including confocal microscopy (CM), focus variation (FV), and coherence scanning interferometry (CSI) instrument modes was used to chronologically follow the drifts and the repositioning errors on stitching topographies. According to a complex measurement plan, a wide measurement campaign was performed on TA6V specimens that were ground with two neighboring SiC FEPA grit papers (P#80 and P#120). Thanks to four indicators (quality, drift, stability, and relevance indexes), no measurement drift in the system was found, indicating controlled stitching and repositioning processes for interferometry, confocal microscopy, and focus variation. Measurements show commendable stability, with interferometric microscopy being the most robust, followed by confocal microscopy, and then focus variation. Despite variations, robustness remains constant for each grinding grit, minimizing interpretation biases. A bootstrap analysis reveals time-dependent robustness for confocal microscopy, which is potentially linked to human presence. Despite Sa value discrepancies, all three metrologies consistently discriminate between grinding grits, highlighting the reliability of the proposed methodology.
]]>Metrology doi: 10.3390/metrology4010009
Authors: Yatao Huang Zihan Su Di Chang Yunke Sun Jiubin Tan
A calibration system was designed to evaluate the accuracy of linear optical encoders at the micron level in a fast and economical manner. The system uses a commercial interferometer and motor stage as the calibrator and moving platform. Error analysis is necessary to prove the effectiveness and identify areas for optimization. A fixture was designed for the scale and interferometer target to meet the Abbe principle. A five-degree-of-freedom manual stage was utilized to adjust the reading head in optimal or suboptimal working conditions, such as working distance, offset, and angular misalignment. The results indicate that the calibration system has an accuracy of ±2.2 μm. The geometric errors of the calibration system, including mounting errors and non-ideal motions, are analyzed in detail. The system could be an inexpensive solution for encoder manufacturers and customers to calibrate a linear optical encoder or test its performance.
]]>Metrology doi: 10.3390/metrology4010008
Authors: Federico Cavedo Parisa Esmaili Michele Norgia
Frequency estimation is often the basis of various measurement techniques, among which optical distance measurement stands out. One of the most used techniques is interpolated fast Fourier transform due to its simplicity, combined with good performance. In this work, we study the limits of this technique in the case of real signals, with reference to a particular interferometric technique known as self-mixing interferometry. The aim of this research is the better understanding of frequency estimation performances in real applications, together with guidance on how to improve them in specific optical measurement techniques. An optical rangefinder, based on self-mixing interferometry, has been realized and characterized. The simulation results allow us to explain the limits of the interpolated fast Fourier transform applied to the realized instrument. Finally, a method for overcoming them is proposed by decorrelating the errors between the measurements, which can provide a guideline for the design of frequency-modulated interferometric distance meters.
]]>Metrology doi: 10.3390/metrology4010007
Authors: André Bülau Daniela Walter André Zimmermann
Voltage standards are widely used to transfer volts from Josephson voltage standards (JVSs) at national metrology institutes (NMIs) into calibration labs to maintain the volts and to transfer them to test equipment at production lines. Therefore, commercial voltage standards based on Zener diodes are used. Analog Devices Inc. (San Jose, CA, USA), namely, Eric Modica, introduced the ADR1000KHZ, a successor to the legendary LTZ1000, at the Metrology Meeting 2021. The first production samples were already available prior to this event. In this article, this new temperature-stabilized Zener diode is compared to several others as per datasheet specifications. Motivated by the superior parameters, a 10 V transfer standard prototype for laboratory use with commercial off-the-shelf components such as resistor networks and chopper amplifiers was built. How much effort it takes to reach the given parameters was investigated. This paper describes how the reference was set up to operate it at its zero-temperature coefficient (z.t.c.) temperature and to lower the requirements for the oven stability. Furthermore, it is shown how the overall temperature coefficient (t.c.) of the circuit was reduced. For the buffered Zener voltage, a t.c. of almost zero, and with amplification to 10 V, a t.c. of <0.01 µV/V/K was achieved in a temperature span of 15 to 31 °C. For the buffered Zener voltage, a noise of ~584 nVp-p and for the 10 V output, ~805 nVp-p were obtained. Finally, 850 days of drift data were taken by comparing the transfer standard prototype to two Fluke 7000 voltage standards according to the method described in NBS Technical Note 430. The drift specification was, however, not met.
]]>Metrology doi: 10.3390/metrology4010006
Authors: Jake Sundet Jake Merrell Maxwell Tree Trevor Christensen Stephen Schultz
Nano-composite piezo-responsive foam (NCPF) is an inexpensive foam that can be used to measure a static load while still providing a comfortable interface. The purpose of this study was to create a modularized foam-based pressure measurement system. A measurement system was developed that uses an interdigitated electrode applied to the NCPF. Applied pressure changes the impedance of the NCPF, which, in turn, is converted into a voltage using a voltage divider. A modular measurement system is described that uses an ATtiny 1627 microcontroller to measure the pressure at nine electrodes. The nine electrode modules are controlled by an ESP32 microcontroller that aggregates the data and wirelessly transmits the data to a tablet. The modular system was demonstrated with 1008 individual electrodes. The characterization of the electrode combined with the NCPF is presented, along with optimization of the electrode geometry.
]]>Metrology doi: 10.3390/metrology4010005
Authors: Mthabisi Adriano Nyathi Jiping Bai Ian David Wilson
Concrete structures inevitably experience cracking, which is a common form of damage. If cracks are left undetected and allowed to worsen, catastrophic failures, with costly implications for human life and the economy, can occur. Traditional image processing techniques for crack detection and measurement have several limitations, which include complex parameter selection and restriction to measuring cracks in pixels, rather than more practical units of millimetres. This paper presents a three-stage approach that utilises deep learning and image processing for crack classification, segmentation and measurement. In the first two stages, custom CNN and U-Net models were employed for crack classification and segmentation. The final stage involved measuring crack width in millimetres by using a novel laser calibration method. The classification and segmentation models achieved 99.22% and 96.54% accuracy, respectively, while the mean absolute error observed for crack width measurement was 0.16 mm. The results demonstrate the adequacy of the developed crack detection and measurement method, and shows the developed deep learning and laser calibration method promotes safer, quicker inspections that are less prone to human error. The method’s ability to measure cracks in millimetres provides a more insightful assessment of structural damage, which is, in comparison to traditional pixel-based measurement methods, a significant improvement for practical field applications.
]]>Metrology doi: 10.3390/metrology4010004
Authors: Jan-Hauke Bartels Ronghua Xu Chongjie Kang Ralf Herrmann Steffen Marx
Acceleration sensors are vital for assessing engineering structures by measuring properties like natural frequencies. In practice, engineering structures often have low natural frequencies and face harsh environmental conditions. Understanding sensor behavior on such structures is crucial for reliable measurements. The research focus is on understanding the behavior of acceleration sensors in harsh environmental conditions within the low-frequency acceleration range. The main question is how to distinguish sensor behavior from structural influences to minimize errors in assessing engineering structure conditions. To investigate this, the sensors are tested using a long-stroke calibration unit under varying temperature and humidity conditions. Additionally, a mini-monitoring system configured with four IEPE sensors is applied to a small-scale support structure within a climate chamber. For the evaluation, a signal-energy approach is employed to distinguish sensor behavior from structural behavior. The findings show that IEPE sensors display temperature-dependent nonlinear transmission behavior within the low-frequency acceleration range, with humidity having negligible impact. To ensure accurate engineering structure assessment, it is crucial to separate sensor behavior from structural influences using signal energy in the time domain. This study underscores the need to compensate for systematic effects, preventing the underestimation of vibration energy at low temperatures and overestimation at higher temperatures when using IEPE sensors for engineering structure monitoring.
]]>Metrology doi: 10.3390/metrology4010003
Authors: Muhammad Ayaz Syed Muhammad Hur Rizvi Muhammad Akbar
Conservation Voltage Reduction (CVR) is a potential energy management approach for increasing computer system energy efficiency. This study uniquely contributes to the field by thoroughly investigating the impact of CVR on computing devices, filling a significant gap in the existing literature. The research employs a novel experimental approach, considering the temporal variations in energy use behavior, and presents a comprehensive benchmark analysis of desktop PCs and laptops. Notable gains in processing efficiency are observed, with specific instances such as Desktop 1’s 1.53% Single-Core performance improvement and Desktop 3’s 3.19% total performance boost. Despite variations, the thermal performance of CVR-equipped devices, particularly Desktop 3 and Laptop 3, consistently demonstrates lower temperatures, indicating thermal management enhanced by 3.19% and 1.35%, respectively. Additionally, the study introduces the CVR Performance Enhancement Ratio (%), providing a unique metric for evaluating the trade-offs between energy efficiency and system performance. This research highlights the dual impact of CVR on thermal and computational elements, emphasizing its broad advantages. Integrating CVR emerges as a viable strategy for developing more durable, efficient, and sustainable computing devices, setting the stage for advancements in voltage regulation.
]]>Metrology doi: 10.3390/metrology4010002
Authors: Mauro Alves Correa de Camargo Gabriela Knippelberg Bifano Manea Elcio Cruz de Oliveira
Viscosity is a physicochemical property that evaluates the resistance that fuel offers to flow, influencing the engine’s operation and combustion process. Its control is aimed at good fuel atomization and the preservation of lubricating characteristics. Changes in viscosity can lead to wear on various parts of the engine. Viscometers typically measure the viscosity of fuels in the oil and gas industry. These instruments can measure the time it takes for a fluid to move a given distance through a pipe or the time it takes for an object of a given size and density to pass through the liquid. The traditional test method, ASTM D445, differentiates the procedure for opaque liquids from transparent ones; that is, it requires a warm-up of the sample between 60 °C and 65 °C for 1 h. This additional step can overload laboratory routines, although it is not guaranteed to have a metrologically significant effect on the final result. Thus, this study evaluated the relevance of complying with this step in the test method for the kinematic viscosity of opaque liquids using a 32 factorial experimental design. Based on the F test, p-value, confidence intervals, and percentage contribution of the sum of squares approaches concerning the regression analysis, one concluded that the warm-up time was not a relevant factor in the kinematic viscosity, specifically of very low sulphur fuel oil, Brazilian fuel oil, and atmospheric residue diluted with diesel oil, which are fluids at room temperature.
]]>Metrology doi: 10.3390/metrology4010001
Authors: Mohamed Ouameur Renata Vasconcellos Mohamed Agazar
For this study, a substitution principle-based impedance bridge has been developed to calibrate AC resistors in a four-terminal-pair (4TP) configuration. The calibration is performed in the full complex plane for resistances ranging from 100 mΩ to 400 Ω and frequencies of between 50 Hz and 20 kHz. The automated bridge is based on four resistors associated with two high-impedance stages. The balancing of the bridge is achieved by means of PXI modules. The bridge is automatically balanced via a simplex top-down algorithm. The new bridge is primarily used for the measurement chain of AC standard resistors defined in a 4TP configuration at LNE, which are used for routine customer calibrations. The traceability of LNE’s standard resistors when defined in a 4TP configuration is ensured by a measurement chain from a 1 kΩ reference resistor using the new bridge. The reference resistor was calibrated previously via comparison with a calculable resistor up to 20 kHz. The bridge was validated via comparison with calibration results obtained in 1983 and 2009. For a resistor of 1 Ω at 1 kHz, the uncertainty of the series resistance variation and the phase shift are less than 6 µΩ/Ω (k = 1) and 6 µrad (k = 1), respectively.
]]>Metrology doi: 10.3390/metrology3040024
Authors: Jacek Dominik Skibicki Roksana Licow Natalia Karosińska-Brzozowska Karol Daliga Piotr Chrostowski Andrzej Wilk Krzysztof Karwowski Marek Szafrański Tadeusz Widerski Leszek Jarzebowicz Slawomir Judek Michał Michna Sławomir Grulkowski Julia Omilianowicz
Environmental noise pollution is nowadays one of the most serious health threats. The impact of noise on the human body depends not only on the sound level but also on its spectral distribution. Reliable measurements of the environmental noise spectrum are often hampered by the very high price of top quality measuring devices. This paper explores the possibility of using much cheaper audio recorders for the frequency analysis. Comparative research was performed in laboratory and field conditions, which showed that, with some limitations, these devices can be useful in analyzing the frequency of environmental noise. This provides an opportunity for reducing the cost of noise analysis experimental work.
]]>Metrology doi: 10.3390/metrology3040023
Authors: Stephen Kyle Stuart Robson Ben Hughes
In the context of the journal Metrology, portable 3D measurement is focused on manufacturing applications where there are typically demands for high-accuracy 3D data, with uncertainties in the range of a few 10s of micrometres to a few tenths of millimetres [...]
]]>Metrology doi: 10.3390/metrology3040022
Authors: Yi-Sha Ku Chun-Wei Lo Cheng-Kang Lee Chia-Hung Cho Wen-Qii Cheah Po-Wen Chou
The main challenges in 3D metrology involve measuring TSVs etched with very high aspect ratios, where the via depth to diameter ratio approaches 10:1–20:1. In this paper, we introduce an innovative approach to enhance our in-house spectroscopic reflectometer module by integrating aperture technology, resulting in a substantial amplification of interference signals. Our system offers the flexibility to conduct measurements on an average number of TSVs, individual TSVs, or specific periodic arrays of TSVs. Additionally, we demonstrate the utility of the spectroscopic reflectometer as a non-destructive, high-speed metrology solution for in-line monitoring of TSV etch uniformity. Through a series of experimental trials in a reactive ion etch (RIE) process, we show that leveraging feedback data from the reflectometer leads to marked improvements in etch depth uniformity.
]]>Metrology doi: 10.3390/metrology3040021
Authors: Oscar Lobato-Nostroza Gerardo Marx Chávez-Campos Antony Morales-Cervantes Yvo Marcelo Chiaradia-Masselli Rafael Lara-Hernández Adriana del Carmen Téllez-Anguiano Miguelangel Fraga-Aguilar
Weather disturbances pose a significant challenge when estimating the energy production of photovoltaic panel systems. Energy production and forecasting models have recently been used to improve energy estimations and maintenance tasks. However, these models often rely on environmental measurements from meteorological units far from the photovoltaic systems. To enhance the accuracy of the developed model, a measurement Internet of Things (IoT) prototype was developed in this study, which collects on-site voltage and current measurements from the panel, as well as the environmental factors of lighting, temperature, and humidity in the system’s proximity. The measurements were then subjected to correlation analysis, and various artificial neural networks (ANNs) were implemented to develop energy estimations and forecasting models. The most effective model utilizes lighting, temperature, and humidity. The model achieves a root mean squared error (RMSE) of 0.255326464. The ANN models are compared to an MLR model using the same data. Using previous power measurements and actual weather data, a non-autoregressive neural network (Non-AR-NN) model forecasts future output power values. The best Non-AR-NN model produces an RMSE of 0.1160, resulting in accurate predictions based on the IoT device.
]]>Metrology doi: 10.3390/metrology3040020
Authors: Mihai R. Gherase
Human bones store elements such as calcium, phosphorus, and strontium, and accumulate toxic elements such as lead. In vivo measurements of elemental bone concentration can be done using X-ray fluorescence (XRF) techniques. Monte Carlo (MC) simulations of X-ray interactions were predominantly employed in this field to develop calibration methods that linked XRF measurements to concentrations. A simple and fast two-dimensional K-shell X-ray fluorescence model was developed to compute the KXRF signal of elements in bone and overlying soft tissue samples. The model is an alternative to MC methods and can guide future bone XRF studies. Contours of bone and soft tissue cross sections were elliptical and only KXRF signals from absorption of primary photons were considered. Predictions of the model were compared to Sr KXRF measurements using the bare lamb bone (LB) and the LB with overlying leather. XRF experiments used a small X-ray beam, silicon X-ray detector, and three positioning stages. Linear attenuation coefficients of the leather and LB were measured and used in the model. Measured and model-derived values of the Sr X-rays leather attenuation and Sr Kβ/Kα ratio agreed, but estimated bone Sr concentrations were likely overestimated. Results, approximations, future work directions, and applications were discussed.
]]>Metrology doi: 10.3390/metrology3030019
Authors: Richard J. C. Brown Bernd Güttler Pavel Neyezhmakov Michael Stock Robert I. Wielgosz Stefan Kück Konstantina Vasilatou
This article provides a report of the recent workshop on “The metrology of quantities which can be counted” organised jointly by the International Committee for Weights and Measures’ Consultative Committees for Amount of Substance (CCQM) and for Units (CCU). The workshop aimed to trigger a discussion on counting and number quantities across the metrological community so that a common understanding of counting and a common nomenclature could be achieved and there was clarity on the differences between these increasingly important concepts. This article details the background to the workshop, provides a summary of the presentations given and the discussions on the topics raised. It also reports the conclusions, agreed actions and next steps resulting from the workshop.
]]>Metrology doi: 10.3390/metrology3030018
Authors: Paolo Vigo Andrea Frattolillo
Time and frequency are quantities that have seen a proliferation and diffusion of tools, unimaginable until a few decades ago, and whose application implications are multiplying in a digital society, now characterized by an absolute lack of temporal and spatial limits. Today’s world requires a perfect synchronism of human activities, both for the need to identify with certainty the moment of commercial transactions and to accurately describe biological phenomenologies, which affect the social life of individuals to the point of having repercussions on issues such as safety, production and manufacturing organization. In this regard, the recent award of the Nobel Prize for Medicine for the discovery of the gene capable of controlling our internal biological clock is significant. This paper describes the social implications connected to time measurements, analyzing some very original application effects, ranging from the typical cadences of production activities to sports applications, going so far as to highlight its apparent anomaly of adopting, unlike all other physical quantities, duodecimal and/or sexagesimal scales. Real time and perceived time can both converge and diverge, and this is almost never objectifiable, as it varies from individual to individual, according to individual experiences or sensitivities. This paper is a point of reflection attempting to understand how the chronology of major historical events influenced the organization of time as it is known today and how we arrived at actual measuring instruments so accurate and interconnected with the social sphere. The evolution of calendars and instruments for measuring relative time is described in terms of their specificity.
]]>Metrology doi: 10.3390/metrology3030017
Authors: Han Haitjema
Our journal ‘Metrology’ has been up and running for a few years now, with interesting and ground-breaking publications covering the wide field that the concept of ‘metrology’ encompasses [...]
]]>Metrology doi: 10.3390/metrology3030016
Authors: Clemens Sulz Friedrich Bleicher
The rate of automation in European industry is increasing continuously. In production metrology, the trend is shifting from measurement laboratories towards integration of metrology into the production process. Increasing levels of automation and the current skills shortage are driving demand for autonomous production systems. In this project, a roughness measurement system was developed that is fully integrated into machine tools and enables fully automatic roughness measurement of part surfaces during the machining process. Using a skidless measurement system, it was possible to obtained measured roughness values comparable to those obtained in measuring rooms under optimal conditions. The present paper shows the development process of the prototype and provides an overview of different application scenarios for in situ measurement of machine tools. In situ roughness measurement has high potential in the future of metrology in industrial applications. Not only can surfaces be measured directly in the process, sub-processes can be triggered based on the measured values, allowing the production process to react flexibly to actual conditions. Potential improvements in metrology and significant optimizations of the entire production chain are highlighted in this paper.
]]>Metrology doi: 10.3390/metrology3030015
Authors: Mehran Saadabadi Mahdieh Samimi Hassan Hosseinlaghab
Engine noise, as a source of sound pollution for humans and the environment, can be reduced by designing a high-performance muffler. This study presents a novel, organized design process of that muffler for the KTM390 engine as a case study. The acoustic simulation analysis is performed in COMSOL software and aerodynamic analysis is performed in ANSYS Fluent. The features of the muffler considered in this designing process are the overall length of the muffler, the presence of baffles and related parameters (baffle distance, baffle hole diameter, and baffle hole offset), and the effects of extended tubes. In order to evaluate the acoustic performance of the muffler, an objective function has been defined and measured on two frequency ranges, 75–300 Hz and 300–1500 Hz. For evaluating the aerodynamic performance of that, the amount of backpressure is analyzed to achieve a maximum of 3.3 kilopascals for this muffler. The selection of the appropriate parameters includes comparing the resulting transmission loss curves and quantitative evaluation of objective functions (for transmission loss) and backpressure. This organized design process (i.e., tree diagram) leads to an increase in the efficiency of designing mufflers (for example, 41.2% improvement on backpressure).
]]>Metrology doi: 10.3390/metrology3030014
Authors: Ian M. Hobbs Joey A. Charboneau Todd L. Jacobsen
In this paper, the development of a fission-gas collecting and physical-analysis-enabling instrument was proposed for small-volume determination. Analysis specifications require a design capable of accurately and repeatably determining volumes in the range of 0.07–2.5 mL. This system relies on a series of gas expansions originating from a cylinder with known internal volume. The combined gas law is used to derive the unknown volumes from these expansions. Initial system designs included one of two known volumes, 11.85 ± 0.34 mL and 5.807 ± 0.078 mL, with a manifold volume of 32 mL. Results obtained from modeling this system’s operation showed that 0.07 mL can be determined with a relative expanded uncertainty greater than 300% (k = 2) for a single replicate, which was unacceptable for the proposed experimental design. Initial modeling showed that the volume connecting the known volume and rodlet, i.e., the manifold volume, and the sensitivity of the pressure sensor were key contributors to the expanded uncertainty of the measured rodlet volume. The system’s design limited the available options for pressure sensors, so emphasis was placed on the design of the manifold volume. The final system design reduced the manifold volume to 17 mL. These changes in design, combined with replicate analysis, were able to reduce the relative expanded uncertainty by ±12% (k = 2) for the 0.07 mL volume.
]]>Metrology doi: 10.3390/metrology3020013
Authors: Adam Thompson Lewis Newton Richard Leach
As metal additive manufacturing has been increasingly accepted as a viable method of industrial manufacture, there has been a significant uptake in manufacturers wishing to verify and test their parts through analysis of part surface. However, various studies have shown that metal additive surfaces tend to exhibit highly complex features and, thus, represent a challenge to those wishing to undertake measurement and characterisation. Over the past decade, good practice in metal additive surface measurement and characterisation have been developed, ultimately resulting in the creation of a new standard guide, ASTM F3624-23, which summarises that good practice. Here, we explain the background and rationale for the creation of this standard and provide an overview of the contents of the standard. An example case study is then presented, showing the worked good practice guidance in a metal additive surface measurement and characterisation task, namely, a comparative measurement of an example surface using two different instruments. Finally, considerations for future versions of the standard are presented, explaining the need to develop further good practice for novel instruments and to focus on feature-based characterisation approaches.
]]>Metrology doi: 10.3390/metrology3020012
Authors: Viktor Witkovský
The Tsallis q-Gaussian distribution is a powerful generalization of the standard Gaussian distribution and is commonly used in various fields, including non-extensive statistical mechanics, financial markets and image processing. It belongs to the q-distribution family, which is characterized by a non-additive entropy. Due to their versatility and practicality, q-Gaussians are a natural choice for modeling input quantities in measurement models. This paper presents the characteristic function of a linear combination of independent q-Gaussian random variables and proposes a numerical method for its inversion. The proposed technique makes it possible to determine the exact probability distribution of the output quantity in linear measurement models, with the input quantities modeled as independent q-Gaussian random variables. It provides an alternative computational procedure to the Monte Carlo method for uncertainty analysis through the propagation of distributions.
]]>Metrology doi: 10.3390/metrology3020011
Authors: Lewis Newton Aditi Thanki Carlos Bermudez Roger Artigas Adam Thompson Han Haitjema Richard Leach
Additive manufactured surfaces, especially metal powder bed fusion surfaces, present unique challenges for measurement because of their complex topographies. To address these measurement challenges, optimisation of the measurement process is required. Using a statistical approach, sensitivity analyses were performed on measurement settings found on a commercial programmable array scanning confocal microscope. The instrument measurement process parameters were compared by their effects on three quality indicators: the areal surface texture parameter Sa, measurement noise, and number of non-measured points. An analysis was performed using a full factorial design of experiments for both the top and side surfaces of test surfaces made from Inconel 718 and Ti-6Al-4V using powder bed fusion. The results indicated that measurements of metal additive surfaces are robust to changes in the measurement control parameters for Sa, with variations within 5% of the mean parameter value for the same objective, surface, and measured area. The number of non-measured points and the measurement noise were more varied and were affected by the choice of measurement control parameters, but such changes could be predicted by the statistical models. The contribution offered by this work is an increased understanding of imaging confocal microscopy measurement of metal additive surfaces, along with the establishment of good practice guidance for measurements.
]]>Metrology doi: 10.3390/metrology3020010
Authors: Niklas Grambow Lennart Hinz Christian Bonk Jörg Krüger Eduard Reithmeier
The increasing demand for electric drives challenges conventional powertrain designs and requires new technologies to increase production efficiency. Hairpin stator manufacturing technology enables full automation, and quality control within the process is particularly important for increasing the process capacity, avoiding rejects and for safety-related aspects. Due to the complex, free-form geometries of hairpin stators and the required short inspection times, inline reconstruction and accurate quantification of relevant features is of particular importance. In this study, we propose a novel method to estimate the creepage distance, a feature that is crucial regarding the safety standards of hairpin stators and that could be determined neither automatically nor accurately until now. The data acquisition is based on fringe projection profilometry and a robot positioning system for a highly complete surface reconstruction. After alignment, the wire pairs are density-based clustered so that computations can be parallelized for each cluster, and an analysis of partial geometries is enabled. In several further steps, stripping edges are segmented automatically using a novel approach of spatially asymmetric windowed local surface normal variation, and the creepage distances are subsequently estimated using a geodesic path algorithm. Finally, the approach is examined and discussed for an entire stator, and a methodology is presented that enables the identification of implausible estimated creepage distances.
]]>Metrology doi: 10.3390/metrology3020009
Authors: Tom Hovell Jon Petzing Wen Guo Connor Gill Laura Justham Niels Lohse Peter Kinnell
Non-destructive measurements of high aspect ratio microscale features, especially those with internal geometries such as micro-holes, remain a challenging metrology problem that is increasing in difficulty due to the increasing requirement for more complexity and higher tolerances in such structures. Additionally, there is a growing use of functional surface texturing for improving characteristics such as heat transfer and wettability. As a result, measurement techniques capable of providing dimensional form and surface finish for these features are of intense interest. This review explores the state-of-the-art inspection methodologies compatible with high-aspect-ratio structures and their suitability for extracting three-dimensional surface data based on identified high-aspect ratio structure types. Here, the abilities, limitations, challenges, and future requirements for the practical implementation and acceptance of these measurement techniques are presented.
]]>Metrology doi: 10.3390/metrology3020008
Authors: Eugênio Benevides dos Santos Elcio Cruz de Oliveira
Brazilian regulation requires the test methods for analysing the shrinkage factor, and the solubility ratio in crude oils must be estimated under the measurement conditions for appropriation. Since these physicochemical parameters depend upon the density, a Brazilian oil company proposed an adapted and more user-friendly methodology for adjusting the digital density meter under high pressure and temperature conditions. This study aimed to evaluate the metrological compatibility of this proposal by comparing it with the fit model presented by a manufacturer of a digital densimeter and with the tabulated reference values of fluid density. Since the density data behaviour presented non-normal distributions, the Wilcoxon signed-rank test showed metrological compatibility between the approaches studied in the pressure range from 0 psi to 1200 psi (8.273709 MPa) and the temperature range from 5 °C to 70 °C.
]]>Metrology doi: 10.3390/metrology3020007
Authors: Alex Krummenauer Victor Emmanuel de Oliveira Gomes Vitor Camargo Nardelli
The need to control the real-time location of assets is increasingly relevant worldwide. The Ultra-wideband (UWB) technology is an IoT solution for real-time locating systems (RTLS). The location of the asset is obtained by the signal exchange between a wireless tag (asset) and fixed anchors. The tag interacts with the fixed anchors, defining its position through the distances obtained by trilateration. This data is sent to the server through the gateway. It is well-known that this process has several sources of errors. However, the measurement uncertainty assessment of UWB technology is an important topic regarding its scope of use. This paper presents a task-specific measurement uncertainty evaluation for the UWB positioning system, according to the ISO GUM. It aims to propose a method to support decision-making regarding the possible uses of UWB technology. The position provided by the UWB is compared with reference points using Cartesian coordinates that are measured with a total station, providing metrological reliability. Using the information from the estimated uncertainty, one can define the minimum tolerance interval associated with UWB technology for a given use. A case study demonstrates the method.
]]>Metrology doi: 10.3390/metrology3010006
Authors: Marco Agustoni Paolo Castello Guglielmo Frigo Giacomo Gallus
Modern power systems are rapidly transitioning towards a fully digital substation paradigm. Based on the IEC 61850, a common communication protocol between the different intelligent electronic devices (IEDs) promises a significant enhancement in terms of efficiency and interoperability. In this context, synchronization represents a crucial aspect as it allows us to rigorously compare measurements taken at the same time in different locations. In this paper, we consider a measurement chain for synchrophasor estimation based on digital inputs: an instrument transformer, a stand-alone merging unit (SAMU) and a phasor measurement unit (PMU). Both the SAMU and the PMU are equipped with independent synchronization sources. In case the SAMU loses its synchronization, the final measurement result would be considered invalid until a complete restoration of the SAMU synchronization status. In view of a longer continuity of operation, this paper proposes an alternative approach to evaluate the PMU Time Quality in real-time. This approach allows for continuing crucial monitoring and control operations, such as state estimation and fault detection, even in the presence of temporary loss of synchronization. A characterization, in both simulated and experimental conditions, proves the potential and reliability of the proposed approach. In the considered test case, the come-back within a sufficient time quality is correctly detected in less than 200 s, while waiting for the full restoration of the SAMU time reference would cost several minutes.
]]>Metrology doi: 10.3390/metrology3010005
Authors: Qiankun Song Jigou Liu Yang Liu
This paper presents a novel low-speed measuring method using analog sine and square waves of Hall effect speed sensors coupled with correlative digital signal processing algorithms packaged on a signal processing unit. The frequency of the initial signal is estimated by a square wave period measuring method (SWPM). On the basis of the initially measured frequency, a recursive self-correction (RSC) algorithm is used to perform the low-frequency measurement using the discrete sinusoid wave. The low-speed signal frequency can be derived continuously from the phase difference of the discrete sine wave, where the RSC algorithm is used to achieve high measuring accuracy. Compared to the method using only the SWPM algorithm, this novel low-speed measuring method enables faster measuring speed to achieve sufficient real-time performance. Simulation analyses and experiments verified the effectiveness of the proposed low-speed measuring method.
]]>Metrology doi: 10.3390/metrology3010004
Authors: Metrology Editorial Office Metrology Editorial Office
High-quality academic publishing is built on rigorous peer review [...]
]]>Metrology doi: 10.3390/metrology3010003
Authors: Jean-Laurent Hippolyte Marina Romanchikova Maurizio Bevilacqua Paul Duncan Samuel E. Hunt Federico Grasso Toro Anne-Sophie Piette Julia Neumann
Achieving the highest levels of compliance with the FAIR (findable, accessible, interoperable, reusable) principles for scientific data management and stewardship requires machine-actionable semantic representations of data and metadata. Human and machine interpretation and reuse of measurement datasets rely on metrological information that is often specified inconsistently or cannot be inferred automatically, while several ontologies to capture the metrological information are available, practical implementation examples are few. This work aims to close this gap by discussing how standardised measurement data and metadata could be presented using semantic web technologies. The examples provided in this paper are machine-actionable descriptions of Earth observation and bathymetry measurement datasets, based on two ontologies of quantities and units of measurement selected for their prominence in the semantic web. The selected ontologies demonstrated a good coverage of the concepts related to quantities, dimensions, and individual units as well as systems of units, but showed variations and gaps in the coverage, completeness and traceability of other metrology concept representations such as standard uncertainty, expanded uncertainty, combined uncertainty, coverage factor, probability distribution, etc. These results highlight the need for both (I) user-friendly tools for semantic representations of measurement datasets and (II) the establishment of good practices within each scientific community. Further work will consequently investigate how to support ontology modelling for measurement uncertainty and associated concepts.
]]>Metrology doi: 10.3390/metrology3010002
Authors: Richard P. Lindqvist Daniel Strand Mikael Nilsson Victor Collins Johan Torstensson Jonas Kressin Domenico Spensieri Andreas Archenti
New automated laser radar measurement systems at the Saab Inc. West Lafayette, USA, facility will make airframe assembly of the aft body for the new eT7-A aircraft a quicker, more cost-efficient process. Digital twin concepts realized through simulation and off-line programming show advantageous results when studying future state scenarios or investigating how a current large-volume dimensional metrology system acts and behaves. The aim of this exploration has been to examine how to facilitate the design and programming of automated laser radar concepts by means of novel simulation-based software. High-speed computing algorithms efficiently solve tasks and sequence problems related to many statistical combinatorial possibilities in calculations. However, this approach requires accurate and reliable models and digital twins that are continuously updated with real world data and information. In this paper, the main contributions are to create procedures to define the dimensional metrology workflow at Saab and to model and simulate the laser radar process, enhancing and tailoring existing offline programming software by specific new functionalities. A case study conducted at Saab Aeronautics premises in Linköping acted as a clinical laboratory to generate our research findings. The exploratory work indicates that a reliable simulation-based development method can be used advantageously in the early-stage design layout of automated dimensional metrology systems to verify and guarantee the line-of-sight of, e.g., a laser light path and its allowed inclinations to a specific geometrical feature to be measured, extracted, and evaluated.
]]>Metrology doi: 10.3390/metrology3010001
Authors: Benjamin Winkler Claudia Nagel Nando Farchmin Sebastian Heidenreich Axel Loewe Olaf Dössel Markus Bär
The numerical modeling of cardiac electrophysiology has reached a mature and advanced state that allows for quantitative modeling of many clinically relevant processes. As a result, complex computational tasks such as the creation of a variety of electrocardiograms (ECGs) from virtual cohorts of models representing biological variation are within reach. This requires a correct representation of the variability of a population by suitable distributions of a number of input parameters. Hence, the assessment of the dependence and variation of model outputs by sensitivity analysis and uncertainty quantification become crucial. Since the standard metrological approach of using Monte–Carlo simulations is computationally prohibitive, we use a nonintrusive polynomial chaos-based approximation of the forward model used for obtaining the atrial contribution to a realistic electrocardiogram. The surrogate increases the speed of computations for varying parameters by orders of magnitude and thereby greatly enhances the versatility of uncertainty quantification. It further allows for the quantification of parameter influences via Sobol indices for the time series of 12 lead ECGs and provides bounds for the accuracy of the obtained sensitivities derived from an estimation of the surrogate approximation error. Thus, it is capable of supporting and improving the creation of synthetic databases of ECGs from a virtual cohort mapping a representative sample of the human population based on physiologically and anatomically realistic three-dimensional models.
]]>Metrology doi: 10.3390/metrology2040029
Authors: Simona Salicone
Metrology is the science of measurements [...]
]]>Metrology doi: 10.3390/metrology2040028
Authors: Michiel Vlaeyen Han Haitjema Wim Dewulf
This study proposes an algorithm to autonomously generate the scan path for a laser line scanner mounted on a coordinate measuring machine. The scan path is determined based on task-specific measurement uncertainty in order to prove conformance to specified tolerances. The novelty of the algorithm is the integration of measurement uncertainty. This development is made possible by recent developments for digital twins of optical measurement systems. Furthermore, the algorithm takes all the constraints of this optical measurement system into account. The proposed algorithm is validated on different objects with different surface characteristics. The validation is performed experimentally by a physical measurement system and virtually by an in-house developed digital twin. The validation proves that theoretical coverable areas are measured properly, and the method applied to the equipment used leads to adequate measurement paths that give measurements results with sufficient measurement uncertainty to prove conformance to specifications.
]]>Metrology doi: 10.3390/metrology2040027
Authors: Fabio Cacais José Ubiratan Delgado Victor Loayza Johnny Rangel
High-accuracy source standards preparation in radionuclide metrology is based on a properly described and reliable weighing procedure able to achieve relative standard uncertainties below 0.1%. However, the results of uncertainty budget comparison CCRI(II)-S7 put in check the ability of the former pycnometer and substitution weighing methods to attain this goal. As a result, a question arises about the validation of mass measurements performed from the elimination weighing method when appropriate uncertainties are required. In order to address this problem, a comprehensive in situ validation methodology is proposed for the results of the pycnometer, substitution, elimination and modified elimination (MEM) methods. Mass comparisons are applied to evaluate the compatibility between weighing methods’ results. It is possible due to a developed weighing sequence, which allows for the performing of all methods by only one drop deposition in the range of mass from 10 mg to 200 mg. As a result, the high degree of compatibility between the MEM and elimination method for uncertainties below 0.1% has been achieved, as well as for higher uncertainties to pycnometer and substitution methods. Numerical simulations indicate that the validation results remain valid on improved technical implementations for these last two methods.
]]>Metrology doi: 10.3390/metrology2040026
Authors: Gorka Kortaberria Unai Mutilba Jon Eguskiza Joel Martins
Major aircraft manufacturers are expecting the commercial aircraft market to overcome the pre-COVID levels by 2025, which demands an increase in the production rate. However, aeronautical product assembly processes are still mainly manually performed with a low level of automation. Moreover, the current industry digitalization trend offers the possibility to develop faster, smarter and more flexible manufacturing processes, aiming at a higher production rate and product customization. Here, the integration of metrology within the manufacturing processes offers the possibility to supply reliable data to constantly adjust the assembly process parameters aiming at zero-defect, more digital and a higher level of automation manufacturing processes. In this context, this article introduces the virtual metrology as an assistant of the assembly process of the Advanced Rear-End fuselage component. It describes how the assembly process CADmodel is used by simulation tools to design, set up and perform the virtual commissioning of the new metrology-driven assembly methods, moving from a dedicated tooling approach to a more flexible and reconfigurable metrology-aided design. Preliminary results show that portable metrology solutions are fit-to-purpose even for hardly accessible geometries and fulfil the current accuracy demands. Moreover, the simulation environment ensures a user-friendly assembly process interaction providing further set-up time reduction.
]]>Metrology doi: 10.3390/metrology2040025
Authors: Nikhil Padhye
Reverse transcription polymerase chain reaction (RT-PCR) targeting select genes of the SARS-CoV-2 RNA has been the main diagnostic tool in the global response to the COVID-19 pandemic. It took several months after the development of these molecular tests to assess their diagnostic performance in the population. The objective of this study is to demonstrate that it was possible to measure the diagnostic accuracy of the RT-PCR test at an early stage of the pandemic despite the absence of a gold standard. The study design is a secondary analysis of published data on 1014 patients in Wuhan, China, of whom 59.3% tested positive for COVID-19 in RT-PCR tests and 87.6% tested positive in chest computerized tomography (CT) exams. Previously ignored expert opinions in the form of verbal probability classifications of patients with conflicting test results have been utilized here to derive the informative prior distribution of the infected proportion. A Bayesian implementation of the Dawid-Skene model, typically used in the context of crowd-sourced data, was used to reconstruct the sensitivity and specificity of the diagnostic tests without the need for specifying a gold standard. The sensitivity of the RT-PCR diagnostic test developed by China CDC was estimated to be 0.707 (95% Cr I: 0.664, 0.753), while the specificity was 0.861 (95% Cr I: 0.781, 0.956). In contrast, chest CT was found to have high sensitivity (95% Cr I: 0.969, 1.000) but low specificity (95% Cr I: 0.477, 0.742). This estimate is similar to estimates that were found later in studies designed specifically for measuring the diagnostic performance of the RT-PCR test. The developed methods could be applied to assess diagnostic accuracy of new variants of SARS-CoV-2 in the future.
]]>Metrology doi: 10.3390/metrology2040024
Authors: Gorka Kortaberria Unai Mutilba Sergio Gomez Brahim Ahmed
Data-driven manufacturing in Industry 4.0 demands digital metrology not only to drive the in-process quality assurance of manufactured products but also to supply reliable data to constantly adjust the manufacturing process parameters for zero-defect manufacturing processes. Better quality, improved productivity, and increased flexibility of manufacturing processes are obtained by combining intelligent production systems and advanced information technologies where in-process metrology plays a significant role. While traditional coordinate measurement machines offer strengths in performance, accuracy, and precision, they are not the most appropriate in-process measurement solutions when fast, non-contact and fully automated metrology is needed. In this way, non-contact optical 3D metrology tackles these limitations and offers some additional key advantages to deploying fully integrated 3D metrology capability to collect reliable data for their use in intelligent decision-making. However, the full adoption of 3D optical metrology in the manufacturing process depends on the establishment of metrological traceability. Thus, this article presents a practical approach to the task-specific uncertainty assessment realisation of a dense point cloud data type of measurement. Finally, it introduces an experimental exercise in which data-driven 3D point cloud automatic data acquisition and evaluation are performed through a model-based definition measurement strategy.
]]>Metrology doi: 10.3390/metrology2030023
Authors: Stathis C. Stiros
Understanding the length and subdivisions of ancient length units is necessary for Archaeology, Architecture, and engineering, among other fields. These metrological units derive from anthropocentric concepts (fathom, cubit, foot, finger, etc.) and hence their metrological characteristics are variable and unknown for various ancient civilizations. The Roman length units are well determined, but the ancient Greek units are not. A rule sculpted in a metrological relief recently permitted the recognition of the Doric foot as having a length of 327 mm, but the broader use and divisions of this length unit remain unknown. In this article we present evidence of use of the Doric foot from the modeling of an ancient, atypical small theatre of the 4th–3rd century B.C., at Makyneia, on the western Greece mainland. It was found that this structure was designed using the Doric foot and its division in 24 (or even 12) digits. This result from a small provincial town indicates that the Doric foot was in broad use in architectural and engineering works of the ancient Greek World, and this result may be used to solve various problems of that era.
]]>Metrology doi: 10.3390/metrology2030022
Authors: Fabio Fuiano Andrea Scorza Salvatore Andrea Sciuto
Arterial simulators are a useful tool to simulate the cardiovascular system in many different fields of application and to carry out in vitro tests that would constitute a danger when performed in in vivo conditions. In the literature, a thriving series of in vitro experimental set-up examples can be found. Nevertheless, in the current scientific panorama on this topic, it seems that organic research from a metrological and functional perspective is still lacking. In this regard, the present review study aims to make a contribution by analyzing and classifying the main concerns for the cardiovascular simulators proposed in the literature from a metrological and functional point of view, according to their field of application, as well as for the transducers in the arterial experimental set-ups, measuring the main hemodynamic quantities in order to study their trends in specific testing conditions and to estimate some parameters or indicators of interest for the scientific community.
]]>Metrology doi: 10.3390/metrology2030021
Authors: Martin Straka Andreas Weissenbrunner Christian Koglin Christian Höhne Sonja Schmelter
Ultrasonic clamp-on meters have become an established technology for non-invasive flow measurements. Under disturbed flow conditions, their measurement values must be adjusted with corresponding fluid mechanical calibration factors. Due to the variety of flow disturbances and installation positions, the experimental determination of these factors often needs to be complemented by computational fluid dynamics (CFD) simulations. From a metrological perspective, substituting experiments with simulation results raises the question of how confidence in a so-called virtual measurement can be ensured. While there are well-established methods to estimate errors in CFD predictions in general, strategies to meet metrological requirements for CFD-based virtual meters have yet to be developed. In this paper, a framework for assessing the overall uncertainty of a virtual flow meter is proposed. In analogy to the evaluation of measurement uncertainty, the approach is based on the utilization of an expanded simulation uncertainty representing the entirety of the computational domain. The study was conducted using the example of an ultrasonic clamp-on meter downstream of a double bend out-of-plane. Nevertheless, the proposed method applies to other flow disturbances and different types of virtual meters. The comparison between laboratory experiments and simulation results with different turbulence modeling approaches demonstrates a clear superiority of hybrid RANS-LES models over the industry standard RANS. With an expanded simulation uncertainty of 1.44 × 10−2, the virtual measurement obtained with a hybrid model allows for a continuous determination of calibration factors applicable to the relevant mounting positions of a real meter at a satisfactory level of confidence.
]]>Metrology doi: 10.3390/metrology2030020
Authors: Pablo Puerto Daniel Heißelmann Simon Müller Alberto Mendikute
The increased relevance of large-volume metrology (LVM) in industrial applications entails certain challenges: measurements must be cost-efficient and the technologies must be easy to use while ensuring accuracy and reliability. Portable photogrammetry shows great potential to overcome such challenges, but industrial users do not yet rely on its accuracy for large scenarios (3 to 64 m), especially when mass-market cameras are not conceived of as industrial metrology instruments. Furthermore, the measurement results might also depend on the operator’s skills and knowledge of the key process variables. In this work, a methodology was designed so that the measurement uncertainty of portable photogrammetry can be evaluated under controlled conditions for LVM. To do so, PTB’s reference wall, which was designed to assess laser-based methods applied to large volumes, was used as a reference artefact to study the measurement performance under different conditions, enabling an analysis of the relative influence of two process variables: the spatial arrangement of the optical instruments on the scene, and the relative camera poses for an accurate triangulation. According to these variables, different measuring conditions were designed (Monte Carlo analysis), and experimentally evaluated and reported (LME, length measuring errors), analysing the performance figures expected from both unskilled and expert users.
]]>Metrology doi: 10.3390/metrology2020019
Authors: Gertjan Kok Gerd Wübbeler Clemens Elster
The usage of virtual instruments (VIs) to analyze measurements and calculate uncertainties is increasing. Well-known examples are virtual coordinate measurement machines (VCMMs) which are often used and even commercially offered to assess measurement uncertainties of CMMs. A more recent usage of the VI concept is posed by the modeling of scatterometers. These VIs can be used to assess the measurement uncertainty after the measurement has been performed based on the real measurement data or prior to the measurement to predict the measurement uncertainty using a type of simulated measurement data. The research question addressed in this paper is to assess if this predicted uncertainty will be similar in magnitude to the calculated uncertainty based on the measurement data. It turns out that this is not necessarily the case. The main observation of this paper was that the uncertainty predicted by a VI can be highly sensitive to the chosen way of operating the VI. To amend this situation, a simple procedure was proposed that can be used prior to performing the real measurement and that is believed to produce a conservative prediction of the measurement uncertainty in most cases. This was verified in a case study involving the measurement of the asphericity of an imperfect sphere using a CMM, with the uncertainty calculated by means of a VCMM.
]]>Metrology doi: 10.3390/metrology2020018
Authors: Yang Liu Jigou Liu Ralph Kennel
Accurate frequency measurement plays an important role in many industrial and robotic systems. However, different influences from the application’s environment cause signal noises, which complicate frequency measurement. In rough environments, small signals are intensively disturbed by noises. Thus, even negative Signal-to-Noise Ratios (SNR) are possible in practice. Thus, frequency measuring methods, which can be used for low SNR signals, are in great demand. In previous work, the method of cross-correlation spectrum has been developed as an alternative to Fast Fourier-Transformation or Continuous Wavelet Transformation. It is able to determine the frequencies of a signal under strong noise and is not affected by Heisenberg’s uncertainty principle. However, in its current version, its creation is computationally very intensive. Thus, its application to real-time operations is limited. In this article, a new way to create the cross-correlation spectrum is presented. It is capable of reducing the calculation time by 89% without significant accuracy loss. In simulations, it achieves an average deviation of less than 0.1% on sinusoidal signals with an SNR of −14 dB and a signal length of 2000 data points. When applied to “self-mixing”-interferometry signals, the method can reach a normalized root-mean-square error of 0.21% with the aid of an estimation method and an averaging algorithm. Therefore, further research of the method is recommended.
]]>Metrology doi: 10.3390/metrology2020017
Authors: Prakash Jamakatel Maximilian Eberhardt Florian Kerber
Modern manufacturing processes are characterized by growing product diversities and complexities alike. As a result, the demand for fast and flexible process automation is ever increasing. However, higher individuality and smaller batch sizes hamper the use of standard robotic automation systems, which are well suited for repetitive tasks but struggle in unknown environments. Modern manipulators, such as collaborative industrial robots, provide extended capabilities for flexible automation. In this paper, an adaptive ROS-based end-to-end toolchain for vision-guided robotic process automation is presented. The processing steps comprise several consecutive tasks: CAD-based object registration, pose generation for sensor-guided applications, trajectory generation for the robotic manipulator, the execution of sensor-guided robotic processes, test and the evaluation of the results. The main benefits of the ROS framework are readily applicable tools for digital twin functionalities and established interfaces for various manipulator systems. To prove the validity of this approach, an application example for surface reconstruction was implemented with a 3D vision system. In this example, feature extraction is the basis for viewpoint generation, which, in turn, defines robotic trajectories to perform the inspection task. Two different feature point extraction algorithms using neural networks and Voronoi covariance measures, respectively, are implemented and evaluated to demonstrate the versatility of the proposed toolchain. The results showed that complex geometries can be automatically reconstructed, and they outperformed a standard method used as a reference. Hence, extensions to other vision-controlled applications seem to be feasible.
]]>Metrology doi: 10.3390/metrology2020016
Authors: Xin Xu Sebastian Hagemeier Peter Lehmann
Rough surfaces such as metal additive manufactured surfaces are quite challenging for measurement. Artifacts caused by irregular and difficult-to-measure geometries are inevitable. Removing all the artifacts would cause a portion of surface information to be missing. Different from previous works, the postprocessing in this paper includes an additional step to eliminate artifacts based on autocorrelation functions of particular subimages instead of simply removing them. This increases the accuracy with respect to surface roughness and provides a more comprehensive view on the topography. In addition, a dome shape LED array ring light is proposed to provide all-round lighting due to the high degree of irregularity of workpiece surfaces. The experimental results obtained from FVM are validated and compared with the given roughness values of a Rubert Microsurf 329 comparator test panel as well as measurement results of a metal additive workpiece by a confocal microscope.
]]>Metrology doi: 10.3390/metrology2020015
Authors: Joffray Guillory Daniel Truong Jean-Pierre Wallerand
Large-volume metrology is essential to many high-value industries and contributes to the factories of the future. In this context, we have developed a tri-dimensional coordinate measurement system based on a multilateration technique with self-calibration. In practice, an absolute distance meter, traceable to the SI metre, is shared between four measurement heads by fibre-optic links. From these stations, multiple distance measurements of several target positions are then performed to, at the end, determine the coordinates of these targets. The uncertainty on these distance measurements has been determined with a consistent metrological approach and it is better than 5 µm. However, the propagation of this uncertainty into the measured positions is not a trivial task. In this paper, an analytical solution for the uncertainty assessment of the positions of both targets and heads under a multilateration scenario with self-calibration is provided. The proposed solution is then compared to Monte-Carlo simulations and to experimental measurements: it follows that all three approaches are well agreed, which suggests that the proposed analytical model is accurate. The confidence ellipsoids provided by the analytical solution described well the geometry of the errors.
]]>Metrology doi: 10.3390/metrology2020014
Authors: Jacek Dominik Skibicki Roksana Licow
This paper concerns the assessment of railway track surface conditions in relation to the degree of weed infestation. The paper conceptually describes the proposed method using a visual system to analyse weed infestation level. The use of image analysis software for weed detection is also proposed. This new measurement method allows for a mobile assessment of the track’s weed infestation status. Validation of the assessment method in real conditions will allow for further expansion of the system using new shades of green from the RAL palette, and will take into account a more extensive and detailed assessment of weed infestation on the track in accordance with applicable railway regulations.
]]>Metrology doi: 10.3390/metrology2020013
Authors: Salvatore Dello Iacono Giuseppe Di Leo Consolatina Liguori Vincenzo Paciello
Spectral analysis is successfully adopted in several fields. However, the requirements and the constraints of the different cases may be so varied that not only the tuning of the analysis parameters but also the choice of the most suitable technique can be a difficult task. For this reason, it is important that a designer of a measurement system for spectral analysis has knowledge about the behaviour of the different techniques with respect to the operating conditions. The case that will be considered is the realization of a numerical instrument for the real-time measurement of the spectral characteristics of a multi-tone signal (amplitude, frequency, and phase). For this purpose, different signal processing techniques can be used, that can be classified as parametric or non-parametric methods. The first class includes those methods that exploit the a priori knowledge about signal parameters, such as the spectral shape of the signal to be processed. Thus, a self-configuring procedure based on a parametric algorithm should include a preliminary evaluation of the number of components. The choice of the right method among several proposals in the literature is fundamental for any designer and, in particular, for the developers of spectral analysis software, for real-time applications and embedded devices where time and reliability constrains are arduous to fulfil. Different aspects should be considered: the desired level of accuracy, the available elaboration resources (memory depth and processing speed), and the signal parameters. The present paper details a comparison of some of the most effective methods available in the literature for the spectral analysis of signals (IFFT-2p, IFFT-3p, and IFFTc, all based on the use of an FFT algorithm, while improving the spectral resolution of the DFT with interpolation techniques and three parametric algorithms—MUSIC, ESPRIT, and IWPA). The methods considered for the comparison will be briefly described, and references to literature will be given for each one of them. Then, their behaviour will be analysed in terms of systematic contribution and uncertainty on the evaluated frequencies of the spectral tones of signals created from superimposed sinusoids and white Gaussian noise.
]]>Metrology doi: 10.3390/metrology2020012
Authors: Christoph-Alexander Holst Volker Lohweg
One of the main challenges in designing information fusion systems is to decide on the structure and order in which information is aggregated. The key criteria by which topologies are constructed include the associativity of fusion rules as well as the consistency and redundancy of information sources. Fusion topologies regarding these criteria are flexible in design, produce maximal specific information, and are robust against unreliable or defective sources. In this article, an automated data-driven design approach for possibilistic information fusion topologies is detailed that explicitly considers associativity, consistency, and redundancy. The proposed design is intended to handle epistemic uncertainty—that is, to result in robust topologies even in the case of lacking training data. The fusion design approach is evaluated on selected publicly available real-world datasets obtained from technical systems. Epistemic uncertainty is simulated by withholding parts of the training data. It is shown that, in this context, consistency as the sole design criterion results in topologies that are not robust. Including a redundancy metric leads to an improved robustness in the case of epistemic uncertainty.
]]>Metrology doi: 10.3390/metrology2020011
Authors: Guglielmo Frigo
In modern power systems, the integration of renewable energy sources relies on dedicated inverters whose power electronic circuitry switches at high frequencies and causes conducted emissions in the supraharmonic range, i.e., from 9 to 150 kHz. In this regard, the normative framework is still lacking a reference measurement method as well as a set of emission limits and performance requirements. From a metrological point of view, it is important to evaluate whether some of the power quality indices adopted for radiated emissions could be transposed also in this context. In particular, the paper considers a recent algorithm for the identification of supraharmonic components and discusses how its estimates affect the estimation of quasi-peak values. To this end, the paper describes the implementation of a fully digital approach and validates the results by means of an experimental comparison against a traditional quasi-peak detector. The proposed analysis confirms the potential of the considered approach and provides some interesting insights about the reliability of quasi-peak estimation in supraharmonic range.
]]>Metrology doi: 10.3390/metrology2020010
Authors: Aleksandr Bystrov Yi Wang Peter Gardner
Ensuring a high accuracy when measuring the parameters of devices under testing is an important task when conducting research in the terahertz-frequency range. The purpose of this paper is a practical study of the thermal drift errors of a vector network analyzer using low-terahertz-frequency extender modules. For this, the change in the measurement error, which is a function of time, was analysed using system, based on Keysight N5247B vector network analyzer and covering the frequency ranges of 220–330 GHz, 500–750 GHz, and 750–1100 GHz. The results of our experiment showed that the measurement error decreased rapidly during the first half hour of warm-up and stabilized by 3 h after turning on the equipment. These results allow for an estimation of the necessary warm-up time depending on the requirements for the measurement’s accuracy. This makes it possible to optimize the experiment and reduce its duration.
]]>Metrology doi: 10.3390/metrology2010009
Authors: Blair D. Hall
There is currently interest in the digitalisation of metrology because technologies that can measure, analyse, and make critical decisions autonomously are beginning to emerge. The notions of metrological traceability and measurement uncertainty should be supported, following the recommendations in the Guide to the Expression of Uncertainty in Measurement (GUM). However, GUM offers no specific guidance. Here, we report on a Python package that implements algorithmic data processing using ‘uncertain numbers’, which satisfy the general criteria in GUM for an ideal format to express uncertainty. An uncertain number can represent a physical quantity that has not been determined exactly. Using uncertain numbers, measurement models can be expressed clearly and succinctly in terms of the quantities involved. The algorithms and simple data structures we use provide an example of how metrological traceability can be supported in digital systems. In particular, uncertain numbers provide a format to capture and propagate detailed information about quantities that influence a measurement along the various stages of a traceability chain. More detailed information about influence quantities can be exploited to extract more value from results for users at the end of a traceability chain.
]]>Metrology doi: 10.3390/metrology2010008
Authors: Gerd Wübbeler Manuel Marschall Karin Kniel Daniel Heißelmann Frank Härtig Clemens Elster
A virtual experiment simulates a real measurement process by means of a numerical model. The numerical model produces virtual data whose properties reflect those of the data observed in the real experiment. In this work, we explore how the results of a virtual experiment can be employed in the context of uncertainty evaluation for a corresponding real experiment. The uncertainty evaluation was based on the Guide to the Expression of Uncertainty in Measurement (GUM), which defines the de facto standard for uncertainty evaluation in metrology. We show that, under specific assumptions about model structure and variance of the data, virtual experiments in combination with a Monte Carlo method lead to an uncertainty evaluation for the real experiment that is in line with Supplement 1 to the GUM. In the general case, a GUM-compliant uncertainty evaluation in the context of a real experiment can no longer be based on a corresponding virtual experiment in a simple way. Nevertheless, virtual experiments are still useful in order to increase the reliability of an uncertainty analysis. Simple generic examples as well the case study of a virtual coordinate measuring machine are presented to illustrate the treatment.
]]>Metrology doi: 10.3390/metrology2010007
Authors: Tobias Eckhardt Oliver Gerberding
Laser interferometers that operate over a dynamic range exceeding one wavelength are used as compact displacement sensors for gravitational wave detectors and inertial sensors and in a variety of other high-precision applications. A number of approaches are available to extract the phase from such interferometers by implementing so-called phasemeters, algorithms to provide a linearised phase estimate. While many noise sources have to be considered for any given scheme, they are fundamentally limited by additive noise in the readout, such as electronic readout, digitisation, and shot-noise, which manifest as an effective, white phase noise in the phasemeter output. We calculated and compared the Cramer–Rao lower bound for phasemeters of some state-of-the-art two-beam interferometer schemes and derived their noise limitations for sub-fringe operation and for multi-fringe readout schemes. From this, we derived achievable noise performance levels for one of these interferometer techniques, deep-frequency modulation interferometry. We then applied our analysis to optical resonators and show that frequency scanning techniques can in theory benefit from such resonant enhancement, indicating that the sensitivities can be improved in future sensors.
]]>Metrology doi: 10.3390/metrology2010006
Authors: Gregor Scholz Ines Fortmeier Manuel Marschall Manuel Stavridis Michael Schulz Clemens Elster
The tilted-wave interferometer (TWI) is a recent and promising technique for optically measuring aspheres and freeform surfaces and combines an elaborate experimental setup with sophisticated data analysis algorithms. There are, however, many parameters that influence its performance, and greater knowledge about the behavior of the TWI is needed before it can be established as a measurement standard. Virtual experiments are an appropriate tool for this purpose, and in this paper we present a digital twin of the TWI that was carefully designed for such experiments. The expensive numerical calculations involved combined with the existence of multiple influencing parameters limit the number of virtual experiments that are feasible, which poses a challenge to researchers. Experimental design is a statistical technique that allows virtual experiments to be planned such as to maximize information gain. We applied experimental design to virtual TWI experiments with the goal of identifying the main sources of uncertainty. The results from this work are presented here.
]]>Metrology doi: 10.3390/metrology2010005
Authors: Jinsun Lee Md Shahjahan Hossain Mohammad Taheri Awse Jameel Manas Lakshmipathy Hossein Taheri
Layering deposition methodology in metal additive manufacturing (AM) and the influence of different processing parameters, such as energy source level and deposition speed, which can change the melt pool condition, are known to be the important influencing factors on properties of components fabricated via AM. The effect of melt pool conditions and geometry on properties and quality of fabricated AM components has been widely studied through experimental and simulation techniques. There is a need for better understanding the influence of solidified melt pool topography on characteristics of next deposition layer that can be applied to complex surfaces, especially those with sparse topographical features, such as those that occur in AM deposition layers. Topography of deposited layers in metal additive manufacturing is a significant aspect on the bonding condition between the layers and defect generation mechanism. Characterization of the topography features in AM deposition layers offers a new perspective into investigation of defect generation mechanisms and quality evaluation of AM components. In this work, a feature-based topography study is proposed for the assessment of process parameters’ influence on AM deposition layers topography and defect generation mechanism. Titanium alloy (Ti6Al4V) samples deposited on steel substrate, by direct energy deposition (DED) AM technique at different process conditions, were used for the assessment. Topography datasets and analysis of shape and size differences pertaining to the relevant topographic features have been performed. Different AM process parameters were investigated on metallic AM samples manufactured via direct energy deposition (DED) and the potential defect generation mechanism was discussed. The assessment of the topography features was used for correlation study with previously published in-situ monitoring and quality evaluation results, where useful information was obtained through characterization of signature topographic formations and their relation to the in-situ acoustic process monitoring, as the indicators of the manufacturing process behavior and performance.
]]>Metrology doi: 10.3390/metrology2010004
Authors: Masaki Michihata
Micro-coordinate measuring machines (micro-CMMs) for measuring microcomponents require a probe system with a probe tip diameter of several tens to several hundreds of micrometers. Scale effects work for such a small probe tip, i.e., the probe tip tends to stick on the measurement surface via surface adhesion forces. These surface adhesion forces significantly deteriorate probing resolution or repeatability. Therefore, to realize micro-CMMs, many researchers have proposed microprobe systems that use various surface-sensing principles compared with conventional CMM probes. In this review, the surface-sensing principles of microprobe systems were the focus, and the characteristics were reviewed. First, the proposed microprobe systems were summarized, and the probe performance trends were identified. Then, the individual microprobe system with different sensing principles was described to clarify the performance of each sensing principle. By comprehensively summarizing multiple types of probe systems and discussing their characteristics, this study contributed to identifying the performance limitations of the proposed micro-probe system. Accordingly, the future development of micro-CMMs probes is discussed.
]]>Metrology doi: 10.3390/metrology2010003
Authors: Ian Smith Yuhui Luo Daniel Hutzschenreuter
Supplement 1 to the ‘Guide to the expression of uncertainty of measurement’ describes a Monte Carlo method as a general numerical approach to uncertainty evaluation. Application of the approach typically delivers a large number of values of the output quantity of interest from which summary information such as an estimate of the quantity, its associated standard uncertainty, and a coverage interval for the quantity can be obtained and reported. This paper considers the use of a Monte Carlo method for uncertainty evaluation in calibration, using two examples to demonstrate how so-called ‘digital calibration certificates’ can allow the complete set of results of a Monte Carlo calculation to be reported.
]]>Metrology doi: 10.3390/metrology2010002
Authors: Janik Schaude Andreas Christian Gröschl Tino Hausotte
The article presents the determination of the topographic spatial resolution of an optical point sensor. It is quantified by the lateral period limit DLIM measured on a type ASG material measure, also called (topographic) Siemens star, with a confocal sensor following both a radial measurement and evaluation, as proposed by ISO 25178-70, and the measurement and subsequent evaluation of two line scans, proposed by the NPL Good Practice Guide. As will be shown, for the latter, an only slightly misidentified target centre of the Siemens star leads to quite significant errors of the determined DLIM. Remarkably, a misidentified target centre does not necessarily result in an overestimation of DLIM, but lower values might also be obtained. Therefore, a modified Good Practice Guide is proposed to determine DLIM more accurately, as it includes a thorough determination of the centre of the Siemens star as well. While the measurement and evaluation effort is increased slightly compared to the NPL Good Practice Guide, it is still much faster than a complete radial measurement and evaluation.
]]>Metrology doi: 10.3390/metrology2010001
Authors: Nikolay V. Kornilov Vladimir G. Pronyaev Steven M. Grimes
Each experiment provides new information about the value of some physical quantity. However, not only measured values but also the uncertainties assigned to them are an important part of the results. The metrological guides provide recommendations for the presentation of the uncertainties of the measurement results: statistics and systematic components of the uncertainties should be explained, estimated, and presented separately as the results of the measurements. The experimental set-ups, the models of experiments for the derivation of physical values from primary measured quantities, are the product of human activity, making it a rather subjective field. The Systematic Distortion Factor (SDF) may exist in any experiment. It leads to the bias of the measured value from an unknown “true” value. The SDF appears as a real physical effect if it is not removed with additional measurements or analysis. For a set of measured data with the best evaluated true value, their differences beyond their uncertainties can be explained by the presence of Unrecognized Source of Uncertainties (USU) in these data. We can link the presence of USU in the data with the presence of SDF in the results of measurements. The paper demonstrates the existence of SDF in Prompt Fission Neutron Spectra (PFNS) measurements, measurements of fission cross sections, and measurements of Maxwellian spectrum averaged neutron capture cross sections for astrophysical applications. The paper discusses introducing and accounting for the USU in the data evaluation in cases when SDF cannot be eliminated. As an example, the model case of 238U(n,f)/235U(n,f) cross section ratio evaluation is demonstrated.
]]>Metrology doi: 10.3390/metrology1020011
Authors: Blair D. Hall Annette Koo
This paper considers a future scenario in which digital reporting of measurement results is ubiquitous and digital calibration certificates (DCCs) contain information about the components of uncertainty in a measurement result. The task of linking international measurement comparisons is used as a case study to look at the benefits of digitalization. Comparison linking provides a context in which correlations are important, so the benefit of passing a digital record of contributions to uncertainty along a traceability chain can be examined. The International Committee for Weights and Measures (CIPM) uses a program of international “key comparisons” to establish the extent to which measurements of a particular quantity may be considered equivalent when made in different economies. To obtain good international coverage, the results of the comparisons may be linked together: a number of regional metrology organization (RMO) key comparisons can be linked back to an initial CIPM key comparison. Specific information about systematic effects in participants’ results must be available during linking to allow correct treatment of the correlations. However, the conventional calibration certificate formats used today do not provide this: participants must submit additional data, and the report of an initial comparison must anticipate the requirements for future linking. Special handling of additional data can be laborious and prone to error. An uncertain-number digital reporting format was considered in this case study, which caters to all the information required and would simplify the comparison analysis, reporting, and linking; the format would also enable a more informative presentation of comparison results. The uncertain-number format would be useful more generally, in measurement scenarios where correlations arise, so its incorporation into DCCs should be considered. A full dataset supported by open-source software is available.
]]>Metrology doi: 10.3390/metrology1020010
Authors: Paul A. Solomon Anna-Marie Hyatt Anthony D. A. Hansen James J. Schauer Nicole P. Hyslop John G. Watson Prakash Doraiswamy Paige Presler-Jur
A simple method that reproducibly creates validation/reference materials for comparison of methods that measure the carbonaceous content of atmospheric particulate matter deposited on filter media at concentrations relevant to atmospheric levels has been developed and evaluated. Commonly used methods to determine the major carbonaceous components of particles collected on filters include optical attenuation for “Black” (BC) and “Brown” (BrC) carbon, thermal-optical analysis (TOA) for “Elemental” (EC) and “Organic” (OC) carbon, and total combustion for “Total” carbon (TC). The new method uses a commercial inkjet printer to deposit ink containing both organic and inorganic components onto filter substrates at programmable print densities (print levels, as specified by the printer–software combination). A variety of filter media were evaluated. The optical attenuation (ATN) of the deposited sample was determined at 880 nm and 370 nm. Reproducibility or precision (as standard deviation or in percent as coefficient of variation) in ATN for Teflon-coated glass-fiber, Teflon, and cellulose substrates was better than 5%. Reproducibility for other substrates was better than 15%. EC and OC measured on quartz-fiber filters (QFF) compared to ATN measured at 880 nm and 370 nm on either QFF or Teflon-coated glass-fiber yielded R2 > 0.92 and >0.97, respectively. Four independent laboratories participated in a round robin study together with the reference laboratory. The propagated standard deviation among the five groups across all print levels was <2.2 ATN at 880 nm and <2.7 ATN at 370 nm with a coefficient of variation of <2% at ~100 ATN.
]]>Metrology doi: 10.3390/metrology1020009
Authors: Peter Lehmann Sebastian Hagemeier Tobias Pahl
Three-dimensional transfer functions (3D TFs) are generally assumed to fully describe the transfer behavior of optical topography measuring instruments such as coherence scanning interferometers in the spatial frequency domain. Therefore, 3D TFs are supposed to be independent of the surface under investigation resulting in a clear separation of surface properties and transfer characteristics. In this paper, we show that the 3D TF of an interference microscope differs depending on whether the object is specularly reflecting or consists of point scatterers. In addition to the 3D TF of a point scatterer, we will derive an analytical expression for the 3D TF corresponding to specular surfaces and demonstrate this as being most relevant in practical applications of coherence scanning interferometry (CSI). We additionally study the effects of temporal coherence and disclose that in conventional CSI temporal coherence effects dominate. However, narrowband light sources are advantageous if high spatial frequency components of weak phase objects are to be resolved, whereas, for low-frequency phase objects of higher amplitude, the temporal coherence is less affecting. Finally, we present an approach that explains the different transfer characteristics of coherence peak and phase detection in CSI signal analysis.
]]>Metrology doi: 10.3390/metrology1020008
Authors: Aleksandr N. Grekov Nikolay A. Grekov Evgeniy N. Sychov
Determining the solute mass amount in seawater using in situ measurements in seas and oceans is currently an unresolved problem. To solve it, it is necessary to develop both new methods and instruments for measurements. The authors of this article analyzed methods for the indirect measurement of salinity and density using parameters that can be measured in situ, including relative electrical conductivity, speed of sound, temperature, and hydrostatic pressure. The authors propose an electric conductivity sensor design that allows for the obtainment of data on solid suspensions along with measuring the impedance of electrodes under various the alternating current frequencies. The authors analyzed the joint measurement technique using the Conductivity-Temperature-Depth (CTD) and Sound Velocity Profiler (SVP) devices in a marine testing area. Based on the results of joint measurements, the authors present tests of water samples of various salt compositions for the presence of solid suspensions.
]]>Metrology doi: 10.3390/metrology1020007
Authors: Guglielmo Frigo Marco Agustoni
In this paper, we consider the calibration of measuring bridges for non-conventional instrument transformers with digital output. In this context, the main challenge is represented by the necessity of synchronization between analog and digital outputs. To this end, we propose a measurement setup that allows for monitoring and quantifying the main quantities of interest. A possible laboratory implementation is presented and the main sources of uncertainty are discussed. From a metrological point of view, technical specifications and statistical analysis are employed to draw up a rigorous uncertainty budget of the calibration setup. An experimental validation is also provided through the thorough characterization of the measurement accuracy of a commercial device in use at METAS laboratories. The proposed analysis proves how the calibration of measuring bridges for non-conventional instrument transformers requires ad hoc measurement setups and identifies possible space for improvement, particularly in terms of outputs’ synchronization and flexibility of the generation process.
]]>Metrology doi: 10.3390/metrology1020006
Authors: Simona Salicone Harsha Vardhana Jetti
The concept of measurement uncertainty was introduced in the 1990s by the “Guide to the expression of uncertainty in measurement”, known as GUM. The word uncertainty has a lexical meaning and reflects the lack of exact knowledge or lack of complete knowledge about the value of the measurand. Thanks to the suggestions in the GUM and following the mathematical probabilistic approaches therein proposed, an uncertainty value can be found and be associated to the measured value. In the last decades, however, other methods have been proposed in the literature, which try to encompass the definitions of the GUM, thus overcoming its limitations. Some of these methods are based on the possibility theory, such as the one known as the RFV method. The aim of this paper is to briefly recall the RFV method, starting from the very beginning and the initial motivations, and summarize in a unique paper the most relevant obtained results.
]]>Metrology doi: 10.3390/metrology1010005
Authors: Han Haitjema
It is with sincere pleasure that we welcome you to the inaugural issue of Metrology [...]
]]>Metrology doi: 10.3390/metrology1010004
Authors: Ellie Molloy Annette Koo Blair D. Hall Rebecca Harding
The validity of calibration and measurement capability (CMC) claims by national metrology institutes is supported by the results of international measurement comparisons. Many methods of comparison analysis are described in the literature and some have been recommended by CIPM Consultative Committees. However, the power of various methods to correctly identify biased results is not well understood. In this work, the statistical power and confidence of some methods of interest to the CIPM Consultative Committees were assessed using synthetic data sets with known properties. Our results show that the common mean model with largest consistent subset delivers the highest statistical power under conditions likely to prevail in mature technical fields, where most participants are in agreement and CMC claims can reasonably be supported by the results of the comparison. Our approach to testing methods is easily applicable to other comparison scenarios or analysis methods and will help the metrology community to choose appropriate analysis methods for comparisons in mature technical fields.
]]>Metrology doi: 10.3390/metrology1010003
Authors: Harsha Vardhana Jetti Simona Salicone
A Kalman filter is a concept that has been in existence for decades now and it is widely used in numerous areas. It provides a prediction of the system states as well as the uncertainty associated to it. The original Kalman filter can not propagate uncertainty in a correct way when the variables are not distributed normally or when there is a correlation in the measurements or when there is a systematic error in the measurements. For these reasons, there have been numerous variations of the original Kalman filter, most of them mathematically based (like the original one) on the theory of probability. Some of the variations indeed introduce some improvements, but without being completely successful. To deal with these problems, more recently, Kalman filters have also been defined using random-fuzzy variables (RFVs). These filters are capable of also propagating distributions that are not normal and propagating systematic contributions to uncertainty, thus providing the overall measurement uncertainty associated to the state predictions. In this paper, the authors make another step forward, by defining a possibilistic Kalman filter using random-fuzzy variables which not only considers and propagates both random and systematic contributions to uncertainty, but also reduces the overall uncertainty associated to the state predictions by compensating for the unknown residual systematic contributions.
]]>Metrology doi: 10.3390/metrology1010002
Authors: Nandeesh Hiremath Vaibhav Kumar Nicholas Motahari Dhwanil Shukla
In order to progress in the area of aeroacoustics, experimental measurements are necessary. Not only are they required for engineering applications in acoustics and noise engineering, but also they are necessary for developing models of acoustic phenomenon around us. One measurement of particular importance is acoustic impedance. Acoustic Impedance is the measure of opposition of acoustical flow due to the acoustic pressure. It indicates how much sound pressure is generated by the vibration of molecules of a particular acoustic medium at a given frequency and can be a characteristic of the medium.The aim of the present paper is to give a synthetic overview of the literature on impedance measurements and to discuss the advantage and disadvantage of each measurement technique. In this work, we investigate the three main categories of impedance measurement techniques, namely reverberation chamber techniques, impedance tube techniques, and far-field techniques. Theoretical principles for each technique are provided along with a discussion on historical development and recent advancements for each technique.
]]>Metrology doi: 10.3390/metrology1010001
Authors: Sergey Dedyulin Elena Timakova Dan Grobnic Cyril Hnatovsky Andrew D. W. Todd Stephen J. Mihailov
Fiber Bragg gratings (FBG) are extensively used to perform high-temperature measurements in harsh environments, however the drift of the characteristic Bragg wavelength affects their long-term stability resulting in an erroneous temperature measurement. Herein we report the most precise and accurate measurements of wavelength drifts available up to date on high-temperature FBGs. The measurements were performed with a set of packaged π-phase-shifted FBGs for high wavelength resolution, in caesium and sodium pressure-controlled heat pipes for stable temperature environment and with a tunable laser for stable wavelength measurements with a 0.1 pm resolution. Using this dataset we outline the experimental caveats that can lead to inconsistent results and confusion in measuring wavelength drifts, namely: influence of packaging; interchangeability of FBGs produced under identical conditions; birefringence of π-phase-shifted FBGs; initial transient behaviour of FBGs at constant temperature and dependence on the previous thermal history of FBGs. In addition, we observe that the wavelength stability of π-phase-shifted gratings at lower temperature is significantly improved upon by annealing at higher temperature. The lowest value of the wavelength drift we obtain is +0.014 pm·h−1 at 600 °C (corresponding to +0.001 °C·h−1) after annealing for 400 h at 1000 °C, the longest annealing time we have tried. The annealing time required to achieve the small drift rate is FBG-specific.
]]>