Next Article in Journal
Micromachined Thermal Gas Sensors—A Review
Next Article in Special Issue
Fringe Projection Profilometry Based on Saturated Fringe Restoration in High Dynamic Range Scenes
Previous Article in Journal
Sensor-Model-Based Trajectory Optimization for UAVs to Enhance Detection Performance: An Optimal Control Approach and Experimental Results
Previous Article in Special Issue
Semi-Global Matching Assisted Absolute Phase Unwrapping
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

High-Accuracy Three-Dimensional Deformation Measurement System Based on Fringe Projection and Speckle Correlation

School of Science, Nanjing University of Science and Technology, Nanjing 210094, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(2), 680; https://doi.org/10.3390/s23020680
Submission received: 5 December 2022 / Revised: 1 January 2023 / Accepted: 5 January 2023 / Published: 6 January 2023
(This article belongs to the Special Issue Advances in 3D Measurement Technology and Sensors)

Abstract

:
Fringe projection profilometry (FPP) and digital image correlation (DIC) are widely applied in three-dimensional (3D) measurements. The combination of DIC and FPP can effectively overcome their respective shortcomings. However, the speckle on the surface of an object seriously affects the quality and modulation of fringe images captured by cameras, which will lead to non-negligible errors in the measurement results. In this paper, we propose a fringe image extraction method based on deep learning technology, which transforms speckle-embedded fringe images into speckle-free fringe images. The principle of the proposed method, 3D coordinate calculation, and deformation measurements are introduced. Compared with the traditional 3D-DIC method, the experimental results show that this method is effective and precise.

1. Introduction

Optical metrology is widely used in biomedicine, reverse engineering, bridge monitoring and other fields [1,2,3], because of its non-contact, speed and high-accuracy advantages [4,5,6,7]. Fringe projection profilometry (FPP) [8,9] and digital image correlation (DIC) [10,11] are two common non-interference measurement methods.
FPP uses a projector to project fringes onto the measured object [12]. The phase information [13,14] can be solved from the deformed fringe images, and the three-dimensional (3D) shape of the object can be reconstructed by the phase. However, it is only sensitive to the out-of-plane displacement of the measured object in deformation measurements. It cannot achieve the tracking of object points.
DIC employs a speckle texture on the surface of the measured object as the deformation information carrier [15]. Two-dimensional DIC (2D-DIC) [16] adopts a single camera to capture images, making it is easy to operate, but it can only measure in-plane deformation. Three-dimensional DIC (3D-DIC) [17] can measure the 3D deformation of objects with high accuracy, but it requires synchronous triggering of multiple cameras. Furthermore, the filtering effects of subset windows will lower the accuracy of non-uniform deformation fields.
The combination of DIC and FPP can overcome the disadvantages of their respective methods. However, the separation of fringe and speckle images is the key procedure. Therefore, Shi et al. [18] obtained a surface texture image of a measured object from phase-shifting fringe images. Because the grey gradient of the speckle texture on the surface of the object changes little, it is not necessary to separate the fringe images to realize the combination of DIC and FPP for the measurement. However, in practice, in order to improve the accuracy of DIC measurement, it is necessary to spray or transfer the speckle with large grey gradient changes on the measured surface. This also leads to poor quality of the captured fringe images, which cannot be directly used for calculation. This requires us to separate the fringe and speckle images. Felipe Sese et al. [19] used a multisensory camera and laser structural illumination to separate the color encoding of characterized fringe and speckle patterns, and realized the measurement of 3D displacement. However, the measurement errors of this method are large, and this method is not suitable for high-speed measurements.
Therefore, in this paper, a fringe and speckle image separation method based on deep learning technology is proposed, which can significantly improve the modulation of fringe images [20], and the accuracy of 3D deformation measurements. A set of three-step phase-shifting speckle-embedded fringe images is converted into speckle-free fringe images using a convolutional neural network (CNN). This method with high efficiency can automatically produce speckle-free images using the trained CNN model.

2. Principle

A flow chart of the proposed measurement method is shown in Figure 1. Firstly, speckle images are extracted from the background light grey intensity of the phase-shifting fringe images. DIC is applied to calculate the sub-pixel displacement of the measured object before and after deformation. At the same time, high-quality fringe images are separated from the speckle-embedded fringe images using CNN. The 3D coordinates of the measured object before and after deformation can be obtained via FPP and the parameterization results of the system. Secondly, the least-squares method is used to solve the integer pixel 3D coordinates before deformation and the 3D coordinates corresponding to the sub-pixel after deformation. The 3D displacement can be achieved by subtracting the corresponding 3D coordinates. At last, the difference method is used to solve the full-field strain. It is necessary to establish local coordinate systems. The 3D coordinates before deformation and 3D displacement are converted to the corresponding local coordinate systems, and the strain tensor can be obtained by fitting the 3D displacement in the local system. The above contents will be introduced in the following four subsections.

2.1. Analysis of the Influence of Speckle

The phase-shifting profilometry (PSP) method calculates the phase information of the surface of the measured objects using fringe images. Ideally, a set of sinusoidal fringes are projected by the projector, and the intensity of the image captured by the camera can be expressed as:
I n ( x , y ) = A ( x , y ) + B ( x , y ) cos φ ( x , y ) + Δ φ n
where I n ( x , y ) [ 0 ,   255 ] is the grey intensity of the standard sinusoidal fringe images captured by the camera, A ( x , y ) is the background grey intensity, B ( x , y ) is the surface reflectivity, φ ( x , y ) is the phase to be obtained, which contains the 3D information of the object, Δ φ n is the phase-shifting phase, and n = 1 , 2 , 3 , N represents the number of phase-shifting steps.
The least-squares phase solution method for N-step phase-shifting can be expressed as [21]:
φ ( x , y ) = arctan n = 1 N I n sin ( Δ φ n ) n = 1 N I n cos ( Δ φ n )
In order to establish the speckle model of images, a random number of speckles [22] in I n ( x , y ) are added. Then, the grey intensity of the speckle-embedded fringe images is expressed as:
I n ( x , y ) = I n ( x , y ) S n ( x , y ) = 255           0 S n ( x , y ) = 0
where I n ( x , y ) is the grey intensity of speckle-embedded fringe images, and S ( x , y ) is the random number of speckles.
According to Equations (1) and (3), a speckle-free fringe image with an initial phase value of 0 rad and a speckle-embedded fringe image with an initial phase value of 0 rad, generated via computer simulation, are shown in Figure 2a1 and Figure 2b1, respectively. In the middle row of the image, there is a red line, and the grey intensity distribution of the two cases is shown in Figure 2a2 and Figure 2b2, respectively. The grey intensity distribution of the speckle-free fringe image is a standard sinusoidal curve, while the speckle-embedded fringe image is irregular. If the PSP is applied to solve the phase value of the image directly, as shown in Figure 2b1, it will inevitably induce large phase errors. Therefore, Section 2.2 will describe, in detail, how to extract the fringe pattern based on the deep learning model.

2.2. Fringe and Speckle Pattern Separation Based on Deep Learning Model

As shown in Figure 3, a U-shaped convolutional neural network (CNN) structure with great feature extraction performance is constructed [23]. Three-step phase-shifting speckle-embedded fringe images are used as the input of the CNN, while speckle-free fringe images are used as the output of the CNN. The U-shaped structure is divided into two parts: feature extraction and feature fusion. The feature extraction part consists of the following operations: convolution (Conv) and batch normalization (BN), and Conv, BN and Dropout by four times. The feature fusion part consists of the following operations: transposed convolution (T-Conv), BN and rectified linear unit (ReLU) by four times; T-Conv; and a group of residual structures and Conv. At the same time, in order to retrieve the missing information, the method of skip connection is used to connect the high-level information and the low-level information to achieve pixel-level information acquisition. The residual block structure includes Conv, residual block and Conv. This structure alleviates the gradient disappearance problem caused by increasing the depth in neural networks.
The loss function of CNN is shown in Equation (4):
L o s s = 1 3 × H × W n = 1 3 × H × W ( I n o u t I n ) 2
where H and W are the height and width of the image, respectively, I n o u t and I n are the output of the network and the given real output.
The adaptive moment estimation (ADAM) learning algorithm is applied in CNN. For parameter selection, the batch size is 2, the starting learning rate is 1 × 10−3, and the learning rate will be multiplied by 0.1 after every 200 training rounds.

2.3. Three-Dimensional Displacement Calculation

In the combination of DIC and FPP, DIC is mainly utilized to solve the sub-pixel high-accuracy image coordinate displacement of objects before and after deformation. Therefore, the corresponding points before and after deformation can be found. However, since the accuracy of DIC can be reached at around 0.01 pixels, the least-squares curve fitting algorithm is employed to solve the corresponding 3D coordinates of the image sub-pixel coordinates. The first-order polynomial is selected if the measured object with a complex-surface, higher-order polynomial can be employed. The 3D coordinates (X, Y, Z) of the reference subset center and the deformed subset center can be expressed as:
X ( x , y ) = a 0 + a 1 x + a 2 y Y ( x , y ) = b 0 + b 1 x + b 2 y Z ( x , y ) = c 0 + c 1 x + c 2 y
where (x, y) denote the local image coordinates of the subset window, and a0, a1, a2, b0, b1, b2, c0, c1 and c2 are the fitting coefficients.
A (2M + 1) × (2M + 1) subset window area is selected to the nearest integer pixel around the calculation point as the center for fitting.
P S = W
P = 1 x 0 M y 0 M 1 x 0 M y 0 M + 1 1 x 0 + M y 0 + M 1 1 x 0 + M y 0 + M S = a 0 b 0 c 0 a 1 b 1 c 1 a 2 b 2 c 2
W = X ( x 0 M , y 0 M ) Y ( x 0 M , y 0 M ) Z ( x 0 M , y 0 M ) X ( x 0 M , y 0 M + 1 ) Y ( x 0 M , y 0 M + 1 ) Z ( x 0 M , y 0 M + 1 ) X ( x 0 + M , y 0 + M 1 ) Y ( x 0 + M , y 0 + M 1 ) Z ( x 0 + M , y 0 + M 1 ) X ( x 0 + M , y 0 + M ) Y ( x 0 + M , y 0 + M ) Z ( x 0 + M , y 0 + M )
where (x0, y0) denote the nearest integral pixel coordinate of (x, y), P is the coordinate matrix of all integral pixels in the selected subset window area, S is the fitting coefficient matrix, and W is the 3D coordinate matrix corresponding to all integer pixel coordinates in the selected subset window area.
When the fitting coefficients are already calculated, the 3D coordinates of the corresponding reference and deformed object points can be obtained using Equation (5). The 3D displacement (U, V, W) in the global world coordinate system can be expressed as:
U V W = X 2 Y 2 Z 2 X 1 Y 1 Z 1
where (X1, Y1, Z1) and (X2, Y2, Z2) represent the 3D coordinates of the reference subset center and the deformed subset center calculated using Equations (5)–(8), respectively.

2.4. Full-Field Strain Measurements

In Section 2.3, the 3D coordinates of the reference subset center are (X, Y, Z), and the 3D displacement (U, V, W) in the global world coordinate system are calculated. In order to calculate the full-field strain of the measured surface, the first step is to establish a local coordinate system through the plane fitting of the local region centered on the point. The local coordinate system takes the vertical fitting plane facing outward as the positive direction of the Z axis; the X axis and Y axis are perpendicular to the Z axis, respectively, and can be artificially specified. Then, we calculate the rotation matrix R and the translation vector T from the world coordinate system to the local coordinate system. The 3D coordinates before deformation and the 3D displacement are transformed into the corresponding local coordinate systems through the rotation matrix R and translation vector T.
X e Y e Z e = R X Y Z + T
U e V e W e = R U V W + T
where ( X e , Y e , Z e ) are the 3D coordinates in the local coordinate system, and ( U e , V e , W e ) are the 3D displacement in the local coordinate system.
The displacement field function is obtained via quadric surface fitting, and the strain tensor can be calculated using the field functions. The fitting forms of the field function are as follows:
U e = a 1 X X e 2 + a 2 X Y e 2 + a 3 X X e Y e + a 4 X X e + a 5 X Y e + a 6 X V e = a 1 Y X e 2 + a 2 Y Y e 2 + a 3 Y X e Y e + a 4 Y X e + a 5 Y Y e + a 6 Y W e = a 1 Z X e 2 + a 2 Z Y e 2 + a 3 Z X e Y e + a 4 Z X e + a 5 Z Y e + a 6 Z
where ( a 1 X , a 2 X , a 3 X , a 4 X , a 5 X , a 6 X ) , ( a 1 Y , a 2 Y , a 3 Y , a 4 Y , a 5 Y , a 6 Y ) and ( a 1 Z , a 2 Z , a 3 Z , a 4 Z , a 5 Z , a 6 Z ) are the coefficients of each field function. Then, the Lagrange strain tensor parameters are calculated based on following equations.
ε x x = U e X e + 1 2 U e X e 2 + V e X e 2 + W e X e 2 ε y y = V e Y e + 1 2 U e Y e 2 + V e Y e 2 + W e Y e 2 ε z z = W e Z e + 1 2 U e Z e 2 + V e Z e 2 + W e Z e 2 ε x y = 1 2 U e Y e + V e X e + 1 2 U e X e U e Y e + V e X e V e Y e + W e X e W e Y e ε y z = 1 2 V e Z e + W e Y e + 1 2 U e Y e U e Z e + V e Y e V e Z e + W e Y e W e Z e ε z x = 1 2 W e X e + U e Z e + 1 2 U e Z e U e X e + V e Z e V e X e + W e Z e W e X e

3. Experiments and Results

In order to verify the effectiveness of the proposed method, three sets of experiments were conducted. They are fringe image extraction, 3D displacement and full-field strain measurements.

3.1. Fringe Image Extraction

In Figure 4, the experimental system includes a DLP projector (LightCrafter 4500, Manufacturer Texas Instruments, Headquartered in Dallas, TX, USA) with a resolution of 912 × 1140 pixels2, and an IDS UI-3370CP (Manufacturer IDS, Headquartered in Obersulm, Germany) camera with a resolution of 2048 × 2048 pixels2 and that operates at 80 frame rates per second. A customized acrylic circular plate is employed as the measured object in the experiment, with a radius of 90 mm. The distances from the camera and projector to the circular plate are about 750 mm, and the angle between the camera and the projector is about 30°.
In order to simulate the real situation, real experimental images are captured by the camera as the training and testing datasets of the CNN. The projector projects a set of three-step phase-shifting speckle-embedded fringe images and a set of three-step phase-shifting speckle-free fringe images to the circular plate, respectively. Two-hundred and forty training image datasets are captured by adjusting the projected speckle size. Next, speckles are generated using Equation (3), and the speckles are transferred to the circular plate via water transfer, as shown in Figure 4b. The projector projects a set of three-step phase-shifting speckle-free fringe images, and the camera captures them as testing datasets. GPU (NVIDIA GeForce RTX 2080TI, 32 GB of RAM, Manufacturer NVIDIA, Headquartered in Santa Clara, CA, USA) and CPU (Intel Xeon Platinum, 96 GB of RAM, Manufacturer intel, Headquartered in Santa Clara, CA, USA) multithreading techniques are employed to accelerate the process of training.
In Figure 5, (a) shows the input fringe images of CNN and grey intensity at the green line, and (b) shows the fringe images predicted by CNN and grey intensity at the green line. It is obvious that CNN can effectively separate fringe images and improve the quality of the fringe images.

3.2. Three-Dimensional Displacement Measurements

The circular plate is rotated about 5 degrees, and speckle-embedded fringe images before and after rotation are captured. The fringe images can be extracted by the CNN. The phase value of the circular plate before and after deformation can be calculated using the 3-step phase-shifting algorithm. The parameters of the experimental system can be calibrated using the method proposed by Zhang [24], and the coordinates of the X, Y and Z directions of the circular plate in the global world coordinate system can be obtained.
Speckle images can be obtained from the background grey intensity of the phase-shifting speckle-embedded fringe images. The circular area in the middle of the circular plate is selected as the calculation area. As shown in Figure 6, (a) shows x-direction displacement of the circular plate along the y direction in the image, and (b) shows y-direction displacement of the circular plate along the x direction in the image. Their units are pixels.
The corresponding relationship before and after rotation can be found using the DIC algorithm according to Section 2.3. The fitting subset window area is 29 × 29 pixels2. According to Equations (5)–(8), polynomial fitting is performed to find the corresponding 3D coordinate relationship before and after rotation. The displacement in the three directions can be obtained by subtracting the 3D coordinates of the corresponding points. The full-field displacement in the three directions are shown in Figure 7.
The total displacement of the circular plate can be expressed as:
D = D x 2 + D y 2 + D z 2
where D is the total displacement, and Dx, Dy and Dz denote the displacement in the X, Y and Z directions, respectively.
Figure 8a shows a stereogram of the full-field displacement, and Figure 8b shows a planar graph of the full-field displacement. The full-field displacement is funnel-shaped with nearly zero displacement in the middle and large displacement in the periphery. The displacement increases linearly along the radial direction. Figure 9a shows the total displacement on the red dashed line. Accurate displacement can be obtained via Fourier curve fitting. Figure 9b shows the error between total displacement and fitting displacement. It can be seen that the error values are mainly distributed in the range of ±0.02 mm, with only a few exceeding this range. In this experiment, the circular plate takes up about half of the camera’s field of view. The measurement error is only about 0.006%.

3.3. Full-Field Strain Measurements

Strain reflects the relative deformation of an object under stress. It is an essential physical quantity used to measure the mechanical properties of objects. As shown in Figure 10, camera 1 and the projector form FPP combined with a DIC measurement system. Camera 1 and camera 2 constitute a 3D-DIC measurement system. They are placed symmetrically about the projector. The distance from camera 1 to the three-point bending specimen is about 750 mm. The distance from the projector to the three-point bending specimen is about 850 mm. The angle between camera 1 and the projector is about 30°. Two methods are used to measure the strain in the red dotted area of the three-point bending specimen.
Since it is impossible to know the changes of the surface below the measured surface, only Ԑxx, Ԑxy and Ԑyy can be calculated. Figure 11a1,b1,c1 are the full-field strain distribution calculated using 3D-DIC, Figure 11a2,b2,c2 are the full-field strain distributions calculated using the method of this paper, and Figure 11a3,b3,c3 are the strain solved using two methods at the red dotted line. By comparison, the distribution of full-field strain calculated in this paper is basically the same as that calculated using 3D-DIC. The strain difference between the two methods is only 1 × 10−4. Therefore, this method can measure accurate strain results.

4. Discussion of Measurement Results under Different Exposure Times

Next, the applicability of the proposed measurement method in high- and low-exposure experimental scenes is discussed.
In the experiment of this paper, when the camera exposure time is 40 ms, the gray intensity of the image is in a reasonable range. The camera exposure time was adjusted to 20 ms for the low-exposure experimental scene, and the camera exposure time was adjusted to 80 ms for the high-exposure experimental scene. The experimental images collected under different exposure times are shown in Figure 12.
Figure 13 shows an error comparison of 3D displacement of a line under the exposure times of 20 ms, 40 ms and 80 ms. It can be clearly seen that the error reaches ±0.05 mm with high and low exposure. In Table 1, the RMSE of displacement measured with low exposure time (20 ms) is 0.02 mm, the RMSE measured with normal exposure (40 ms) is 0.01 mm, and the RMSE measured with high exposure (80 ms) is 0.03 mm. Therefore, the method presented in this paper can be used to measure both high- and low-exposure experimental scenes and has high accuracy.

5. Conclusions

In this paper, a high-accuracy 3D deformation measurement system based on fringe projection and speckle correlation is proposed, which can effectively eliminate the influence of speckles on fringe image quality. The difficulty of combining DIC with FPP is solved. The method has high accuracy in displacement measurement and can be effectively applied to high- and low-exposure experimental scenes. A similar trend to 3D-DIC can be obtained in strain measurements.
Before each measurement, the measurement system in this paper needs to be calibrated, and the high-accuracy calibration plate should be selected as far as possible. In addition, the measurement system in this paper can also be calibrated by measuring the standard specimen or the standard deformation.

Author Contributions

Conceptualization, C.Z. and C.L.; methodology, C.Z.; software, C.L.; validation, C.Z., C.L. and Z.X.; formal analysis, C.Z.; investigation, C.Z.; resources, C.Z. and Z.X.; data curation, C.Z.; writing—original draft preparation, C.Z.; writing—review and editing, C.Z.; visualization, C.Z.; supervision, C.L.; project administration, C.L.; funding acquisition, C.Z., C.L. and Z.X. All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by the National Natural Science Foundation of China (Grant No. 11802132), the Natural Science Foundation of Jiangsu Province (Grant No. BK20180446), the China Postdoctoral Science Foundation (Grant Nos. 2020M671493 and 2019M652433), the Jiangsu Planned Projects for Postdoctoral Research Funds, and the Natural Science Foundation of Shandong Province (Grant No. ZR2018BF001).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

We thank the High-Performance Computing Center of Nanjing University of Science and Technology for providing computing assistance.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gao, Z.; Kato, T.; Takahashi, H.; Doi, A. 3D Measurement and Feature Extraction for Metal Nuts. In Advances in Networked-Based Information Systems; Barolli, L., Chen, H.C., Enokido, T., Eds.; Lecture Notes in Networks and Systems; Springer International Publishing: Cham, Switzerland, 2022; Volume 313, pp. 299–305. [Google Scholar] [CrossRef]
  2. Midgett, D.; Thorn, S.; Ahn, S.; Uman, S.; Avendano, R.; Melvinsdottir, I.; Lysyy, T.; Kim, J.; Duncan, J.; Humphrey, J.; et al. CineCT platform for in vivo and ex vivo measurement of 3D high resolution Lagrangian strains in the left ventricle following myocardial infarction and intramyocardial delivery of theranostic hydrogel. J. Mol. Cell. Cardiol. 2022, 166, 74–90. [Google Scholar] [CrossRef] [PubMed]
  3. Herráez, J.; Martínez, J.C.; Coll, E.; Martín, M.T.; Rodríguez, J. 3D modeling by means of videogrammetry and laser scanners for reverse engineering. Measurement 2016, 87, 216–227. [Google Scholar] [CrossRef]
  4. Geng, J. Structured-light 3D surface imaging: A tutorial. Adv. Opt. Photon. 2011, 3, 128–160. [Google Scholar] [CrossRef]
  5. Yu, C.; Ji, F.; Xue, J.; Wang, Y. Adaptive Binocular Fringe Dynamic Projection Method for High Dynamic Range Measurement. Sensors 2019, 19, 4023. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Zhang, S. Recent progresses on real-time 3D shape measurement using digital fringe projection techniques. Opt. Lasers Eng. 2010, 48, 149–158. [Google Scholar] [CrossRef]
  7. Zuo, C.; Chen, Q.; Gu, G.; Feng, S.; Feng, F. High-speed three-dimensional profilometry for multiple objects with complex shapes. Opt. Express 2012, 20, 19493. [Google Scholar] [CrossRef] [Green Version]
  8. Hu, Y.; Chen, Q.; Feng, S.; Zuo, C. Microscopic fringe projection profilometry: A review. Opt. Lasers Eng. 2020, 135, 106192. [Google Scholar] [CrossRef]
  9. Yang, S.; Huang, H.; Wu, G.; Wu, Y.; Yang, T.; Liu, F. High-speed three-dimensional shape measurement with inner shifting-phase fringe projection profilometry. Chin. Opt. Lett. 2022, 20, 112601. [Google Scholar] [CrossRef]
  10. Yu, L.; Pan, B. Single-camera high-speed stereo-digital image correlation for full-field vibration measurement. Mech. Syst. Signal Process. 2017, 94, 374–383. [Google Scholar] [CrossRef]
  11. Tekieli, M.; De Santis, S.; de Felice, G.; Kwiecień, A.; Roscini, F. Application of Digital Image Correlation to composite reinforcements testing. Compos. Struct. 2017, 160, 670–688. [Google Scholar] [CrossRef]
  12. Gorthi, S.S.; Rastogi, P. Fringe projection techniques: Whither we are? Opt. Lasers Eng. 2010, 48, 133–140. [Google Scholar] [CrossRef] [Green Version]
  13. Zheng, D.; Da, F.; Kemao, Q.; Seah, H.S. Phase-shifting profilometry combined with Gray-code patterns projection: Unwrapping error removal by an adaptive median filter. Opt. Express 2017, 25, 4700–4713. [Google Scholar] [CrossRef] [PubMed]
  14. Wei, Y.; Lu, L.; Xi, J. Reconstruction of moving object with single fringe pattern based on phase shifting profilometry. Opt. Eng. 2021, 60, 084106. [Google Scholar] [CrossRef]
  15. Ding, Y.; Lu, P.; He, B.; Huang, X.; Li, G.; Wang, Z.; Zhou, Y.; Zhu, Z. Speckle Deformation Measurement Based on Pixel Correlation Search Method. In Proceedings of the 2021 IEEE 5th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chongqing, China, 12–14 March 2021; pp. 2525–2529. [Google Scholar] [CrossRef]
  16. Liu, Q.; Looi, D.T.-W.; Chen, H.H.; Tang, C.; Su, R.K.L. Framework to optimise two-dimensional DIC measurements at different orders of accuracy for concrete structures. Structures 2020, 28, 93–105. [Google Scholar] [CrossRef]
  17. Zhou, K.; Lei, D.; He, J.; Zhang, P.; Bai, P.; Zhu, F. Real-time localization of micro-damage in concrete beams using DIC technology and wavelet packet analysis. Cem. Concr. Compos. 2021, 123, 104198. [Google Scholar] [CrossRef]
  18. Shi, H.; Ji, H.; Yang, G.; He, X. Shape and deformation measurement system by combining fringe projection and digital image correlation. Opt. Lasers Eng. 2013, 51, 47–53. [Google Scholar] [CrossRef]
  19. Felipe-Sesé, L.; López-Alba, E.; Siegmann, P.; Díaz, F.A. Integration of fringe projection and two-dimensional digital image correlation for three-dimensional displacements measurements. Opt. Eng. 2016, 55, 121711. [Google Scholar] [CrossRef]
  20. Yu, H.; Zheng, D.; Fu, J.; Zhang, Y.; Zuo, C.; Han, J. Deep learning-based fringe modulation-enhancing method for accurate fringe projection profilometry. Opt. Express 2020, 28, 21692. [Google Scholar] [CrossRef]
  21. Zhang, S.; Yau, S.-T. High-resolution, real-time 3D absolute coordinate measurement based on a phase-shifting method. Opt. Express 2006, 14, 2644. [Google Scholar] [CrossRef] [PubMed]
  22. Yamaguchi, I. Digital simulation of speckle patterns. In Proceedings of the Speckle 2018: VII International Conference on Speckle Metrology, Janów Podlaski, Poland, 9–12 September 2018; Volume 10834, p. 1083409. [Google Scholar] [CrossRef]
  23. Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention—MICCAI 2015; Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2015; Volume 9351, pp. 234–241. [Google Scholar] [CrossRef] [Green Version]
  24. Zhang, S.; Huang, P.S. Novel method for structured light system calibration. Opt. Eng. 2006, 45, 083601. [Google Scholar] [CrossRef]
Figure 1. Flow chart of the proposed measurement method.
Figure 1. Flow chart of the proposed measurement method.
Sensors 23 00680 g001
Figure 2. Effect of speckle on fringe image quality. (a1,a2) The speckle-free fringe image and grey intensity, respectively; (b1,b2) the speckle-embedded fringe image and grey intensity, respectively.
Figure 2. Effect of speckle on fringe image quality. (a1,a2) The speckle-free fringe image and grey intensity, respectively; (b1,b2) the speckle-embedded fringe image and grey intensity, respectively.
Sensors 23 00680 g002
Figure 3. U-shaped CNN structure.
Figure 3. U-shaped CNN structure.
Sensors 23 00680 g003
Figure 4. The experimental setup. (a) Circular plate without speckles; (b) circular plate with speckles; (c) the experimental setup.
Figure 4. The experimental setup. (a) Circular plate without speckles; (b) circular plate with speckles; (c) the experimental setup.
Sensors 23 00680 g004
Figure 5. Experimental results. (a) The input speckle-embedded fringe images of CNN and grey intensity; (b) the output speckle-free fringe images of CNN and grey intensity.
Figure 5. Experimental results. (a) The input speckle-embedded fringe images of CNN and grey intensity; (b) the output speckle-free fringe images of CNN and grey intensity.
Sensors 23 00680 g005aSensors 23 00680 g005b
Figure 6. The displacement of integer pixel position. (a) Displacement in x direction; (b) displacement in y direction.
Figure 6. The displacement of integer pixel position. (a) Displacement in x direction; (b) displacement in y direction.
Sensors 23 00680 g006
Figure 7. Full-field displacement in the X, Y and Z directions.
Figure 7. Full-field displacement in the X, Y and Z directions.
Sensors 23 00680 g007
Figure 8. The total displacement. (a) The stereogram of the full-field displacement; (b) the planar graph of the full-field displacement.
Figure 8. The total displacement. (a) The stereogram of the full-field displacement; (b) the planar graph of the full-field displacement.
Sensors 23 00680 g008
Figure 9. The displacement at the dashed line. (a) Comparison of experimental results and fitting results; (b) error.
Figure 9. The displacement at the dashed line. (a) Comparison of experimental results and fitting results; (b) error.
Sensors 23 00680 g009
Figure 10. Experimental setup.
Figure 10. Experimental setup.
Sensors 23 00680 g010
Figure 11. The distribution of full-field strain. (a1,b1,c1) the full-field strain distributions calculated using 3D-DIC; (a2,b2,c2) the full-field strain distributions calculated using the method of this paper; (a3,b3,c3) the strain solved using two methods at the red dotted line.
Figure 11. The distribution of full-field strain. (a1,b1,c1) the full-field strain distributions calculated using 3D-DIC; (a2,b2,c2) the full-field strain distributions calculated using the method of this paper; (a3,b3,c3) the strain solved using two methods at the red dotted line.
Sensors 23 00680 g011
Figure 12. Images acquired at different exposure times. (a) Exposure time of 20 ms; (b) exposure time of 40 ms; (c) exposure time of 80 ms.
Figure 12. Images acquired at different exposure times. (a) Exposure time of 20 ms; (b) exposure time of 40 ms; (c) exposure time of 80 ms.
Sensors 23 00680 g012
Figure 13. Comparison of displacement measurement error under different exposure times.
Figure 13. Comparison of displacement measurement error under different exposure times.
Sensors 23 00680 g013
Table 1. Root mean square error of displacement measurement (mm).
Table 1. Root mean square error of displacement measurement (mm).
Exposure Time20 ms40 ms80 ms
RMSE0.020.010.03
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, C.; Liu, C.; Xu, Z. High-Accuracy Three-Dimensional Deformation Measurement System Based on Fringe Projection and Speckle Correlation. Sensors 2023, 23, 680. https://doi.org/10.3390/s23020680

AMA Style

Zhang C, Liu C, Xu Z. High-Accuracy Three-Dimensional Deformation Measurement System Based on Fringe Projection and Speckle Correlation. Sensors. 2023; 23(2):680. https://doi.org/10.3390/s23020680

Chicago/Turabian Style

Zhang, Chuang, Cong Liu, and Zhihong Xu. 2023. "High-Accuracy Three-Dimensional Deformation Measurement System Based on Fringe Projection and Speckle Correlation" Sensors 23, no. 2: 680. https://doi.org/10.3390/s23020680

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop