Next Article in Journal
Printed Potentiometric Nitrate Sensors for Use in Soil
Next Article in Special Issue
Real Time Water-In-Oil Emulsion Size Measurement in Optofluidic Channels
Previous Article in Journal
Smart Polymer Composite Deck Monitoring Using Distributed High Definition and Bragg Grating Fiber Optic Sensing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Scheimpflug Camera-Based Technique for Multi-Point Displacement Monitoring of Bridges

School of Geosciences and Info-Physics, Central South University, Changsha 410083, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(11), 4093; https://doi.org/10.3390/s22114093
Submission received: 5 May 2022 / Revised: 25 May 2022 / Accepted: 26 May 2022 / Published: 27 May 2022
(This article belongs to the Special Issue Low-Cost Optical Sensors)

Abstract

:
Owing to the limited field of view (FOV) and depth of field (DOF) of a conventional camera, it is quite difficult to employ a single conventional camera to simultaneously measure high-precision displacements at many points on a bridge of dozens or hundreds of meters. Researchers have attempted to obtain a large FOV and wide DOF by a multi-camera system; however, with the growth of the camera number, the cost, complexity and instability of multi-camera systems will increase exponentially. This study proposes a multi-point displacement measurement method for bridges based on a low-cost Scheimpflug camera. The Scheimpflug camera, which meets the Scheimpflug condition, can enlarge the depth of field of the camera without reducing the lens aperture and magnification; thus, when the measurement points are aligned in the depth direction, all points can be clearly observed in a single field of view with a high-power zoom lens. To reduce the impact of camera motions, a motion compensation method applied to the Scheimpflug camera is proposed according to the characteristic that the image plane is not perpendicular to the lens axis in the Scheimpflug camera. Several tests were conducted for performance verification under diverse settings. The results showed that the motion errors in x and y directions were reduced by at least 62% and 92%, respectively, using the proposed method, and the measurements of the camera were highly consistent with LiDAR-based measurements.

1. Introduction

Structural health monitoring runs through the entire life cycle of civil engineering structures, and displacement measurement is an important technique in structural health monitoring. Currently, many types of sensors can be used to measure structural displacements, such as linear variable differential transformers (LVDTs) [1], laser Doppler vibrometers (LDVs) [2], global navigation satellite systems (GNSS) [3,4,5], total stations and image assisted total stations (IATS) [6,7,8]. However, the application of these sensors has certain limitations. For example, both the LVDT and LDV are limited by the measurement distance, making them impractical for large-scale field measurements. The GNSS is limited by insufficient measurement accuracy for high dynamic responses of the structure; its real time accuracy only reaches the centimeter level [5]. The total station is a high-precision non-contact sensor and is widely recognized, but it cannot fulfill the multi-point measurement requirement. To overcome this shortcoming, the latest development of total stations called IATS integrates a robotic total station with image sensors, which contains the advantages of high precision and multi-point measurement. However, the high costs of IATS restrict its extensive application [7].
Vision-based sensors provide a cost-effective, simple alternative for non-contact displacement measurement, and have been applied to various fields of structural displacement measurements [9,10], for example, wind tunnel tests of high-rise buildings [11], vibrational displacement measurements [12], defect detection [13] of bridges and slope deformation monitoring [14]. To determine displacements on distant bridges using vision-based sensors, scholars have proposed many approaches and made some achievements. Examples include dynamic displacement monitoring of bridges and high-rise buildings based on the grey centroid method [15], high frame rate (HFR) monitoring with artificial targets [10] and deflection measurements of bridges using a novel laser and video-based displacement transducer [16]. Recent studies have focused on improving the practicability of visual techniques [17]. For example, highly robust target localization methods have been designed to cope with complex illumination conditions [18]; additional sensors, such as the laser collimator [19] and total station [20], have been employed to compensate for camera motions [21]; some valuable investigations have been conducted to model or correct the thermal effect on camera image sensors [22,23].
Although scholars have proposed many practical methods for various challenges in the field, there are still many difficulties in applying visual techniques to bridge displacement monitoring. One of the main problems is that it is quite difficult to simultaneously measure displacements at many points on a bridge of dozens or hundreds of meters. Several multi-camera approaches [24,25,26,27] have been reported. Generally, a multi-camera system can be regarded as a combination of multiple single-camera systems, where each camera measures some points at different regions of the structure surface. Although a multi-camera system produces a sufficiently large effective field of view, the installation of multiple cameras is cumbersome and time-consuming. Moreover, with an increase in the number of cameras, the cost and uncertainty of the system increase. Therefore, it is still of practical significance to study single-camera methods. Aliansyah et al. [28] proposed installing a single camera at the front of the bridge to enable the measurement points fixed along the road direction of the bridge to be observed in a single FOV with a high-power zoom lens. However, for conventional cameras, the small DOF at high magnifications becomes problematic when focusing all points that lie in or close to a plane that is not parallel to the image plane.
In this work, an alternative approach of the Scheimpflug camera-based method for the multi-point displacement monitoring of bridges was proposed. The Scheimpflug camera, adopting the Scheimpflug condition by tilting the lens with respect to the image plane, enables the extension of the DOF of the camera without reducing the lens aperture and magnification; thus, it makes it possible to place only one camera at the front of the bridge to capture clear images of all measurement points distributed along the road direction of the bridge. Scheimpflug cameras have been applied to 3D digital image correlation [29], line structured light [30] and other fields [31]. However, to the best of our knowledge, the Scheimpflug camera-based method is rarely used in the literature on the multi-point displacement monitoring of bridges.
The remainder of this paper is organized as follows: Section 2 describes the limitations of multi-point displacement measurement using a conventional camera in detail and introduces the configuration and algorithms of the multi-point displacement monitoring of bridges using a single Scheimpflug camera. In Section 3, three tests are included. The first test evaluates the performance of camera motion compensation in the method with the help of a slide table. The second test evaluates the robustness of the method for long-distance measurements in outdoor environments. The third test is conducted on a truss structure bridge model and demonstrates the applicability of the proposed method and Scheimpflug camera-based system. Section 4 discusses some practical issues when applying the Scheimpflug camera to actual bridge monitoring, and Section 5 concludes this paper.

2. Materials and Methods

2.1. Limitations of Multi-Point Displacement Measurement Using Conventional Camera

Vision-based measurement systems have been widely used for defect detection and displacement measurements of bridges [32]. However, the following limitations remain to be solved for the multi-point displacement monitoring of bridges.

2.1.1. The Contradiction between a Wide FOV and High-Resolution

The simple method of acquiring multi-point displacement data on a bridge is to install the camera at the side of the bridge to capture side-view images of the bridge, and the camera magnification should be reduced to observe all points in a single camera view. However, because of the limited number of pixels integrated on the image sensor, the reduction of camera magnification indicates a decrease in image resolution, thus affecting the accuracy of the displacement measurement, as illustrated in Figure 1a.

2.1.2. Narrow DOF at High Magnification

Generally, there are large inaccessible areas on the side of the bridge, whereas the front of the bridge is open and accessible along the road direction. Therefore, installing the camera at the front of the bridge and capturing measurement points arranged along the road direction in a single front-view without reducing the magnification has become the main mode of bridge displacement measurement. However, because of the narrow DOF of conventional cameras at high magnifications, capturing all measurement points clearly in a single front-view is quite difficult, as illustrated in Figure 1b. Aliansyah et al. [28] considered that lens blur does not significantly reduce the localization accuracy of the target; however, this assumption is not always practically applicable. A small lens aperture helps to extend the DOF, but it produces dark images owing to insufficient incident light. Therefore, a better approach for extending the DOF of the camera without reducing the magnification and lens aperture is necessary.

2.2. Multi-Point Displacement Measurement of Bridges Using Scheimpflug Camera

2.2.1. Scheimpflug Camera-Based Measurement System and Displacement Calculation Algorithm

The Scheimpflug principle states that the focus plane (the plane on which the camera is focused), thin lens plane and image plane intersect in a single line, which is called the Scheimpflug line (Figure 2). In this case, the DOF of the camera is extended. Based on this principle, a robust, high-precision and low-cost displacement measurement system was designed in this study, which can clearly observe all measurement points distributed along the depth direction in a single-camera view with high magnification. The system contains a Scheimpflug camera, a tripod, laptop PC for camera control and several targets; their placement in bridge monitoring is illustrated in Figure 3. The Scheimpflug camera was installed at a stable area in front of the bridge, and each target was installed outside the bridge for its pattern to face the longitudinal direction of the bridge. The pattern of the target had two cross-shaped corners; thus, the scale conversion factors (mm/pixel) could be calculated easily. The distribution of the Scheimpflug camera and all targets were approximately in a line. The target installed on the stable platform (usually the pier) of the bridge was used as a reference for compensating camera motions. To facilitate the description of the algorithm in Section 2.2.2, the reference targets on two adjacent piers and the measuring targets between them were defined as a measuring unit, as shown in Figure 3.
The Scheimpflug camera used in this study includes three components: an 8-bit CMOS sensor employed to record the target images, which has a spatial resolution of 4096 × 2160 pixels; a telephoto lens (focal length 135 mm); and a custom-made Scheimpflug adapter. The adapter was machined by a computer numerical control (CNC) system, which can tilt the sensor around the vertical axis, with a range of approximately ±10°. The expense of the Scheimpflug adapter is only $100. The horizontal (H), vertical (V) and depth (D) directions of the Scheimpflug camera are defined as shown in Figure 4.
The displacement calculation algorithm mainly includes three steps. At first, the image coordinates of the targets are detected. To improve the accuracy and robustness of the localization method of cross-shaped targets, the sub-pixel method proposed by Duda and Frese [33] is utilized in this paper. Then, the sub-pixel displacement in the image plane can be obtained by calculating the difference between the centers of the targets in the continuous images sequence. Finally, the scale conversion factors in the corresponding direction need to be solved to convert image displacement into physical displacement, which can be obtained by comparing the physical dimension of the target with the pixel dimension in the image plane. It is assumed that the camera optical axis is almost perpendicular to the target plane. Therefore, the horizontal scale conversion factor sx and the vertical conversion factor sy can be solved as Equation (1),
s x = s y = D p h y s i c a l d i m a g e .
Physical displacement (Mx, My) can be obtained by multiplying the corresponding scale conversion factors:
M x = s x × d I x M y = s y × d I y .
where dIx and dIy are the horizontal and vertical displacements in the image plane.

2.2.2. Motion Compensation of Scheimpflug Camera

When a camera is installed for monitoring a full-scale structure, unexpected camera motion is unavoidable. Even if the camera is firmly fixed at a stationary point, its self-weight induces an inevitable and gradual movement of the entire system. In addition, cameras may be shaken by strong winds or ground vibrations in the field. Thus, the compensation of camera motions is necessary to ensure the accuracy of the displacement measurement. At present, utilizing fixed reference targets [34,35] to compensate camera motion is the most common and practical method, but existing methods have not considered the case that the image plane is not perpendicular to the lens axis in the Scheimpflug camera. To solve this problem, this paper used two reference targets which can build translational and rotational models for the Scheimpflug camera to reduce the impact of camera motion.
Figure 5 depicts the camera motion that consists of translation and rotation. The translation in the z direction of the camera can be ignored compared to the measured distance. Because the size of the camera in the z direction is larger than its size in the x and y directions, when the camera is firmly fixed, its rotation around the z-axis is significantly small, which can also be ignored. The translation in the x and y directions causes additional displacement errors, ctx and cty, in the image coordinate. The rotation around the x-axis and y-axis causes additional displacement errors, crx and cry, in the image coordinate. Here, the x, y and z directions correspond to the horizontal, vertical and depth directions of the Scheimpflug camera, as shown in Figure 4.
Assume that there is a measuring unit (two reference targets and a measuring target located between them) to be measured in the image plane, as shown in Figure 6. The centers of the two reference targets are A and B, respectively, and the center of the measuring target is P, which are obtained by averaging the image coordinates of the two cross-shaped corners on the targets. The scale conversion factors (mm/pixel) of the plane where each target is located are s x A , s y A , s x P , s y P , s x B and s y B , and the width and height of the image are W and H, respectively. Therefore, the physical displacements of target P, A and B without camera motion compensation can be expressed by the following formulas:
M x P U n c o r r e c t e d = M x P C o r r e c t e d + c t x P × s x P + c r x P × s x P M y P U n c o r r e c t e d = M y P C o r r e c t e d + c t y P × s y P + c r y P × s y P ,
M x A = c t x A × s x A + c r x A × s x A M x B = c t x B × s x B + c r x B × s x B ,
M y A = c t y A × s y A + c r y A × s y A M y B = c t y B × s y B + c r y B × s y B ,
where M x P C o r r e c t e d and M y P C o r r e c t e d are the corrected physical displacements in x and y directions. The displacements of the two reference targets A and B are only caused by camera motions.
  • Errors caused by camera rotation around x-axis and y-axis
When the camera rotates around the x-axis or y-axis, it only causes an error in the y direction (cry) or x direction (crx), respectively. First, the influence of camera rotation about the y-axis on the displacement error was analyzed, as shown in Figure 7. Assuming that the camera rotates clockwise around the focus point f, then the rotation angle is θ. For demonstration purposes, the rotation of the camera was replaced by rotations of the targets P, A and B. After the rotation, the positions of the targets become P′, A′ and B′. o is the center point of the image; p x p , y p and p x p , y p are the points in the image before and after the rotation of measuring point P; a x a , y a and a x a , y a are the points in the image before and after the rotation of reference point A; b x b , y b and b x b , y b are the points in the image before and after the rotation of reference point B. When analyzing the changes in the targets induced by the rotation of the Scheimpflug camera, the tilt angle (α) of the sensor should be considered; thus, the changes in the targets should be discussed under two conditions.
(1) When the image sensor tilts right (the definitions of left and right are depicted in Figure 4), the changes in the targets P, A and B in the image coordinate can be derived from Figure 7a:
c r x P = l f p × tan   θ × cos o f p s × cos o f p + α ,     x p W 2 > 0 l f p × tan   θ × cos o f p s × cos o f p α ,     x p W 2 0 , c r x A = l f a × tan   θ × cos o f a s × cos o f a + α ,     x a W 2 > 0 l f a × tan   θ × cos o f a s × cos o f a α ,     x a W 2 0 , c r x B = l f b × tan   θ × cos o f b s × cos o f b + α ,     x b W 2 > 0 l f b × tan   θ × cos o f b s × cos o f b α ,     x b W 2 0 ,
where the rotation angle θ is regarded as significantly small because the camera rotation is significantly small in practice. s is the pixel size (3.45 µm/pixel in this study), which is equal in the x and y directions. l f p = f / cos   o f p , l f a = f / cos   o f a , l f b = f / cos   o f b , o f p = arctan   ( x p W / 2 × s / f ) , o f a = arctan   ( x a W / 2 × s / f ) , o f b = arctan   ( x b W / 2 × s / f ) , f is the focal length.
Here, the proportion between the changes in the two reference targets is as follows:
c r x A c r x B = cos   o f b + α cos   o f a + α ,     x a W 2 > 0   a n d   x b W 2 > 0 cos   o f b α cos   o f a α ,     x a W 2 0   a n d   x b W 2 0 cos   o f b + α cos   o f a α ,     x a W 2 > 0   a n d   x b W 2 0 cos   o f b α cos   o f a + α ,     x a W 2 0   a n d   x b W 2 > 0 ,
and the proportion between the changes in measuring target P and reference target A is as follows:
c r x A c r x P = cos   o f p + α cos   o f a + α ,     x a W 2 > 0   a n d   x p W 2 > 0 cos   o f p α cos   o f a α ,     x a W 2 0   a n d   x p W 2 0 cos   o f p + α cos   o f a α ,     x a W 2 > 0   a n d   x p W 2 0 cos   o f p α cos   o f a + α ,     x a W 2 0   a n d   x p W 2 > 0 ,
(2) When the image sensor tilts left, the changes in the targets P, A and B in the image coordinate can be derived from Figure 7b.
c r x P = l f p × tan   θ × cos o f p s × cos o f p α ,     x p W 2 > 0 l f p × tan   θ × cos o f p s × cos o f p + α ,     x p W 2 0 , c r x A = l f a × tan   θ × cos o f a s × cos o f a α ,     x a W 2 > 0 l f a × tan   θ × cos o f a s × cos o f a + α ,     x a W 2 0 , c r x B = l f b × tan   θ × cos o f b s × cos o f b α ,     x b W 2 > 0 l f b × tan   θ × cos o f b s × cos o f b + α ,     x b W 2 0 ,
in this case, the proportion between the changes of the two reference targets is expressed as:
c r x A c r x B = cos   o f b α cos   o f a α ,     x a W 2 > 0   a n d   x b W 2 > 0 cos   o f b + α cos   o f a + α ,     x a W 2 0   a n d   x b W 2 0 cos     o f b α cos   o f a + α ,     x a W 2 > 0   a n d   x b W 2 0 cos   o f b + α cos   o f a α ,     x a W 2 0   a n d   x b W 2 > 0 ,
and the proportion between the changes in measuring target P and reference target A is as follows:
c r x A c r x P = cos   o f p α cos   o f a α ,     x a W 2 > 0   a n d   x p W 2 > 0 cos   o f p + α cos   o f a + α ,     x a W 2 0   a n d   x p W 2 0 cos   o f p α cos   o f a + α ,     x a W 2 > 0   a n d   x p W 2 0 cos   o f p + α cos   o f a α ,     x a W 2 0   a n d   x p W 2 > 0 ,
Figure 7. Changes of the targets before and after the camera rotation about the y-axis. (a) Tilt the sensor (image plane) right. (b) Tile the sensor (image plane) left.
Figure 7. Changes of the targets before and after the camera rotation about the y-axis. (a) Tilt the sensor (image plane) right. (b) Tile the sensor (image plane) left.
Sensors 22 04093 g007
Similarly, when the camera rotates around the x-axis, the proportion between the changes in the targets in the image coordinate can be derived.
(1) When the image sensor tilts to the right:
c r y A c r y B = cos   o f b + α cos   o f a + α ,     x a W 2 > 0   a n d   x b W 2 > 0 cos     o f b α cos   o f a α ,     x a W 2 0   a n d   x b W 2 0 cos   o f b + α cos   o f a α ,     x a W 2 > 0   a n d   x b W 2 0 cos   o f b α cos   o f a + α ,     x a W 2 0   a n d   x b W 2 > 0 ,
c r y A c r y P = cos   o f p + α cos   o f a + α ,     x a W 2 > 0   a n d   x p W 2 > 0 cos   o f p α cos   o f a α ,     x a W 2 0   a n d   x p W 2 0 cos   o f p + α cos   o f a α ,     x a W 2 > 0   a n d   x p W 2 0 cos   o f p α cos   o f a + α ,     x a W 2 0   a n d   x p W 2 > 0 ,
where o f p = arctan   ( ( y p H / 2 × s ) / f ) , o f a = arctan   ( y a H / 2 × s / f ) , o f b = arctan   ( y b H / 2 × s / f ) .
(2) When the image sensor tilts to the left:
c r y A c r y B = cos   o f b α cos   o f a α ,     x a W 2 > 0   a n d   x b W 2 > 0 cos   o f b + α cos   o f a + α ,     x a W 2 0   a n d   x b W 2 0 cos   o f b α cos   o f a + α ,     x a W 2 > 0   a n d   x b W 2 0 cos   o f b + α cos   o f a α ,     x a W 2 0   a n d   x b W 2 > 0 ,
c r y A c r y P = cos   o f p α cos   o f a α ,     x a W 2 > 0   a n d   x p W 2 > 0 cos   o f p + α cos   o f a + α ,     x a W 2 0   a n d   x p W 2 0 cos   o f p α cos   o f a + α ,     x a W 2 > 0   a n d   x p W 2 0 cos   o f p + α cos   o f a α ,     x a W 2 0   a n d   x p W 2 > 0 .
2.
Errors caused by camera translation along x and y directions
When the camera translates in the x and y directions, it only causes an error in the x direction (ctx) or y direction (cty). First, the influence of the camera translation in the y direction on the displacement error was analyzed, as shown in Figure 8. Assuming that the translation amount of the camera along the y direction is Δt, the translation of the camera is also replaced by the translation of the targets P, A and B. After the translation, the positions of the targets become P′, A′ and B′, and their changes in the image coordinate can be derived from Figure 8:
c t y P = o p o p c t y A = o a o a c t y B = o b o b   .
Several experiments show that the error induced by camera translation is significantly smaller than that induced by rotation, and thus the influence of the tilt angle (α) on camera translation can be ignored. That is, assuming that the image plane is perpendicular to the lens axis, the proportion between the changes in the targets can be approximately expressed as (15):
c t y A c t y B = d B d A , c t y A c t y P = d P d A ,
where dP, dA and dB represent the physical distance between the target and the camera. However, it is difficult to directly measure the distance between the target and the camera; therefore, the scale conversion factor of the plane where the target is located is used to replace the target-camera distance. It is known that there is a linear positive correlation between them; thus, the following formula can be obtained:
c t y A c t y B = d B d A = s y B s y A ,
c t y A c t y P = d P d A = s y P s y A .
Similarly, when the camera translates along the x direction, the changes in the two reference targets in the image coordinate have the following proportions:
c t x A c t x B = s x B s x A ,
c t x A c t x P = s x P s x A .
Figure 8. Changes of the targets before and after the camera translation in the y direction.
Figure 8. Changes of the targets before and after the camera translation in the y direction.
Sensors 22 04093 g008
3.
Displacement calculation with camera motion compensation
Firstly, Equations (4) and (5) can be simplified by using Equations (18) and (20):
M x B M x A = c r x B × s x B c r x A × s x A M y B M y A = c r y B × s y B c r y A × s y A   ,
according to different tilt directions of the image sensor, different equations are used to calculate the error components of the targets A and B: (1) When the image sensor tilts right, c r x A , c r y A , c r x B and c r y B can be calculated by substituting Equations (7) and (12) into Equation (22); or (2) when the image sensor tilts left, c r x A , c r y A , c r x B and c r y B can be calculated by substituting Equations (10) and (14) into Equation (22). Secondly, c t x A , c t x B , c t y A and c t y B can be computed easily through Equations (4) and (5). Then, the error components of the target P  c t x P , c r x P , c t y P , c r y P can be calculated using Equations (8), (13), (19) and (21) or Equations (11), (15), (19) and (21) according to the tilt direction of the image sensor. Finally, Equation (3) can be used to compute the corrected physical displacement of target P.
4.
Measurement stage
(1)
After installation of the Scheimpflug camera, one must read the tilt angle (α) of the image sensor on the Scheimpflug adapter (the resolution of the adapter is 0.1°) and determine the tilt direction. When the measurement starts, the tilt angle and direction of the image sensor remain unchanged.
(2)
The Scheimpflug camera measures (xp, yp)i, (xa, ya)i and (xb, yb)i for each target in the i-th image.
(3)
The uncorrected physical displacements M x P U n c o r r e c t e d   , M y P U n c o r r e c t e d i , M x A   , M y A i and M x B   , M y B i in the i-th image are calculated with respect to the reference image.
(4)
According to the tilt direction of the image sensor, the error components c r x A   , c r y A , c t x A   , c t y A i and c r x B   , c r y B , c t x B   , c t y B i of the reference target A and B in the i-th image are calculated by using Equations (4), (5), (7), (12) and (22), or Equations (4), (5), (10), (14) and (22).
(5)
According to the tilt direction of the image sensor, the error components c t x P   , c r x P , c t y P   , c r y P i of the target P in the i-th image are calculated by using Equations (8), (13), (19) and (21) or Equations (11), (15), (19) and (21).
(6)
The corrected physical displacement of the target P in the i-th image is calculated by using Equation (3). Note that this approach assumes that the out-of-plane motion of the target can be neglected. The displacement calculation process of other measuring targets is the same as that of target P.

3. Experiment Validation

3.1. Validation through a Slide Table Test

The main purpose of this test was to verify the effectiveness of the proposed motion compensation method of the Scheimpflug camera. As shown in Figure 9, the Scheimpflug camera installed on a six-axis slide table observed four fixed targets aligned along the depth direction, in which the six-axis slide table was used to simulate the camera motions. The distance between the camera and the nearest target (No. 105) was 6.1 m. The distance between two adjacent targets was 0.6 m. These four targets constituted a measurement unit mentioned in Figure 3, where targets 105 and 285 were regarded as reference targets to compensate for the camera motions. When the tilt direction was to the right and tilt angle α was approximately 5.1°, the camera could clearly capture the four targets (Figure 9c). If the image plane was parallel to the lens plane, the camera could capture only one or two targets clearly (Figure 9d).
Four acquisitions were performed in this test, and the sampling rate of the camera was set to 2 frames per second. In the first two acquisitions, the translation motions of the camera were simulated by slowly translating the slide table in the x and y directions, while in the last two acquisitions, the rotation motions of the camera were simulated by slowly rotating the slide table around its y-axis and x-axis, and 150 images were collected in each acquisition. The four targets were fixed in this test, whose real displacements can be considered as zeros. Correspondingly, the displacements detected by the camera were namely the displacement measurement errors induced by camera motions.
Since targets 105 and 285 were reference targets whose displacements defaulted to zeros, and the measurements of target 225 were highly consistent with those of target 165, only the displacement of target 165 is plotted in Figure 10. In the first two acquisitions, the translation of the camera was close to 10 mm, while in the last two acquisitions, the rotation of the camera exceeded 1°, and caused a displacement error of more than 25 mm. After the compensation, the maximum errors of the four results did not exceed 0.01 mm.
Considering that the resolution of the Scheimpflug adapter was only 0.1°, five different tilt angles {4.9°, 5.0°, 5.1°, 5.2°, 5.3°} were used to verify the influence of the adapter resolution on camera motion compensation. Taking the third acquisition as an example, the corrected displacements of target 165 obtained with different tilt angles are shown in Figure 11.
It can be seen from Figure 11 that the corrected displacements obtained with five different tilt angles had little differences, and the maximum difference was only 0.01 mm. Therefore, it can be concluded that the error caused by the insufficient adapter resolution has a negligible influence on camera motion compensation.

3.2. Outdoor Test Using Static Targets

Five static targets fixed on the ground were monitored in this test, as shown in Figure 12 and Figure 13. The Scheimpflug camera was installed on a 1.0 m tall tripod to enable the targets to be observed horizontally (Figure 12). The targets were well distributed along the road direction such that all the targets were collectively observed in a narrow FOV without lowering the camera magnification. Similar to the first test, in this test, targets 1 and 5 were used as reference targets for camera motion compensation. In bridge monitoring applications, these two reference targets are usually installed on two adjacent piers; therefore, the distance between targets 1 and 5 is the span length of the bridge, which is one of the main factors affecting the measurement accuracy. Consequently, the distance L between the two reference targets was set to 20 m, 40 m and 80 m to cover different span lengths. In addition, the distance between the camera and the target is also one of the key factors affecting the measurement accuracy; thus, the distance d between the camera and target 1 was set to 50 m and 80 m, respectively. Therefore, a total of six acquisitions were conducted to comprehensively evaluate the effectiveness of the system and method proposed in this study. A comparison of the imaging results between the Scheimpflug camera and the conventional camera is shown in Figure 13.
The sampling rate of the camera was set to 90 frames per second, and the duration of each acquisition was 100 s. The purpose of installing multiple targets in the test was to illustrate the capability of the multi-point displacement measurement of the proposed system. However, because the displacements of targets 2, 3 and 4 were almost the same, owing to space and clarity, only the displacements of target 3 are shown in Figure 14.
In this test, we placed the camera on a busy road (Figure 12); thus, passing cars caused obvious ground vibrations. Moreover, the maximum air velocity on the test day exceeded 8 m/s. Under the combined action of these two factors, there were many sudden variations in the original displacements of target 3. In addition, the original displacements of target 3 also showed a gradual decreasing trend, because the camera was prone to slow movement due to its self-weight and temperature changes. However, these factors did not affect the effectiveness of the proposed method, and the corrected displacements of the six acquisitions obtained satisfactory accuracy.
The root mean squared errors (RMSEs) with and without compensation were calculated, and are listed in Table 1. After implementing the motion compensation method, the RMSEs in the x and y directions did not exceed 0.54 mm, which were reduced by at least 62% and 92%, respectively. It can be observed from Table 1 that the reductions of RMSEs in the y direction were overall higher than those in the x direction, which is because the self-weight of the camera and ground vibrations were more likely to cause the camera’s movement in the y direction. In addition, the increase in the camera–target distance reduced the image resolution and localization accuracy of the targets, resulting in a worse correction to motion-induced errors.
This test verified the remote measurement performance of the proposed system and method under outdoor conditions. When the measurement distance d and span length L were all 80 m, that is, the farthest measurement distance was 160 m, the total RMSE and maximum error reached about 0.6 mm and 1.0 mm, respectively. Therefore, in order to ensure that the measurement accuracy is within 1.0 mm, the proposed system and method can only be applied to bridges with a span length of 160 m or less.

3.3. Bridge Model Experiment

The proposed system was implemented on a truss structure bridge model with a length of approximately 38.8 m to measure its dynamic displacements. The geometric configuration of the bridge model to be inspected is shown in Figure 15a,b. The whole bridge model was fixed on four shake tables (STs), which were provided by SERVOTEST [36] and arranged in a straight line. The ST 1–ST 2 distance was 6.54 m, and both the ST 2–ST 3 distance and ST 3–ST 4 distance were 13.08 m. These four STs had the same technical indices. They were all six-axis shake tables with a table size of 4 × 4 m2; the maximum payload of a single ST was 30 t; the maximum displacements were 250 mm in x and y directions and 160 mm in z direction; the maximum speed in x, y and z directions was ±1000 mm/s; the operating frequency range was 0.1–50 Hz. Here, the x, y and z directions corresponded to the horizontal, vertical and depth directions, respectively, as shown in Figure 15a,b. In addition, the four STs had a flexible operation mode; that is, they can be used independently or concatenated into a shake table array.
Five targets were used in this experiment: target 1 was located 0.1 m ahead of the front end, and target 5 was located 5.2 m behind the back end of the bridge model. These two targets were not fixed on the bridge model; thus, they were static during the experiment and can be regarded as references for compensating camera motion. Other targets were attached to the bridge model. The size of each target was 300 mm × 200 mm, and the physical length between the two cross-shaped corners was 100 mm.
As shown in Figure 15c, the test site was very narrow; therefore, it was impractical to find a suitable installation position for a single conventional camera to clearly observe all targets at high magnification. In contrast, the Scheimpflug camera has better practicability in these narrow sites, as shown in Figure 15c. All targets could be clearly observed by installing the Scheimpflug camera near the bridge model (Figure 15e), in which the distance between the Scheimpflug camera and target 1 was 10.8 m.
This experiment simulated the impact of an earthquake on a bridge structure. Figure 16 shows the measurement results of the proposed Scheimpflug camera-based system. Images were captured at 60 frames per second. In the entire process of the experiment, the four STs were concatenated into a shake table array and vibrated synchronously in x and y directions, so the displacements of targets 2, 3 and 4 had the same varying tendency. The maximum displacement amplitude of target 3 was slightly larger than that of target 2 and 4, which were 5.14 mm and 1.79 mm in x and y directions, respectively, mainly because target 3 was farthest from the shake table. Note that there remained slight vibrations with a maximum value of 0.36 mm in the y direction at target 5; this is because the oil-fired engine driving the shake tables released a significant amount of heat during operation, and the resulting hot-air turbulence caused image deformation and additional measurement errors. The engine was located under the bridge model between target 1 and target 2, so the displacement results of targets 2, 3, 4 and 5 were all affected.
To further validate the proposed Scheimpflug-camera system, the measured displacement of target 3 was compared to the values measured by the LDV-based method. Two LDVs with an accuracy of 0.05 mm were installed near target 3, as shown in Figure 15d. Figure 17 shows that the two displacements shared similar overall trends. The RMSEs of the differences between the two displacements in x and y directions were 0.16 mm and 0.11 mm, respectively. Thus, the performance of the proposed system and method in measuring the dynamic displacements was verified. However, due to the influence of hot-air turbulence, the maximum differences between the two displacements in x and y directions were 0.76 mm and 0.41 mm, respectively, which is still far from the requirement for a high-precision measurement. It is desired for the proposed method to capture images on bridges when little difference exists between the air and ground temperatures.

4. Discussion

The effectiveness of the proposed system and method has been proven, but the following practical issues should be considered when applying the system to actual bridge monitoring.
(1)
Out-of-plane motion of target
(2)
The proposed motion compensation method does not consider the out-of-plane motion of the target; that is, the displacement of the bridge along the road direction is ignored. However, in practical applications, the out-of-plane motion of the target is inevitable, which causes additional calculation errors of scale conversion factors when high-magnification-ratio images are captured through a super-telephoto lens. This decreases the measurement accuracy of our method. Therefore, the proposed method needs to be further optimized.
(3)
Placement restrictions in camera installation
(4)
The camera must be installed close to the bridge. In Figure 15c, the shortest distance between the camera and the bridge model is approximately 1.0 m, and only such a short distance can ensure that all targets can be collected in a narrow camera view. However, when monitoring actual bridges, there may be insufficient installation space in front of the bridge.
(5)
Image noise, blur and deformation caused by remote measurement
(6)
As shown in Figure 14, when the span length of the bridge or measurement distance increases, the measurement accuracy reduces significantly owing to the noise, blur and deformation of the image. Unmanned aerial vehicles (UAVs) can provide an opportunity to capture bridge images more effectively by bringing the camera closer to the bridge; thus, the UAV equipped with the Scheimpflug camera can be used to realize the short-distance measurement, so as to further improve the accuracy of the Scheimpflug camera-based technique in bridge monitoring. However, the distance (span length of the bridge) between the two piers for fixing reference targets will still restrict the effectiveness of camera motion compensation, which makes the proposed method difficult to be applied to long-span bridges, such as suspension bridges or cable-stayed bridges.

5. Conclusions

In this study, we proposed a low-cost system based on a single Scheimpflug camera to measure displacements at many artificial targets attached to a bridge, such that all targets are clearly observed in a single-camera view without reducing the lens magnification. The existing camera-ego-motion compensation methods using reference targets do not consider the case that the image plane is not perpendicular to the lens axis. To solve this problem, this paper built translational and rotational models for the Scheimpflug camera to reduce the error induced by the Scheimpflug-camera motion, which only requires the simple processing of two-dimensional images.
The proposed method was verified through three experiments. In the first experiment, a six-axis slide table was used to simulate camera motions. The maximum error induced by the slide table exceeded 25 mm, and then it was suppressed to 0.01 mm using the proposed method. Regarding outdoor conditions, the performance of the method was verified through different measurement distances and span lengths. The results showed that when the span length of the bridge is no more than 160 m, the measurement accuracy of the proposed system will be better than 1.0 mm. The span length (the distance between two adjacent piers) of the bridge and measurement distance are the two main factors affecting the applicability of the proposed method. Finally, a bridge model experiment was conducted and the performance of the proposed system in measuring the dynamic displacements of bridges was demonstrated. Next, we plan to carry out UAV-related research to prevent the influence of remote measurement by bringing the camera closer to the bridge.

Author Contributions

L.X.: Methodology, Software, Validation, Formal Analysis, Writing—Original Draft. W.D.: Validation, Investigation, Writing—Review and Editing. Y.Z.: Investigation, Supervision, Project Administration, Funding Acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China [41974215], and in part by the Science and Technology Research and Development Program Project of China railway group limited [Major Special Project, No.: 2020-Special-02].

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Novacek, G. Accurate Linear Measurement Using LVDTs. Circuit Cellar Ink 1999, 106, 20–27. [Google Scholar]
  2. Nassif, H.H.; Gindy, M.; Davis, J. Comparison of Laser Doppler Vibrometer with Contact Sensors for Monitoring Bridge Deflection and Vibration. Ndt E Int. 2005, 38, 213–218. [Google Scholar] [CrossRef]
  3. Celebi, M. Gps in Dynamic Monitoring of Long-Period Structures; Springer: Dordrecht, The Netherlands, 2001. [Google Scholar]
  4. Nakamura, S.-I. GPS Measurement of Wind-Induced Suspension Bridge Girder Displacements. J. Struct. Eng. 2000, 126, 1413–1419. [Google Scholar] [CrossRef]
  5. Mayer, L.; Yanev, B.; Olson, L.D.; Smyth, A. Monitoring of the Manhattan Bridge and Interferometric Radar Systems. 2010. Available online: https://www.researchgate.net/publication/283605971_Monitoring_of_the_Manhattan_Bridge_and_Interferometric_Radar_Systems (accessed on 4 May 2022).
  6. Zschiesche, K. Image Assisted Total Stations for Structural Health Monitoring—A Review. Geomatics 2022, 2, 1–16. [Google Scholar] [CrossRef]
  7. Paar, R.; Marendic, A.; Jakopec, I.; Grgac, I. Vibration Monitoring of Civil Engineering Structures Using Contactless Vision-Based Low-Cost IATS Prototype. Sensors 2021, 21, 7952. [Google Scholar] [CrossRef]
  8. Paar, R.; Roic, M.; Marendic, A.; Miletic, S. Technological Development and Application of Photo and Video Theodolites. Appl. Sci. 2021, 11, 3893. [Google Scholar] [CrossRef]
  9. Feng, D.; Scarangello, T.; Feng, M.Q.; Ye, Q. Cable Tension Force Estimate Using Novel Noncontact Vision-Based Sensor. Measurement 2017, 99, 44–52. [Google Scholar] [CrossRef]
  10. Feng, D.; Feng, M.Q.; Ozer, E.; Fukuda, Y. A Vision-Based Sensor for Noncontact Structural Displacement Measurement. Sensors 2015, 15, 16557–16575. [Google Scholar] [CrossRef]
  11. Huang, M.; Zhang, B.; Lou, W. A Computer Vision-Based Vibration Measurement Method for Wind Tunnel Tests of High-Rise Buildings-ScienceDirect. J. Wind Eng. Ind. Aerodyn. 2018, 182, 222–234. [Google Scholar] [CrossRef]
  12. Lydon, D.; Lydon, M.; Taylor, S.; Rincon, J.M.D.; Hester, D.; Brownjohn, J. Development and Field Testing of a Vision-Based Displacement System Using a Low Cost Wireless Action Camera. Mech. Syst. Signal Process. 2019, 121, 343–358. [Google Scholar] [CrossRef] [Green Version]
  13. Dabous, S.A.; Feroz, S. Condition Monitoring of Bridges with Non-Contact Testing Technologies. Autom. Constr. 2020, 116, 103224. [Google Scholar] [CrossRef]
  14. Rau, J.; Jhan, J.-P.; Andaru, R. Landslide Deformation Monitoring by Three-Camera Imaging System. In Proceedings of the ISPRS—International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Enschede, The Netherlands, 10–14 June 2019; Volume XLII-2/W13, pp. 559–565. [Google Scholar] [CrossRef] [Green Version]
  15. Lee, J.J.; Shinozuka, M. A Vision-Based System for Remote Sensing of Bridge Displacement. NDT E Int. 2006, 39, 425–431. [Google Scholar] [CrossRef]
  16. Miguel, V.; Dorys, G.; Jesus, M.; Thomas, S. A Novel Laser and Video-Based Displacement Transducer to Monitor Bridge Deflections. Sensors 2018, 18, 970. [Google Scholar]
  17. Xing, L.; Dai, W.; Zhang, Y. Improving Displacement Measurement Accuracy by Compensating for Camera Motion and Thermal Effect on Camera Sensor. Mech. Syst. Signal Process. 2022, 167, 108525. [Google Scholar] [CrossRef]
  18. Luo, L.; Feng, M.Q.; Wu, Z.Y. Robust Vision Sensor for Multi-Point Displacement Monitoring of Bridges in the Field. Eng. Struct. 2018, 163, 255–266. [Google Scholar] [CrossRef]
  19. Zhao, X.; Liu, H.; Yu, Y.; Xu, X.; Hu, W.; Li, M.; Ou, J. Bridge Displacement Monitoring Method Based on Laser Projection-Sensing Technology. Sensors 2015, 15, 8444–8463. [Google Scholar] [CrossRef]
  20. Ehrhart, M.; Lienhart, W. Monitoring of Civil Engineering Structures Using a State-of-the-Art Image Assisted Total Station. J. Appl. Geod. 2015, 9, 174–182. [Google Scholar] [CrossRef]
  21. Lee, J.; Lee, K.-C.; Jeong, S.; Lee, Y.-J.; Sim, S.-H. Long-Term Displacement Measurement of Full-Scale Bridges Using Camera Ego-Motion Compensation. Mech. Syst. Signal Process. 2020, 140, 106651. [Google Scholar] [CrossRef]
  22. Zhou, H.F.; Zheng, J.F.; Xie, Z.L.; Lu, L.J.; Ni, Y.Q.; Ko, J.M. Temperature Effects on Vision Measurement System in Long-Term Continuous Monitoring of Displacement. Renew. Energy 2017, 114, 968–983. [Google Scholar] [CrossRef]
  23. Daakir, M.; Zhou, Y.; Deseilligny, M.P.; Thom, C.; Martin, O.; Rupnik, E. Improvement of Photogrammetric Accuracy by Modeling and Correcting the Thermal Effect on Camera Calibration. ISPRS J. Photogramm. Remote Sens. 2019, 148, 142–155. [Google Scholar] [CrossRef] [Green Version]
  24. Lee, J.-J.; Ho, H.-N.; Lee, J.-H. A Vision-Based Dynamic Rotational Angle Measurement System for Large Civil Structures. Sensors 2012, 12, 7326–7336. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Shang, Y.; Yu, Q.; Yang, Z.; Xu, Z.; Zhang, X. Displacement and Deformation Measurement for Large Structures by Camera Network. Opt. Lasers Eng. 2014, 54, 247–254. [Google Scholar] [CrossRef]
  26. Malesa, M.; Malowany, K.; Pawlicki, J.; Kujawinska, M.; Skrzypczak, P. Non-Destructive Testing of Industrial Structures with the Use of Multi-Camera Digital Image Correlation Method. Eng. Fail. Anal. 2016, 69, 122–134. [Google Scholar] [CrossRef]
  27. Malowany, K.; Malesa, M.; Kowaluk, T.; Kujawinska, M. Multi-Camera Digital Image Correlation Method with Distributed Fields of View. Opt. Lasers Eng. 2017, 98, 198–204. [Google Scholar] [CrossRef]
  28. Aliansyah, Z.; Shimasaki, K.; Jiang, M.; Takaki, T.; Ishii, I.; Yang, H.; Umemoto, C.; Matsuda, H. A Tandem Marker-Based Motion Capture Method for Dynamic Small Displacement Distribution Analysis. J. Robot. Mechatron. 2019, 31, 671–685. [Google Scholar] [CrossRef]
  29. Sun, C.; Liu, H.; Shang, Y.; Chen, S.; Yu, Q. Scheimpflug Camera-Based Stereo-Digital Image Correlation for Full-Field 3D Deformation Measurement. J. Sens. 2019, 2019, 5391827. [Google Scholar] [CrossRef] [Green Version]
  30. Li, J.; Guo, Y.; Zhu, J.; Lin, X.; Xin, Y.; Duan, K.; Tang, Q. Large Depth-of-View Portable Three-Dimensional Laser Scanner and Its Segmental Calibration for Robot Vision. Opt. Lasers Eng. 2007, 45, 1077–1087. [Google Scholar] [CrossRef]
  31. Miks, A.; Novak, J.; Novak, P. Analysis of Imaging for Laser Triangulation Sensors under Scheimpflug Rule. Opt. Express 2013, 21, 18225–18235. [Google Scholar] [CrossRef]
  32. Brownjohn, J.M.W.; Xu, Y.; Hester, D. Vision-Based Bridge Deformation Monitoring. Front. Built Environ. 2017, 3, 23. [Google Scholar] [CrossRef] [Green Version]
  33. Duda, A.; Frese, U. Accurate Detection and Localization of Checkerboard Corners for Calibration. 2018. Available online: http://bmvc2018.org/contents/papers/0508.pdf (accessed on 4 May 2022).
  34. Yoneyama, S.; Ueda, H. Bridge Deflection Measurement Using Digital Image Correlation with Camera Movement Correction. Mater. Trans. 2012, 53, 285–290. [Google Scholar] [CrossRef] [Green Version]
  35. Zhou, X.; Wang, J.; Mou, X.; Li, X.; Feng, X. Robust and High-Precision Vision System for Deflection Measurement of Crane Girder with Camera Shake Reduction. IEEE Sens. J. 2020, 21, 7478–7489. [Google Scholar] [CrossRef]
  36. Servotest—Test & Motion Simulation. Available online: Https://www.servotestsystems.com/ (accessed on 4 May 2022).
Figure 1. Limitations in vision-based bridge displacement measurement using conventional camera. (a) Side-view measurement. (b) Front-view measurement.
Figure 1. Limitations in vision-based bridge displacement measurement using conventional camera. (a) Side-view measurement. (b) Front-view measurement.
Sensors 22 04093 g001
Figure 2. Scheimpflug principle in a 2D view, where gray shade represents DOF.
Figure 2. Scheimpflug principle in a 2D view, where gray shade represents DOF.
Sensors 22 04093 g002
Figure 3. Scheimpflug camera-based system.
Figure 3. Scheimpflug camera-based system.
Sensors 22 04093 g003
Figure 4. Scheimpflug camera.
Figure 4. Scheimpflug camera.
Sensors 22 04093 g004
Figure 5. Camera motions during measurement.
Figure 5. Camera motions during measurement.
Sensors 22 04093 g005
Figure 6. Measuring unit in the image plane.
Figure 6. Measuring unit in the image plane.
Sensors 22 04093 g006
Figure 9. Experiment setup and target selection. (a) Experiment setup, where blue shade represents FOV. (b) Scheimpflug camera. (c) The imaging result of the Scheimpflug camera. (d) The imaging result of the conventional camera.
Figure 9. Experiment setup and target selection. (a) Experiment setup, where blue shade represents FOV. (b) Scheimpflug camera. (c) The imaging result of the Scheimpflug camera. (d) The imaging result of the conventional camera.
Sensors 22 04093 g009
Figure 10. The measurement results obtained by the Scheimpflug camera without/with motion compensation. (a) Displacement results of camera translation in the x direction. (b) Displacement results of camera translation in the y direction. (c) Displacement results of camera rotation around the x-axis. (d) Displacement results of camera rotation around the y-axis.
Figure 10. The measurement results obtained by the Scheimpflug camera without/with motion compensation. (a) Displacement results of camera translation in the x direction. (b) Displacement results of camera translation in the y direction. (c) Displacement results of camera rotation around the x-axis. (d) Displacement results of camera rotation around the y-axis.
Sensors 22 04093 g010aSensors 22 04093 g010b
Figure 11. The corrected displacements of target 165 obtained with different tilt angles.
Figure 11. The corrected displacements of target 165 obtained with different tilt angles.
Sensors 22 04093 g011
Figure 12. Experiment setup of the outdoor test.
Figure 12. Experiment setup of the outdoor test.
Sensors 22 04093 g012
Figure 13. A comparison of the imaging results between the Scheimpflug camera and the conventional camera. (a) The imaging result of the Scheimpflug camera. (b) The imaging result of the conventional camera.
Figure 13. A comparison of the imaging results between the Scheimpflug camera and the conventional camera. (a) The imaging result of the Scheimpflug camera. (b) The imaging result of the conventional camera.
Sensors 22 04093 g013aSensors 22 04093 g013b
Figure 14. Displacement results of target 3 in the x (left column) direction and y (right column) direction. (a) d = 50 m, L = 20 m. (b) d = 50 m, L = 40 m. (c) d = 50 m, L = 80 m. (d) d = 80 m, L = 20 m. (e) d = 80 m, L = 40 m. (f) d = 80 m, L = 80 m.
Figure 14. Displacement results of target 3 in the x (left column) direction and y (right column) direction. (a) d = 50 m, L = 20 m. (b) d = 50 m, L = 40 m. (c) d = 50 m, L = 80 m. (d) d = 80 m, L = 20 m. (e) d = 80 m, L = 40 m. (f) d = 80 m, L = 80 m.
Sensors 22 04093 g014aSensors 22 04093 g014b
Figure 15. Experiment setup of the bridge model experiment. (a) Geometric configuration in the bridge model experiment (top-view). (b) Geometric configuration in the bridge model experiment (side-view). (c) The Scheimpflug camera. (d) The LDV installed at the bottom of the model. (e) An example of images captured by the Scheimpflug camera.
Figure 15. Experiment setup of the bridge model experiment. (a) Geometric configuration in the bridge model experiment (top-view). (b) Geometric configuration in the bridge model experiment (side-view). (c) The Scheimpflug camera. (d) The LDV installed at the bottom of the model. (e) An example of images captured by the Scheimpflug camera.
Sensors 22 04093 g015
Figure 16. Dynamic displacements from 5 targets. (a) Displacements along the x direction. (b) Displacements along the y direction.
Figure 16. Dynamic displacements from 5 targets. (a) Displacements along the x direction. (b) Displacements along the y direction.
Sensors 22 04093 g016
Figure 17. Comparison of the displacements with the Scheimpflug camera and LDV. (a) Displacements in the x direction. (b) Displacements in the y direction.
Figure 17. Comparison of the displacements with the Scheimpflug camera and LDV. (a) Displacements in the x direction. (b) Displacements in the y direction.
Sensors 22 04093 g017
Table 1. RMSEs with/without camera motion compensation.
Table 1. RMSEs with/without camera motion compensation.
AcquisitionsWith Compensation (mm)Without Compensation (mm)Reduction in RMSE (%)
xyxyxy
d = 50 m, L = 20 m0.140.093.442.639697
d = 50 m, L = 40 m0.270.141.973.128696
d = 50 m, L = 80 m0.210.202.017.139097
d = 80 m, L = 20 m0.210.191.784.258895
d = 80 m, L = 40 m0.500.291.313.436292
d = 80 m, L = 80 m0.330.542.318.278693
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xing, L.; Dai, W.; Zhang, Y. Scheimpflug Camera-Based Technique for Multi-Point Displacement Monitoring of Bridges. Sensors 2022, 22, 4093. https://doi.org/10.3390/s22114093

AMA Style

Xing L, Dai W, Zhang Y. Scheimpflug Camera-Based Technique for Multi-Point Displacement Monitoring of Bridges. Sensors. 2022; 22(11):4093. https://doi.org/10.3390/s22114093

Chicago/Turabian Style

Xing, Lei, Wujiao Dai, and Yunsheng Zhang. 2022. "Scheimpflug Camera-Based Technique for Multi-Point Displacement Monitoring of Bridges" Sensors 22, no. 11: 4093. https://doi.org/10.3390/s22114093

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop