6.2. Experimental Environment
The MPU9250 has the lowest power consumption after power-testing three different IMUs. This project mainly designed eyes for robots, so MPU6050 and WT9011G4K were eliminated, and MPU9250 was selected to continue the follow-up experiment. To verify the effectiveness of the EKF fusion method, shown in
Table 2 and
Table 3, the verification system is shown below.
The MPU9250 sensor can transmit data via IIC protocol. In MPU9250, the gyroscope and accelerometer measurement data are composed of high and low positions, respectively, and the high and low positions are combined to form a complete detection data. When the scope of selection is 16 g, the calculation method of
x axis acceleration of the sensor is:
where in
, the x is accelerated velocity, AxH is the high-level data, AxL is the low-level data, and g is the gravity acceleration.
Similarly, y axis and z axis accelerated velocities can be calculated:
When the scope of selection is 2000 dps, the calculation method of x axis angular velocity of the sensor is:
Wherein the is x angular velocity, GxH is the high-level data, and GxL is the low-level data.
Similarly,
y axis and
z axis angular velocities can be calculated:
Find the real data of the magnetic field sensor in a similar way.
Given that we need to normalize the data of magnetic field sensor and acceleration sensor in EKF, we just divide by the maximum value for the raw data we got. We do not need to multiply the range to get a specific value, such as the
X-axis acceleration:
The acceleration obtained by IMU and angular velocity are processed by data fusion algorithm, and the obtained acceleration and angular velocity are compared in IMU. Additionally, STM32 performs data fusion on the Angle of MPU9250 through data fusion algorithm. Besides, it can realize feedback of attitude angle by sending the fused angle to the upper computer.
6.3. Detection of Attitude of Bionic Eye at under External Interference
In this experiment, CF, GD, and EKF are added to the original gyroscope and accelerometer data of MPU9250 for comparison. The data fusion method of CF is known to be:
wherein
(x) is the output angle of first-order filter,
is proportion coefficient,
is the acceleration magnitude of MPU9250 at this moment,
is the gyroscope value at this moment, and
is the degree to which the gyroscope corrects the acceleration value. In this experiment,
was set at 0.02 and
was set at 0.005.
The data fusion method of GD is:
wherein
(x) is the output angle of GD,
is gradient descent value, μ is learning rate,
is current time ratio coefficient,
is the MPU9250 acceleration value at the same time, and
is the gyroscope value at the same time. In this experiment,
and μ were set as 0.5 and 0.005.
In the present study, x and y were set to represent the left and right movement and up and down movement of the bionic eye, respectively. Considering that there is not forward and backward movement of the bionic eye in the movement of the test environment, the
Z-axis coordinate was uniformly set to 200 mm. Secondly, the upper computer gives instructions to measure the actual coordinates of gradient descent filtering, complementary filtering, and EKF successively through MPU9250 under the same environment, same temperature, same motion posture motion, and external vibration interference. In this way, the comparison data between actual coordinates and expected coordinates in
Table 4 were obtained, and the methods in two peer references were added for numerical comparison. The effectiveness of the method was measured based on the mean square error of the actual and expected position of the bionic eye. The mean square error formula is shown below:
wherein: (
,
,
) is the predicted coordinate and (
,
,
) is the expected coordinate.
According to the data in
Table 4, data in
Table 5 were obtained by means of mean square error formula. The average angle error of gradient descent filtering, complementary filtering, and EKF algorithm was 0.62°, 0.61°, and 0.43°, respectively, which proves that the EKF algorithm can effectively reduce the angle error of the bionic eye attitude angle.
Hold the eye level and keep reading the sensor data.
Table 6 shows part of the yaw angle in the static case. The angle offset generated by EKF under static condition is smaller than the other two algorithms.
In the static condition, the gyroscope will produce a large error, resulting in data fluctuation.
Table 7 shows the static yaw angle errors of the yaw Angle after the fusion of the three filters in the static state, which are 0.1017°, 0.1001°, and 0.0462°, respectively. The error value of EKF is smaller than that of the other two algorithms, indicating that EKF can effectively reduce the angle error of bionic eye attitude Angle in static environment.
With the gyroscope and accelerometer data obtained from the MPU9250, the bionic eye was rotated up and down by 40° from the static state, and rotated to the left and right by 80°, and then the eye was turned back to the initial position. After gradient descent filtering correction, the estimated curve was compared with the gyroscope and accelerometer lines of the original IMU. According to
Figure 15, gradient descent filtering begins to converge gradually, but the effect is not as good as expected.
Figure 16 shows that after the bionic eye rotates 40° upward and downward from the static state, and 80° left and right, the eye turns back to the initial position, processing a curve after complementary filtering. Complementary filtering aims to calculate a weighted average of the new sampling value and the previous filtering result. In this regard, the complementary filter is greatly affected by accelerometer and gyroscope, and the data of the complementary filter are relatively delayed and sensitive.
According to
Figure 17, the bionic eye is rotated up and down by 40° from a stationary position and rotated to the left and right by 80° before returning to its original position. The data obtained when the MPU9250 sensor moves are the result of EKF data fusion. The acceleration acquisition data are not smooth. Nonetheless, after EKF data fusion, the process curve is smooth, the filtered data are close to the actual data, and the error can be reduced.
According to the above experiment, EKF algorithm data fusion can significantly improve the attitude detection accuracy of the bionic eye. Additionally, the EKF algorithm can enhance the anti-interference ability of the bionic eye, and it has the obvious effect of filtering convergence and smoothing. The EKF algorithm takes the data measured by gyroscope as the estimated value and the data measured by acceleration as the observed value. By using the attitude angle of accelerometer to correct the attitude angle of gyroscope, the error and zero drift of gyroscope easily affected by external influences are solved. Besides, the precision of bionic eye movement control is improved, and the robot bionic eye can position more accurately and stably, collecting images more effectively.
In addition, in this experiment, the difference between the fixed-point coordinates and the actual coordinates of the bionic eye was used to test the accuracy of the control system for the sake of verifying the feasibility of the proposed algorithm intuitively and accurately. The self-made binocular camera embedded in the bionic eye was used. The focal length of the camera is 6 mm and the camera angle is 90°. The reference is a black and white checkerboard generated based on MATLAB. Checkerboard corner points were used as marking points, and subpixel-level corner point method was used to extract marking points, as shown in
Figure 18. The world coordinate system (WCS) is defined as follows: the X-axis takes a line parallel to the width of the board, which is positive to the right, and the Y-axis takes a line parallel to the width of the board, which is positive to the downward. The Z axis is perpendicular to the checkerboard plane, and the checkerboard facing outward is positive. The origin of the world coordinate system (0 mm, 0 mm, 0 mm) is defined from row 1 and column 1 to calibrate corner points. In the presence of external vibration interference, for testing bionic eyes, we can turn the posture correction system on and off and investigate the accuracy of the left and right cameras of the bionic eye to capture the chessboard and extract the world coordinates.
According to
Figure 18, A represents the original picture of posture correction with the left and right cameras closed when the bionic eye is disturbed by external vibration. B represents the effect of tracking and calibration when the attitude correction system is closed (also the real-time effect of most binocular robot vision technology at present). In this regard, it needs to rely on complex neural networks or large algorithms higher GPU for image processing. Image C is the original picture after opening the posture correction system. The picture distortion can be significantly improved after opening the attitude correction system. The checkerboard in the picture is no longer distorted, which proves that the attitude correction system plays a key role in the picture shooting quality, target tracking, or calibration calculation. In addition, the image with attitude data is more convenient for post-processing, image correction, and calculation.