Next Article in Journal
An X-Band State Adjustable Low Noise Amplifier Using Current Reuse Technique
Previous Article in Journal
Limitations of Multi-GNSS Positioning of USV in Area with High Harbour Infrastructure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Robot Bionic Eye Motion Posture Control System

1
Jewxon Intelligent Technology Co., Ltd., No. 293, Gongren East Road, Jiaojiang District, Taizhou 318000, China
2
School of Computer Science, Semyung University, 65 Semyung-ro, Jecheon-si 27136, Republic of Korea
*
Author to whom correspondence should be addressed.
Electronics 2023, 12(3), 698; https://doi.org/10.3390/electronics12030698
Submission received: 8 January 2023 / Revised: 17 January 2023 / Accepted: 24 January 2023 / Published: 30 January 2023
(This article belongs to the Section Artificial Intelligence)

Abstract

:
This paper mainly studies the structure and system of robot bionic eye. Given that most robots usually run on battery power, the STM32L053C8T6 with high efficiency and super low power consumption was selected as the main control. By carrying IMU, the bionic eye attitude data can be acquired quickly and accurately, and the measurement data of accelerometer and gyroscope can be fused by the algorithm to obtain stable and accurate bionic eye attitude data. Thus, the precise control of the motor can be realized through the drive control system equipped with PCA9685, which can enhance the motion control precision of robot bionic eye. In the present study, three kinds of IMU sensors, MPU6050, MPU9250, and WT9011G4K, were selected to carry out experiments. Finally, MPU9250 with better power consumption and adaptability is selected. This is the attitude acquisition device of bionic eye. In addition, three different filters, CF, GD, and EKF, were used for data fusion and comparison. The experimental result showed that the dynamic mean errors of CF, GD, and EKF are 0.62°, 0.61°, and 0.43°, respectively, and the static mean errors are 0.1017°, 0.1001°, and 0.0462°, respectively. The result showed that, after the use of EKF, the robot bionic eye system designed in this paper can significantly reduce the attitude angle error and effectively improve the image quality. It ensures accuracy and reduces power consumption and cost, which has lower requirements on hardware and is easier to popularize.

1. Introduction

1.1. The Research Backgrounds

Given that the vision system is an important device for machines to acquire external information, the bionic robot bionic eye mimicking human eye is an important research direction in the study of robot vision system. The robot bionic eye consists of a camera, a motor with multiple degrees of freedom, an IMU, an attitude control system, a drive system, and an image processing system. The IMU and attitude control system can obtain accurate positions and attitude data of the bionic eye, then control motors with different degrees of freedom through the driving system adjust the camera acquisition direction, thus achieving simulated eye movement. A bionic eye has a high ability of autonomous perception and recognition. It is able to improve the interactive experience between machine and human effectively, which can be applied to humanoid service robot, binocular autonomous robot, security patrol robot, medical service robot, and other robots. In this design, from the perspective of machine bionics, combining the structure, motion posture, and motion mechanism of human eye, the human eye imitation system is constructed in the hope of laying a solid foundation for the study of robot bionic vision posture correction.
Attitude detection system is a kind of inertial measurement unit to detect the attitude change of the carrier. In the microcontroller, data fusion and attitude solution have been realized, and then the carrier attitude information such as yaw angle, pitch angle, and roll angle is obtained [1]. IIMU is an inertial measurement device which can obtain the attitude angle of moving vehicle in real-time. IMU technology has developed rapidly. It has been widely used in attitude detection systems due to its small size, low cost, high integration, and strong impact resistance [2]. Although a single IMU can be used for attitude angle measurement alone, the accuracy depends on the precision of the inertial device. In this regard, it is difficult to greatly improve the control accuracy only by improving the movement structure and algorithm of the bionic eye, and systematic errors will accumulate over time. The attitude data of the carrier is wrong due to external interference and other factors. Hence, it is hard to obtain relatively true attitude angle data with a single sensor. Semwal et al. extracted data from hips, knees, ankles, calves, thighs, and feet by capturing six different walking patterns through configuring IMU [3]. In terms of the accuracy of attitude angle measurement, the method of multi-sensor signal fusion processing should be considered to obtain the optimal attitude angle and reduce the errors in the movement and tracking of the bionic eye. Thus, the master control system can give reasonable instructions and realize the purpose of accurate control. At present, the research on the attitude detection of the robot bionic eye in the state of ultra-low power consumption is still blank. A low-power-consumption bionic eye movement and posture system is proposed. Attitude detection systems have higher demand for power consumption and precision with the continuous expansion of application fields. In this regard, the research on low-power attitude detection systems based on IMU is of practical significance.

1.2. Status of Bionic Eye Research

At present, the driving technology of bionic eye movement is mostly driven by motors. Hence, a lot of research institutions have achieved fruitful results in this area. In 1996, Active Vision Laboratory of Oxford University developed Yorick series binocular active vision platform to achieve fast tracking of bright spots [4]. The function and characteristics of signal processing based on human HR neural circuits were analyzed by Tokyo Institute of Technology, which started a bionic eye study in 1995. Tokyo Institute of Technology, through the analysis, developed the mathematical model and experimental apparatus of the horizontal eye movement control nervous system of both eyes, which can realize the movement characteristics of human eyes such as saccade, smooth pursuit, vergence movement, and vestibular movement reflex, which can further realize active motion compensation for six degrees of freedom, such as translation and rotation [5]. Notwithstanding, most of these designs can only achieve one or more of the movements of the human eye. The eye imitation system of Tokyo Institute of Technology can realize multiple human eye movements at the same time, but it is only one-dimensional horizontal motion. Therefore, the human eye imitation system is not perfect [6,7,8,9,10,11,12,13,14,15,16,17]. In 2012, Yi-Chiao Lee came up with a two-degree-of-freedom camera with parallel motors. The rotation axis was changed by connecting rods, so that the parallel motors could rotate with two degrees of freedom [18,19].
In 2014, Xu Yanliang from Shandong University proposed a control method for the 4-DOF bionic eye. The single-chip microcomputer was used to control the rotation of the two bionic eyes, and the subdivision current was used to drive the motor of the bionic eye [20]. In 2015, Hao Li from Shandong University made use of two stepper motors to control the movement of the bionic eye with two degrees of freedom, and PMW was used to control the current waveform to improve the motor rotation accuracy [21]. In 2018, a stator magnet hybrid stepper motor designed by Lu Binglin can reduce the installation space of the bionic eye and improve the operation accuracy and positioning accuracy of the bionic eye [22].

1.3. Research Status of Multi-Sensor Combination

In 1984, Draper Laboratory of MIT successfully designed a miniature inertial measurement combination consisting of three gyroscopes and three accelerometers with corresponding control circuits, which have been designed and applied in practical works. It provides the basis for the development of attitude detection equipment [23]. In 2005, Yang Lisha et al. designed an attitude solution system by using the 6DOF inertial measurement unit provided by Rotomotion. LLC combined with MEGA16. However, given the immaturity of the technology, the attitude calculation function is only preliminarily realized on the platform [24]. In 2018, Chuan Liu selected six-axis MEMS inertial measurement unit MPU6050, integrated anisotropic reluctance effect (AMR) three-axis magnetic sensor HMC5883 and barometer BMP180 as its attitude measurement sensitive components, so as to design a small attitude measurement system autonomous sailboat [25]. At present, in the market, both AMR and GMR will often adopt reluctance sensor technology, and the latter has a higher sensitivity than the former. However, according to the design requirements of the system, both of them can be added to the design of the attitude detection system. Zhejiang Sci-tech University of Science and Technology designed a MEMS gyroscope attitude detection system [26]. An attitude detection system based on MEMS magnetometer was proposed by Xi’an University of Science and Technology after its study on attitude detection of tracked robot designed by the University [27].
The VN-100 series aerial attitude reference system, manufactured and produced by VectorNav Technologies, uses a three axis gyroscope, three axis accelerometer, and air pressure sensor as its main attitude sensors. Combined with 32-bit processor and the corresponding attitude calculation algorithm, its volume is 36 × 33 × 9 mm, and external RS-232 interface can transmit attitude data. Under static conditions, the error data of pitch angle and roll angle are 0.5°, while the error data of yaw angle is 2°, which is suitable for UAV, human attitude recognition, and other occasions [28]. Advanced Navigation from Australia developed a tiny ultra-high precision MEMS IMU called Motus by using ultra-high precision MEMS accelerometer and gyroscope. It is fully calibrated for all sensors in a wide temperature range, resulting in a static pitch angle and roll angle error of 0.03° and yaw angle error of 0.1° [29].

1.4. Comparison of Data Fusion Algorithms

Multi-sensor data fusion algorithms for attitude solutions in the academic field mainly include gradient descent algorithm, complementary filtering algorithm, and Kalman filter algorithm.
Kalman derived a recursive estimation algorithm called KF by introducing state space model into filtering theory in 1960 [30]. Due to advances in digital computing, however, KF has been more and more widely used. Additionally, in the continuous update, EKF (extended Kalman filter) algorithm has emerged. It is an efficient recursive filter (autoregressive filter) when the equation of state or measurement equation is nonlinear, usually using the extended Kalman filter. EKF truncates the Taylor expansion of the nonlinear function with first-order linearization and ignores the remaining higher order terms. In this way, the Kalman linear filtering algorithm can be applied to nonlinear systems to transform nonlinear problems into linear ones. The nonlinear problem is solved in this case.
Qiang Fu et al. came up with a practical EKF method for conventional cameras and fisheye cameras. A general corresponding method was designed, which can deal with any number of feature points by considering the EKF implementation. According to the simulation and practical experiments, the proposed EKF method can resist noise and occlusion better than the current filtering method [31]. BanChao et al. designed an adaptive extended Kalman filter (EKF) attitude measurement to improve algorithm. Experimental results showed that EKF is able to effectively improve the attitude measurement accuracy compared with Sage-Husa adaptive filtering algorithm [32]. Guo jie et al. established MATLAB-Simulink attitude simulation and control model of a four-rotor UAV and neural network PID control algorithm to adjust the transition time from 4 s to 1 s. The static error of the system after output stability is only 0.5%, indicating that this control algorithm has high control precision and better static and dynamic features [33]. An extended quaternion Kalman filter algorithm for attitude estimation of small UAV was proposed by Shi Yupeng. This algorithm can be used to solve the problems of low precision and being prone to interference when MEMS is used for attitude measurement of carrier by establishing a quaternion attitude motion model and attitude sensor measurement model, thus effectively improving the accuracy of attitude estimation. Finally, the effectiveness of the proposed algorithm is verified by simulation experiments [34]. By means of the experiment of motion control of spherical robot, LONG Ziyang et al. verified that the accuracy and robustness of attitude solution based on extended EKF are obviously improved, with a stronger ability to resist noise interference. Compared with the attitude solution of the complementary filter, the total attitude angle RMS (root-mean-square) and the mean error decreased by 0.0601 and 0.1984, respectively [35]. The performance of KF and EKF state estimation is excellent, however, the computation is large. In the early stages, the main computing speed of the control chip is slow, which had difficult meeting the high-speed requirements of the EKF algorithm. As a result, it delimited the application of EKF algorithm in the field of motor control. Recently, the computing speed of many MCUS has been immensely improved, which lays the foundation for the application of EKF [36].
According to Table 1, EKF has the advantages of higher fusion precision, a faster convergence rate, and more accurate angles compared with GD and CF. However, it requires more computation quantity. EKF fuses observation data with estimated data to manage errors in a closed-loop manner. Additionally, when the error is limited to a certain range, the error will not become larger with the accumulation of time, as the longer the time, the greater the uncertainty. This may result in uncontrollable errors for long time control. The introduction of observation data can be used to correct the estimated data and prevent large errors of the estimated data. The estimation data integrate the observation data. It is equivalent to the closed-loop feedback management of the estimation of the former. Although EKF still has errors, it has the advantage that it can maintain stable errors in a long time. In general, EKF is able to achieve a commendable filtering effect under the non-linearity of the function without severe interference.

1.5. Main Content and Chapter Arrangement of This Paper

The main content of this paper is to detect the motion attitude of the bionic eye, understand the principle of the sensor to detect the motion attitude of the bionic eye, and the process of attitude detection. We aimed to detect the acceleration and angular velocity of the bionic eye movement by means of IMU and amend data via extended Kalman filter. The chapters of this paper are arranged as follows:
Arrangement of chapters of this paper:
In Section 1, the development of bionic eye attitude control technology is reviewed.
In Section 2, the overall design scheme of the bionic eye system is introduced.
Section 3 introduces the design of the bionic eye attitude detection algorithm and the process of data fusion.
In Section 4, the hardware selection, circuit design, and process of the bionic eye attitude detection system are introduced.
In Section 5, the software design process and method of the bionic eye posture detection system are introduced.
In Section 6, the experimental results and analysis were carried out by three different IMU sensors. By comparing GD, CF, and EKF, the anti-interference ability and accuracy of three different methods are tested.
In Section 7, this chapter analyzes the existing problems in the current research technology and summarizes the research results of this paper.

2. Materials and Methods

2.1. Principle of Bionic Eye Motion Attitude Detection and Selection of Experimental Materials

Eyes are the most important senses for humans to acquire external information and perceive external environment, and they are responsible for at least 80% of human information acquisition tasks. Robot eyes need to imitate the movement of the human eye to achieve the bionic effect [38]. In this design, the motion attitude of the bionic eye is detected. The bionic eye mainly adjusts the angle via two motors to achieve different observation positions. Two motors control the rotation of the bionic eye on the X axis and Y axis, respectively. The point where the two rotating shafts of motor intersect is the center of the bionic eye, which is the installation position of the attitude sensor. Two motors controlling how the bionic eye rotates are shown in Figure 1:
Figure 1 shows the IMU sensor, which can detect three axes [39]. The attitude solution of the bionic eye is based on the attitude angle detection data of the bionic eye. The detection data of gyroscope and accelerometer need to be fused for the sake of realizing the accuracy of the bionic eye attitude angle detection data [40,41,42]. Given that the bionic eye is designed to be compact, the space occupied by a sensor should be as small as possible, and the device that occupies fewer I/O port resources should be selected. IMU attitude detection module is used in this present study to detect the attitude angle of the bionic eye. It adopts a 16-bit ADC inside and can transmit data through IIC. In this regard, only two I/O ports are needed [43]. The measurement scope of the sensor can be adjusted according to the needs of the user. The scope of gyroscope detection is ±250, ±500, ±1000, and ±2000 dps. Measurable scope of accelerometer: ±2, ±4, ±8, and ±16 g, realizing the bionic eye’s fast and slow movement attitude accurate detection [44,45]. The sensor output is programmable and should be set according to different measurements of the bionic eye attitude.
According to Figure 2, when the IMU acquired the pose angle data of the bionic eyeball, the information can be transmitted to the attitude control system through the IIC, and the attitude control system, after processing by the fusion algorithm, sends the information of attitude angle error correction to the drive control system through the USART, which is sent to the drive control system through USART. After obtaining the accurate attitude angle, the drive control system combines the autonomous motion algorithm to drive the corresponding motor movement through PWM, thus realizing the flexible rotation of the eyeball up and down and left and right.
In order to enable the IMU to read the correct movement posture of the bionic eye, IMU is fixed in the bionic eye, and connected with MCU through IIC. Additionally, the rotation posture of IMU is synchronized with that of the bionic eye, and the rotation axis of the sensor yaw angle and pitch angle are kept parallel with the rotation axis of the bionic eye’s left and right rotation motor and up and down rotation motor, which is shown in Figure 3.
The bionic eye model has mechanisms that mimic the movements of the human eye. Each eyeball is equipped with a high-definition camera and IMUs. In the electric control system area, a high-performance GPU image control system, drive control system, and attitude control system are equipped with multiple motors, thus realizing eye movement and eyelid blinking function. With the left eye of the equipment as an example, the bionic left eye movement consists of two motors, of which the sub-motor 1 is installed parallel to the horizontal plane and connected to the bionic eyeball through the horizontal connecting rod to control the bionic eyeball up and down movement. The sub-motor 2 is installed in the vertical horizontal plane, and the bionic eyeball is connected by the left and right horizontal connecting rods to control the left and right movement of the bionic eyeball. The attitude control system is connected with the drive control system and used with the image control system to drive the bionic eye motor angle motion. Figure 4. Structure drawing and physical drawing of bionic eye’s experimental model.

2.2. Overall Scheme Design

Figure 5 shows that the IMU sensor captures the acceleration and angular velocity, respectively, through the accelerometer and gyroscope when the bionic eye rotates. Then, STM32 calculates the signal to obtain the attitude angle of the bionic eye and integrates the data of these two attitude angles to obtain a more stable attitude angle. The camera obtains video through attitude angle signal, and the video algorithm predicts the category. When the current class is true, the image control system transmits a signal to shut down the drive control system, thus keeping the mechanical eye temporarily fixed. The image control system does not transmit the signal when the prediction category is false. The drive control system then receives the attitude angle signal of the bionic eye to rotate the mechanical eye.

2.3. Principle of Detection Attitude Angle of Gyroscope

Gyroscopes are designed to sense and maintain direction based on the principle of conservation of angular momentum. A gyroscope mainly consists of a rotatable rotor. The rotor is mounted on the axis of the gyroscope. As the gyroscope starts to rotate, due to the angular momentum of the rotor, the gyroscope will have a tendency to resist the change of direction. As the object moves at a high speed, the object has an extremely large angular momentum, and the rotation axis of the object will always point in a stable direction. The gyroscope is a directional measurement instrument made as per this property.
A gyroscope will gain angular velocity when the object rotates about its own coordinate axis. Following that, the attitude angle of the object is obtained by integrating the diagonal velocity [46].
Angular velocity:
W = ω x , ω y , ω z T
Angle obtained after integrating the angular velocity:
φ k = t k 1 t k ω x d t
θ k = t k 1 t k ω y d t
ψ k = t k 1 t k ω z d t
As the gyroscope is prone to be affected by external factors, resulting in zero drift error, with the increase of integration time, the zero drift error will accumulate continuously and the detection result will be inaccurate. To this end, it is not reasonable to merely rely on the detection error of gyroscope, as the gyroscope can show the change of attitude angle when the object is moving. The dynamic characteristics of a gyroscope are good, but the static characteristics are poor.

2.4. Principle of Detecting Attitude Angle by Accelerometer

An accelerometer is a kind of inertial sensor which detects the shape variable generated by the static acceleration component of the sensor. The accelerometer detects the attitude angle of the object by detecting the shape variable. Typically, the reference coordinate system of the accelerometer is the direction of the gravitational acceleration, and the static acceleration is orthogonally decomposed to the X and Y axes. As shown in Figure 6, the attitude angle of the object can be expressed as the inverse sine of the acceleration component and the acceleration of gravity [47,48].
βX in Figure 6 can be indicated as:
β x = sin 1 g X g
wherein, gX refers to the component of the acceleration as a result of gravity on the X-axis of the coordinate system.
Therefore, βY and βZ represent:
β Y = sin 1 g Y g
β Z = sin 1 g Z g
wherein gY and gZ are the components of the acceleration of gravity on the Y-axis and Z-axis of the object coordinate system.
Any object will be subjected to other inertial forces besides the acceleration of gravity when the object under measurement is in motion. To this end, the measurement results will be inaccurate in the detection process. The direction of gravity is orthogonal to the yaw angle. Thus, the accelerometer cannot detect the yaw angle of the object.

3. Preliminaries

3.1. EKF Design Based on Quaternion

The ENU coordinate system is adopted in this paper. The carrier coordinate system of IMU is: right, front, up (x, y, z), the rotating order of Euler angle: x, y, z (yaw, pitch, roll), and the EKF based on quaternion mainly adopts two layers of Kalman filter. Firstly, the attitude angle is estimated by the gyroscope and then corrected by the accelerometer and magnetic force measurement model. The acceleration is mainly involved in the check of pitch angle and roll angle, and the magnetometer is mainly involved in the check of yaw angle [49]. Figure 7. Attitude solution flow chart.

3.2. Calculate Initial Euler Angle

First, the initial Euler angle was calculated by the accelerometer and magnetometer. The formula is shown below:
x 0 = a s i n a y a x 2 + a y 2 + a z 2 2 y 0 = a t a n 2 a x a z m x a m y a = m x b c y 0 + m z b s y 0 m x b s y 0 s x 0 + m y b c x 0 m z b * c y 0 s x 0 z 0 = a t a n 2 m y a , m x a
wherein, x0, y0, and z0 are initial pitch angle, initial roll angle, and initial yaw angle.
Then, the Euler Angle is converted to quaternion to obtain the initial quaternion, which is used to represent the rotation. The formula is shown below:
q 1 = cos p 2 cos r 2 cos y 2 sin p 2 sin r 2 sin y 2 q 2 = cos p 2 sin r 2 cos y 2 sin p 2 cos r 2 sin y 2 q 3 = cos p 2 sin r 2 cos y 2 + sin p 2 cos r 2 sin y 2 q 4 = cos p 2 cos r 2 sin y 2 + sin p 2 sin r 2 cos y 2
Wherein p, r, and y represent pitch, roll, and yaw, respectively, namely the rotation angle of x, y, and z axes in the carrier coordinate system of IMU.

3.3. Initialize Noise

The Q process noise covariance matrix and R observation noise covariance matrix are initialized, assuming that the noise between each axis is correlated and independent. It is written in the following form:
Q = ω n 0 0 ω b n R = ω a 0 0 ω m
wherein the ω n is gyro noise, ω b n is gyro bias noise, ω a is measurement noise of accelerometer, and ω m is noise measured by magnetometer.

3.4. Pre-Inspection Process

The state transition equation is a quaternion differential equation. The formula is shown below:
q ^ n b = 1 2 Ω n b n q n b q ˙ 0 q ˙ 1 q ˙ 2 q ˙ 3 = 1 2 0 ω x ω x 0 ω y ω z ω z ω y ω y ω z ω z ω y 0 ω x ω x 0 q 0 q 1 q 2 q 3
The above formula is quaternion from the navigation coordinate system to the carrier coordinate system, representing quaternion differential equations in continuous domain. In the actual use of the Kalman filter, the most commonly used is the discrete domain, so it is needed to convert to the discrete domain, derived as follows:
State transition equation:
x ^ k = A k x ^ k 1 + B k u k
Quaternion differential equation is shown:
q ˙ n b = lim T 0 q n b t + T q n b t T = A T C q n b t q n b t + T = q n b t + A T C q n b t T = I + A T C T q n b t A k 1 = I + A T C T = I + 1 2 Ω n b n T A k 1 = 1 2 T 1 ω x ω b x ω y ω b y ω z ω b z ω x ω b x 1 ω z ω b z ω y ω b y ω y ω b y ω z ω b z 1 ω x ω b x ω z ω b z ω y ω b y ω x ω b x 1
A k 1 part of the state transition matrix, as it should take into account the gyroscope bias. The state variable dimension is 7: Quaternion + bias. The final state transition matrix is:
q k + 1 ω b k + 1 = I + 1 2 Ω n b n T 0 0 I q k ω b k
The quaternion of the state transition equation is normalized.
Set   q k + 1 ω b k + 1 = g x k   x k = q k ω b k  
Jacobian matrix solution of state transition equation is shown below:
G k = g x k x k
The result is:
G k = I + 1 2 Ω nb n T T 2 L q k 0 I 3 × 3 0 I
The second step of the pre-inspection process is the covariance calculation formula. The formula is shown below:
P k = G k P k 1 G K T + Q k 1

3.5. Calculating Measurement Equation

First, find the measurement equation of the accelerometer. The derivation process is as follows. The unit of the accelerometer is g:
R n b = q 0 2 + q 1 2 q 2 2 q 3 2 2 q 1 q 2 2 q 0 q 3 2 q 1 q 3 + 2 q 0 q 2 2 q 1 q 2 + 2 q 0 q 3 q 0 2 q 1 2 + q 2 2 q 3 2 2 q 2 q 3 2 q 0 q 1 2 q 1 q 3 2 q 0 q 2 2 q 2 q 3 + 2 q 0 q 1 q 0 2 q 1 2 q 2 2 + q 3 2
R n b is the transformation matrix of the navigation system to the carrier coordinate system. This matrix has nothing to do with rotating order. The projection of accelerometer data in carrier coordinates:
h 1 q k = R b n 0 0 1 = 2 q 1 q 3 2 q 0 q 2 2 q 2 q 3 + 2 q 0 q 1 q 0 2 q 1 2 q 2 2 + q 3 2
According to the formula above, given that the accelerometer calculates the non-gravitational acceleration, the above measurement equation is only satisfied if the accelerometer is not moving violently. In this regard, the measurement noise of acceleration measurement equation is adjusted adaptively according to the intensity of motion.
Set the triaxial data of the accelerometer: a x b , a y b , a z b , ω a is the accelerometer measured equation. Thus, the measurement equation of the accelerometer is as follows:
a x b a y b a z b = 2 q 1 q 3 2 q 0 q 2 2 q 2 q 3 + 2 q 0 q 1 q 0 2 q 1 2 q 2 2 + q 3 2 + ω a
Next, the Jacobian matrix is obtained for the above measurement equation, and the final solution results are as follows:
h 1 q k q k = 2 q 2 2 q 3 2 q 1 2 q 0 2 q 0 2 q 1 2 q 0 2 q 1 2 q 3 2 q 2 2 q 2 2 q 3
The partial derivative for gyroscope bias is just zero. Hence, the overall Jacobian matrix is shown below:
h 1 q k x k = h 1 q k q k 0 3 × 3
Next, find the measurement equation of the magnetometer: the magnetic field data are projected into the navigation coordinate system first, which is going to use quaternion notation for coordinate transformations. Assume that the magnetic field data in carrier coordinate system are: m x b , m y b , m z b , m x n , m y n , m z n . Again, we need to normalize the magnetic field data in this case:
0 m x n m y n m z n = q k 0 m x n m y n m z n q k *
There are no data for the left and right magnetic fields in the navigation coordinate system in the magnetic field measurement model. Hence, the data are processed as below:
m ^ n = 0 0 m x b × m x b + m y b × m y b 2 m z n
Then, convert the magnetic field data to the carrier coordinate system:
m ^ b = R b n 0 m ^ y n m ^ z n = q 0 2 + q 1 2 q 2 2 q 3 2 2 q 1 q 2 2 q 0 q 3 2 q 1 q 3 + 2 q 0 q 2 2 q 1 q 2 + 2 q 0 q 3 q 0 2 q 1 2 + q 2 2 q 3 2 2 q 2 q 3 2 q 0 q 1 2 q 1 q 3 2 q 0 q 2 2 q 2 q 3 + 2 q 0 q 1 q 0 2 q 1 2 q 2 2 + q 3 2 T 0 m ^ y n m ^ z n = m ^ y n 2 q 1 q 2 + 2 q 0 q 3 + m ^ z n 2 q 1 q 3 2 q 0 q 2 m ^ y n q 0 2 q 1 2 + q 2 2 q 3 2 + m ^ z n 2 q 2 q 3 + 2 q 0 q 1 m ^ y n 2 q 2 q 3 2 q 0 q 1 + m ^ z n q 0 2 q 1 2 q 2 2 + q 3 2
Hence, the final measurement equation is as follows, in which ω m   is the magnetometer measurement equation mentioned above:
m x b m y b m z b = m ^ y n 2 q 1 q 2 + 2 q 0 q 3 + m ^ z n 2 q 1 q 3 2 q 0 q 2 m ^ y n q 0 2 q 1 2 + q 2 2 q 3 2 + m ^ z n 2 q 2 q 3 + 2 q 0 q 1 m ^ y n 2 q 2 q 3 2 q 0 q 1 + m ^ z n q 0 2 q 1 2 q 2 2 + q 3 2 + ω m
Next, the Jacobian matrix of the magnetometer measurement equation is obtained as below:
m ^ b q k = 2 m ^ y n q 3 m ^ z n q 2 2 m ^ y n q 2 + m ^ z n q 3 2 m ^ y n q 1 + m ^ z n q 0 2 m ^ y n q 0 + m ^ z n q 1 2 m ^ y n q 0 + m ^ z n q 1 2 m ^ y n q 0 m ^ z n q 1 2 m ^ y n q 2 + m ^ z n q 3 2 m ^ z n q 2 m ^ y n q 3 2 m ^ z n q 0 m ^ y n q 1 2 m ^ y n q 0 + m ^ z n q 1 2 m ^ y n q 3 m ^ z n q 2 2 m ^ y n q 2 + m ^ z n q 3
Complete magnetic field measurement equation of the Jacobian matrix:
H k = h 1 q k x k h 2 q k x k = h 1 q k q k 0 3 × 3 m ^ b q k 0 3 × 3
Total measurement equation:
h k ( x ^ k ) = 2 q 1 q 3 2 q 0 q 2 2 q 2 q 3 + 2 q 0 q 1 q 0 2 q 1 2 q 2 2 + q 3 2 m ^ y n 2 q 1 q 2 + 2 q 0 q 3 + m ^ z n 2 q 1 q 3 2 q 0 q 2 m ^ y n q 0 2 q 1 2 + q 2 2 q 3 2 + m ^ z n 2 q 2 q 3 + 2 q 0 q 1 m ^ y n 2 q 2 q 3 2 q 0 q 1 + m ^ z n q 0 2 q 1 2 q 2 2 + q 3 2 + ω a 3 × 1 ω m 3 × 1
Calculate Kalman gain
H k = P k H k T H k P k H k T + R
Update the posterior state
x ^ k = x ^ k + K k Z k h k x ^ k
Z k the acceleration data and magnetic field data after normalization processing.
Update the posterior covariance matrix
P k = I K k H k P k

3.6. Quaternion-Euler Angle Transforming

For the extended Kalman filter, the attitude data are represented by quaternion and needs to be converted into Euler angle for the convenience of display. Assume that the angles of yaw, pitch, and roll are z, y, and x, respectively. The conversion formula is shown below.
x = a s i n 2 q 2 q 3 + 2 q 0 q 1 y = a t a n 2 2 q 0 q 2 2 q 1 q 3 , q 0 2 q 1 2 q 2 2 + q 3 2 z = a t a n 2 2 q 0 q 3 2 q 1 q 2 , q 0 2 q 1 2 + q 2 2 q 3 2

4. Hardware System Design

4.1. MCU Selection

Given that most robots are mobile and run on battery power, when choosing the processor, we should take into account the miniaturization, power consumption, interface, universality, and expansibility of the system. In this paper, after screening the mainstream miniaturized embedded MCU on the market, we finally chose a STM32L053C8T6 which not only has low power consumption but also has rich interface [50]. As the processor of attitude control system, STM32L053C8T6 has 48 pins. The pin diagram is shown in Figure 8:
STM32L053C8T6 provides high power and high efficiency performance for the system. In standby mode, the lowest power consumption is 0.29 μA. The chip contains a lot of internal and external clock sources. The wide range of performance required by this system is selected through these clock sources, combined with several low power modes and internal voltage adaptive realization. In addition, STM32L053C8T6 contains a lot of communication interfaces in order to achieve rich functions, which are, respectively, 3 USART, 3 I2C, 1 I2S, 7 channel DMA controller and SPI, a crystal free USB and a low power UART (LPUART).
Figure 9 shows the core circuit of the attitude control system. The DC can convert 5 V voltage to 3.3 V to supply power to STM32L053C8T6 based on the HT7333 low-power power management module. In addition, the debugging interface of flash reprogramming and TTL level control interface are designed, which can satisfy the recognition and communication of the host computer. USART interface can transmit the precise attitude angle according to the precise attitude angle after fusion.

4.2. IMU Circuit Design

In this design, it is required to detect the attitude angle of the bionic eye robot. At present, a variety of sensors are able to detect the attitude angle of the bionic eye robot, such as: accelerometer, gyroscope, and so on. Nonetheless, it is necessary to choose a more suitable sensor due to space limitation and external environment interference. Besides, eyes are binocular stereoscopic images, and a single sensor cannot realize the complete detection of bionic eyes. This design selects a more appropriate gyroscope and accelerometer to detect the attitude angle of the bionic eye. To control the overall system power consumption, IMUs also need to look for low power consumption. To this end, three different sensors were prepared for easy comparison, namely MPU6050 [51], MPU9250 [52], and WT9011G4K [53]. MPU6050 is the world’s first integrated 6-axis motion processing component launched by InvenSense Company, where MPU6050 and a 3-axis gyroscope and a 3-axis acceleration sensor are integrated. MPU9250 is a 9-axis motion tracking device launched by InvenSense company. It is internally integrated with a 3-axis gyroscope and 3-axis accelerometer and AK8963′s 3-axis magnetometer. WT9011G4K, as a high-performance 3D motion attitude measurement system based on MEMS technology, contains motion sensors such as a three-axis gyroscope, a triaxial accelerometer, and a three-axis electronic compass. A typical IMU circuit diagram is shown in Figure 10.

4.3. Drive Control System Circuit Design

The system uses BaseCAM stepper motor of GYEMS Company. A PCA9685 DC motor drive chip was used to drive the motor [54]. Figure 11 shows that the PCA9685 has a high-current MOSFET-H bridge structure with two-channel circuit output, so it can drive more than two motors simultaneously.

5. Software System Design

The software design of this system is based on STM32 MCU hardware platform, mainly to achieve the bionic eye drive and tracking. The software programming language is C language, and the development tool is Keil.

5.1. Software Flow

The software flow chart of the system is shown in Figure 12. A timed interrupt of 15 m/s is used here to ensure a stable control period. After the system is powered on, it initializes function modules and peripherals. During the initialization, the register writing to the module is interrupted in order to prevent the initial process. In this regard, it turns off all interrupts for the controller before initializing the various modules and peripherals, etc. Following that, the initialization starts for each module and peripheral in the controller. When everything is ready, the initialization starts with a periodic interrupt of 15 m/s to end the initialization process. It will automatically jump to the timing interrupt function every 15 m/s. After entering the interrupt function, the data of IMU sensor will be read first. After obtaining the data, the master chip will calculate the inclination angle, angular velocity and other attitude data of the bionic eye. After that, EKF was performed to obtain accurate values, and appropriate PWM signals were output according to accurate attitude data to control the rotation of the motor and maintain the balance of the bionic eye. The video classification model predicts the category if the camera captures the object. If the target is predicted, the timing signal will be given to continue to drive the bionic eye to follow the target, thus achieving the effect of tracking the target.

5.2. PID Software Design

When the binocular bionic eyes tilt, to keep the bionic eye balanced, the system uses PID algorithm to control, through the bionic eye angle, angular speed, acceleration and position data fusion, and output PWM signal to drive the motor. For the angle and angular velocity data fusion and output control structure diagram, in which kd and kp are PID controller-related parameters, refer to Figure 13.

6. Materials and Methods

6.1. Power Consumption Evaluation

Most robots are mobile, so they are usually powered by batteries. Nonetheless, the biggest problem with battery power supply is the high-power consumption of components, which causes the battery to be consumed quickly. IMU should also be selected with low power consumption as the main component of robot bionic eye attitude information acquisition. In this experiment, three IMUs, MPU9250, MPU6050, and WT9011G4K, were selected and installed into the bionic eye model, respectively. The power consumption was tested in the independent motion state, and the following results were obtained: MPU9250 has a power consumption of 3.28 mA, MPU6050 power consumption is 5 mA, and WT9011G4K power consumption was 11.26 mA. Test results show that MPU9250 has the lowest power consumption. Refer to Figure 14:

6.2. Experimental Environment

The MPU9250 has the lowest power consumption after power-testing three different IMUs. This project mainly designed eyes for robots, so MPU6050 and WT9011G4K were eliminated, and MPU9250 was selected to continue the follow-up experiment. To verify the effectiveness of the EKF fusion method, shown in Table 2 and Table 3, the verification system is shown below.
The MPU9250 sensor can transmit data via IIC protocol. In MPU9250, the gyroscope and accelerometer measurement data are composed of high and low positions, respectively, and the high and low positions are combined to form a complete detection data. When the scope of selection is 16 g, the calculation method of x axis acceleration of the sensor is:
a x = AxH 8 | AxL 32 , 768 × 16   g
where in   a x , the x is accelerated velocity, AxH is the high-level data, AxL is the low-level data, and g is the gravity acceleration.
Similarly, y axis and z axis accelerated velocities can be calculated:
a y = AyH 8 | AyL 32 , 768 × 16   g
a z = AzH 8 | AzL 32 , 768 × 16   g
When the scope of selection is 2000 dps, the calculation method of x axis angular velocity of the sensor is:
g x = GxH 8 | GxL 32 , 768 × 2000
Wherein the g x is x angular velocity, GxH is the high-level data, and GxL is the low-level data.
Similarly, y axis and z axis angular velocities can be calculated:
g y = GyH 8 | GyL 32 , 768 × 2000
g z = GzH 8 | G   zL 32 , 768 × 2000
Find the real data of the magnetic field sensor in a similar way.
Given that we need to normalize the data of magnetic field sensor and acceleration sensor in EKF, we just divide by the maximum value for the raw data we got. We do not need to multiply the range to get a specific value, such as the X-axis acceleration:
a x = AzH 8 | AzL 32 , 768
The acceleration obtained by IMU and angular velocity are processed by data fusion algorithm, and the obtained acceleration and angular velocity are compared in IMU. a x g x Additionally, STM32 performs data fusion on the Angle of MPU9250 through data fusion algorithm. Besides, it can realize feedback of attitude angle by sending the fused angle to the upper computer.

6.3. Detection of Attitude of Bionic Eye at under External Interference

In this experiment, CF, GD, and EKF are added to the original gyroscope and accelerometer data of MPU9250 for comparison. The data fusion method of CF is known to be:
A n g l e ( x ) = K × a n g l e m + 1 K × ( A n g l e ( x 1 ) + g y r o m × L ) ;
wherein A n g l e (x) is the output angle of first-order filter, K is proportion coefficient, a n g l e _ m is the acceleration magnitude of MPU9250 at this moment, g y r o _ m is the gyroscope value at this moment, and L is the degree to which the gyroscope corrects the acceleration value. In this experiment, K was set at 0.02 and L was set at 0.005.
The data fusion method of GD is:
d K = A n g l e x = A n g l e x K  
k n = k n 1 μ × d K
A n g l e ( x ) = K n × a n g l e m + 1 k × g y r o m
wherein A n g l e (x) is the output angle of GD, d K is gradient descent value, μ is learning rate, k n is current time ratio coefficient, a n g l e m is the MPU9250 acceleration value at the same time, and g y r o m is the gyroscope value at the same time. In this experiment, K and μ were set as 0.5 and 0.005.
In the present study, x and y were set to represent the left and right movement and up and down movement of the bionic eye, respectively. Considering that there is not forward and backward movement of the bionic eye in the movement of the test environment, the Z-axis coordinate was uniformly set to 200 mm. Secondly, the upper computer gives instructions to measure the actual coordinates of gradient descent filtering, complementary filtering, and EKF successively through MPU9250 under the same environment, same temperature, same motion posture motion, and external vibration interference. In this way, the comparison data between actual coordinates and expected coordinates in Table 4 were obtained, and the methods in two peer references were added for numerical comparison. The effectiveness of the method was measured based on the mean square error of the actual and expected position of the bionic eye. The mean square error formula is shown below:
e = x x 2 + y y 2 + z z 2 3
wherein: ( x , y , z ) is the predicted coordinate and ( x , y , z ) is the expected coordinate.
According to the data in Table 4, data in Table 5 were obtained by means of mean square error formula. The average angle error of gradient descent filtering, complementary filtering, and EKF algorithm was 0.62°, 0.61°, and 0.43°, respectively, which proves that the EKF algorithm can effectively reduce the angle error of the bionic eye attitude angle.
Hold the eye level and keep reading the sensor data. Table 6 shows part of the yaw angle in the static case. The angle offset generated by EKF under static condition is smaller than the other two algorithms.
In the static condition, the gyroscope will produce a large error, resulting in data fluctuation. Table 7 shows the static yaw angle errors of the yaw Angle after the fusion of the three filters in the static state, which are 0.1017°, 0.1001°, and 0.0462°, respectively. The error value of EKF is smaller than that of the other two algorithms, indicating that EKF can effectively reduce the angle error of bionic eye attitude Angle in static environment.
With the gyroscope and accelerometer data obtained from the MPU9250, the bionic eye was rotated up and down by 40° from the static state, and rotated to the left and right by 80°, and then the eye was turned back to the initial position. After gradient descent filtering correction, the estimated curve was compared with the gyroscope and accelerometer lines of the original IMU. According to Figure 15, gradient descent filtering begins to converge gradually, but the effect is not as good as expected.
Figure 16 shows that after the bionic eye rotates 40° upward and downward from the static state, and 80° left and right, the eye turns back to the initial position, processing a curve after complementary filtering. Complementary filtering aims to calculate a weighted average of the new sampling value and the previous filtering result. In this regard, the complementary filter is greatly affected by accelerometer and gyroscope, and the data of the complementary filter are relatively delayed and sensitive.
According to Figure 17, the bionic eye is rotated up and down by 40° from a stationary position and rotated to the left and right by 80° before returning to its original position. The data obtained when the MPU9250 sensor moves are the result of EKF data fusion. The acceleration acquisition data are not smooth. Nonetheless, after EKF data fusion, the process curve is smooth, the filtered data are close to the actual data, and the error can be reduced.
According to the above experiment, EKF algorithm data fusion can significantly improve the attitude detection accuracy of the bionic eye. Additionally, the EKF algorithm can enhance the anti-interference ability of the bionic eye, and it has the obvious effect of filtering convergence and smoothing. The EKF algorithm takes the data measured by gyroscope as the estimated value and the data measured by acceleration as the observed value. By using the attitude angle of accelerometer to correct the attitude angle of gyroscope, the error and zero drift of gyroscope easily affected by external influences are solved. Besides, the precision of bionic eye movement control is improved, and the robot bionic eye can position more accurately and stably, collecting images more effectively.
In addition, in this experiment, the difference between the fixed-point coordinates and the actual coordinates of the bionic eye was used to test the accuracy of the control system for the sake of verifying the feasibility of the proposed algorithm intuitively and accurately. The self-made binocular camera embedded in the bionic eye was used. The focal length of the camera is 6 mm and the camera angle is 90°. The reference is a black and white checkerboard generated based on MATLAB. Checkerboard corner points were used as marking points, and subpixel-level corner point method was used to extract marking points, as shown in Figure 18. The world coordinate system (WCS) is defined as follows: the X-axis takes a line parallel to the width of the board, which is positive to the right, and the Y-axis takes a line parallel to the width of the board, which is positive to the downward. The Z axis is perpendicular to the checkerboard plane, and the checkerboard facing outward is positive. The origin of the world coordinate system (0 mm, 0 mm, 0 mm) is defined from row 1 and column 1 to calibrate corner points. In the presence of external vibration interference, for testing bionic eyes, we can turn the posture correction system on and off and investigate the accuracy of the left and right cameras of the bionic eye to capture the chessboard and extract the world coordinates.
According to Figure 18, A represents the original picture of posture correction with the left and right cameras closed when the bionic eye is disturbed by external vibration. B represents the effect of tracking and calibration when the attitude correction system is closed (also the real-time effect of most binocular robot vision technology at present). In this regard, it needs to rely on complex neural networks or large algorithms higher GPU for image processing. Image C is the original picture after opening the posture correction system. The picture distortion can be significantly improved after opening the attitude correction system. The checkerboard in the picture is no longer distorted, which proves that the attitude correction system plays a key role in the picture shooting quality, target tracking, or calibration calculation. In addition, the image with attitude data is more convenient for post-processing, image correction, and calculation.

7. Conclusions

In recent years, machine vision has become a popular research direction in the field of artificial intelligence. However, research around robot bionic eye is still in its infancy. In the subdivision research field of robot bionic eye, most research stays in the direction of electronic control and algorithm of bionic eyes. This research lacks monitoring of their own data, stability, and reliability. In this design, the structure and system design of the robot bionic eye are mainly studied, especially the attitude control system of the bionic eye. Given that most robots usually run on battery power, the MCU with high efficiency and low power consumption was selected as the main control. By carrying IMU, the bionic eye attitude data can be acquired quickly and accurately, and the measurement data of accelerometer and gyroscope can be fused by the algorithm to obtain stable and accurate bionic eye attitude data, thus achieving precise control of the motor through drive system can enhance the motion control precision of robot bionic eye. Three commonly used IMUs in the industry were selected for power testing in this experiment, and the attitude angle error of gyroscope and accelerometer data fusion was tested by three kinds of filters respectively for the sake of verifying the reliability of the design. Finally, IMUs and filters more suitable for this system are selected. The research summary is shown below:
(1)
Reviewing the history of bionic eye research and attitude control techniques, the problems are found, and the overall scheme of bionic eye posture detection is designed. In terms of specific hardware system, MCU is adopted as the main control module of the bionic eye posture detection system. The IMU mainly obtains the posture data of the bionic eye. The two interact through IIC.
(2)
The process of EKF algorithm is studied. The observation data and estimated data are fused by EKF algorithm. Then, the error is limited to a certain range. In this study, using ENU coordinate system and EKF based on quaternion, the attitude angle was estimated by collecting gyroscope through two-layer Kalman filter. Next, the measurement model of accelerometer and magnetic force were used for secondary correction, in which the acceleration was mainly involved in the calibration of pitch angle, and roll angle, and the magnetometer was mainly involved in the calibration of yaw angle.
(3)
The software and hardware of the bionic eye system are designed based on the specific function of the attitude detection system. After screening the mainstream miniaturized embedded MCU in the market, the final choice is not only ultra-low power consumption. Additionally, STM32L053C8T6 with abundant interfaces serves as the main control, and is paired with IMU to obtain the bionic eye movement and posture data quickly and accurately, thus realizing the precise control of the PCA9685 drive system and motor.
(4)
Three different IMU sensors were used to conduct experiments in the experimental stage to verify the reliability of the bionic eye designed in this paper. The CF, GD, and EKF were compared to test the anti-interference ability and accuracy of three different methods. Experimental results showed that the dynamic mean errors of CF, GD, and EKF are 0.62°, 0.61°, and 0.43°, respectively, and the static mean errors are 0.1017°, 0.1001°, and 0.0462°, respectively. Through opening or closing the posture correction system of the bionic eye and comparing the left and right cameras to the checkerboard screen, the experiment found that the distortion of the picture after opening became significantly smaller, which proved the quality of the posture correction system for the picture shooting. Target tracking or calibration calculation plays a key role, and the image with attitude data is more convenient for post-processing, image correction and calculation, and other functions.
In this present study, the traditional bionic eye driving algorithm was improved via the whole system design and algorithm fusion. Compared with the algorithm, the EKF algorithm adopted in this present study has less error compared with gradient descent filtering and complementary filtering. Further, the system is designed for low power consumption and is more suitable for autonomous passive robots. Compared with higher-end systems, this paper ensures accuracy and reduces power consumption and cost, which has lower requirements on hardware and is easier to popularize. Future innovations in robotic bionic vision are a long way off. After constant research and exploration by human beings, robots are about to understand the world and completely replace human jobs.

Author Contributions

Conceptualization, H.Z.; methodology, H.Z.; validation, H.Z.; formal analysis, H.Z.; investigation, H.Z.; resources, H.Z.; writing—original draft preparation, H.Z.; writing—review and editing, H.Z.; supervision, S.L.; project administration, H.Z.; All authors have read and agreed to the published version of the manuscript.

Funding

This study received no external funding.

Data Availability Statement

The data presented in this study are available upon request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
IMUInertial measurement unit
MCUMicrocontroller unit
MEMSMicro-Electro-Mechanical Systems
USARTUniversal Synchronous/Asynchronous Receiver/Transmitter
PWMPulse Width Modulation
PIDProportion, Integral, Differential coefficient algorithm
DOFDegree of freedom
KFKalman Filter
EKFExtended Kalman Filter
GDGradient descent
CFComplementary Filters
ADCAnalog-to-digitalconverter
IICInter-Integrated Circuit
PIDProportional Integral Derivative
ENUlocal Cartesian coordinates coordinate system
I2CInter-Integrated Circuit
TTLTransistor-Transistor Logic
SPIService Provider Interface
I2SInter-IC Sound

References

  1. Wang, C.; Li, Z.; Kang, Y.; Li, Y.; Xos, M.P.; Pardo, X.M. Applying SLAM Algorithm Based on Nonlinear Optimized Monocular Vision and IMU in the Positioning Method of Power Inspection Robot in Complex Environment. Math. Probl. Eng. 2022, 2022, 3378163. [Google Scholar] [CrossRef]
  2. Liang, C.; Yu, C.; Qin, Y.; Wang, Y.; Shi, Y. DualRing: Enabling Subtle and Expressive Hand Interaction with Dual IMU Rings. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 1–27. [Google Scholar] [CrossRef]
  3. Semwal, V.B.; Gaud, N.; Lalwani, P.; Bijalwan, V.; Alok, A.K. Pattern identification of different human joints for different human walking styles using inertial measurement unit (IMU) sensor. Artif. Intell. Rev. 2022, 55, 1149–1169. [Google Scholar] [CrossRef]
  4. Sharkey, P.M.; Murray, D.W.; Mclauchlan, P.F.; Brooker, J.P. Hardware development of the Yorick series of active vision systems. Microprocess. Microsyst. 1998, 21, 363–375. [Google Scholar] [CrossRef]
  5. Wakamatsu, H.; Zhang, X. Optical Axis Control System as Unification of Reflex and Pursuit Eye Movements. Trans. Electron. Inf. Syst. 1997, 111–117, 1688–1695. [Google Scholar]
  6. Liu, Y.; Zhu, D.; Peng, J.; Wang, X.; Wang, L.; Chen, L.; Li, J.; Zhang, X. Real-Time Robust Stereo Visual SLAM System Based on Bionic Eyes. IEEE Trans. Med. Robot. Bionics 2020, 2, 391–398. [Google Scholar] [CrossRef]
  7. Liu, Y.; Zhu, D.; Peng, J.; Wang, X.; Wang, L.; Chen, L.; Li, J.; Zhang, X. Robust Active Visual SLAM System Based on Bionic Eyes. In Proceedings of the 2019 IEEE International Conference on Cyborg and Bionic Systems (CBS), Munich, Germany, 18–20 September 2019; pp. 340–345. [Google Scholar]
  8. Li, B.; Zhang, X.; Sato, M. Pitch angle estimation using a Vehicle-Mounted monocular camera for range measurement. In Proceedings of the 2014 12th International Conference on Signal Processing (ICSP), Hangzhou, China, 19–23 October 2014; pp. 1161–1168. [Google Scholar]
  9. Zhang, X. Novel Human Fixational Eye Movements Detection using Sclera Images of the Eyeball. Jpn. J. Appl. Physiol. 2012, 42, 143–152. [Google Scholar]
  10. Song, Y.; Zhang, X. An Integrated System for Basic Eye Movements. J. Inst. Image Inf. Telev. Eng. 2012, 66, J453–J460. [Google Scholar] [CrossRef] [Green Version]
  11. Zhang, X. Wide Area Tracking System Using Three Zoom Cameras. Ph.D. Thesis, Tokyo Institute of Technology, Tokyo, Japan, 2011. [Google Scholar]
  12. Song, Y.; Zhang, X. Translational Vestibulo-Ocular Reflex Model for Robotic Binocular Motor Control System. J. Robot. Soc. Jpn. 2009, 27, 1123–1131. [Google Scholar] [CrossRef] [Green Version]
  13. Zhang, X. A Binocular Camera System for Wide Area Surveillance. J. Inst. Image Inf. Telev. Eng. 2009, 63, 1828–1837. [Google Scholar]
  14. Zhang, X. A Mathematical Model of a Neuron with Synapses based on Physiology. Nat. Preced. 2008. [Google Scholar] [CrossRef]
  15. Zhang, X. Cooperative Movements of Binocular Motor System. In Proceedings of the 2008 IEEE International Conference on Automation Science and Engineering, Arlington, VA, USA, 23–26 August 2008. [Google Scholar]
  16. Zhang, X. A Novel Methodology for High Accuracy Fixational Eye Movements Detection. In Proceedings of the 2012 4th International Conference on Bioinformatics and Biomedical Technology, Singapore, 26–28 February 2012. [Google Scholar]
  17. Zhang, X. An Object Tracking System Based on Human Neural Pathways of Binocular Motor System. In Proceedings of the 2006 9th International Conference on Control, Automation, Robotics and Vision, Singapore, 5–8 December 2006. [Google Scholar]
  18. Li, H.; Luo, J.; Chen, J.; Liu, Z.; Xie, S. Development of Robot Bionic Eye with Spherical Parallel Manipulator Based on Oculomotor Control Model. Prz. Elektrotechniczny 2012, 88, 1–7. [Google Scholar]
  19. Lee, Y.; Lan, C.; Chu, C.; Lai, C.; Chen, Y. A Pan-Tilt Orienting Mechanism With Parallel Axes of Flexural Actuation. IEEE/ASME Trans. Mechatron. 2013, 18, 1100–1112. [Google Scholar] [CrossRef]
  20. Li, H.; Xu, Y.; Tian, H.; Lu, B.; Gao, F. Driving and Controlling System of Four Degrees of Freedom Stepping Motors Used in Biomimetic Eye. Micromotors 2014, 47, 5–34. [Google Scholar]
  21. Li, H. Research on Stepper Motor Driving Control System for Bionic Eyeball. Master’s Thesis, Shandong University, Jinan, China, 2015. [Google Scholar]
  22. He, H.; Xu, Y.; Lu, B. 2-Dimensional Equivalent Model Analysis Method of a Novel Spherical 2-DOF Hybrid Stepping Motor. Micromotors 2018, 51, 12–16. [Google Scholar] [CrossRef]
  23. Mellit, A. Inertial Technology for the Future. IEEE Trans. Aerosp. Electron. Syst. 1984, 20, 414–444. [Google Scholar]
  24. Lv, C.; Zhang, X.; Tan, J.; Chao, Z. Hardware acceleration technology for UAV attitude computation based on Cortex-M4F kernel. China Meas. Test. Technol. 2005, 44, 106–111. [Google Scholar]
  25. Liu, C.; Yang, S.; Duan, T.; Huang, J.; Wang, Z. Motion Control of an One-meter Class Autonomous Sailboat. In Proceedings of the 2018 IEEE 8th International Conference on Underwater System Technology: Theory and Applications (USYS), Wuhan, China, 1–3 December 2018; pp. 1–6. [Google Scholar]
  26. Zhao, X.; Du, P.; Li, H.; Yang, H. Attitude Estimation System based on MEMS accelerometer and gyroscope. Railw. Comput. Appl. 2012, 21, 4–15. [Google Scholar]
  27. Li, W. Research on Attitude Detection and Control of Coal mine Rescue Robot. Xi’an University of Science and Technology. Master’s Thesis, Xi’an, China, 2011. [Google Scholar]
  28. Web. VN-100. Available online: https://www.vectornav.com/products/vn-100 (accessed on 9 August 2022).
  29. Web. MOTUS. Available online: https://www.advancednavigation.com/product/motus (accessed on 9 August 2022).
  30. Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. ASME J. Basic Eng. 1960, 1960, 81–82. [Google Scholar] [CrossRef] [Green Version]
  31. Fu, Q.; Quan, Q.; Cai, K. Robust Pose Estimation for Multirotor UAVs Using Off-Board Monocular Vision. IEEE Trans. Ind. Electron. 2017, 64, 7942–7951. [Google Scholar] [CrossRef]
  32. Ban, C.; Ren, G.; Wang, B.; Chen, X.J. Research on adaptive EKF Measurement Algorithm of robot attitude based on IMU. J. Sci. Instrum. 2020, 41, 33–39. (In Chinese) [Google Scholar]
  33. Guo, J.; Jin, H.; Shen, X. Optimization control of quadrotor UAV based on neural network PID algorithm. Electron. Technol. 2021, 34, 51–55. (In Chinese) [Google Scholar]
  34. Shi, Y.; Ma, H.; Chen, B. Extended Kalman filtering algorithm for quadrotor UAV based on quaternion. Control Eng. 2021, 28, 2131–2135. (In Chinese) [Google Scholar]
  35. Long, Z.; Xiang, P.; Sui, G. Attitude calculation of spherical robot based on Extended Kalman filter. Softw. Eng. 2022, 25, 47–50. (In Chinese) [Google Scholar]
  36. Li, Z. Research on Sensorless Control System of Induction Motor Based on Adaptive EKF. Master’s Thesis, China University of Mining and Technology, Xuzhou, China, 2022. [Google Scholar]
  37. Li, J. Research on Accuracy and Frequency Response Characteristics of Attitude Detection System Based on MEMS Sensor. Hangzhou Dianzi University. Master’s Thesis, Hangzhou, China, 2022. [Google Scholar]
  38. Li, S.; Cheng, J.; Li, H. Head-eye Coordination System Based on Bionic Principle. Comput. Eng. 2016, 42, 273–278. [Google Scholar]
  39. Duan, S. Dynamic Data Acquisition System of Picking Manipulator Based on Three-axis Gyroscope. J. Agric. Mech. Res. 2022, 44, 37–41. [Google Scholar]
  40. Fu, Z.; Zhu, H.; Sun, J.; Liu, W. Study on Filtering Algorithm Based on Inertial Sensors MPU6050. Piezoelectr. Acoustoopt. 2015, 37, 821–825, 829. [Google Scholar]
  41. Zhang, C.; Li, T.; Wang, Y. Design of Quad-Rotor Aircraft Flight Control System Based on MPU6050 and Adaptive Complementary Filter. Chin. J. Sens. Actuators 2016, 29, 1011–1015. [Google Scholar]
  42. Qin, H.; Zhang, S.; Huang, H.; Cao, J. Design of concrete laser leveling machine horizontal control system based on STM32. Mod. Electron. Tech. 2020, 43, 150–153. [Google Scholar]
  43. Zheng, A.; Lin, W.; Fu, Y. IIC Driving Algorithm Based on State Machine and Its Application. Sci. Technol. Inf. 2020, 18, 1–3, 6. [Google Scholar]
  44. Guillermo García-Villamil, L.R.A.R. Influence of IMU’s Measurement Noise on the Accuracy of Stride-Length Estimation for Gait Analysis. In Proceedings of the IPIN 2021 WiP Proceedings, Lloret de Mar, Spain, 29 November–2 December 2021. [Google Scholar]
  45. Li, J.; Qiang, J. Research and Implementation of Attitude Solving Algorithm Based on STM32 and MPU6050. J. Jiamusi Univ. (Nat. Sci. Ed.) 2017, 35, 295–298, 316. [Google Scholar]
  46. Zhao, S. Research on Pipeline Robot Based on Quadcopter. Master’s Thesis, Soochow University, Suzhou, China, 2013. (In Chinese). [Google Scholar]
  47. Yoo, T.S.; Hong, S.K.; Yoon, H.M.; Park, S. Gain-scheduled complementary filter design for a MEMS based attitude and heading reference system. Sensors 2011, 11, 3816–3830. [Google Scholar] [CrossRef]
  48. Okatan, A.; Hajiyev, C.; Hajiyeva, U. Kalman Filter Innovation Sequence Based Fault Detection in LEO Satellite Attitude Determination and Control System. In Proceedings of the 2007 3rd International Conference on Recent Advances in Space Technologies, Istanbul, Turkey, 14–16 June 2007; pp. 411–416. [Google Scholar]
  49. Sabatelli, S.; Galgani, M.; Fanucci, L.; Rocchi, A. A Double-Stage Kalman Filter for Orientation Tracking With an Integrated Processor in 9-D IMU. IEEE Trans. Instrum. Meas. 2013, 62, 590–598. [Google Scholar] [CrossRef]
  50. Stm. STM32L053C8T6. Available online: https://www.st.com/en/microcontrollers-microprocessors/stm32l053c8.html (accessed on 8 August 2022).
  51. Invensense. MPU6050. Available online: https://invensense.tdk.com/products/motion-tracking/6-axis/mpu-6050/ (accessed on 9 August 2022).
  52. Invensense. MPU9250. Available online: https://invensense.tdk.com/products/motion-tracking/9-axis/mpu-9250/ (accessed on 9 August 2022).
  53. Wt. WT9011G4K. Available online: https://wit-motion.yuque.com/wumwnr/docs/cokn95 (accessed on 9 August 2022).
  54. Nxp. PCA9685. Available online: https://www.nxp.com.cn/docs/en/data-sheet/PCA9685.pdf (accessed on 16 August 2022).
  55. Liu, M.; Cai, Y. Attitude solving method of portable mobile robot based on complementary filter. In Proceedings of the CCSSTA22nd 2021, Chongqing, China, 10–14 October 2021; p. 5. (In Chinese). [Google Scholar]
  56. Gao, Y.; Li, D.; Guo, P. Attitude calculation of aircraft based on gradient descent algorithm. Electron. Des. Eng. 2021, 29, 7–10. (In Chinese) [Google Scholar]
Figure 1. IMU and diagram of the way the bionic eye rotates.
Figure 1. IMU and diagram of the way the bionic eye rotates.
Electronics 12 00698 g001
Figure 2. Principle diagram of bionic eye attitude control.
Figure 2. Principle diagram of bionic eye attitude control.
Electronics 12 00698 g002
Figure 3. Principle diagram of connection of IMU and MCU.
Figure 3. Principle diagram of connection of IMU and MCU.
Electronics 12 00698 g003
Figure 4. (a) is the experimental model of the bionic eye, and (b) is the structure of the bionic eye model.
Figure 4. (a) is the experimental model of the bionic eye, and (b) is the structure of the bionic eye model.
Electronics 12 00698 g004
Figure 5. Design scheme of detection system of bionic eye attitude.
Figure 5. Design scheme of detection system of bionic eye attitude.
Electronics 12 00698 g005
Figure 6. Solution to the angle of acceleration.
Figure 6. Solution to the angle of acceleration.
Electronics 12 00698 g006
Figure 7. Attitude solution flow chart.
Figure 7. Attitude solution flow chart.
Electronics 12 00698 g007
Figure 8. STM32L053C8T6 pin diagram.
Figure 8. STM32L053C8T6 pin diagram.
Electronics 12 00698 g008
Figure 9. Core circuit of attitude control system.
Figure 9. Core circuit of attitude control system.
Electronics 12 00698 g009
Figure 10. Two kinds of IMU circuit diagram.
Figure 10. Two kinds of IMU circuit diagram.
Electronics 12 00698 g010
Figure 11. Drive control system circuit design drawing.
Figure 11. Drive control system circuit design drawing.
Electronics 12 00698 g011
Figure 12. Software flow chart.
Figure 12. Software flow chart.
Electronics 12 00698 g012
Figure 13. PID software flow chart.
Figure 13. PID software flow chart.
Electronics 12 00698 g013
Figure 14. Actual power consumption of three different IMUs in the working state is displayed.
Figure 14. Actual power consumption of three different IMUs in the working state is displayed.
Electronics 12 00698 g014
Figure 15. MPU9250. The result of GD data fusion.
Figure 15. MPU9250. The result of GD data fusion.
Electronics 12 00698 g015
Figure 16. MPU9250 dynamic result after CF data fusion.
Figure 16. MPU9250 dynamic result after CF data fusion.
Electronics 12 00698 g016
Figure 17. MPU9250 dynamic result after EKF data fusion.
Figure 17. MPU9250 dynamic result after EKF data fusion.
Electronics 12 00698 g017
Figure 18. Comparison of visual effects of posture correction systems in interference.
Figure 18. Comparison of visual effects of posture correction systems in interference.
Electronics 12 00698 g018
Table 1. Performance comparison of three typical data fusion algorithms [37].
Table 1. Performance comparison of three typical data fusion algorithms [37].
Data Fusion AlgorithmFusion
Accuracy
Amount of CalculationRate of
Convergence
GDGeneralSmallSlow
CFLowSmallSlow
EKFHighBigFast
Table 2. System experimental parameters.
Table 2. System experimental parameters.
ControllerSensorExperimental Temperature
STM32L053C8T6MPU9250Room Temperature 20 °C
Table 3. Filter initialization parameters.
Table 3. Filter initialization parameters.
Parameter NamesSampling PeriodABQRX0P0
Parameter value0.005 s 1 0.02 0 1 5.02 0 0.01 0 0 0.03 0.5 0 0 0.05 0.05 0.05 0.05
Table 4. Motion state attitude angle error after the fusion of three kinds of filtering (unit mm).
Table 4. Motion state attitude angle error after the fusion of three kinds of filtering (unit mm).
CF [55]GD [56]EKFExpected Coordinates
xyzxyzxyzxyz
−25.5211.12203.29−24.5711.42202.31−24.5611.44202.59−209.28200
−33.9410.76210.28−24.6011.25208.46−24.5610.74201.51−2010.12200
−7.509.64201.14−24.5611.85200.37−28.6413.06200.33−2011.53200
−27.9610.48211.70−24.5513.09209.12−22.1814.04204.43−2013.22200
−20.2814.36208.76−24.5514.09206.07−23.8215.02205.83−2014.62200
−27.7220.12201.19−24.5514.51200.73−23.0413.62200.27−2015.75200
−25.2131.60201.96−24.4216.74201.25−23.9916.60202.22−2017.16200
−13.1645.80200.94−24.0618.30200.33−21.6618.72201.38−2018.84200
Table 5. Motion attitude angle error after three kinds of filtering are combined.
Table 5. Motion attitude angle error after three kinds of filtering are combined.
Error of Attitude AngleCFGDEKF
Angle error0.62°0.61°0.43°
Table 6. Errors of static yaw angle and ideal yaw angle after three kinds of filtering are combined.
Table 6. Errors of static yaw angle and ideal yaw angle after three kinds of filtering are combined.
CFGDEKFIdeal Angle
YawYawYawYaw
69.7769.7369.7370.00
69.9169.9270.7170.00
70.1070.1470.8070.00
71.5569.9170.8970.00
70.3470.4870.7670.00
71.2570.2370.7670.00
71.9670.9070.6570.00
72.1371.1170.2170.00
Table 7. Static attitude angle error after three kinds of filtering are combined.
Table 7. Static attitude angle error after three kinds of filtering are combined.
Error of Attitude AngleCFGDEKF
Angle error0.1017°0.1001°0.0462°
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, H.; Lee, S. Robot Bionic Eye Motion Posture Control System. Electronics 2023, 12, 698. https://doi.org/10.3390/electronics12030698

AMA Style

Zhang H, Lee S. Robot Bionic Eye Motion Posture Control System. Electronics. 2023; 12(3):698. https://doi.org/10.3390/electronics12030698

Chicago/Turabian Style

Zhang, Hongxin, and Suan Lee. 2023. "Robot Bionic Eye Motion Posture Control System" Electronics 12, no. 3: 698. https://doi.org/10.3390/electronics12030698

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop