Next Article in Journal
Fuzzy Risk-Based Maintenance Strategy with Safety Considerations for the Mining Industry
Previous Article in Journal
Insole-Based Systems for Health Monitoring: Current Solutions and Research Challenges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Payload Identification and Gravity/Inertial Compensation for Six-Dimensional Force/Torque Sensor with a Fast and Robust Trajectory Design Approach

1
College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
2
Key Laboratory of Measurement and Control of Complex Systems of Engineering, Southeast University, Ministry of Education, Nanjing 210096, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(2), 439; https://doi.org/10.3390/s22020439
Submission received: 17 November 2021 / Revised: 23 December 2021 / Accepted: 5 January 2022 / Published: 7 January 2022
(This article belongs to the Topic Robotics and Automation in Smart Manufacturing Systems)

Abstract

:
In the robot contact operation, the robot relies on the multi-dimensional force/torque sensor installed at the end to sense the external contact force. When the effective load and speed of the robot are large, the gravity/inertial force generated by it will have a non-negligible impact on the output of the force sensor, which will seriously affect the accuracy and effect of the force control. The existing identification algorithm time is often longer, which also affects the efficiency of force control operations. In this paper, a self-developed multi-dimensional force sensor with integrated gravity/inertial force sensing function is used to directly measure the resultant force. Further, a method for the rapid identification of payload based on excitation trajectory is proposed. Firstly, both a gravity compensation algorithm and an inertial force compensation algorithm are introduced. Secondly, the optimal spatial recognition pose based on the excitation trajectory was designed, and the excitation trajectory of each joint is represented by a finite Fourier series. The least square method is used to calculate the identification parameters of the load, the gravity, and inertial force. Finally, the experiment was verified on the robot. The experimental results show that the algorithm can quickly identify the payload, and it is faster and more accurate than other algorithms.

1. Introduction

In recent years, most industrial/collaborative robots are used in non-contact operations, such as spraying, polishing, gluing, and handling. With the increasing demand for small-batch, flexible, and multi-variety operations, the demand for contact operations such as polishing, polishing, assembly, remote operation, and human–computer interaction has become increasingly significant. For the robot contact operations, the end-effector of the robot needs to accurately sense the contact force with the environment to ensure the smooth progress of the contact operation. There are many ways to sense the contact force between the end-effector of the robot and the environment: (1) Estimate the wrench based on the current [1]; (2) based on the joint force sensor, the end contact force is sensed through Jacobian transposition [2]; (3) the external contact force is directly sensed through the six-dimensional force sensor installed on the end-effector [3]. If the external contact force is accurately sensed through the current estimation method, it needs to be based on a dynamic model, and this method requires dynamic parameter identification and nonlinear control [4]. At present, it is difficult to achieve precise dynamic control by current estimation alone, and external force perception is not feasible. It is not realistic to add a torque sensor to the existing machine, and it is also difficult to control based on dynamics [5].
Compared with the first two methods, the advantages of the third method include: The joint torque sensor does not need to be added by changing the mechanical structure. The existing mechanism and control system do not need to be changed, the sensor is easily installed at the end of the flange. Through the control algorithm [6,7], the precise perception and application of the end force can be realized. When the robot performs operations, the loads that meet the requirements of different operations will be installed at the end of the robot, and the six-dimensional force sensor will generally be installed between the flange and the workpiece. In the low-speed motion of the robot, in order to accurately sense the force between the workpiece and the environment, it is necessary to eliminate the influence of the load’s center of gravity on the direct measurement value of the sensor. Further, when the robot is moving at a medium speed or a high speed, it is necessary to eliminate the influence of inertial on the measured value of the sensor.
At present, gravity compensation algorithms have been proposed in many works [8,9,10], which can be divided into three categories according to the application scenarios of gravity compensation. (1) Gravity compensation for the multi-dimensional force sensor installed at the base: a dynamic force compensation algorithm for zero value of the six-dimensional force sensor at the base is proposed [11], which solves the problem of the robot’s own gravity and inertia during the movement of the robot. However, this scheme is only suitable for collision detection in human–machine collaboration, and cannot effectively perceive the end force of the robot. (2) Gravity compensation for the multi-dimensional force sensor installed at the end-effector of the robot. To improve the robot’s perception of contact force, a robot compliance control method based on gravity compensation of force sensor is proposed [12]. By collecting data from force sensors in different poses of the robot, the parameter information of the robot load is calculated. Since the robot pose has not been specially selected, the recognition effect is not very good. In order to avoid the problem of improper posture selection, the gravity and center of gravity parameters at the load end were obtained by experimentally determining the form of gravity variation at different attitudes at the load end [13]. An optimization algorithm of gravity compensation based on genetic algorithm is proposed [14], the optimal solution model is established with the goal of minimizing the sum of squares of error, and the installation deviation angle of force sensor is obtained by genetic algorithm. A multi-objective optimization problem to ensure the recognition accuracy and ease of identification for the problem of robot identification pose selection is proposed [15], and the multi-objective particle swarm optimization method to obtain the optimal identification pose is used. Although the above algorithm can solve the problem of selecting the recognition attitude, it only performs gravity compensation and does not realize inertial force compensation. (3) Gravity compensation of each link: the gravity moment of each joint based on the principle of virtual displacement is derived [16], the electric drive to compensate the gravity of each link is used. In order to achieve gravity feedforward compensation, a PD control method based on a robot model is designed [17]. To solve the problem of tracking errors caused by the gravity of each joint linkage, an adaptive neural network model is designed [18].Further, a PD gravity compensation algorithm based on maximum gravity moment estimation and trigonometric function is proposed based on the spatial geometric characteristics of manipulator and the principle of torque balance [19]. To solve the problems that affect a doctor’s true perception of the feedback force information, a gravity compensation model, friction compensation model, and inertial force compensation model is established [20] and a force compensation control strategy is adopted. For this compensation method, it is necessary to establish an accurate dynamic model to achieve better results.
Some ideas from the above algorithm can be applied to the gravity compensation, but most of them only collect space-limited points for gravity compensation. The compensation accuracy is relatively poor, and the operating efficiency is very low, so it is not suitable for industrial sites that require high precision and high beat force control. At the same time, there are relatively few studies considering the inertial force compensation algorithm, because most six-dimensional force sensors do not have inertial force sensing devices (such as multi-axis acceleration sensors) to sense inertia force. This paper uses a six-dimensional force sensor that integrates gravity/inertial force compensation, and is inspired by the dynamic parameter identification method based on the excitation trajectory [21]. A fast and efficient algorithm for gravity compensation and inertial force compensation is proposed, and it can meet the requirements of high-speed and high-tempo operations in industrial sites.

2. Gravity Compensation Algorithm

2.1. Analysis of the Influence of Load Gravity

In this paper, a force sensor is installed at the end of the robot to sense the contact forces with the external environment. In general, a six-dimensional force sensor is selected to measure the three-dimensional orthogonal forces ( F x , F y , F z ) and three-dimensional orthogonal moment ( T x , T y , T z ) in the arbitrary force systems in space, as shown in Figure 1. In the static case, the force and moment data perceived by the end of the robot include four parts: (1) sensor system error; (2) load gravity; (3) load inertia force; (4) external contact force on the load.
The six-dimensional force sensor is generally installed between the flange and the end tool. When the load posture changes with the operation, although the direction of gravity is always vertically down, the component of the load gravity on each axis of the sensor coordinate system is different. The coordinate system of the six-dimensional force sensor is set to F T , the robot end coordinate system is E, the robot base coordinate system is B, the world coordinate system is W, and the homogeneous transformation matrix of the sensor coordinate system with relative to the robot end coordinate system is FT E T . For any given end position E B T , the gravity of the end tool has components on each axis in the F T coordinate system. If the z-axis of the sensor is consistent with the direction of gravity, the end gravity acts only on the z-axis of the FT coordinate system. The purpose of gravity compensation is to eliminate the gravity influence of the end tool, and the gravity and center of gravity parameters of the end load can be obtained through the gravity compensation algorithm.

2.2. Force and Torque of Six-Dimensional Force Sensor

In general, the six-dimensional force sensor has an initial zero value due to errors such as processing and calibration, the zero value of the three force components are recorded as F x 0 , F y 0 , and F z 0 , and the zero value of the three torque components are recorded as T x 0 , T y 0 , and T z 0 . The origin of the sensor coordinate system is O, and the direction of the coordinate system is consistent with that given by the sensor manufacturer. Assuming that the load gravity is G and the coordinate of the load center of gravity in the sensor coordinate system is ( x , y , z ) , then the force components of the load gravity in the sensor coordinate system are G x , G y , and G z , and the force moments are T g x , T g y and T g z . The formula shown in Equation (1) can be obtained from the relationship between force and torque by the right-hand rule.
T g x = G y × z G z × y T g y = G z × x G x × z T g z = G x × y G y × x
The force component feedback values measured by the six-dimensional force sensor are recorded as F x , F y , and F z , respectively, and the torque component feedback values are recorded as T x , T y , and T z , respectively. When there is no external force at the end of the robot, the measured value of the sensor, the force component/moment of the load gravity under the sensor, and the zero value of the sensor are shown in Equation (2).
G x = F x F x 0 G y = F y F y 0 G z = F z F z 0 T g x = T x T x 0 T g y = T y T y 0 T g z = T z T z 0
Substituting Equation (2) into Equation (1) yields
T x = F y × z F z × y + T x 0 + F z 0 × y F y 0 × z T y = F z × x F x × z + T y 0 + F x 0 × z F z 0 × x T z = F x × y F y × x + T z 0 + F y 0 × x F x 0 × y
In Equation (3), F x 0 , F y 0 , F z 0 , T x 0 , T y 0 , T z 0 , and x , y , z are all constants, so that:
k 1 = T x 0 + F z 0 × y F y 0 × z k 2 = T y 0 + F x 0 × z F z 0 × x k 3 = T z 0 + F y 0 × x F x 0 × y

2.3. Load Center of Gravity Coordinate Calculation

To obtain the load center coordinates, Equation (4) can be taken into Equation (3), and Equation (5) can be obtained:
T x T y T z = 0 F z F y 1 0 0 F z 0 F x 0 1 0 F y F x 0 0 0 1 x y z k 1 k 2 k 3
In order to obtain accurate identification results, N sets of data can be collected ( N 3 ) , and the normal vectors at the end of the robot in at least three postures are not coplanar, that is linearly independent. Then Equation (5) can be reconstructed as Equation (6):
T x 1 T y 1 T z 1 T x 2 T y 2 T z 2 . . . T x N T y N T z N = 0 F z 1 F y 1 1 0 0 F z 1 0 F x 1 0 1 0 F y 1 F x 1 0 0 0 1 0 F z 2 F y 2 1 0 0 F z 2 0 F x 2 0 1 0 F y 2 F x 2 0 0 0 1 . . . 0 F z N F y N 1 0 0 F z N 0 F x N 0 1 0 F y N F x N 0 0 0 1 x y z k 1 k 2 k 3
Convert Equation (6) to the form of linear equation, as shown in Equation (7):
τ = F · p
of which,
p = x y z k 1 k 2 k 3 T
After a finite number of elementary linear changes, the matrix F can be verified to be a non-full rank matrix with r a n k ( F ) = 3 N , so that the Equation (7) is actually overdetermined equation, so the corresponding least squares solution is:
p = ( F T F ) 1 · F T τ
the coordinate values ( x , y , z ) and the constant k 1 , k 1 , k 3 of gravity in the coordinate system of the six-dimensional force sensor can be solved from the above formulas.

2.4. Calculation of Base Mounting Inclination, Sensor Zero Point, and Load Gravity

In the actual process of industrial robot installation, there is often a flatness error between the robot base and the ground, that is, the z-axis of the robot coordinate system is not necessarily parallel to the gravity direction, so there is an installation inclination angle.
The installation inclinations angle U and V between the robot base and the world coordinate system are the deviation angles between the machine base and the world coordinate system in the x-axis and y-axis directions, respectively, which satisfies Equation (10).
B W R = 1 0 0 0 cos U sin U 0 sin U cos U cos V 0 sin V 0 1 0 sin V 0 cos V
Equation (11) is the rotation matrix of the sensor coordinate system relative to the base coordinate system, where the angles of rotation around the Z, Y, X axes in the base coordinate system are represented by A, B, C, respectively, which can be obtained from the robot control system. It is assumed that the robot tool coordinate system coincides with the sensor coordinate system.
The direction vector of gravity in the world coordinate system is expressed by Equation (11).
F T B R = R Z ( A ) R Y ( B ) R X ( C ) = cos A sin A 0 sin A cos A 0 0 0 1 cos B 0 sin B 0 1 0 sin B 0 cos B 1 0 0 0 cos C sin C 0 sin C cos C
g W = 0 0 1
According to the above equation, the directional vector of the load in the sensor coordinate system can be obtained as Equation (13).
g F T = B F T R · W B R · g W = F T B R T · B W R T · g W = F T B R T cos U · sin V sin U cos U · cos V
Suppose the gravity of the load is G, and Equation (14) can be obtained from Equation (2).
F x F y F z = G x G y G z + F x 0 F y 0 F z 0 = F T B R T G · cos U · sin V G · sin U G · cos U · cos V + F x 0 F y 0 F z 0
Let the constant term be:
L x = G · c o s U · s i n V L y = G · s i n V L z = G · c o s U · c o s V
Equation (14) can be arranged as follows:
F x F y F z = F T B R T I L x L y L z F x 0 F y 0 F z 0
where I is the unit matrix of 3 × 3. In order to ensure the accuracy of the calculation, the A, B, C values of N groups of robots in different postures are collected, and the F T B R T matrix of N gestures is obtained, and the following equation is obtained.
F x 1 F y 1 F z 1 F x 2 F y 2 F z 2 . . . F x N F y N F z N = F T B R 1 T I F T B R 2 T I . . . . . . F T B R N T I L x L y L z F x 0 F y 0 F z 0
The above equation can be transformed into a system of linear equation:
f = Rl
of which:
l = L x L y L z F x 0 F y 0 F z 0 T
The least squares solution of Equation (18) is:
l = ( R T R ) 1 · R T f
The zero points F x 0 , F y 0 , F z 0 and the constants L x , L y , L z of the six-dimensional force transducer can be solved from Equation (20). the zero point of the moment of the sensor can be solved from Equation (4):
T x 0 = k 1 F z 0 × y + F y 0 × z T y 0 = k 2 F x 0 × z + F z 0 × x T z 0 = k 3 F y 0 × x + F x 0 × y
The gravity of the load and the mounting inclination of the robot can be obtained from Equation (15):
G = L x 2 + L y 2 + L z 2
U = arcsin ( L y G ) V = arctan ( L x L z )

2.5. Calculation of External Force Perception

The three components of the load gravity in the coordinate system of the six-dimensional force sensor can be obtained from Equations (13) and (15):
G x G y G z = G · g F T = F T B R T L x L y L z
Substituting Equation (24) into Equation (1), the moment components T g x , T g y , T g z of gravity acting on the sensor coordinate system can be obtained.
The value of the sensor feedback contains the zero value, the force/moment component caused by the load gravity and the external force, so the external force/moment value can be calculated as follows:
F e x = F x F x 0 G x F e y = F y F y 0 G y F e z = F z F z 0 G z T e x = T x T x 0 T g x T e y = T y T y 0 T g y T e z = T z T z 0 T g z

3. Inertial Force Compensation Algorithm

3.1. Analysis of the Influence of Load Inertia Force

When the manipulator performs the operation with a certain acceleration, the inertia mass effect will be produced by the end tool due to the acceleration produced by the load end of the sensor (such as fixture, actuator, etc.). The inertial force F and inertial moment T will be generated by the end load under the influence of acceleration, and the force and moment perceived by the sensor will be affected by these two values; therefore, the inertial force and inertial moment need to be compensated to obtain more accurate measurements values, which provides the correct force feedback values for the force control algorithm.

3.2. Inertial Force Models and Compensation Algorithms

In the process of fast motion od the robot, due to the action of load, the inertial force F i n e r t and inertial moment T i n e r t will be produced at the end of the robot in the variable speed motion, and the following equations are obtained from Newton’s second law and the law of rigid body rotation:
F i n e r t = m · a
T i n e r t = J · α
where m is the mass of the load, a is the acceleration of the load movement, J is the rotational inertia of the load, and α is the angular acceleration of the load rotation. The acceleration and angular acceleration can be read directly by the accelerometer installed in the six-dimensional force sensor, the load mass is identified by gravity compensation, and the rotational inertia is calculated according to theoretical mechanics.
The measured value of the sensor ( F m e a s   a n d   T m e a s ) is composed of two components: the desired tracking force/moment ( F d e s   a n d   T d e s ) and the inertia force/moment value.
F m e a s = F d e s + F i n e r t T m e a s = T d e s + T i n e r t
The inertia forces/moment is compensated by an equivalent compensation force and moment:
F m e a s = F d e s + F i n e r t + F c o m p T m e a s = T d e s + T i n e r t + T c o m p
where the compensating force/moment is defined as follows, and then the compensation of inertia forces/moments is realized.
F c o m p = F i n e r t T c o m p = T i n e r t

4. Fast Gravity/Inertial Force Identification Method Based on Excitation Trajectories

4.1. The Combined Forces of Gravity and Inertia Are Expressed

Based on the contents in Section 2 and Section 3, the gravity and inertial force can be calculated, respectively, but it takes a long time to identify. In order to realize the rapid identification of gravity and inertial forces, a fast identification method based on excitation trajectories will be introduced in this section.
Firstly, the expression method and the solution equation of the resultant force of gravity and inertia force are introduced. With the combination Equations (14), (15) and (28), the comprehensive expressions of gravity and inertia force can be obtained:
F x F y F z = F T R R T G · cos U · sin V G · sin U G · cos U · cos V + F x 0 F y 0 F z 0 + G g a x a y a z
The above Equation (31) can be sorted into the form of Equation (32), and the comprehensive Equation (33) of gravity moment and inertia moment can be arranged.
Let H x = F y 0 z F z 0 y + T x 0 , H y = F z 0 x F x 0 z + T y 0 and H z = F x 0 y F y 0 x + T z 0 . Then, Equation (33) can be sorted into Equation (34).
For Equations (32) and (34), sensor data are usually collected from N sets of robots at different attitudes to obtain Equation (35) shown below.
F x F y F z = F T R R T I L x L y L z F x 0 + G g a x F y 0 + G g a y F z 0 + G g a z
T x = F z y F y z + F x 0 z F z 0 y + T x 0 + G g ( a z y a y z ) T y = F x z F z x + F z 0 x F x 0 z + T y 0 + G g ( a x z a z x ) T z = F y x F x y + F x 0 y F y 0 x + T z 0 + G g ( a y x a x y )
T x T y T z = 0 F z + G g a y F y G g a y 1 0 1 F z G g a z 0 F x + G g a x 0 1 0 F y + G g a y F x G g a x 0 0 0 1 x y z H x H y H z
F x 1 F y 1 F z 1 F x 2 F y 2 F z 2 . . . F x N F y N F z N = F T B R 1 T I F T B R 2 T I . . . . . . F T B R N T I L x L y L z F x 0 + G g a x F y 0 + G g a y F z 0 + G g a z
The above Equation (35) is transformed into a system of linear equations:
F = RL
of which,
L = L x L y L z F x 0 + G g a x F y 0 + G g a y F z 0 + G g a z T
The corresponding least squares solutions to Equation (37) are Equations (38) and (39).
L = ( R T R ) 1 · R T F
T x 1 T y 1 T z 1 T x 2 T y 2 T z 2 . . . T x N T y N T z N = 0 F z 1 + G g a z 1 F y 1 G g a y 1 1 0 0 F z 1 G g a z 1 0 F x 1 + G g a x 1 0 1 0 F y 1 + G g a y 1 F x 1 G g a x 1 0 0 0 1 0 F z 2 + G g a z 2 F y 2 G g a y 2 1 0 0 F z 2 G g a z 2 0 F x 2 + G g a x 2 0 1 0 F y 2 + G g a y 2 F x 2 G g a x 2 0 0 0 1 . . . . . . . . . . . . . . . . . . 0 F z N + G g a z N F y N G g a y N 1 0 0 F z N G g a z N 0 F x N + G g a x N 0 1 0 F y N + G g a y N F x N G g a x N 0 0 0 1 x y z H x H y H z
Equation (39) is transformed into a system linear equation:
T = RH
of which,
H = x y z H x H y H z T
The corresponding least squares solution to Equation (41) is:
H = ( R T R ) 1 · R T T
In Equations (38) and (42), the unknown parameters are the load gravity G, inclination angles of base installation U, V, force values of sensor zero point F x 0 , F y 0 , F z 0 , tool center of gravity coordinates x, y, z, and T x 0 , T y 0 , T z 0 , a total of 12 parameters, N ( N 12 ) groups of sensor data under different attitudes are collected, and 12 unknown parameters can be obtained by substitution into Equations (38) and (42).

4.2. Excitation Trajectory Design

Each of the unknown parameters of the gravity/inertia force can be identified by changing the attitude of the end (that is, related to the 4–6 axes). In order to make the selected attitude represent the identification space as much as possible, an optimal joint reference trajectory needs to be designed. In order to excite the relevant parameters of the end 4–6 axis joint, the equations related to the joint parameters are found, that is, the moment equations under inertia force are considered as excitation equations, such as Equation (43).
T a x T a y T a z = 0 a z a y a z 0 a x a y a x 0 G g x G g y G g z
The acceleration of the load is the vector sum of the acceleration generated by the rotating of the load around the end coordinate system and the acceleration of the coordinate system itself, namely:
a x a y a z = x y z × α + a x 0 a y 0 a z 0
where α is the angular acceleration of the robot end coordinate system and a x 0 a y 0 a z 0 T is the acceleration of the end coordinate system itself, Equation (45) can be obtained by substituting the above Equation (43).
T a x T a y T a z = 0 x α y y α x + a z 0 z α x + x α z a y 0 x α y + y α x a z 0 0 y α z z α y + a x 0 z α x x α z + a y 0 y α + z α y a x 0 0 G g x G g y G g z
The above equation can be expressed as follows:
T a = a · p
where the acceleration a is a function of θ 4 , θ 5 and θ 6 , which will be denoted as a = f ( q 4 , q 5 , q 6 ) . p is the part except a in the above formula. The objective function of the optimization is denoted as m i n f ( a ) , the physical meaning of this is to fill the whole space with the end of the robot as much as possible. The excitation trajectory of each joint is expressed by a Fourier series of finite terms, then the angular displacement q i , angular velocity q ˙ i , and angular acceleration q ¨ i of joint i of the robot can be expressed as:
q i ( t ) = l = 1 N a i , j w f l sin ( w f l t ) b i , j w f l cos ( w f l t )
q ˙ i ( t ) = l = 1 N a i , j cos ( w f l t ) + b i , j sin ( w f l t )
q ¨ i ( t ) = l = 1 N w f l a i , j sin ( w f l t ) + b i , j cos ( w f l t )
where N represents the number of terms of sine and cosine, w f = 2 π f f is the fundamental angular frequency, and a i , l and b i , l are the parameters to be determined. Since gravity compensation only involves the 4th, 5th, and 6th axes at the end of the robot arm, a total of 6 N parameters need to be optimized, and the optimization objective is recorded as:
m i n f ( a )
To meet the constraint conditions of robot joints, the following constraint expressions are given:
q i ( t ) l = 1 N 1 w f l a i , l 2 + b i , l 2 q i , max
q ˙ i ( t ) l = 1 N a i , l 2 + b i , l 2 q ˙ i , max
q ¨ i ( t ) l = 1 N w f l a i , l 2 + b i , l 2 q ¨ i , max
The start and stop constraint equations are as follow:
q i ( t 0 ) = l = 1 N ( b i , l w f l ) = 0
q ˙ i ( t 0 ) = l = 1 N a i , l = 0
q ¨ i ( t 0 ) = l = 1 N w f l b i , l = 0
The factors such as the “unrepresentative” attitude of the identification point and the floating of sensor data are considered, and M a , a , and p are all disturbed, and the equations after the disturbance are as follows:
T a + δ T a = ( a + δ a ) · ( p + δ p )
Subtracting from the original equation gives:
δ T a = a · δ p + δ a · p + δ a · δ p
δ p = a ( δ T a δ a · p δ a · δ p )
Therefore,
δ p a ( δ T a + δ a · p + δ a · δ p )
can be solved for:
δ p a 1 a δ a ( δ T a + δ a · p )
Since T a = a · p , then T a a · p
Thus, Equation (59) can be rewritten as:
δ p a 1 a δ a ( δ T a T a · a · p + δ a · p )
Then the relative error of the identification parameter b is:
δ p p a · a 1 a δ a ( δ T a T a + δ a a )
Since δ T a T a + δ a a is a relative perturbation term with a very small value, the error in the identification parameter matrix p is mainly caused by a · a . The condition number is introduced here as an optimization objective function for the excitation trajectory, that is:
C o n d ( a ) = a · a
The relative error of parameter b can be obtained by substituting this equation into parameter b:
δ p p C o n d ( a ) 1 C o n d ( a ) δ a a ( δ T a T a + δ a a )
Optimized objective function:
min f ( a ) = a a
Under the conditions of the satisfying the position, velocity, and acceleration of each joint of the robot, when the value of the optimization objective function m i n f ( a ) is at its minimum, it means that the external disturbance of the external disturbance to the identification parameter matrix p is the least, and the corresponding parameters a i , j and b i , j are the corresponding parameters of the excitation trajectory, the purpose of the excitation trajectory is to make the posture of the end of the robot diverse, so it can be assumed that the position vector of the center of mass of the workpiece in the coordinate system of the robot end is a known quantity p , and it can be substituted into Equation (45) to excite the representative posture of the robot.

4.3. Excitation Trajectory Curve

The number of sine and cosine terms in the Fourier series is taken as N = 5 , the fundamental frequency f f is selected as 0.1 Hz, and the sampling frequency is taken as 100 Hz, a total of 1000 samples are taken to calculate the matrix a . To achieve the excitation of each joint parameter, the fmincon function in Matlab is selected and the excitation trajectory can be obtained by inputting the optimization objective function and each constraint into this function. The angular variations of joints 4–6 are shown in Figure 2, and the corresponding Cartesian space end position, velocity, and acceleration curves are shown in Figure 3.
According to the above excitation trajectory, the sensor data under various attitudes are obtained, and the 12 parameters, such as the load gravity G, the installation inclination angle of machine U, V, the zero point force values of the sensor F x 0 , F y 0 , F z 0 , the tool center of gravity coordinates x, y, z, and T x 0 , T y 0 , T z 0 can be identified by Equations (38) and (42).

5. Experimental Verification and Results

5.1. Experimental Platform Introduction and Verification

The UR10 robot system and the self-developed six-dimensional force sensor with acceleration sensing were used as the platform for this experiment (the sensor performance test can be found in [22], also shown in Table 1). The self-developed six-dimensional force sensor can directly measure angle, angular velocity, and angular acceleration. These data can be directly used for gravity and inertial compensation. A test load was mounted on the end of the robot; the experimental scene diagram is shown in Figure 4. According to the design scheme of the excitation trajectory in Section 4.2, an identification trajectory with a movement time of 10 s was designed, as shown in Figure 2 and Figure 3. In order to realize the synchronous periodic motion control of the robot and the synchronous acquisition of the sensors, an industrial computer platform with the Linux Real-Time operating system was built. The 8 ms communication control between the industrial computer and the UR10 can be realized by using the servoj instructions provided by the UR10. According to the designed identification trajectory, a total of 1250 discrete points of motion were interpolated, and the data of 1250 groups of sensors were synchronously collected. The data set can be substituted into Equation (42) to obtain the identification results. The identification results include load mass and center of gravity position, the initial force/torque value, and the installation inclination angle of the foundation; the results are shown in Table 2.

5.2. Experimental Results and Analysis

In order to verify whether the identification result is accurate, multiple trajectories with acceleration in Cartesian space is further designed to verify the compensation result, including straight lines, arcs, variable posture curves, and arbitrary drag trajectory. One representative curve was chosen to show the compensation results as shown in Figure 5, that is a arbitrary curve with varying poses. The robot’s motion data and sensor values are collected synchronously, and the corresponding results after gravity and inertial force compensation can be obtained after entering from Equations (28)–(30), several sets of experiments were performed, and one set of results is shown in Figure 6. The other compensation results are consistent with this set of results; the result shows the compensation effect remain in the same values. It can be seen from Figure 6 that before compensation, the force value of the z-axis exceeds 10 N, and the torque value exceeds 0.5 Nm. The compensated force value is within 0.5 N, and the torque value is within 0.2 Nm, which meets the needs of industrial field applications.
Compared with other algorithms, one of the advantages of this algorithm is that the identification time is short. The identification time of the traditional six-point method or the least square method with limited points is more than 60 s, and the result of the traditional compensation algorithm as shown in Figure 7. It can be seen that the compensation result exceeds 1 N in the force direction and 0.4 Nm in the torque direction.
The existing algorithm is only used for gravity compensation, not for inertial force compensation, so it is impossible to design an identification trajectory that is too fast. The algorithm proposed in this paper can satisfy both low-speed and high-speed conditions. It not only compensates for gravity, but also compensates for high-speed or heavy-load conditions. The identification of payload can be completed only through the designed excitation trajectory. Due to the rapid identification, the algorithm can be applied to the occasions that require high time tempo in the industrial field.

6. Conclusions

For the precise force control of the robot, gravity, and inertia, force compensation is particularly critical, and the effect of force control is directly affected by its identification and compensation accuracy. In this paper, the effects of inertia force caused by high speeds or large loads is considered, by using the self-developed multi-dimensional force sensor with acceleration sensor, the inertia force, and inertia moment of the robot are measured directly, and the measurement data are provided directly for inertia force identification. A fast identification method based on the excitation trajectory is proposed to quickly and accurately identify the relevant parameters of the robot tool under fast/large loads, which can shorten the identification time to 10 s and the efficiency of identification is greatly improved. Through the verification of the test trajectory, it is found that the accuracy of force and moment compensation can be improved to less than 0.5 N and 0.2 Nm, respectively. The problem of the traditional algorithm is that the identification time is long, and only the gravity compensation is considered, which is not suitable for high speed or heavy load. The algorithm mentioned in this paper is very helpful for industrial applications, not only can shorten the identification time, but also can compensate for high-speed or large-load motion at the same time. Through this algorithm, the sensor can accurately sense the external force, laying the foundation for the next force control application.

Author Contributions

Methodology, J.D.; validation and data curation, J.D., Z.L., and Y.B.; writing—original draft preparation, J.D. and K.C.; writing—review and editing, J.D.; visualization, J.D.; supervision, Z.D.; project administration, Z.D.; funding acquisition, Z.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Key Research and Development Plan under grant number 2018YFB1305601. and Open Project Program of the Key Laboratory of Measurement and Control of Complex Systems of Engineering, Ministry of Education, Southeast University under grant number MCCSE2021A04, and the Fundamental Research Funds for the Central Universities under grant number MCCSE2021A04.

Institutional Review Board Statement

Not Applicable.

Informed Consent Statement

Not Applicable.

Data Availability Statement

Not Applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Trajin, B.; Regnier, J.; Faucher, J. Comparison Between Stator Current and Estimated Mechanical Speed for the Detection of Bearing Wear in Asynchronous Drives. IEEE Trans. Ind. Electron. 2009, 56, 4700–4709. [Google Scholar] [CrossRef]
  2. Albu-Schäffer, A.; Haddadin, S.; Ott, C.; Stemmer, A.; Wimböck, T.; Hirzinger, G. The DLR lightweight robot: Design and control concepts for robots in human environments. Ind. Robot. 2007, 34, 376–385. [Google Scholar] [CrossRef] [Green Version]
  3. Kim, Y.B.; Seok, D.Y.; Lee, S.Y.; Kim, J.; Kang, G.; Kim, U.; Choi, H.R. 6-Axis Force/Torque Sensor With a Novel Autonomous Weight Compensating Capability for Robotic Applications. IEEE Robot. Autom. Lett. 2020, 5, 6686–6693. [Google Scholar] [CrossRef]
  4. Kostic, D.; De Jager, B.; Steinbuch, M.; Hensen, R. Modeling and identification for high-performance robot control: An RRR-robotic arm case study. IEEE Trans. Control. Syst. Technol. 2004, 12, 904–919. [Google Scholar] [CrossRef]
  5. Ludovico, D.; Guardiani, P.; Lasagni, F.; Lee, J.; Cannella, F.; Caldwell, D.G. Design of Non-Circular Pulleys for Torque Generation: A Convex Optimisation Approach. IEEE Robot. Autom. Lett. 2021, 6, 958–965. [Google Scholar] [CrossRef]
  6. Alamdari, A.; Haghighi, R.; Krovi, V. Gravity-Balancing of Elastic Articulated-Cable Leg-Orthosis Emulator. Mech. Mach. Theory 2018, 131, 351–370. [Google Scholar] [CrossRef]
  7. Zavala-Rio, A.; Santibanez, V. A Natural Saturating Extension of the PD-With-Desired-Gravity-Compensation Control Law for Robot Manipulators With Bounded Inputs. IEEE Trans. Robot. 2007, 23, 386–391. [Google Scholar] [CrossRef]
  8. Swevers, J.; Verdonck, W.; Naumer, B.; Pieters, S.; Biber, E. An Experimental Robot Load Identification Method for Industrial Application. Int. J. Robot. Res. 2002, 21, 701–712. [Google Scholar] [CrossRef]
  9. Kelly, R.; Santibaez, V.; Reyes, F. On saturated-proportional derivative feedback with adaptive gravity compensation of robot manipulators. Int. J. Adapt. Control. Signal Process. 2015, 10, 465–479. [Google Scholar] [CrossRef]
  10. Lin, H.; Hui, C.W.V.; Wang, Y.; Deguet, A.; Kazanzides, P.; Au, K.S. A Reliable Gravity Compensation Control Strategy for dVRK Robotic Arms With Nonlinear Disturbance Forces. IEEE Robot. Autom. Lett. 2019, 4, 3892–3899. [Google Scholar] [CrossRef] [Green Version]
  11. Zhijun, W.; Lu, L.; Zhanxian, L.; Liwen, C. Research on dynamic force compensation for robots based on six-dimensional force sensors. Mech. Des. 2020, 37, 72–77. [Google Scholar]
  12. Lingtao, H.; Lin, W.; Shui, N.; Jingsong, Y.; Tao, N. Research on robot flexibility control based on force sensor gravity compensation. J. Agric. Mach. 2020, 51, 386–393. [Google Scholar]
  13. Ni, Z.; Zhang, Y.; Shen, X.; Wu, S.; Wu, Z. Payload parameter identification method of flexible space manipulator. Aircr. Eng. Aerosp. Technol. 2020, 92, 925–938. [Google Scholar] [CrossRef]
  14. Gen, L.; Pengcheng, L.; Chao, W.; Ye, S. Research on the optimization algorithm of robot load gravity compensation based on genetic algorithm. Aviat. Manuf. Technol. 2021, 64, 52–59. [Google Scholar]
  15. Che, H.; Yiwen, Z.; Bi, Z.; Yingli, L.; Xingang, Z. Robotic arm load estimation based on optimal excitation pose sequences. Robot 2020, 42, 503–512. [Google Scholar]
  16. Tao, W.; Bo, P.; Yili, F.; Shuguo, W. Minimally invasive surgical robot force feedback main hand gravity compensation study. Robot 2020, 42, 525–533. [Google Scholar]
  17. Montalvo, W.; Escobar-Naranjo, J.; Garcia, C.A.; Garcia, M.V. Low-Cost Automation for Gravity Compensation of Robotic Arm. Appl. Sci. 2020, 10, 3823. [Google Scholar] [CrossRef]
  18. Razmi, M.; Macnab, C.J.B. Near-optimal neural-network robot control with adaptive gravity compensation. Neurocomputing 2020, 389, 83–92. [Google Scholar] [CrossRef]
  19. Morooka, Y.; Mizuuchi, I. Gravity Compensation Modular Robot:Proposal and Prototyping. J. Robot. Mechatron. 2019, 31, 697–706. [Google Scholar] [CrossRef]
  20. Jintian, Y.; Yangyang, L.; Quanli, Y.; Hongqiang, S. Research on Compensation Strategy of Additional Force of Main Manipulator with Force Feedback. China Mech. Eng. 2017, 28, 1156–1162. [Google Scholar]
  21. Jin, J.; Gans, N. Parameter identification for industrial robots with a fast and robust trajectory design approach. Robot. Comput. Integr. Manuf. 2015, 31, 21–29. [Google Scholar] [CrossRef]
  22. Song, Y.; Duan, J.; Xiang, L.; Li, C.; Yao, J.; Dai, Z. Design and application of a kind of high-precision, loosely coupled six-axis force (torque) sensor. J. Nanjing Univ. Aeronaut. Astronaut. 2022; in press. [Google Scholar]
Figure 1. Diagram of gravity compensation.
Figure 1. Diagram of gravity compensation.
Sensors 22 00439 g001
Figure 2. Joint-time trajectory.
Figure 2. Joint-time trajectory.
Sensors 22 00439 g002
Figure 3. End-effector position, velocity, and acceleration profiles. (a) Position; (b) Velocity, acceleration.
Figure 3. End-effector position, velocity, and acceleration profiles. (a) Position; (b) Velocity, acceleration.
Sensors 22 00439 g003
Figure 4. Physical test scene diagram.
Figure 4. Physical test scene diagram.
Sensors 22 00439 g004
Figure 5. Test trajectories in Cartesian space.
Figure 5. Test trajectories in Cartesian space.
Sensors 22 00439 g005
Figure 6. Results after compensation of gravitational and inertial forces.
Figure 6. Results after compensation of gravitational and inertial forces.
Sensors 22 00439 g006
Figure 7. The result of the traditional compensation algorithm.
Figure 7. The result of the traditional compensation algorithm.
Sensors 22 00439 g007
Table 1. The characteristic information of the sensor.
Table 1. The characteristic information of the sensor.
TpyeGTL96003AXX (Self-Developed)
RangeFx/Fy: 300 N, Fz: 500 N, Tx/Ty/Tz: 25 Nm
Weight700 g
SizeØ80 mm × 40 mm
Protection levelIP65
Overload capacity500% FS
Resolution0.1 N/0.02 Nm
Acceleration sensing accuracyacceleration ≤ 0.01 g, angular velocity ≤ 0.05 deg/s
Table 2. Identification result.
Table 2. Identification result.
F x 0 (N) F y 0 (N) F z 0 (N) T x 0 (Nm) T y 0 (Nm) T z 0 (Nm)
−0.66720.85650.35380.02280.00840.0080
x (cm)y (cm)z (cm)G (N)U ( )V ( )
0.50.25.18.862−9.8716−5.3709
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Duan, J.; Liu, Z.; Bin, Y.; Cui, K.; Dai, Z. Payload Identification and Gravity/Inertial Compensation for Six-Dimensional Force/Torque Sensor with a Fast and Robust Trajectory Design Approach. Sensors 2022, 22, 439. https://doi.org/10.3390/s22020439

AMA Style

Duan J, Liu Z, Bin Y, Cui K, Dai Z. Payload Identification and Gravity/Inertial Compensation for Six-Dimensional Force/Torque Sensor with a Fast and Robust Trajectory Design Approach. Sensors. 2022; 22(2):439. https://doi.org/10.3390/s22020439

Chicago/Turabian Style

Duan, Jinjun, Zhouchi Liu, Yiming Bin, Kunkun Cui, and Zhendong Dai. 2022. "Payload Identification and Gravity/Inertial Compensation for Six-Dimensional Force/Torque Sensor with a Fast and Robust Trajectory Design Approach" Sensors 22, no. 2: 439. https://doi.org/10.3390/s22020439

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop