Next Article in Journal
A Vibration Suppression Method for the Multistage Rotor of an Aero-Engine Based on Assembly Optimization
Next Article in Special Issue
A Review of Sensors Used on Fabric-Handling Robots
Previous Article in Journal
Research on the Multi-Robot Cooperative Pursuit Strategy Based on the Zero-Sum Game and Surrounding Points Adjustment
Previous Article in Special Issue
Frequency Measurement Method of Signals with Low Signal-to-Noise-Ratio Using Cross-Correlation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Proximity Sensor for Thin Wire Recognition and Manipulation

Dipartimento di Ingegneria, Università degli Studi della Campania “Luigi Vanvitelli”, Via Roma 29, 81031 Aversa, CE, Italy
*
Author to whom correspondence should be addressed.
Machines 2021, 9(9), 188; https://doi.org/10.3390/machines9090188
Submission received: 18 June 2021 / Revised: 23 August 2021 / Accepted: 1 September 2021 / Published: 3 September 2021
(This article belongs to the Special Issue State-of-Art in Sensors for Robotic Applications)

Abstract

:
In robotic grasping and manipulation, the knowledge of a precise object pose represents a key issue. The point acquires even more importance when the objects and, then, the grasping areas become smaller. This is the case of Deformable Linear Object manipulation application where the robot shall autonomously work with thin wires which pose and shape estimation could become difficult given the limited object size and possible occlusion conditions. In such applications, a vision-based system could not be enough to obtain accurate pose and shape estimation. In this work the authors propose a Time-of-Flight pre-touch sensor, integrated with a previously designed tactile sensor, for an accurate estimation of thin wire pose and shape. The paper presents the design and the characterization of the proposed sensor. Moreover, a specific object scanning and shape detection algorithm is presented. Experimental results support the proposed methodology, showing good performance. Hardware design and software applications are freely accessible to the reader.

1. Introduction

Given the move from structured, safe and controlled robotic cells to unstructured, dynamic and human-shared environments, robots are being called to perform even more complex tasks, often mimicking human beings or collaborating with them. As the task increases in complexity, more sensing abilities are required by the robots to autonomously determine the optimal task strategy, to enact it and to guarantee the maximum level of safety for itself, the external environment and the surrounding humans. Nowadays robots are equipped with various and heterogeneous sensory systems, e.g., cameras, odometers, LIDARs, radars, successfully accomplishing navigation tasks without colliding with external and dynamic objects [1,2].
Like for navigation tasks, for manipulation tasks robots must be able to perceive near objects and to estimate their pose with high accuracy, sometimes even before grasping and manipulating them. Grasping and manipulation are still challenging due to intrinsic difficulties with accurately perceiving objects of different shapes and sizes in cluttered environments. With these type of applications vision/depth sensors can capture positional and geometric information, but they are affected by problems related to occlusions and calibration errors. The latter may result in imprecise estimation of object shape and pose with severe impacts on the grasping process. Although in-hand cameras are preferred to the more conventional head-mounted ones, there are still some limitations in the classical calibration procedures that result not so precise even though the distance between the sensor and the object is reduced, as demonstrated in [3].
While cameras can be employed by robots to sense objects from relatively high distance, tactile sensing can be used only in strict contact with an object. However, tactile sensors are typically installed on the robot end-effector, e.g., robot grippers, robotic hands, and could potentially sense areas closer to the objects that could be not accessible to cameras. By exploiting the strategic position of tactile sensors, the integration of proximity sensors on them could enable robots to rely on pre-touch sensing, easily allowing the estimation of the object pose and shape when the robot is in the approaching and grasping phase. Differently from vision-based and tactile sensors, pre-touch sensors operate at an intermediate range, providing benefits of both the mentioned class of sensors: mounted to the robot end-effector, they are more robust against occlusion than cameras; mounted at a closer range, they may potentially provide more precise measurements; similar to camera/depth sensors, they do not require to get in contact with objects.
In these terms, through specific scanning strategy pre-touch sensors would enable robots to acquire geometric information of an object necessary to estimate its pose and shape and, hence, to perform grasping actions, manipulation and re-grasping by exploiting more accurate data.
Pre-touch sensors have been already used in robotics and different technologies, e.g., optical, electric field and acoustic have been adopted in recent years. Electric field (EF) pre-touch sensors have been widely used [4,5,6]. EF sensors exploit electric field generated between two electrodes irradiating the nearby objects with AC signals. By analyzing the alteration of the electric field, it is able to reconstruct the information of the irradiated objects. Despite the good performance, the use of EF sensors is limited to conductive materials with a high dielectric constant, so they are not so efficient with paper or rubber objects. Acoustic pre-touch sensors have been proposed in [7,8], where seashell effect has been studied. With this sensor technology a cavity and a microphone are used to scan the environment reconstructing the object shape information by analyzing the changes in the resonant frequency spectrum while the robot is approaching the object. Optical sensors have been demonstrated to be the most reliable technology, able to work with a wide range of materials with precise and accurate results. References [9,10,11] showed the effectiveness of the optical approach obtaining with high precision the shape or the distance of the object of interest. All three approaches measure the amount of light reflected by the objects, so intrinsic problems due to specific calibration procedures requested for any kind of color and surface reflectivity should be take into account.
Among the optical pre-touch sensors, Time-of-Flight (ToF) technology represents the solution to overcome the classical optical sensors issues. As shown in [3,12,13], time-of-flight sensors do not need calibration and they are robust and accurate enough with a wide range of materials. In particular, Lancaster et al. [3] successfully combine depth camera-based point cloud with pre-touch sensor information to improve the accuracy of the objects pose estimation, while Yang et al. [12] used similar time-of-flight-based sensors to improve the accuracy of the grasping actions in sequential manipulation tasks during the Rubik’s Cube solving. Sasaki et al. [13], more recently, successfully exploit ToF sensors to increase the robustness of the positioning of a robot hand during grasping task. Because slight errors in the posture and position are enough to cause grasping failure, they used proximity sensors installed on the hand fingernails, besides on the fingertips, to detect both target object and support surface. Proximity sensor data are, then, used in a grasping control technique based on the relative pose between the hand and the support surface and between the hand and the target object.
For the sake of completeness, the authors would remark how pre-touch sensing is not only useful in object shape and pose estimation, but it could provide a relevant contribution also in safety critical situations, e.g., Human-Robot Collaboration tasks. Some examples are reported in [14,15], where proximity sensors are used in nursing applications to prevent and avoid unintended collision with patients, absorb impact due to them and perform automatic rehabilitation activity.
Considering the good performance and the limited dimensions of the new off-the-shelf ToF sensors, the authors believe that they represent the best enabling technology for pre-touch sensing. In particular, starting from the tactile sensors already developed by some of the authors in the last ten years [16,17], in this paper the authors present a new pre-touch sensing solution, fully integrated in the pre-existent tactile sensors, able to recognize the shape of the objects through a specific scanning procedure. The work focuses on Deformable Linear Objects (DLO) manipulation, which represents one of the main objectives of H2020 REMODEL project https://remodel-project.eu/ (accessed on 2 September 2021). More in detail, during the switchgear assembly the recognition of the shape and pose of the manipulated wires results fundamental for the successful task execution. A precise knowledge of the wire pose obtained with pre-touch sensors in the pre-grasping phase can be used to correct the a priory knowledge on the grasping target obtained with supplementary sensors, e.g., depth cameras. In this way the robot end-effector pose can be corrected, guaranteeing a more stable wire grasping. The need of a pre-touch sensor, intended also as a complementary sensor for camera-based systems, is remarked by supplementary studies conducted within H2020 REMODEL project and presented by Cop et al. [18]. In particular, a benchmark between several commercial depth cameras has been conducted in order to highlight the limitations of that sensors when used to recognize very small and thin objects, i.e., wires. The cameras performance have been evaluated through standard metrics and carefully focusing on the object distance estimation (depth in the Z-dimension). The study shows that low- and mid-range cameras, e.g., Intel RealSense D415, Microsoft Kinect Azure, etc., are not able to provide accurate reconstruction of wires thinner than 4.0 mm. Moreover, reconstruction accuracy is affected by the object materials and it is highly altered when transparent objects are considered.
The direct use of 2D camera images in wire shape recognition can be sufficient in constrained conditions, where 3D information is not necessary. Chapman et al. [19] were recently faced with the FFC cables grasping problem, and they approached the challenge with a customized low-cost gripper. An in-hand RGB camera allows for the recognition and 2D pose estimation of the flat cables through standard computer vision methods. However, the specific task requires solving only a 2D problem given that the cables lie on a flat surface assumed to be parallel to the robot base. In these terms, no information about the distance between the cables and the support are needed, so the main effort is focused on the design of the gripper used to efficiently grasp and assemble the cables. In a more general meaning, wires grasping and manipulation require the estimation of the object 3D pose. An accurate knowledge of the complete pose in the Cartesian space by using 2D camera images is quite challenging and computationally expensive as shown in [20], where the authors combine multiple 2D views to reconstruct the 3D pose. This approach requires tackling problems related to the alignment of the multiple views by considering the DLOs features. Once the different perspective views have been detected (not an immediate task given that they depend also on the available lightning and on the reachable areas), object reference points shall be used to merge the 2D images, which in case of DLOs are not easily obtainable due to the high uniformity of their surface and to the absence of shape corners, usually considered for more common objects.
Despite the effectiveness of ToF-based pre-touch sensors widely demonstrated in other works (see [12,13]), in case of daily-life and common objects, e.g., bottles, hammers and screwdrivers, fruits and vegetables, no evidence of their performance is reported in case of small and thin objects such as wires. For that reason, this work would further explore the application of ToF pre-touch in DLO manipulation tasks REMODEL is faced with. More in detail, the manipulation of DLOs requires an accurate 3D reconstruction of the wires to allow a correct grasp for the execution of the following task (e.g., wire insertion, wire routing). In this paper, to reach this goal, a custom hardware and software design for a ToF based sensor is presented. Additionally, to evaluate if the sensor performance is adequate to the objective, a detailed characterization is also reported. Specific experiments have been defined and carried out to evaluate the proposed solution with thin wires.
The paper is organized as follows. Section 2 describes the main design features of the pre-touch sensor used in this paper. Section 3 tackles the sensor characterization by reporting the main characteristics as Signal-to-Noise Ratio (SNR), repeatability and hysteresis. Section 4 describes more in detail the scanning procedure used to extract the shape and pose information from the pre-touch sensor data. Section 5 reports the experimental analysis and results. Finally, Section 6 highlights the conclusion and the future works.

2. Sensor Technology

This section describes the mechanical, software (SW) and hardware (HW) design of the proposed prototype. The design completely refers to the custom solution presented in this work. Only few references to the chip datasheet are reported in order to provide general functioning information used in Section 3 and to support a more detailed comparison between the proposed solution and a commercial one.
Starting from the tactile sensor developed and presented by the authors in [16,17], the pre-touch (proximity) sensor prototype described in this paper has been designed in order to be compatible with the existing tactile sensor solution from both mechanics and hardware/software points of view. In particular, the compatibility of the new sensors with the mentioned tactile one represented the principal design constraint, e.g., the new board has been designed in such a way it can be installed on the rear side of the tactile sensor and respecting the electrical and software interface of the latter. The choice of a reliable, accurate, but also small-sized proximity sensor is mandatory in order to not significantly alter the shape and dimensions of the tactile sensor chassis. Given the previously listed requirements, the VL6180X Time-of-Flight sensor provided by STMicroelectronics has been chosen.
As said before, the ideas behind the integration of proximity sensors on the tactile finger are two: (a) to detect unknown objects in the surroundings of the sensorized finger when a robotic arm, where the latter is installed on, is performing a pick and place task; (b) to estimate the position and the shape of close objects, i.e., thin wires, that the robotic gripper has to grasp and manipulate. Proximity sensors have to be placed along different axis in order to detect objects in all the directions, see Figure 1. From that comes the need of designing small and multiple proximity sensor modules which can be installed, with a Plug&Play procedure, on the existent tactile sensor using small and non-invasive connectors and PCBs.

2.1. Mechanical Design

The mechanical design starts from the CAD drawings of the tactile sensor chassis. In order to cover four directions around the tactile sensor with the proximity module Field of View (FoV) and given the limited space available on the pre-existent chassis of the tactile sensor, i.e., 25 × 45 mm, specific connection solutions have been detected. The chosen connectors allow for the installation of new and small PCBs forming 90 degrees w.r.t. the tactile sensor chassis by guaranteeing a low height profile of the pre-existent chassis, i.e., new electronics can be embedded in 10 mm height-sized box. In fact, with the detected components, it is possible to design proximity sensor modules with very small PCBs, i.e., 12 × 8 mm. See Figure 2 for the CAD drawings.
For more details, the part numbers of the connectors are listed below:
  • Samtec CLP-103-02-F-DH: female horizontal low profile connector;
  • Samtec FTSH-103-04-F-DV: male vertical connector (compatible with CLP-103-02-F-DH);
  • Samtec CLP-103-02-F-D: female vertical low profile connector;
  • Samtec FTSH-103-03-F-DV: male vertical connector (compatible with CLP-103-02-F-D).

2.2. Hardware Design

This section describes the design of the system from a hardware point of view. The electronics has been developed in order to separate the pre-touch system into two parts:
  • An interface board, which shall be installed on the rear side of the tactile sensor chassis;
  • A self consistent sensor module, that hosts the VL6180X.
According to the specific need to locally have or not a dedicated processing unit, two versions of the interface board have been designed. The first one (v1) represents only an adapter, a kind of bridge, between the proximity sensor modules and the interrogation board of the tactile sensor. The board, whose dimensions are 24 × 23 mm, is able to host up to four proximity sensor modules through horizontal connectors and can be connected to the tactile board via a 8-ways connector, i.e., JST SM08B-SRSS-TB. Figure 3a reports the Eagle [21] view of the designed 2-layers PCB, while Table 1 reports the interface connector pinout.
The second version (v2) of the interface board provides a complete, compact ( 24 × 34 mm) interrogation system for the proximity sensor modules. As the first version, it can host up to four proximity modules, but it provides also an elaboration unit for sensor data acquisition. The MCU is a PIC microcontroller, i.e., Microchip PIC16F19176. The microcontroller can be externally interrogated through a Serial Bus via the 5-ways interface connector, while it scans the proximity sensor modules via I2C interface. Differently from the v1, in the v2 the middle proximity module is directly soldered on the PCB and it is no more connected through the connector. Moreover, the board provides the programming connector that can be used to flash the PIC firmware via PicKit 3 programmer. Figure 3b reports the Eagle view of the designed 2-layers PCB, while Table 2 reports the interface and programming connectors pinout.
Concerning the proximity sensor module, it has been designed in order to obtain a self consistent module. The module PCB hosts all the components needed to properly communicate with the VL6180X device and to supply it: a LDO Regulator, i.e., TPS76928 by Texas Instruments, is used to provide the 2.8 V to the sensor, while two NXP BSS138 are used as level shifters to adapt the logic levels for the I2C bus. The module, with dimensions of only 12 × 8 mm, can be accessed via a 6-ways Samtec connector, namely, the Samtec FTSH-103-04-F-DV. Figure 4 shows the Eagle view of the PCB of the proximity sensor module, while Figure 5 shows the top and bottom perspectives of the complete proximity sensor module.
Figure 6 reports a view of the complete system composed by the Interface Board (v1) and two proximity sensor modules integrated on the same finger that hosts the tactile sensor.

2.3. Software Design-MCU Side

The VL6180X is a Time-of-Flight proximity sensor that communicates with a microcontroller system through a serial I2C bus (it represents a slave device). The communication protocol is not a standard one, so a custom I2C low level driver for read and write operations needs to be developed. As described in Section 2.2, the chosen elaboration unit is a Microchip PIC16F19175/176 microcontroller, so the Microchip MPLab IDE and XC8 compiler have been chosen as firmware development toolchain. According to [22], the VL6180X has different registers that can be used to:
  • Configure the range measurement functionality;
  • Configure the ambient light sensor measurement functionality;
  • Configure the convergence time;
  • Configure the measurement period;
  • Read the sensor measurements.
In order to access to the mentioned registers, the I2C protocol reported in Figure 7a,b has to be implemented, where “S” is the start bit, “As” is the acknowledge of the slave device, “Am” is the acknowledge of the master device and “P” is the stop bit. A high level API library has been developed in order to allow the user to easily configure the device and read the range measurements. The APIs are listed below:
  • uint8_t VL6180X_IdentifyDevices(void): identifies the number of VL6180X devices on the I2C bus;
  • void VL61800X_SetChipEnable(uint8_t DeviceNumber): enables the device through the dedicated GPIO microcontroller pin;
  • bool VL6180X_WriteRegister(uint8_t address, uint16_t reg, uint8_t data): writes a device register;
  • uint8_t VL6180X_ReadRegister(uint8_t address, uint16_t reg, bool error): reads a device register;
  • void VL6180X_SetupRegisters(uint8_t DeviceNumber): setups all the configuration registers of the device;
  • void VL6180X_MeasureRange(uint8_t NumDeviceToRead, uint8_t measure, uint8_t status): reads the range measurements of the devices on the I2C bus.
A typical application should call in sequence the following routines in order to properly accomplish a range measurement:
  • VL6180X_IdentifyDevices();
  • VL6180X_SetupRegisters() for each detected device;
  • VL6180X_MeasureRange() for one range measurement.
The specific SW implementation allows one to plug and unplug the proximity sensor modules without any SW modifications or re-compilations. Once the number of proximity modules changes, it is enough to restart the MCU in order to automatically re-configure it.
The developed SW library is freely available to the readers at the repository reported in [23].

2.4. Software Design-PC Side

Concerning the design of the software developed to acquire and elaborate the data from the proximity sensors, two software versions are available, i.e., one for Windows OS systems and one for Linux OS systems. For both versions, Robot Operating System (ROS) has been selected as framework for process scheduling and inter-process communication management. In this paper, the Linux Version will be described in more detail, but generally the Windows version follows the same SW design and specifications.
The software has been designed in order to be as flexible as possible and independent on the type of sensor (also tactile sensor is compatible) and on the number of sensing elements installed on it. Moreover, due to the particular protocol used for data exchange, it is possible to automatically detect the number and the type of the sensors attached to the elaboration system, e.g., PC-based system. It means that the designed SW, in the initialization phase, is able to detect:
  • The number of available serial ports on the PC system;
  • The sensor type attached to each serial port: tactile or proximity sensor;
  • The number of sensing elements installed on each sensor, i.e., number of tactile elements or VL6180X modules.
Once the initialization phase is completed, the proximity sensor RAW data are made available on a ROS topic, namely, “/sensor_data” topic. The RAW data correspond to the distance (not filtered signal), in millimeters, from the observed object. The distance is codified with an unsigned integer of 8 bits (uint8_t) and the maximum reading frequency reached with the presented system is 50 Hz for each proximity module. With the configuration used in this specific application the proximity sensor is able to measure the distance from an object with millimetric accuracy in a range from 0.5 mm to 100 mm.
If the reader is interested in having more details about the PC side—SW design and implementation, they can refer to [17].

3. Sensor Characterization

This section reports the sensor characterization in terms of repeatability, hysteresis, SNR and sampling frequency. In order to provide a complete overview of the designed system, the developed proximity sensor and the interrogation board based on the PIC MCU have been compared with the on-hand ST solution. The characterization and the comparison analyses show the effectiveness of the proposed solution. The two systems under comparison are resumed in Table 3.
From a software point of view the ST solution provides:
  • A Windows-based application (VL6180X_Explorer) with a graphic interface that allows to visualize at run-time the sensor data, e.g., range in mm, Ambient Light Sensor (ALS) mode data, frequency, and to log the acquired data in CSV format;
  • A firmware compatible with the Windows application and the STM32F401 Nucleo Board.
To better understand how the sensor data are acquired with both systems, please refer to Figure 8. In the upper diagram a schematization of the operating phases for the ST solution is reported, while in the lower one there are the operating phases for the designed solution. It is clear how the ST solution does not allow one to read the RAW sensor data. In fact, at PC application level only filtered data are available. In particular, the sensor data are processed by two filtering steps:
  • A low pass filter (LPF) is implemented at firmware level and its output is provided to the PC application;
  • A filtering process based on the SNR is implemented at application level, i.e., when the SNR of a single sample is too low then the sample is discarded. The second filtering step is necessary in order to obtain a clean ranging signal. Given the low level driver configuration of VL6180X device for the ST solution, the acquired data are not always correct due to high convergence time and low SNR. The upper image in Figure 9 shows:
    • The data not processed with the second filtering stage in blue;
    • The data processed with both filtering stages in red.
    The lower image, instead, reports the SNR acquired from the PC ST application. The red curve is obtained selecting the samples of the blue curve by thresholding the SNR. In particular, if a specific sample elaborated by the LPF has a SNR lower than 255, the sample itself is discarded.
Concerning the Custom Solution, the SW does not implement any filtering stage, so all the analyses are made on the RAW sensor signal.
Figure 10 reports the characterization setup for the two solutions. A paper box is used as object, while a standard meter is used to measure the distance between the object and the sensor. The same setup is used for both systems’ characterization.

3.1. Sampling Frequency

Sampling frequency characterization is of high interest for a Time-of-Flight sensor as VL6180X. In such sensors, the sampling frequency highly depends on the convergence time of each measure, hence, on the object distance. From datasheet (see Figure 11), it can be noticed that the total execution time of a single reading process can be calculated as the sum of:
  • Pre-Calibration Time: 3.2 ms;
  • Range Convergence Time: variable;
  • Readout Averaging Time: 4.3 ms.
The range convergence time depends on the maximum range the sensor is configured for. For example, for a maximum range of 100 mm, the convergence time is 10.73 ms with a target reflectance of 3 % and 0.73 ms with a target reflectance of 88 %. In these conditions, a reading operation typically ranges between 8.23 ms and 18.23 ms that corresponds to a sampling frequency range of 54.8 , 121.5 Hz.
Figure 12a,b report the sampling frequency for the two systems. In the ST solution case, the behaviour described at the beginning of this chapter has to be taken into account. In particular, the sampling frequency reported in Figure 12a refers to the data available before the second filtering stage and so it is not the sampling frequency obtained at the end of the processing channel. In both cases, the sampling frequency depends on the obstacle distance: the higher the distance, the lower the sampling frequency. The custom solution shows a more stable behavior and, in general, a higher sampling frequency when the object distance is about 100 mm, e.g., 68 Hz.

3.2. Repeatability

The repeatability evaluation has been carried out considering a total of 10 measurements. During the tests, the object has been moved several times in different positions in order to obtain 10 measurements for each of the following distances: 10 mm, 20 mm, 30 mm, 50 mm, 100 mm. Figure 13a,b report the results. The repeatability error has been evaluated by considering the variance w.r.t. the mean value computed on the 10 measures. The ST solution shows a repeatability error that increases when the object is closer to the sensor, e.g., at 10 mm the max error in percentage is 68%. Instead, the repeatability error decreases when the object is farther, e.g., at 100 mm the max error in percentage is 0.72 %. For the custom solution, the repeatability error is quite constant at the different evaluated object distances. In particular, at 10 mm the max repeatability error in percentage is 16% while at 100 mm it is 13%.
By using the same measures, also the measurement error has been computed. Green lines/stars in Figure 13 represent the expected values. In particular, the measurement error is obtained by considering the difference between the mean values obtained on the 10 measures and the expected values. As for repeatability error, the ST solution shows a measurement error that increases when the object is closer to the sensor, e.g., at 10 mm the max error in percentage is 57%. Differently, the custom solution shows a more constant measurement error of about 10%.
Table 4 summarizes the repeatability and the measurement errors in the different working conditions for the two solutions.

3.3. Hysteresis

Figure 14a,b report the hysteresis for the two solutions. The sensor behavior is quite linear for both solutions and the tests showed the absence of a hysteretical behavior for VL6180X sensor. The hysteresis error has been computed finding the maximum difference between the distance values on the two “sides” of the hysteresis graph corresponding to the same object distance value and using the following equation:
e h y s t = d i n c r d d e c r d m a x × 100 ,
where d i n c r and d d e c r are the distance values on the increasing and decreasing sides respectively and d m a x is the maximum distance value reached during the experiment. In the specific test, data of one cycle have been used to compute the error index. No relevant differences are obtained with two or more cycles.
According to the previous equation, the hysteresis errors for the two systems are:
  • ST solution hysteresis error: e h y s t = 4.02 %;
  • Custom solution hysteresis error: e h y s t = 1.53 %.

3.4. Power Spectrum

The power spectrum for the two systems has been computed to evaluate the SNR. The tests have been performed placing the object at different distances in order to evaluate if the SNR depends on the object distance. The object has been located at the following distances: 100 mm, 50 mm, 20 mm, 10 mm.
The results are reported in Figure 15a,b. It is possible to see how, since the signal bandwidth is limited to few hertz, the noise level is about 4–5 orders of magnitude below the signal level in all cases.
The noise level for the custom solution is higher than the ST case. It is quite obvious that these results are related to the filtering stages used in the ST processing chain.

4. Wire Detection and Scanning Procedure

As said in Section 1, one of the main objectives of H2020 REMODEL project is related to Deformable Linear Objects manipulation, i.e., thin wires for switchgears assembly. For such a task and for all the grasping and manipulation tasks on thin and small objects, where the contact area is very small, an accurate knowledge of the object pose is fundamental. To this aim, in this work the proposed pre-touch sensor has been used to increase the a priori knowledge of the object pose by re-constructing a proximity-based point cloud. In particular, a specific scanning strategy has been adopted to scan a predetermined area: the measurements of the pre-touch sensor, properly processed to be referred w.r.t. to the robot base frame, are collected in order to obtain the object point cloud. By properly processing the point cloud, the shape of the wire under analysis is estimated. The validity of the reconstruction algorithm is demonstrated by letting the robot end-effector autonomously follow the wire for all its length through a classical position control. In the following, more details on the scanning strategy and on the estimation algorithm are reported, while Section 5 describes the experimental results.
Let Σ b and Σ e be the robot base frame and the end-effector frame (whose origin corresponds to the center of the proximity sensor), respectively. See Figure 16 for clarification about the reference frames’ pose. For simplicity, let us divide the scanning strategy into two main parts: (a) scanning phase and (b) shape estimation phase.
(a) Scanning strategy algorithm has been designed in order to generate a point map starting from the following parameters (refer to Figure 17):
  • Δ X b : the dimension of the area to scan along the x-axis of Σ b and expressed in m ;
  • Δ Y b : the dimension of the area to scan along the y-axis of Σ b and expressed in m ;
  • Δ Ze: the delta along the z-axis of Σ e and expressed in m ;
  • Δ Scan: the distance between two successive scanning lines expressed in m ;
  • v: the scanning speed expressed in m / s ;
  • ScanningDirection: the main axis for the scanning map generation.
With the defined parameters, the initial point p i e and the final point p f e of the map are p i e = 0 , 0 , Δ Z e T and p f e = p i e + R b e Δ X b , Δ Y b , 0 T , where R b e is the Σ e to Σ b rotation matrix. In these terms, a Y-Scanning Point Map can be obtained as in Equation (2).
P m a p e = p 1 e p 2 e p n e 1 1 1 = x i e x i e x i e + Δ Scan x i e + Δ Scan x i e + 2 Δ Scan y i e y i e + Δ Y e y i e + Δ Y e y i e y i e Δ Z e Δ Z e Δ Z e Δ Z e Δ Z e 1 1 1 1 1
The X-Scanning Map is similarly obtained. To compute the scanning trajectory in Cartesian space expressed in base frame, Equation (3) is used.
P m a p b = T e b P m a p e ,
where T e b is the homogeneous transformation matrix that expresses Σ e in Σ b and it is computed with the direct kinematic algorithm on the initial robot joints configuration. Once the point map P m a p e is generated, a position trajectory in the Cartesian space is obtained with a constant speed profile according to the parameter v. Afterwards, the joint-space trajectory used to control the robot is computed with a first order CLIK (Closed-Loop Inverse Kinematics) algorithm [24]. The point cloud is obtained collecting the pre-touch sensor measurements, acquired at a rate of 50 Hz while the robot is moving according to the trajectory previously computed. In particular, the k-th point of the point cloud is stored as pc k b = x p c k b , y p c k b , z p c k b T , where:
  • x p c k b = x e b is the current x-coordinate of the end-effector;
  • y p c k b = y e b is the current y-coordinate of the end-effector;
  • z p c k b = z e b m k is the current z-coordinate of the end-effector minus the current proximity sensor measurement.
Figure 18 reports an example of point cloud acquired with a 2.5 mm diameter wire at a cruiser speed of 0.02 m/s.
(b) Shape estimation phase consists in approximating the scanned wire shape by using a third order polynomial. The algorithm can be described with the following steps:
  • Segmentation of the point cloud acquired during the scanning phase in separate rows (or columns) considering two by two the points in the P m a p e reported in Equation (2);
  • Selection of the point with the maximum z-coordinate for each segment;
  • Computation of the analytical form of the wire shape estimation by using a third order polynomial interpolating the points obtained in the previous step.
For the last mentioned step, supposing the wire main direction is mostly parallel to the x-axis of Σ b , once the coordinates of the points p k , m a x = ( x k , m a x , y k , m a x ) are known for each segments, it is possible to write:
y 1 , m a x y 2 , m a x y M , m a x = x 1 , m a x 3 x 1 , m a x 2 x 1 , m a x 1 x 2 , m a x 3 x 2 , m a x 2 x 2 , m a x 1 x M , m a x 3 x M , m a x 2 x M , m a x 1 a b c d
with M being the number of segments. Inverting the Equation (4) by using a pseudo-inverse, the parameters a, b, c and d of the polynomial are found in the sense of minimum square error. If the main direction of the wire is orthogonal to the x-axis of Σ b , the x and y coordinates in the (4) are exchanged.
An example of shape estimation is reported in Figure 18, where the red stars are the points with the maximum z-coordinate of the rows in the point cloud and the red line is the third order polynomial approximating the wire shape.

5. Experiments

In wires’ manipulation, the quality of the grasping is really important to assure the success of the whole task. For this reason it is necessary to precisely know where to grasp the wire and, since thin wires are not easily detectable by using depth cameras as pointed out in [18], other procedures are needed to compute with a good resolution the pose to use for the grasping. In this work preliminary experiments have been carried out to evaluate performance and limitations of the proposed procedure.
In particular, the aim of this section is to assess:
a
What is the minimum diameter of the DLOs well recognizable with the proposed solution, regardless of the material;
b
What is the minimum distance between two adjacent and parallel wires that allows one to distinguish them;
c
What is the maximum allowed approaching distance between the sensor and the scanned wires;
d
The goodness of the estimated wire shape, thus, the possibility to use it for the computation of a correct grasping pose.
Referring to the parameters defined in Section 4, the experiments described below have been performed choosing: Δ X b and Δ Y b depending on the area to scan, Δ Scan = 0.01 m, ScanningDirection = X . Concerning the cruiser speed v used for the scanning phase, tests have been performed considering v = { 0.005 , 0.01 , 0.02 } m/s obtaining similar results. The scanning time depends on the above parameters. To have an idea, for the scanning area (about 10 × 18 cm) reported in the Video S1 attached to the paper (see details below), with Δ Scan = 0.01 m and v = 0.01 m/s, the scanning time needed to obtain the point cloud is about 220 s.
The experimental setup, refer to Figure 16, consists of a UR5e robot manipulator by Universal Robot equipped with the Hand-e gripper by Robotiq. The proximity sensor presented in the previous sections is mounted on the tip of one of the two fingers of the aforementioned gripper. The background of the scanned area is a common white paper.
The first experiment has been implemented to evaluate what DLOs in terms of diameter and material are recognizable with the proposed solution. In particular, in the experiment (see Figure 19a) four wires and a flexible hose have been taken into account, since they are the DLOs used in REMODEL use cases. The diameters of the considered wires are (from left to right) 4 mm, 3 mm, 2 mm and 1.5 mm respectively, while the hose diameter is 4 mm. Figure 19b reports the scanned map obtained by the sensor and it highlights how only the wire with the smallest diameter is hard to distinguish. By considering these results for the following experiments the wire with 2 mm of diameter has been selected since it represents the DLO well recognizable with the minimum diameter.
The second experiment is devoted to assess if the sensor is able to distinguish two parallel 2.0 mm diameter wires and at which distance they begin to be detected as one single wire by the sensor. The results are reported in Figure 20, where it is possible to see the point clouds obtained maintaining the same height of the end effector for four different distances between the two wires: 20 mm, 15 mm, 10 mm and 5 mm. The two wires can be distinguished in the first three cases but not in the last one, i.e., with a distance of 5 mm between the wires. To partially quantify the quality of recognition described above, since a ground truth for the scene is not available, the mean distance d m e a n estimated among the recognized wires is compared with the nominal one d n o m . In particular, referring to a generic case, as the one reported in Figure 21, the estimated mean distance d m e a n is obtained by computing the mean of the distances between the two points (red and black stars in the figure) corresponding to each of the two wires for all the “scanned segments”. Hence, an error indicator can be computed as
e d = | d n o m d m e a n | d n o m × 100 .
Evaluated error reported in Table 5 highlights how errors greater than 30 % correspond to low recognition quality.
The third experiment is similar to the second one. The aim is, again, to see if the sensor is able to distinguish two parallel 2.0 mm diameter wires but, this time, changing the height of the end effector, i.e., the distance between the wires and the proximity sensor, for the scanning phase. Figure 22 reports the point clouds of two 2.0 mm diameter wires, spaced 15 mm apart, obtained with four different heights h e of the end effector: 20 mm, 25 mm, 30 mm and 35 mm from the wires. As in the previous experiment, the two wires can be distinguished in the first three cases but not in the last one, i.e., with a distance of 35 mm from the wires. In addition, in this case, to quantify the quality of reconstruction, the error indicator of Equation (5) has been computed and the results are reported in Table 6.
The limitations observed during the previous experiments, i.e., the impossibility to detect two parallel wires in the mentioned conditions, are mainly due to the Field of View (FOV) the VL6180X sensor is sensible to. VL6180X FOV is typically 42 degrees (half angle) in both x and y axes. In case of small or thin objects, for example, the light beam used by the sensor to detect the presence of the object and hence to measure its distance could be reflected by surfaces belonging to two or more objects, i.e., the two wires in the experiments. In the mentioned conditions, it is not possible to properly distinguish the objects between them. Since the FOV angle is an intrinsic parameter of the device, i.e., it is a non-calibratable feature, this represents a limitation of the proposed system that can be overcome only by changing the sensing element.
The last experiment is thought to show how the scanning procedure and the wire shape estimation can be used in a wire manipulation task. The goal is to demonstrate how it is possible to precisely compute the grasping pose by using the estimation of the wire shape obtained with the algorithm presented in Section 4. In the experiment the robot end effector follows the wire for all its length according to the estimated shape, as shown in the video attached to the paper. To this aim, the trajectory is computed by using the x, y and z coordinates of the estimated wire for the end effector position and the corresponding first derivative for its orientation, so as to have the fingers well positioned to grasp the wire. In particular, the first order derivative f ( x ) of the polynomial y = f ( x ) = a x 3 + b x 2 + c x + d represents, for each x, the angular coefficient of the tangent to the curve representing the wire shape and it is used, together with the x, y coordinates from the shape approximation and z coordinate from the point cloud, to compute the pose of the robot end effector. The latter can be described by the following homogeneous transformation matrix:
T e b ( p ) = sin ( tan 1 ( f ( x ) ) ) cos ( tan 1 ( f ( x ) ) ) 0 x cos ( tan 1 ( f ( x ) ) ) sin ( tan 1 ( f ( x ) ) ) 0 y 0 0 1 z 0 0 0 1
In this way, the robot end effector follows the estimated wire shape in position and with the y-axis of Σ e parallel to the tangent to the wire, so to have the fingers’ tactile pads aligned with the wire, ready to successfully grasp it. Figure 23 reports some frames of the video showing the robot executing the trajectory explained above. It is important to underline that the action of the robot end effector following the wire is not so useful in a real scenario, indeed it is here used only to demonstrate the validity of the proposed shape reconstruction algorithm. In a real use case only one grasping point is necessary and it corresponds to one of the points of the trajectory computed for this experiment.

6. Conclusions and Future Works

The paper proposed a Time-of-Flight-based proximity sensor for thin and small object detection and shape estimation. The paper presented several contributions. From the hardware point of view, the authors developed a suitable hardware for the integration of several ToF modules on board parallel gripper fingers. From the software point of view, a new library for the use of sensors has been developed: the library allows one to improve sensor performance with respect to standard ST solution available with the sensor. A complete characterization of the sensor (in terms of sampling frequency, repeatability, hysteresis, SNR) has been carried out with a comparison of the developed software solution with respect to standard ST one. The comparison demonstrated how the proposed solution improved the sensor performance. The discussion about the state of the art demonstrated that alternative technology (e.g., 2D and 3D camera) are not able to detect thin DLOs as wires considered in this paper and as a consequence the authors exploited the proposed sensor for the development of a 3D scanning strategy for thin objects in order to reconstruct the 3D pose of different wires. The estimated 3D poses can be used to grasp the DLO along its length. Different experiments have been carried out to demonstrate that the estimated poses can be used to correctly grasp the DLO in a desired point. The proposed solution for the wire scan presents limitations in some working conditions mainly related to the Field of View of the VL6180X device that impacts the capability to detect thinner or closer wires. In future works further scanning and filtering strategies will be evaluated in order to improve the scan algorithm performance. The estimated 3D wire pose can be exploited with task planning approaches to generate optimized grasping trajectories. In addition, a deeper integration of the pre-touch sensor with the tactile one will be evaluated for the implementation of more complex grasping and manipulation tasks. For the sake of completeness, the authors make available the SW packages and the HW design of the presented proximity sensor at the following GitHub link [23].

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/machines9090188/s1, Video S1: Scan Wire Demo.

Author Contributions

Hardware design, A.C. and S.P.; Software design, A.C. and G.L.; Methodology and experiments definition, all authors; Writing, all authors. All authors have read and agreed to the submitted version of the manuscript.

Funding

This work was partially supported by the European Commission within H2020 REMODEL Project (no. 870133).

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available since are strictly related to the specific hardware used.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Aman, M.S.; Mahmud, M.A.; Jiang, H.; Abdelgawad, A.; Yelamarthi, K. A sensor fusion methodology for obstacle avoidance robot. In Proceedings of the 2016 IEEE International Conference on Electro Information Technology (EIT), Grand Forks, ND, USA, 19–21 May 2016; pp. 458–463. [Google Scholar] [CrossRef]
  2. Falco, P.; Natale, C. Low-level flexible planning for mobile manipulators: A distributed perception approach. Adv. Robot. 2014, 28, 1431–1444. [Google Scholar] [CrossRef]
  3. Lancaster, P.; Yang, B.; Smith, J.R. Improved object pose estimation via deep pre-touch sensing. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 2448–2455. [Google Scholar] [CrossRef]
  4. Wistort, R.; Smith, J.R. Electric Field Servoing for robotic manipulation. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 494–499. [Google Scholar] [CrossRef]
  5. Mayton, B.; LeGrand, L.; Smith, J.R. An Electric Field Pretouch system for grasping and co-manipulation. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, Alaska, 3–8 May 2010; pp. 831–838. [Google Scholar] [CrossRef]
  6. Smith, J.R.; Garcia, E.; Wistort, R.; Krishnamoorthy, G. Electric field imaging pretouch for robotic graspers. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 676–683. [Google Scholar] [CrossRef]
  7. Jiang, L.-T.; Smith, J.R. Seashell effect pretouch sensing for robotic grasping. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 2851–2858. [Google Scholar] [CrossRef] [Green Version]
  8. Jiang, L.; Smith, J.R. A unified framework for grasping and shape acquisition via pretouch sensing. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 999–1005. [Google Scholar] [CrossRef] [Green Version]
  9. Hsiao, K.; Nangeroni, P.; Huber, M.; Saxena, A.; Ng, A.Y. Reactive grasping using optical proximity sensors. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 2098–2105. [Google Scholar] [CrossRef]
  10. Maldonado, A.; Alvarez, H.; Beetz, M. Improving robot manipulation through fingertip perception. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Algarve, Portugal, 7–12 October 2012; pp. 2947–2954. [Google Scholar] [CrossRef] [Green Version]
  11. Di Guo, P.; Lancaster, L.J.; Sun, F.; Smith, J.R. Transmissive optical pretouch sensing for robotic grasping. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015; pp. 5891–5897. [Google Scholar] [CrossRef]
  12. Yang, B.; Lancaster, P.; Smith, J.R. Pre-touch sensing for sequential manipulation. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5088–5095. [Google Scholar] [CrossRef]
  13. Sasaki, K.; Koyama, K.; Ming, A.; Shimojo, M.; Plateaux, R.; Choley, J. Robotic Grasping Using Proximity Sensors for Detecting both Target Object and Support Surface. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 2925–2932. [Google Scholar] [CrossRef]
  14. Tsuji, S.; Kohama, T. Self-Capacitance Proximity and Tactile Skin Sensor with Shock-Absorbing Structure for a Collaborative Robot. IEEE Sens. J. 2020, 20, 15075–15084. [Google Scholar] [CrossRef]
  15. Liang, J.; Wu, J.; Huang, H.; Xu, W.; Li, B.; Xi, F. Soft Sensitive Skin for Safety Control of a Nursing Robot Using Proximity and Tactile Sensors. IEEE Sens. J. 2020, 20, 3822–3830. [Google Scholar] [CrossRef]
  16. De Maria, G.; Natale, C.; Pirozzi, S. Force/tactile sensor for robotic applications. Sens. Actuators A Phys. 2012, 175, 60–72. [Google Scholar] [CrossRef]
  17. Cirillo, A.; Costanzo, M.; Laudante, G.; Pirozzi, S. Tactile Sensors for Parallel Grippers: Design and Characterization. Sensors 2021, 21, 1915. [Google Scholar] [CrossRef] [PubMed]
  18. Cop, K.P.; Peters, A.; Žagar, B.L.; Hettegger1, D.; Knoll, A.C. New Metrics for Industrial Depth Sensors Evaluation for Precise Robotic Applications. In Proceedings of the 2021 IEEE International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; Available online: https://mediatum.ub.tum.de/670474?show_id=1616006 (accessed on 2 September 2021).
  19. Chapman, J.; Gorjup, G.; Dwivedi, A.; Matsunaga, S.; Mariyama, T.; MacDonald, B.; Liarokapis, M. A Locally-Adaptive, Parallel-Jaw Gripper with Clamping and Rolling Capable, Soft Fingertips for Fine Manipulation of Flexible Flat Cables. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Hybrid Conference. Xi’an, China, 30 May–5 June 2021. [Google Scholar]
  20. Caporali, A.; Galassi, K.; Palli, G. 3D DLO Shape Detection and Grasp Planning from Multiple 2D Views. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Virtual Conference. Delft, The Netherlands, 12–16 July 2021; pp. 424–429. [Google Scholar]
  21. Autodesk EAGLE, Electronic Design Automation (EDA) Software. Available online: https://www.autodesk.com/products/eagle/ (accessed on 2 September 2021).
  22. Proximity Sensor and Ambient Light Sensing (ALS) Module. Available online: https://www.st.com/en/imaging-and-photonics-solutions/vl6180x.html (accessed on 2 September 2021).
  23. GitHub Reference for Proximity Sensor Design. Available online: https://github.com/Vanvitelli-Robotics/REMODEL_WP6_MDPI_SENSORS_2021 (accessed on 2 September 2021).
  24. Cirillo, A.; Ficuciello, F.; Natale, C.; Pirozzi, S.; Villani, L. A Conformable Force/Tactile Skin for Physical Human–Robot Interaction. IEEE Robot. Autom. Lett. 2016, 1, 41–48. [Google Scholar] [CrossRef]
Figure 1. Sketch of the tactile sensor with proximity sensors.
Figure 1. Sketch of the tactile sensor with proximity sensors.
Machines 09 00188 g001
Figure 2. CAD drawings of the proximity sensor modules: perspective view of the whole sensor system (a) and zoom on proximity system (b).
Figure 2. CAD drawings of the proximity sensor modules: perspective view of the whole sensor system (a) and zoom on proximity system (b).
Machines 09 00188 g002
Figure 3. Interface board v1 (a) and v2 (b): Eagle view of the PCBs.
Figure 3. Interface board v1 (a) and v2 (b): Eagle view of the PCBs.
Machines 09 00188 g003
Figure 4. Proximity sensor module: Eagle view of the PCB.
Figure 4. Proximity sensor module: Eagle view of the PCB.
Machines 09 00188 g004
Figure 5. Top (a) and bottom (b) view of the proximity sensor module.
Figure 5. Top (a) and bottom (b) view of the proximity sensor module.
Machines 09 00188 g005
Figure 6. Interface board v1 and two proximity sensor modules installed on the tactile finger: front (a) and lateral (b) view.
Figure 6. Interface board v1 and two proximity sensor modules installed on the tactile finger: front (a) and lateral (b) view.
Machines 09 00188 g006
Figure 7. Write (a) and read (b) register operations.
Figure 7. Write (a) and read (b) register operations.
Machines 09 00188 g007
Figure 8. SW operating diagrams.
Figure 8. SW operating diagrams.
Machines 09 00188 g008
Figure 9. SNR filtering stage for ST solution.
Figure 9. SNR filtering stage for ST solution.
Machines 09 00188 g009
Figure 10. Experiments Setup for ST Solution (a,b) and for Custom Solution (c,d).
Figure 10. Experiments Setup for ST Solution (a,b) and for Custom Solution (c,d).
Machines 09 00188 g010
Figure 11. VL6180X convergence time.
Figure 11. VL6180X convergence time.
Machines 09 00188 g011
Figure 12. Sampling frequency for ST solution (a) and custom solution (b).
Figure 12. Sampling frequency for ST solution (a) and custom solution (b).
Machines 09 00188 g012
Figure 13. Repeatability for ST solution (a) and custom solution (b). Blue bars represent the mean values computed on 10 measures, red lines report the variances on the 10 measures, green lines/stars report the expected values.
Figure 13. Repeatability for ST solution (a) and custom solution (b). Blue bars represent the mean values computed on 10 measures, red lines report the variances on the 10 measures, green lines/stars report the expected values.
Machines 09 00188 g013
Figure 14. Hysteresis for ST Solution (a) and Custom Solution (b). Error index computed on 1 cycle.
Figure 14. Hysteresis for ST Solution (a) and Custom Solution (b). Error index computed on 1 cycle.
Machines 09 00188 g014
Figure 15. PSD for ST solution (a) and custom solution (b).
Figure 15. PSD for ST solution (a) and custom solution (b).
Machines 09 00188 g015
Figure 16. Experimental setup and reference frames.
Figure 16. Experimental setup and reference frames.
Machines 09 00188 g016
Figure 17. Scanning map generation.
Figure 17. Scanning map generation.
Machines 09 00188 g017
Figure 18. Example of point cloud obtained by using the proximity sensor on a 2.5 mm diameter wire.
Figure 18. Example of point cloud obtained by using the proximity sensor on a 2.5 mm diameter wire.
Machines 09 00188 g018
Figure 19. Picture of tested DLOs (a) and corresponding sensor measurements (b).
Figure 19. Picture of tested DLOs (a) and corresponding sensor measurements (b).
Machines 09 00188 g019
Figure 20. Experiments with fixed distance between the sensor and the wires ( 20 mm) and decreasing distance between the two wires (from left to right: 20 mm, 15 mm, 10 mm and 5 mm).
Figure 20. Experiments with fixed distance between the sensor and the wires ( 20 mm) and decreasing distance between the two wires (from left to right: 20 mm, 15 mm, 10 mm and 5 mm).
Machines 09 00188 g020
Figure 21. Measurement points used to estimate the recognition quality by means of Mean Distance Error.
Figure 21. Measurement points used to estimate the recognition quality by means of Mean Distance Error.
Machines 09 00188 g021
Figure 22. Experiments with increasing distance between the sensor and the wires (from left to right: 20 mm, 25 mm, 30 mm and 35 mm) and fixed distance between the two wires ( 15 mm).
Figure 22. Experiments with increasing distance between the sensor and the wires (from left to right: 20 mm, 25 mm, 30 mm and 35 mm) and fixed distance between the two wires ( 15 mm).
Machines 09 00188 g022
Figure 23. Frames of the video showing the robot end effector following the wire for its entire length.
Figure 23. Frames of the video showing the robot end effector following the wire for its entire length.
Machines 09 00188 g023
Table 1. Connector pinout.
Table 1. Connector pinout.
PinSignal
1V c c 3.3 V
2SDA
3SCL
4Chip Enable 1
5Chip Enable 2
6Chip Enable 3
7Chip Enable 4
8GND
Table 2. Connectors pinout.
Table 2. Connectors pinout.
PinInterface Conn. SignalProgramming Conn. Signal
1V c c 24 VMCRL
2V c c 3.3 VV c c 3.3 V
3GNDGND
4USART RXICSPDAT
5USART TXICSPCLK
Table 3. Systems comparison.
Table 3. Systems comparison.
ST SolutionDesigned Solution
VL6180X moduleX-NUCLEO-6180XA1Custom VL6180X module
Interrogation BoardSTM32F401 Nucleo BoardCustom PIC-based board (PIC16F19176)
Table 4. Repeatability and Measurement Errors.
Table 4. Repeatability and Measurement Errors.
Distance [mm]ST Solution Error [%]Designed Solution Error [%]
Error TypeRepeatabilityMeasurementRepeatabilityMeasurement
1068.59−57.5016.7115.55
2010.20−22.5015.9213.02
304.96−12.5910.5211.29
502.36−6.554.284.07
1000.71−3.863.252.94
Table 5. Recognition error of the distance among the wires for the experiment in Figure 20.
Table 5. Recognition error of the distance among the wires for the experiment in Figure 20.
d n o m d m e a n e d
20 mm 21.5 mm 7.5 %
15 mm 15.3 mm2%
10 mm 10.8 mm8%
5 mm 6.8 mm36%
Table 6. Recognition error of the distance among the wires for the experiment in Figure 22.
Table 6. Recognition error of the distance among the wires for the experiment in Figure 22.
h e d n o m d m e a n e d
15 mm 15 mm 14.4 mm4%
20 mm 15 mm 13.5 mm10%
25 mm 15 mm 11.9 mm 20.7 %
35 mm 15 mm 9.6 mm36%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cirillo, A.; Laudante, G.; Pirozzi, S. Proximity Sensor for Thin Wire Recognition and Manipulation. Machines 2021, 9, 188. https://doi.org/10.3390/machines9090188

AMA Style

Cirillo A, Laudante G, Pirozzi S. Proximity Sensor for Thin Wire Recognition and Manipulation. Machines. 2021; 9(9):188. https://doi.org/10.3390/machines9090188

Chicago/Turabian Style

Cirillo, Andrea, Gianluca Laudante, and Salvatore Pirozzi. 2021. "Proximity Sensor for Thin Wire Recognition and Manipulation" Machines 9, no. 9: 188. https://doi.org/10.3390/machines9090188

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop