sensors-logo

Journal Browser

Journal Browser

Smart Robotics for Automation

A topical collection in Sensors (ISSN 1424-8220). This collection belongs to the section "Physical Sensors".

Viewed by 43663

Editor


E-Mail Website
Collection Editor
Sensors and Smart Systems Group, Institute of Engineering, Hanze University of Applied Sciences, 9747 AS Groningen, The Netherlands
Interests: robot control; mobile robotics; educational robotics; collaborative robotics
Special Issues, Collections and Topics in MDPI journals

Topical Collection Information

Dear Colleagues,

In recent years, the demand for efficient automation in distribution centers has increased sharply, mainly due to the increase in e-commerce. This trend continues and spreads to other sectors. The world’s population is aging, so more efficient automation is needed to cope with the relative reduction in the available workforce in the near future. In addition, as more people are moving to cities and fewer people are interested in working on farms, food production will also need to be automated more efficiently. Moreover, climate change tends to play an important role due to factors such as the decrease in available areas for food production.

To address these important problems, this Topical Collection on “Smart Robotics for Automation” is dedicated to publishing research papers, communications, and review articles proposing solutions to increase the efficiency of automation systems with the application of smart robotics. By smart robotics we mean solutions that apply the latest developments in artificial intelligence, sensor systems (including computer vision), and control to manipulators and mobile robots. Solutions that increase the adaptability and flexibility of current robotic applications are also welcome.

Topics of interest are related to robotics automation. A non-exhaustive list is as follows:

- Process automation with intelligent robotics;

- Intelligent robotic applications for production systems;

- Robotics applied to precision agriculture;

- AI and machine learning systems applied to robotics;

- Robot control;

- Robot manipulation and picking;

- Mobile robot navigation, localization, and mapping;

- Interpretation of sensor data (including vision systems);

- Human–robot collaboration (including cobots);

- Multi-robot systems

This Topical Collection is focused more on sensors. Papers focus on Automation may choose our joint Topical Collection in Automation (ISSN 2673-4052).

Dr. Felipe N. Martins
Collection Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the collection website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Control theory
  • Manufacturing systems
  • Learning systems
  • Intelligent control systems
  • Artificial neural networks
  • Motion control and sensing
  • Process automation and monitoring
  • Sensors and signal processing
  • Robotics and applications

Published Papers (15 papers)

2024

Jump to: 2023, 2022, 2021

16 pages, 60291 KiB  
Article
Analytical Formalism for Data Representation and Object Detection with 2D LiDAR: Application in Mobile Robotics
by Leonardo A. Fagundes, Jr., Alexandre G. Caldeira, Matheus B. Quemelli, Felipe N. Martins and Alexandre S. Brandão
Sensors 2024, 24(7), 2284; https://doi.org/10.3390/s24072284 - 03 Apr 2024
Viewed by 463
Abstract
In mobile robotics, LASER scanners have a wide spectrum of indoor and outdoor applications, both in structured and unstructured environments, due to their accuracy and precision. Most works that use this sensor have their own data representation and their own case-specific modeling strategies, [...] Read more.
In mobile robotics, LASER scanners have a wide spectrum of indoor and outdoor applications, both in structured and unstructured environments, due to their accuracy and precision. Most works that use this sensor have their own data representation and their own case-specific modeling strategies, and no common formalism is adopted. To address this issue, this manuscript presents an analytical approach for the identification and localization of objects using 2D LiDARs. Our main contribution lies in formally defining LASER sensor measurements and their representation, the identification of objects, their main properties, and their location in a scene. We validate our proposal with experiments in generic semi-structured environments common in autonomous navigation, and we demonstrate its feasibility in multiple object detection and identification, strictly following its analytical representation. Finally, our proposal further encourages and facilitates the design, modeling, and implementation of other applications that use LASER scanners as a distance sensor. Full article
Show Figures

Figure 1

2023

Jump to: 2024, 2022, 2021

17 pages, 2452 KiB  
Article
A Machine Learning Approach to Robot Localization Using Fiducial Markers in RobotAtFactory 4.0 Competition
by Luan C. Klein, João Braun, João Mendes, Vítor H. Pinto, Felipe N. Martins, Andre Schneider de Oliveira, Heinrich Wörtche, Paulo Costa and José Lima
Sensors 2023, 23(6), 3128; https://doi.org/10.3390/s23063128 - 15 Mar 2023
Cited by 4 | Viewed by 2494
Abstract
Localization is a crucial skill in mobile robotics because the robot needs to make reasonable navigation decisions to complete its mission. Many approaches exist to implement localization, but artificial intelligence can be an interesting alternative to traditional localization techniques based on model calculations. [...] Read more.
Localization is a crucial skill in mobile robotics because the robot needs to make reasonable navigation decisions to complete its mission. Many approaches exist to implement localization, but artificial intelligence can be an interesting alternative to traditional localization techniques based on model calculations. This work proposes a machine learning approach to solve the localization problem in the RobotAtFactory 4.0 competition. The idea is to obtain the relative pose of an onboard camera with respect to fiducial markers (ArUcos) and then estimate the robot pose with machine learning. The approaches were validated in a simulation. Several algorithms were tested, and the best results were obtained by using Random Forest Regressor, with an error on the millimeter scale. The proposed solution presents results as high as the analytical approach for solving the localization problem in the RobotAtFactory 4.0 scenario, with the advantage of not requiring explicit knowledge of the exact positions of the fiducial markers, as in the analytical approach. Full article
Show Figures

Figure 1

2022

Jump to: 2024, 2023, 2021

23 pages, 9512 KiB  
Article
Development of Fixed-Wing UAV 3D Coverage Paths for Urban Air Quality Profiling
by Qianyu Zhou, Li-Yu Lo, Bailun Jiang, Ching-Wei Chang, Chih-Yung Wen, Chih-Keng Chen and Weifeng Zhou
Sensors 2022, 22(10), 3630; https://doi.org/10.3390/s22103630 - 10 May 2022
Cited by 5 | Viewed by 2281
Abstract
Due to the ever-increasing industrial activity, humans and the environment suffer from deteriorating air quality, making the long-term monitoring of air particle indicators essential. The advances in unmanned aerial vehicles (UAVs) offer the potential to utilize UAVs for various forms of monitoring, of [...] Read more.
Due to the ever-increasing industrial activity, humans and the environment suffer from deteriorating air quality, making the long-term monitoring of air particle indicators essential. The advances in unmanned aerial vehicles (UAVs) offer the potential to utilize UAVs for various forms of monitoring, of which air quality data acquisition is one. Nevertheless, most current UAV-based air monitoring suffers from a low payload, short endurance, and limited range, as they are primarily dependent on rotary aerial vehicles. In contrast, a fixed-wing UAV may be a better alternative. Additionally, one of the most critical modules for 3D profiling of a UAV system is path planning, as it directly impacts the final results of the spatial coverage and temporal efficiency. Therefore, this work focused on developing 3D coverage path planning based upon current commercial ground control software, where the method mainly depends on the Boustrophedon and Dubins paths. Furthermore, a user interface was also designed for easy accessibility, which provides a generalized tool module that links up the proposed algorithm, the ground control software, and the flight controller. Simulations were conducted to assess the proposed methods. The result showed that the proposed methods outperformed the existing coverage paths generated by ground control software, as it showed a better coverage rate with a sampling density of 50 m. Full article
Show Figures

Figure 1

23 pages, 10686 KiB  
Article
A Visual Servoing Scheme for Autonomous Aquaculture Net Pens Inspection Using ROV
by Waseem Akram, Alessandro Casavola, Nadir Kapetanović and Nikola Miškovic
Sensors 2022, 22(9), 3525; https://doi.org/10.3390/s22093525 - 05 May 2022
Cited by 11 | Viewed by 2933
Abstract
Aquaculture net pens inspection and monitoring are important to ensure net stability and fish health in the fish farms. Remotely operated vehicles (ROVs) offer a low-cost and sophisticated solution for the regular inspection of the underwater fish net pens due to their ability [...] Read more.
Aquaculture net pens inspection and monitoring are important to ensure net stability and fish health in the fish farms. Remotely operated vehicles (ROVs) offer a low-cost and sophisticated solution for the regular inspection of the underwater fish net pens due to their ability of visual sensing and autonomy in a challenging and dynamic aquaculture environment. In this paper, we report the integration of an ROV with a visual servoing scheme for regular inspection and tracking of the net pens. We propose a vision-based positioning scheme that consists of an object detector, a pose generator, and a closed-loop controller. The system employs a modular approach that first utilizes two easily identifiable parallel ropes attached to the net for image processing through traditional computer vision methods. Second, the reference positions of the ROV relative to the net plane are extracted on the basis of a vision triangulation method. Third, a closed-loop control law is employed to instruct the vehicle to traverse from top to bottom along the net plane to inspect its status. The proposed vision-based scheme has been implemented and tested both through simulations and field experiments. The extensive experimental results have allowed the assessment of the performance of the scheme that resulted satisfactorily and can supplement the traditional aquaculture net pens inspection and tracking systems. Full article
Show Figures

Figure 1

28 pages, 22430 KiB  
Article
Heterogeneous Autonomous Robotic System in Viticulture and Mariculture: Vehicles Development and Systems Integration
by Nadir Kapetanović, Jurica Goričanec, Ivo Vatavuk, Ivan Hrabar, Dario Stuhne, Goran Vasiljević, Zdenko Kovačić, Nikola Mišković, Nenad Antolović, Marina Anić and Bernard Kozina
Sensors 2022, 22(8), 2961; https://doi.org/10.3390/s22082961 - 12 Apr 2022
Cited by 9 | Viewed by 3289
Abstract
There are activities in viticulture and mariculture that require extreme physical endurance from human workers, making them prime candidates for automation and robotization. This paper presents a novel, practical, heterogeneous, autonomous robotic system divided into two main parts, each dealing with respective scenarios [...] Read more.
There are activities in viticulture and mariculture that require extreme physical endurance from human workers, making them prime candidates for automation and robotization. This paper presents a novel, practical, heterogeneous, autonomous robotic system divided into two main parts, each dealing with respective scenarios in viticulture and mariculture. The robotic components and the subsystems that enable collaboration were developed as part of the ongoing HEKTOR project, and each specific scenario is presented. In viticulture, this includes vineyard surveillance, spraying and suckering with an all-terrain mobile manipulator (ATMM) and a lightweight autonomous aerial robot (LAAR) that can be used in very steep vineyards where other mechanization fails. In mariculture, scenarios include coordinated aerial and subsurface monitoring of fish net pens using the LAAR, an autonomous surface vehicle (ASV), and a remotely operated underwater vehicle (ROV). All robotic components communicate and coordinate their actions through the Robot Operating System (ROS). Field tests demonstrate the great capabilities of the HEKTOR system for the fully autonomous execution of very strenuous and hazardous work in viticulture and mariculture, while meeting the necessary conditions for the required quality and quantity of the work performed. Full article
Show Figures

Figure 1

14 pages, 9281 KiB  
Article
Motion Similarity Evaluation between Human and a Tri-Co Robot during Real-Time Imitation with a Trajectory Dynamic Time Warping Model
by Liang Gong, Binhao Chen, Wenbin Xu, Chengliang Liu, Xudong Li, Zelin Zhao and Lujie Zhao
Sensors 2022, 22(5), 1968; https://doi.org/10.3390/s22051968 - 02 Mar 2022
Cited by 11 | Viewed by 2946
Abstract
Precisely imitating human motions in real-time poses a challenge for the robots due to difference in their physical structures. This paper proposes a human–computer interaction method for remotely manipulating life-size humanoid robots with a new metrics for evaluating motion similarity. First, we establish [...] Read more.
Precisely imitating human motions in real-time poses a challenge for the robots due to difference in their physical structures. This paper proposes a human–computer interaction method for remotely manipulating life-size humanoid robots with a new metrics for evaluating motion similarity. First, we establish a motion capture system to acquire the operator’s motion data and retarget it to the standard bone model. Secondly, we develop a fast mapping algorithm, by mapping the BVH (BioVision Hierarchy) data collected by the motion capture system to each joint motion angle of the robot to realize the imitated motion control of the humanoid robot. Thirdly, a DTW (Dynamic Time Warping)-based trajectory evaluation method is proposed to quantitatively evaluate the difference between robot trajectory and human motion, and meanwhile, visualization terminals render it more convenient to make comparisons between two different but simultaneous motion systems. We design a complex gesture simulation experiment to verify the feasibility and real-time performance of the control method. The proposed human-in-the-loop imitation control method addresses a prominent non-isostructural retargeting problem between human and robot, enhances robot interaction capability in a more natural way, and improves robot adaptability to uncertain and dynamic environments. Full article
Show Figures

Figure 1

19 pages, 23003 KiB  
Article
Carved Turn Control with Gate Vision Recognition of a Humanoid Robot for Giant Slalom Skiing on Ski Slopes
by Cheonyu Park, Baekseok Kim, Yitaek Kim, Younseal Eum, Hyunjong Song, Dongkuk Yoon, Jeongin Moon and Jeakweon Han
Sensors 2022, 22(3), 816; https://doi.org/10.3390/s22030816 - 21 Jan 2022
Cited by 4 | Viewed by 3271
Abstract
The performance of humanoid robots is improving, owing in part to their participation in robot games such as the DARPA Robotics Challenge. Along with the 2018 Winter Olympics in Pyeongchang, a Skiing Robot Competition was held in which humanoid robots participated autonomously in [...] Read more.
The performance of humanoid robots is improving, owing in part to their participation in robot games such as the DARPA Robotics Challenge. Along with the 2018 Winter Olympics in Pyeongchang, a Skiing Robot Competition was held in which humanoid robots participated autonomously in a giant slalom alpine skiing competition. The robots were required to transit through many red or blue gates on the ski slope to reach the finish line. The course was relatively short at 100 m long and had an intermediate-level rating. A 1.23 m tall humanoid ski robot, ‘DIANA’, was developed for this skiing competition. As a humanoid robot that mimics humans, the goal was to descend the slope as fast as possible, so the robot was developed to perform a carved turn motion. The carved turn was difficult to balance compared to other turn methods. Therefore, ZMP control, which could secure the posture stability of the biped robot, was applied. Since skiing takes place outdoors, it was necessary to ensure recognition of the flags in various weather conditions. This was ensured using deep learning-based vision recognition. Thus, the performance of the humanoid robot DIANA was established using the carved turn in an experiment on an actual ski slope. The ultimate vision for humanoid robots is for them to naturally blend into human society and provide necessary services to people. Previously, there was no way for a full-sized humanoid robot to move on a snowy mountain. In this study, a humanoid robot that transcends this limitation was realized. Full article
Show Figures

Figure 1

2021

Jump to: 2024, 2023, 2022

14 pages, 12576 KiB  
Article
Vibrational Transportation on a Platform Subjected to Sinusoidal Displacement Cycles Employing Dry Friction Control
by Sigitas Kilikevičius and Algimantas Fedaravičius
Sensors 2021, 21(21), 7280; https://doi.org/10.3390/s21217280 - 01 Nov 2021
Cited by 3 | Viewed by 1795
Abstract
Currently used vibrational transportation methods are usually based on asymmetries of geometric, kinematic, wave, or time types. This paper investigates the vibrational transportation of objects on a platform that is subjected to sinusoidal displacement cycles, employing periodic dynamic dry friction control. This manner [...] Read more.
Currently used vibrational transportation methods are usually based on asymmetries of geometric, kinematic, wave, or time types. This paper investigates the vibrational transportation of objects on a platform that is subjected to sinusoidal displacement cycles, employing periodic dynamic dry friction control. This manner of dry friction control creates an asymmetry, which is necessary to move the object. The theoretical investigation on functional capabilities and transportation regimes was carried out using a developed parametric mathematical model, and the control parameters that determine the transportation characteristics such as velocity and direction were defined. To test the functional capabilities of the proposed method, an experimental setup was developed, and experiments were carried out. The results of the presented research indicate that the proposed method ensures smooth control of the transportation velocity in a wide range and allows it to change the direction of motion. Moreover, the proposed method offers other new functional capabilities, such as a capability to move individual objects on the same platform in opposite directions and at different velocities at the same time by imposing different friction control parameters on different regions of the platform or on different objects. In addition, objects can be subjected to translation and rotation at the same time by imposing different friction control parameters on different regions of the platform. The presented research extends the classical theory of vibrational transportation and has a practical value for industries that operate manufacturing systems performing tasks such as handling and transportation, positioning, feeding, sorting, aligning, or assembling. Full article
Show Figures

Figure 1

20 pages, 7861 KiB  
Article
Efficient and Consumer-Centered Item Detection and Classification with a Multicamera Network at High Ranges
by Nils Mandischer, Tobias Huhn, Mathias Hüsing and Burkhard Corves
Sensors 2021, 21(14), 4818; https://doi.org/10.3390/s21144818 - 14 Jul 2021
Cited by 3 | Viewed by 1916
Abstract
In the EU project SHAREWORK, methods are developed that allow humans and robots to collaborate in an industrial environment. One of the major contributions is a framework for task planning coupled with automated item detection and localization. In this work, we present the [...] Read more.
In the EU project SHAREWORK, methods are developed that allow humans and robots to collaborate in an industrial environment. One of the major contributions is a framework for task planning coupled with automated item detection and localization. In this work, we present the methods used for detecting and classifying items on the shop floor. Important in the context of SHAREWORK is the user-friendliness of the methodology. Thus, we renounce heavy-learning-based methods in favor of unsupervised segmentation coupled with lenient machine learning methods for classification. Our algorithm is a combination of established methods adjusted for fast and reliable item detection at high ranges of up to eight meters. In this work, we present the full pipeline from calibration, over segmentation to item classification in the industrial context. The pipeline is validated on a shop floor of 40 sqm and with up to nine different items and assemblies, reaching a mean accuracy of 84% at 0.85 Hz. Full article
Show Figures

Figure 1

20 pages, 4820 KiB  
Article
A Simple Neural Network for Collision Detection of Collaborative Robots
by Michał Czubenko and Zdzisław Kowalczuk
Sensors 2021, 21(12), 4235; https://doi.org/10.3390/s21124235 - 21 Jun 2021
Cited by 16 | Viewed by 4481
Abstract
Due to the epidemic threat, more and more companies decide to automate their production lines. Given the lack of adequate security or space, in most cases, such companies cannot use classic production robots. The solution to this problem is the use of collaborative [...] Read more.
Due to the epidemic threat, more and more companies decide to automate their production lines. Given the lack of adequate security or space, in most cases, such companies cannot use classic production robots. The solution to this problem is the use of collaborative robots (cobots). However, the required equipment (force sensors) or alternative methods of detecting a threat to humans are usually quite expensive. The article presents the practical aspect of collision detection with the use of a simple neural architecture. A virtual force and torque sensor, implemented as a neural network, may be useful in a team of collaborative robots. Four different approaches are compared in this article: auto-regressive (AR), recurrent neural network (RNN), convolutional long short-term memory (CNN-LSTM) and mixed convolutional LSTM network (MC-LSTM). These architectures are analyzed at different levels of input regression (motor current, position, speed, control velocity). This sensor was tested on the original CURA6 robot prototype (Cooperative Universal Robotic Assistant 6) by Intema. The test results indicate that the MC-LSTM architecture is the most effective with the regression level set at 12 samples (at 24 Hz). The mean absolute prediction error obtained by the MC-LSTM architecture was approximately 22 Nm. The conducted external test (72 different signals with collisions) shows that the presented architecture can be used as a collision detector. The MC-LSTM collision detection f1 score with the optimal threshold was 0.85. A well-developed virtual sensor based on such a network can be used to detect various types of collisions of cobot or other mobile or stationary systems operating on the basis of human-machine interaction. Full article
Show Figures

Figure 1

11 pages, 3890 KiB  
Article
Estimate the Unknown Environment with Biosonar Echoes—A Simulation Study
by Muhammad Hassan Tanveer, Antony Thomas, Waqar Ahmed and Hongxiao Zhu
Sensors 2021, 21(12), 4186; https://doi.org/10.3390/s21124186 - 18 Jun 2021
Cited by 1 | Viewed by 1739
Abstract
Unmanned aerial vehicles (UAVs) have shown great potential in various applications such as surveillance, search and rescue. To perform safe and efficient navigation, it is vitally important for a UAV to evaluate the environment accurately and promptly. In this work, we present a [...] Read more.
Unmanned aerial vehicles (UAVs) have shown great potential in various applications such as surveillance, search and rescue. To perform safe and efficient navigation, it is vitally important for a UAV to evaluate the environment accurately and promptly. In this work, we present a simulation study for the estimation of foliage distribution as a UAV equipped with biosonar navigates through a forest. Based on a simulated forest environment, foliage echoes are generated by using a bat-inspired bisonar simulator. These biosonar echoes are then used to estimate the spatial distribution of both sparsely and densely distributed tree leaves. While a simple batch processing method is able to estimate sparsely distributed leaf locations well, a wavelet scattering technique coupled with a support vector machine (SVM) classifier is shown to be effective to estimate densely distributed leaves. Our approach is validated by using multiple setups of leaf distributions in the simulated forest environment. Ninety-seven percent accuracy is obtained while estimating thickly distributed foliage. Full article
Show Figures

Figure 1

21 pages, 22608 KiB  
Article
Intelligent Parameter Identification for Robot Servo Controller Based on Improved Integration Method
by Ye Li, Dazhi Wang, Shuai Zhou and Xian Wang
Sensors 2021, 21(12), 4177; https://doi.org/10.3390/s21124177 - 18 Jun 2021
Cited by 4 | Viewed by 2248
Abstract
With the rise of smart robots in the field of industrial automation, the motion control theory of the robot servo controller has become a research hotspot. The parameter mismatch of the controller will reduce the efficiency of the equipment and damage the equipment [...] Read more.
With the rise of smart robots in the field of industrial automation, the motion control theory of the robot servo controller has become a research hotspot. The parameter mismatch of the controller will reduce the efficiency of the equipment and damage the equipment in serious cases. Compared to other parameters of servo controllers, the moment of inertia and friction viscous coefficient have a significant effect on the dynamic performance in motion control; furthermore, accurate real-time identification is essential for servo controller design. An improved integration method is proposed that increases the sampling period by redefining the update condition in this paper; it then expands the applied range of the classical method that is more suitable for the working characteristics of a robot servo controller and reducesthe speed quantization error generated by the encoder. Then, an optimization approach using the incremental probabilistic neural network with improved Gravitational Search Algorithm (IGSA-IPNN) is proposed to filter the speed error by a nonlinear process and provide more precise input for parameter identification. The identified inertia and friction coefficient areused for the PI parameter self-tuning of the speed loop. The experiments prove that the validity of the proposed method and, compared to the classical method, it is more accurate, stable and suitable for the robot servo controller. Full article
Show Figures

Figure 1

20 pages, 3391 KiB  
Article
Safe Path Planning Algorithms for Mobile Robots Based on Probabilistic Foam
by Luís B. P. Nascimento, Dennis Barrios-Aranibar, Vitor G. Santos, Diego S. Pereira, William C. Ribeiro and Pablo J. Alsina
Sensors 2021, 21(12), 4156; https://doi.org/10.3390/s21124156 - 17 Jun 2021
Cited by 3 | Viewed by 3481
Abstract
The planning of safe paths is an important issue for autonomous robot systems. The Probabilistic Foam method (PFM) is a planner that guarantees safe paths bounded by a sequence of structures called bubbles that provides safe regions. This method performs the planning by [...] Read more.
The planning of safe paths is an important issue for autonomous robot systems. The Probabilistic Foam method (PFM) is a planner that guarantees safe paths bounded by a sequence of structures called bubbles that provides safe regions. This method performs the planning by covering the free configuration space with bubbles, an approach analogous to a breadth-first search. To improve the propagation process and keep the safety, we present three algorithms based on Probabilistic Foam: Goal-biased Probabilistic Foam (GBPF), Radius-biased Probabilistic Foam (RBPF), and Heuristic-guided Probabilistic Foam (HPF); the last two are proposed in this work. The variant GBPF is fast, HPF finds short paths, and RBPF finds high-clearance paths. Some simulations were performed using four different maps to analyze the behavior and performance of the methods. Besides, the safety was analyzed considering the new propagation strategies. Full article
Show Figures

Figure 1

18 pages, 11283 KiB  
Article
Hybrid Imitation Learning Framework for Robotic Manipulation Tasks
by Eunjin Jung and Incheol Kim
Sensors 2021, 21(10), 3409; https://doi.org/10.3390/s21103409 - 13 May 2021
Cited by 5 | Viewed by 3109
Abstract
This study proposes a novel hybrid imitation learning (HIL) framework in which behavior cloning (BC) and state cloning (SC) methods are combined in a mutually complementary manner to enhance the efficiency of robotic manipulation task learning. The proposed HIL framework efficiently combines BC [...] Read more.
This study proposes a novel hybrid imitation learning (HIL) framework in which behavior cloning (BC) and state cloning (SC) methods are combined in a mutually complementary manner to enhance the efficiency of robotic manipulation task learning. The proposed HIL framework efficiently combines BC and SC losses using an adaptive loss mixing method. It uses pretrained dynamics networks to enhance SC efficiency and performs stochastic state recovery to ensure stable learning of policy networks by transforming the learner’s task state into a demo state on the demo task trajectory during SC. The training efficiency and policy flexibility of the proposed HIL framework are demonstrated in a series of experiments conducted to perform major robotic manipulation tasks (pick-up, pick-and-place, and stack tasks). In the experiments, the HIL framework showed about a 2.6 times higher performance improvement than the pure BC and about a four times faster training time than the pure SC imitation learning method. In addition, the HIL framework also showed about a 1.6 times higher performance improvement and about a 2.2 times faster training time than the other hybrid learning method combining BC and reinforcement learning (BC + RL) in the experiments. Full article
Show Figures

Figure 1

25 pages, 2075 KiB  
Article
PID++: A Computationally Lightweight Humanoid Motion Control Algorithm
by Thomas F. Arciuolo and Miad Faezipour
Sensors 2021, 21(2), 456; https://doi.org/10.3390/s21020456 - 11 Jan 2021
Cited by 8 | Viewed by 3281
Abstract
Currently robotic motion control algorithms are tedious at best to implement, are lacking in automatic situational adaptability, and tend to be static in nature. Humanoid (human-like) control is little more than a dream, for all, but the fastest computers. The main idea of [...] Read more.
Currently robotic motion control algorithms are tedious at best to implement, are lacking in automatic situational adaptability, and tend to be static in nature. Humanoid (human-like) control is little more than a dream, for all, but the fastest computers. The main idea of the work presented in this paper is to define a radically new, simple, and computationally lightweight approach to humanoid motion control. A new Proportional-Integral-Derivative (PID) controller algorithm called PID++ is proposed in this work that uses minor adjustments with basic arithmetic, based on the real-time encoder position input, to achieve a stable, precise, controlled, dynamic, adaptive control system, for linear motion control, in any direction regardless of load. With no PID coefficients initially specified, the proposed PID++ algorithm dynamically adjusts and updates the PID coefficients Kp, Ki and Kd periodically. No database of values is required to be stored as only the current and previous values of the sensed position with an accurate time base are used in the computations and overwritten in each read interval, eliminating the need of deploying much memory for storing and using vectors or matrices. Complete in its implementation, and truly dynamic and adaptive by design, engineers will be able to use this algorithm in commercial, industrial, biomedical, and space applications alike. With characteristics that are unmistakably human, motion control can be feasibly implemented on even the smallest microcontrollers (MCU) using a single command and without the need of reprogramming or reconfiguration. Full article
Show Figures

Figure 1

Back to TopTop