sensors-logo

Journal Browser

Journal Browser

New Advances in Robotically Enabled Sensing

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: 10 September 2024 | Viewed by 2800

Special Issue Editors

Researcher, Institute for High-Performance Computing and Networking (ICAR), National Research Council (CNR), Via Ugo La Malfa, 153, 90146 Palermo, Italy
Interests: automated and autonomous robotic inspection systems; metrology; non-destructive testing; instrument interfacing; real-time control; data-driven robotic inspection; adaptive robotic inspections; in-process inspections; ultrasonics; phased array technology
Special Issues, Collections and Topics in MDPI journals
Centre for Ultrasonic Engineering, University of Strathclyde, Glasgow G1 1XW, UK
Interests: welding technology; NDT; residual stress; additive manufacturing; robotics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Humans have an immediate perception of the geometry of parts and surroundings, through their senses and their cognitive capabilities. This innate human ability enables the manual inspection of objects in everyday life and in manufacturing environments. Trained inspectors combine their senses and handling skills with bespoke instrumentation. However, manual inspection can be slow for large and complex geometries, and is prone to human error (e.g., tiredness, boredom, and distraction). Robotic sensing has emerged in many sectors to improve the inspection of parts and materials, enhancing data acquisition speed, part coverage, and inspection reliability. Several automated or semi-automated solutions have been proposed to enable the automated deployment of specific types of sensors. Additionally, robots are able to reach inspection positions not easily accessible to human operators, removing humans from potentially dangerous environments.

However, the perceived complexity and high costs of robot sensing have limited the adoption of automation. As a result, the full potential derived from the seamless integration of robotic platforms with sensors, actuators, and software has not been fully explored; sufficient research could revolutionise how automated sensing is performed and conceived. Recent advancements in electronics, robotics, sensor technology and software pave the way for new developments in automated and data-driven robotic inspections in several sectors. These developments can help face the current societal challenges in this field. Robotic sensing must develop in parallel with new arising tools, e.g., autonomous robotics, artificial intelligence, the Internet of Things, cloud computing, cybersecurity, virtual-twin simulations, augmented reality, and big data.

We invite the research community to submit contributions to this Special Issue. Manuscripts introducing novel developments in one or more of the following aspects are welcome:

  • Robotic sensing;
  • Robotic non-destructive testing;
  • Novel integrations of robotic systems for hybrid manufacturing and inspection tasks;
  • Transition from automated to autonomous robotics;
  • Modelling of robotic approaches, remote inspections, and data interpretation;
  • Real-time data monitoring and robot control;
  • Processing, management, compression and archiving of robotically collected data;
  • Machine learning, artificial intelligence, image recognition and data mining;
  • Novel data visualization and analysis approaches;
  • Human–robot interaction/communication in the operation of robotic inspection systems.

The goal of this Special Issue is to identify how robotic sensing is evolving to address the issues raised by challenging new frontiers in civil and medical fields and by Industry 4.0, which is the ongoing automation of traditional manufacturing and industrial practices using modern smart technology.

Dr. Carmelo Mineo
Dr. Yashar Javadi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • robotic non-destructive testing
  • robotic sensing
  • remote inspection
  • adaptive inspection
  • data interpretation
  • real-time monitoring
  • data-driven autonomous inspection
  • data management, processing, compression, and storage
  • machine learning, artificial intelligence, image recognition
  • human–robot interaction

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 11583 KiB  
Article
Real-Time Kinematically Synchronous Planning for Cooperative Manipulation of Multi-Arms Robot Using the Self-Organizing Competitive Neural Network
Sensors 2023, 23(11), 5120; https://doi.org/10.3390/s23115120 - 27 May 2023
Viewed by 802
Abstract
This paper presents a real-time kinematically synchronous planning method for the collaborative manipulation of a multi-arms robot with physical coupling based on the self-organizing competitive neural network. This method defines the sub-bases for the configuration of multi-arms to obtain the Jacobian matrix of [...] Read more.
This paper presents a real-time kinematically synchronous planning method for the collaborative manipulation of a multi-arms robot with physical coupling based on the self-organizing competitive neural network. This method defines the sub-bases for the configuration of multi-arms to obtain the Jacobian matrix of common degrees of freedom so that the sub-base motion converges along the direction for the total pose error of the end-effectors (EEs). Such a consideration ensures the uniformity of the EE motion before the error converges completely and contributes to the collaborative manipulation of multi-arms. An unsupervised competitive neural network model is raised to adaptively increase the convergence ratio of multi-arms via the online learning of the rules of the inner star. Then, combining with the defined sub-bases, the synchronous planning method is established to achieve the synchronous movement of multi-arms robot rapidly for collaborative manipulation. Theory analysis proves the stability of the multi-arms system via the Lyapunov theory. Various simulations and experiments demonstrate that the proposed kinematically synchronous planning method is feasible and applicable to different symmetric and asymmetric cooperative manipulation tasks for a multi-arms system. Full article
(This article belongs to the Special Issue New Advances in Robotically Enabled Sensing)
Show Figures

Figure 1

15 pages, 3171 KiB  
Article
Transforming Industrial Manipulators via Kinesthetic Guidance for Automated Inspection of Complex Geometries
Sensors 2023, 23(7), 3757; https://doi.org/10.3390/s23073757 - 05 Apr 2023
Viewed by 1493
Abstract
The increased demand for cost-efficient manufacturing and metrology inspection solutions for complex-shaped components in High-Value Manufacturing (HVM) sectors requires increased production throughput and precision. This drives the integration of automated robotic solutions. However, the current manipulators utilizing traditional programming approaches demand specialized robotic [...] Read more.
The increased demand for cost-efficient manufacturing and metrology inspection solutions for complex-shaped components in High-Value Manufacturing (HVM) sectors requires increased production throughput and precision. This drives the integration of automated robotic solutions. However, the current manipulators utilizing traditional programming approaches demand specialized robotic programming knowledge and make it challenging to generate complex paths and adapt easily to unique specifications per component, resulting in an inflexible and cumbersome teaching process. Therefore, this body of work proposes a novel software system to realize kinesthetic guidance for path planning in real-time intervals at 250 Hz, utilizing an external off-the-shelf force–torque (FT) sensor. The proposed work is demonstrated on a 500 mm2 near-net-shaped Wire–Arc Additive Manufacturing (WAAM) complex component with embedded defects by teaching the inspection path for defect detection with a standard industrial robotic manipulator in a collaborative fashion and adaptively generating the kinematics resulting in the uniform coupling of ultrasound inspection. The utilized method proves superior in performance and speed, accelerating the programming time using online and offline approaches by an estimate of 88% to 98%. The proposed work is a unique development, retrofitting current industrial manipulators into collaborative entities, securing human job resources, and achieving flexible production. Full article
(This article belongs to the Special Issue New Advances in Robotically Enabled Sensing)
Show Figures

Figure 1

Back to TopTop