Next Article in Journal
Selective Layer Tuning and Performance Study of Pre-Trained Models Using Genetic Algorithm
Previous Article in Journal
Configurable Readout Error Mitigation in Quantum Workflows
Previous Article in Special Issue
Planning Collision-Free Robot Motions in a Human–Robot Shared Workspace via Mixed Reality and Sensor-Fusion Skeleton Tracking
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Augmented Reality and Robotic Systems for Assistance in Percutaneous Nephrolithotomy Procedures: Recent Advances and Future Perspectives

by
Federica Ferraguti
1,*,
Saverio Farsoni
2 and
Marcello Bonfè
2
1
ARSControl Lab, Department of Sciences and Methods for Engineering, University of Modena and Reggio Emilia, 42122 Reggio Emilia, Italy
2
LIRA Lab, Department of Engineering, University of Ferrara, 44122 Ferrara, Italy
*
Author to whom correspondence should be addressed.
Electronics 2022, 11(19), 2984; https://doi.org/10.3390/electronics11192984
Submission received: 15 July 2022 / Revised: 7 September 2022 / Accepted: 19 September 2022 / Published: 20 September 2022

Abstract

:
Percutaneous nephrolithotomy is the gold standard for the treatment of renal stones larger than 20 mm in diameter. The treatment outcomes of PCNL are highly dependent on the accuracy of the puncture step, in order to achieve a suitable renal access and reach the stone with a precise and direct path. Thus, performing the puncturing to get the renal access is the most crucial and challenging step of the procedure with the steepest learning curve. Many simulation methods and systems have been developed to help trainees achieve the requested competency level to achieve a suitable renal access. Simulators include human cadavers, animal tissues and virtual reality simulators to simulate human patients. On the other hand, the availability of pre-operative information (e.g., computed tomography or magnetic resonance imaging) and of intra-operative images (e.g., ultrasound images) has allowed the development of solutions involving augmented reality and robotic systems to assist the surgeon during the operation and to help a novel surgeon in strongly reducing the learning curve. In this context, the real-time awareness of the 3D position and orientation of the considered anatomical structures with reference to a common frame is fundamental. Such information must be accurately estimated by means of specific tracking systems that allow the reconstruction of the motion of the probe and of the tool. This review paper presents a survey on the leading literature on augmented reality and robotic assistance for PCNL, with a focus on existing methods for tracking the motion of the ultrasound probe and of the surgical needle.

1. Introduction

Percutaneous nephrolithotomy (PCNL) is considered to be the gold standard for the treatment of patients with renal stones larger than 20 mm in diameter, as stated by the European Association of Urology and the American Urological Association [1,2]. Its popularity and acceptance among urologists and patients is largely due to the fact that it is minimally invasive and is associated with low morbidity [3]. The procedure involves entering the kidney through a small incision in the lower flank or in the abdomen. Once the surgeon gets to the kidney, a nephroscope (a miniature fiberoptic camera) and other small instruments are threaded in through the hole, and the stone is removed or broken up and then removed. The success and treatment outcomes of PCNL are highly dependent on the precision and accuracy of the puncture step, in order to achieve a suitable renal access and reach the stone with a precise and direct path [4]. Thus, PCNL has a relatively steep learning curve, and the initial puncturing step to gain the renal access is recognized to be the most challenging step of the procedure [5]. Indeed, several studies have been presented to evaluate the learning curve in PCNL [6,7,8,9,10,11,12]. The studies concluded that 45–60 PCNL operations are required to achieve competence in performing the procedure, and 105–115 operations to gain excellence. One of the main reasons why it is challenging for urologists to achieve good proficiency is the fact that not all the urology residency programs are prepared to teach the residents the technique for percutaneous access. As a consequence, many urologists do not perform percutaneous access routinely, but this step of the procedure is frequently performed by the interventional radiologist. A 2014 review of case logs from certifying and recertifying urologists found that only 6% performed more than 10 PCNLs during the prior six months and urologist-obtained access only occurred in 20% of these cases; in the remaining cases, the renal access was performed by the radiologist [13].
In recent years, the fast developments in virtual and augmented reality technologies gave rise to numerous applications of such technologies in medical applications, in particular for supporting surgeons, doctors or nurses in intensive care units [14,15,16], Furthermore, the availability of pre-operative information (e.g., computed tomography (CT) or magnetic resonance imaging (MRI)) and of intra-operative images (e.g., ultrasound images) allowed the introduction of advanced solutions involving virtual and augmented reality and robotic assistance, in order to help trainees achieve competency in a shorter length of time and to assist the surgeon during the operations. Apart from augmented and virtual reality, other interesting technologies that analyze CT or MRI scans can be found in [17,18,19], where the images are processed to detect and classify tumors. Moreover, in order to provide a real-time awareness of the 3D position and orientation of the considered anatomical structures, specific tracking systems have been introduced. These systems allow the reconstruction of the motion of the ultrasound probe and of the surgical tool (i.e., the nephroscope, but in general, any surgical needle) during the execution of the operation.
This paper presents an overview of the leading literature on the innovative technologies that can be exploited for assisting the surgeon in PCNL operations (virtual and augmented reality and robotic assistance) with a focus on state-of-the-art methods for tracking the motion of the ultrasound probe (US) and of the surgical needle. Indeed, the probe tracking allows one to perform an accurate CT/MRI to US registration and to align in real-time the 3D reconstruction with the planned trajectory of the surgical tool, and needle tracking provides confirmation that the target point (i.e., the stone) has been reached.

2. Literature Review

2.1. Search Strategy

We performed a literature search exploiting the database Web Of Science (WoS), including the Social Science Citation Index (SSCI) and the Science Citation Index Expanded (SCI-Expanded). The following queries have been selected to analyze the number of contributions in which percutaneous nephrolithotomy is related to virtual and augmented reality systems or robotic assistance:
  • Topic = (percutaneous nephrolithotomy) AND ((virtual reality) OR (augmented reality) OR (mixed reality)).
  • Topic = (percutaneous nephrolithotomy) AND (robot*).
  • Topic = (percutaneous nephrolithotomy) AND ((virtual reality) OR (augmented reality) OR (mixed reality)) AND (robot*).
The topics comprised searches through titles, abstracts and keywords, and the superscript * in robot* was used to include all words with the same root, e.g., robot, robotic and robotics. The search has been limited to the contributions of the last decade, from 2012 to 2022. The results of the queries are depicted in the Venn diagram of Figure 1. In particular, it can be noticed that the number of contributions where PCNL is related to robotics is 73 (query 2); most of them come from medical journals and consist in the study of the feasibility of robot-assisted PCNL. The usage of virtual/augmented reality systems in PCNL has been investigated in 46 contributions (query 1). The intersection among queries contains just nine entries (query 3), showing that the idea of merging robotic assistance and virtual/augmented reality for PCNL surgery is still an emerging application.

2.2. Virtual and Augmented Reality for PCNL Interventions

The development of virtual reality software is continuing to simulate human patients and surgical procedures to help trainees achieve competency in a shorter length of time. Virtual reality simulators for percutaneous renal access training have been proposed in [20,21,22]. Then, recently, a review on the existing training methods was presented in [23], together with the introduction of the Marion Surgical K181 system. The latter is a virtual reality surgical simulator which emulates the PCNL procedure. In more detail, the simulator provides real-time haptic feedback to the user while he/she is immersed in a virtual environment, reproduced by means of a VR headset, in which the operating room and the patient anatomy are simulated. In [24], the authors performed a preliminary study of PCNL surgical rehearsal using the Marion K181 simulator.
Simulators integrating virtual reality have been proven to improve the performances of trainees. However, the main drawback is that they do not allow one to reproduce realistic haptic feedback, which is an important skill to be learned by the trainees [25]. To this end, augmented reality can be very helpful, since the surgeon can directly interact with the patient and with the environment. The development of augmented reality devices allows surgeons to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency and safety, and to enhance surgical training. Augmented reality applications consist of having the surgeon wearing a headset with a see-through eye display that projects images of the patient’s internal anatomy based on CT or MRI images, essentially giving them X-ray vision. Several reviews addressed the topic of exploiting augmented reality in surgery. In particular, ref. [26,27] provided reviews on the use of augmented reality in surgery and in minimally invasive surgery. Then, ref. [28] presented a review focused on virtual and augmented reality systems for renal interventions, and [4] directly focused on PCNL. One of the first goals of augmented reality solutions is to provide to the surgeon visual localizations of the anatomical structures in real-time. Then, this information is used for navigating the surgical instrument. In [29], an interactive medical image-guidance system using tablet-based augmented reality was developed in order to localize in 3D space the patient’s anatomic structures and navigate interoperatively the surgical tools. A navigation system based on optical tracking was provided also in [30], where the focus was exactly on PCNL, in order to avoid the need for multiple punctures due to a lack of visual information. In order to augment the capabilities of the surgeon during the procedure, augmented reality systems to support the percutaneous access by overlaying a 3D model onto the image from a tablet camera were proposed in [31,32,33]. Then, in [34] an augmented-reality-driven medical simulator for PCNL (simPCNL) with integrated assessment of cybersecurity was presented and validated. Finally, in [35], an aid for gaining the percutaneous access was provided by superimposing the projection of the puncture tract onto fluoroscopic images.
Table 1 summarizes the aforementioned papers and their main features.

2.3. Robotic Assistance in PCNL

The outcomes of a PCNL surgery strongly depend on how accurately the surgeon performs the puncture for the percutaneous access. Less experienced surgeons can manifest higher rates of complications due to the steep learning curve of the procedure. In this context, robotic systems can be used as skill-trainer devices. The work of [36] describes a robotic system for PCNL training. It consists of a 6-DoF manipulator holding the needle that is teleoperated by means of a haptic device that provides force feedback to the trainee while the needle is inserted into a simulation manikin.
Furthermore, robotic assistance can support the surgeon during the task, allowing him to focus on other critical jobs. The robotic system PAKY (Percutaneous Access to Kidney), firstly introduced in [37] and then improved in [38], comprises a passive 7-DoF arm holding a radiolucent needle at the end-effector that can be easily detected in CT images. Such a tool is actuated by an electric driver, allowing the translation of the needle along its axis. The surgeon has to manually align the instrument with the desired trajectory, and then, once the system has been locked, it can be teleoperated for performing the needle insertion.
Possible intra-operative complications may arise because of the displacement of the kidney or of the targeting stone, which can move away from the position estimated by means of a pre-operative CT dataset, which is usually acquired a few days before the surgery. Furthermore, the stone’s position can also change while the needle is inserted because of the patient’s respiratory movement. This challenge was addressed in the work of [39], by developing an admittance-controlled robotic system that performs the needle insertion while holding the ultrasound probe for providing the visual-servoing feedback. Reference [40] proposes the usage of a collaborative robotic system to execute a three-step procedure for PCNL surgery. In the first stage, the surgeon hand-guides the robot towards the percutaneous access, and then an adaptive algorithm with US image feedback is executed to estimate and compensate the kidney displacement caused by the patient’s breathing. In the final stage, the needle insertion is automatically performed by the robot. Recently, NDR Medical Technology put on the market ANT-X, a robot that integrates image registration software to automate the needle alignment. In [41], the authors compared the PCNL performances of ANT-X assisted surgery with the free-hand technique.
The block diagram of Figure 2 shows the currently available categories of robotics systems for assistance in PCNL surgery. Table 1 summarizes the latest literature on robotic-assistance in PCNL.
Finally, it is worth mentioning the da Vinci surgical robot: a teleoperated robotic system approved by the Food and Drug Administration (FDA) that allows minimally-invasive laparoscopic interventions including, among others, anatrophic nephrolithotomy (ANL). Such intervention provides the same treatment as PCNL, but exploits a laparoscopic approach rather than the percutaneous one. As an example, [42] published a study in which three patients with staghorn calculi underwent robot-assisted surgery using the da Vinci system without intra-operative complications. However, this work is not included in Table 1 because of the mentioned different approach, with respect to PCNL.

2.4. Tracking the US Probe and the Surgical Needle

Additional tools that can be provided and integrated to provide further assistance to the surgeon, obtaining real-time awareness of the 3D position and orientation of the considered anatomical structures, are tracking systems. In this context, this section will provide a detailed description of the available methods and technologies related to the tracking of the US probe and of the surgical needle.

2.4.1. Probe Tracking

Tracking systems represent fundamental equipment in robot-assisted percutaneous interventions because of the need to accurately localize the target and the instrument with respect to the robot reference system. Since both the target and the instrument can be visualized by means of US images, the pose of the US probe is required to be estimated in real time. The typical technology exploited to perform the tracking of the US probe involves the usage of an optical or electromagnetic system. On the one hand, the former category utilizes computer-vision software to track a number of optical markers fixed to probe and seen from a multi-camera system. In general, the markers are made up of reflective materials that are highlighted by apposite illuminators. More cameras have to be placed in the intervention room to avoid obstruction problems. OptiTrack and NDI Polaris Vega are commercial optical trackers that provide a declared sub-millimeter accuracy, as described in [43]. On the other hand, the electromagnetic trackers work by generating an electromagnetic field in which micro-electromagnetic sensors, fixed to the probe, are tracked. This kind of device could be affected by electromagnetic disturbance generated by other medical equipment. A range of field generators and compatible sensors are available, for example, through Polhemus. In this context, the work of [44] gives an assessment of the latest devices presented by Polhemus, which have estimated accuracy comparable to that obtained by optical trackers.
Recent research activities aimed to investigate possible tracking solutions with reduced costs and simplified equipment. The training system described in [45] makes use of a simulation manikin and a mock-up US-probe containing an inertial measurement unit (IMU) and an RFID reader that allows the real-time reconstruction of the probe pose. In particular, the position of the probe is discretized by placing RFID tags over the manikin in the anatomical landmarks of interest, and the orientation of the probe is estimated by processing the IMU data by means of a Kalman filter. The same authors investigated the performances of several IMU-based tracking methods in [46].
In one study [47], the authors designed an optical tracking system based on a single low-cost camera. The same authors in [48] proposed the usage of Optitrack V120-Trio to track the ultrasound probe, and enhanced the tracking performance by designing a hemispherical marker rigid body. In [49], a robust optical tracker is proposed that is based on small key-dot markers attached to a miniature pickup ultrasound probe, so that the circular dot pattern improves the performances in cases of low illumination levels and out-of-focus images during a robot-assisted partial nephrectomy. An alternative to the usage of specific sensors is presented in [50], the authors of which propose to mount the probe on a passive manipulator with calibrated kinematics that allows reconstructing the pose of the probe during its motion. Finally, it is worth mentioning [51], which introduces a sensorless approach to the tracking of the probe, in which a convolutional neural network learns to regress the pose of the probe directly from the acquired US images.
Table 2 shows the main works related to the probe-tracking problem, highlighting for each entry the adopted method and the necessary equipment.

2.4.2. Needle Tracking

The awareness of the needle trajectory and tip position during a percutaneous image-guided insertion is of fundamental importance for preventing the access of the needle into forbidden regions such as ribs, vessels or calyxes. The methods for estimating the pose of the needle can be firstly classified into two main categories: those that require a specific tracking sensor and those that directly perform the needle segmentation in the US image by exploiting an image-processing algorithm. The first group comprises approaches based on optical tracking systems. A stereo-camera system is often adopted to track the needle position and orientation in a three-dimensional space. The seminal work of [52] describes the algorithm and the stereo-camera calibration procedure to track the needle from an external point of view. More recently, ref. [53] proposed the usage of a low-resolution camera mounted directly on the ultrasound probe to visualize the external motion of the needle, assuming that the latter already has markings commonly used by surgeons as visual aids for insertion depth control. Such markings on the needle are then detected within the acquired image, so that the current needle trajectory can be estimated. The fact that the camera is attached to the probe guarantees that the needle is always visualized with the minimum risk of obstruction. The choice to use a single camera system provides benefits in terms of costs. These methods have to assume that no needle deflection or deformation exists inside the patient’s body.
Other methods make use of miniaturized devices affixed within the needle cannula. Reference [54] integrated a fiber-optic ultrasound receiver in the needle that interacts with the external probe transducer to reconstruct the pose of the needle and visualize tip position in the US image. Such a method requires specific additional elements in the transducer arrays. Similarly, the commercial system Onvision by B. Braun and Philips, described in the work of [55], tracks the needle’s end-point by exploiting a piezoelectric sensor close to the tip. The Onvision software allows the user to visualize the tip position in real time when it is within the US image plane, but also indicates its projection point when the needle is just nearby the image plane.
On the other hand, the methods belonging to the second category directly estimate the needle pose from the US image, without the need for additional sensors. Such a task is particularly challenging because of the poor observability of the needle in an US image. Issues that the image-processing algorithm has to take into account are: the proper segmentation of the needle, which has to be distinguished from possible artifacts (see, e.g., [56]), and the estimation of the needle direction and tip. Indeed, the needle can be approximated as a straight line with an end-point representing the needle tip. Such an approximation can be assumed in the context of percutaneous nephrolithotomy because of the high stiffness of the nephroscope that prevents the inflection of the instrument. The seminal work of [57] presents a real-time algorithm for detecting straight needles, based on a custom version of the Hough transform, but does not provide a solution for the problem of tip estimation. Similarly, the two methods proposed in [58] involve the usage of a Hough transform to detect the points of the US image belonging to the needle. Such methods can estimate the needle curvature, but do not provide a robust tip localization. Reference [59] solves the tip-estimation problem using a probability map reconstructed by means of a Gabor filter with an automatic parameter tuning strategy. The same authors in [60] improved the tracking performances using the motion information captured from the whole US video stream. Finally, ref. [61] provides a real-time procedure to firstly detect the needle axis inside a defined region of interest and then estimate the tip position along the axis by means of a statistical filter.
The classification of the tracking methods for the probe and the needle is sketched in the block diagram of Figure 3, and Table 2 summarizes the works addressing the tracking problem and their main features.

3. Discussion

In the context of PCNL surgery, for experienced surgeons, augmented reality visualization of the renal anatomy has been proven to be sufficient for estimating the correct needle angles to reach the target. Such visualization is enabled by the proper tracking and registration of the internal conditions of the anatomical structures of the patient reconstructed intraoperatively by means of ultrasound images. This review has analyzed the available methods for tracking the probe [43,44,45,46,47,48,49,50,51], classifying them on the basis of the adopted technology (i.e., IMU-RFID sensors, optical cameras or magnetic trackers). In addition, the tracking of the surgical needle is useful for having feedback on the insertion and puncturing step. References [52,53,54,55,56,57,58,59,60,61] address the problem of detecting the needle and estimating the tip position, by proposing sensor-based or sensorless approaches. However, several doubts persist as to whether urologists in training might need additional technical assistance apart from only AR support. To this end, the integration between augmented reality and robotic assistance can help urologists in training by reducing the learning curve and enhancing the outcomes of the intervention by supporting the surgeon during both pre-operative and intra-operative phases, as proposed in the system developed by [62]. In more detail, as depicted in Figure 4, in the pre-operative phase, the 3D reconstruction and segmentation of the patient anatomy were performed on the basis of CT/MRI scans so that the surgeon can plan a first trajectory to insert the surgical needle. In the intra-operative phase, organs’ poses may be slightly different with respect to those reconstructed in the pre-operative phase, so freehand ultrasound scans can be used to acquire the current 3D volumes of the organs to be aligned, and by means of computer vision algorithms, the 3D segmented models of the corresponding organs. The usage of an AR device can allow the surgeon to visualize in real time the segmented 3D models of the anatomical parts of interest, reconstructed from CT-imaging, directly on the patient body, so that the insertion trajectory can be easily replanned. Furthermore, a robotic system can assist the surgeon during the puncturing and the needle insertion by providing force feedback with the aim of avoiding the misalignment of the needle from the planned trajectory. Indeed, several robotics systems have been recently developed for assistance during PCNL surgery [36,37,38,39,40,41]. Some of them consist of teleoperated robots controlled by the surgeon, and more advanced systems can provide haptic feedback during hand guidance or can have a certain degree of autonomy by exploiting visual-servoing.

4. Conclusions and Future Work

PCNL is considered to be the gold standard for the treatment of kidney stones larger than 20 mm in diameter. It is performed through a small skin incision and thus minimizes incision size, pain, blood loss and blood transfusions, and shortens hospitalization with respect to standard procedures executed in open surgery. From a medical point of view, the main difficulty in executing PCNL arises when looking for precision and accuracy of the puncture step, in order to achieve a suitable renal access and reach the stone with a precise and direct path. This process requires a relatively steep learning curve. In light of the considerations presented in this review, it has been proven that augmented reality and robotic-assisted systems can particularly provide support to novel surgeons, preventing critical mistakes and reducing their learning curve. Needle and probe tracking are additional features that have been explored to be integrated into an assisting system. The main challenge still to be explored, and currently the future direction of development, is related to the integration of augmented reality and robotic assistance into a single system, which could then provide full and complete assistance to both expert and novel surgeons. In this direction, the main difficulty to be addressed and solved is related to the execution of an accurate CT/MRI to US registration, in order to align in real time the reconstructed model with the real body of the patient. Probe tracking can help in this direction, and several works have started to appear to help perform accurate US volume reconstruction and 3D registration (see, e.g., [63,64]).

Author Contributions

F.F. and S.F. searched in the literature for relevant works and wrote the paper. M.B. wrote the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Morris, D.S.; Wei, J.T.; Taub, D.A.; Dunn, R.L.; Wolf, J.S.; Hollenbeck, B.K. Temporal Trends in the Use of Percutaneous Nephrolithotomy. J. Urol. 2006, 175, 1731–1736. [Google Scholar] [CrossRef]
  2. Preminger, G.M.; Assimos, D.G.; Lingeman, J.E.; Nakada, S.Y.; Pearle, M.S.; Wolf, J.S.J. Chapter 1: AUA guideline on management of staghorn calculi: Diagnosis and treatment recommendations. J. Urol. 2005, 173, 1991–2000. [Google Scholar] [CrossRef]
  3. de la Rosette, J.; Assimos, D.; Desai, M.; Gutierrez, J.; Lingeman, J.; Scarpa, R.; Tefekli, A. The Clinical Research Office of the Endourological Society Percutaneous Nephrolithotomy Global Study: Indications, Complications, and Outcomes in 5803 Patients. J. Endourol. 2011, 25, 11–17. [Google Scholar] [CrossRef]
  4. Rodrigues, P.L.; Rodrigues, N.F.; Fonseca, J.; Lima, E.; Vilaca, J. Kidney targeting and puncturing during percutaneous nephrolithotomy: Recent advances and future perspective. J. Endourol. 2013, 27, 826–834. [Google Scholar] [CrossRef] [PubMed]
  5. Allen, D.; O’Brien, T.; Tiptaft, R.; Glass, J. Defining the learning curve for percutaneous nephrolithotomy. J. Endourol. 2005, 19, 279–282. [Google Scholar] [CrossRef] [PubMed]
  6. Tanriverdi, O.; Boylu, U.; Kendirci, M.; Kadihasanoglu, M.; Horasanli, K.; Miroglu, C. The learning curve in the training of percutaneous nephrolithotomy. Eur. Urol. 2007, 52, 206–211. [Google Scholar] [CrossRef]
  7. Ziaee, S.A.; Sichani, M.M.; Kashi, A.H.; Samzadeh, M. Evaluation of the learning curve for percutaneous nephrolithotomy. J. Urol. 2010, 7, 226–231. [Google Scholar]
  8. Ng, C.F. Training in percutaneous nephrolithotomy: The learning curve and options. Arab J. Urol. 2014, 12, 54–57. [Google Scholar] [CrossRef]
  9. Garg, A.; Yadav, S.S.; Tomar, V.; Priyadarshi, S.; Giri, V.; Vyas, N.; Agarwal, N. Prospective Evaluation of Learning Curve of Urology Residents for Percutaneous Nephrolithotomy. Urol. Pract. 2016, 3, 230–235. [Google Scholar] [CrossRef]
  10. Yu, W.; Rao, T.; Li, X.; Ruan, Y.; Yuan, R.; Li, C.; Li, H.; Cheng, F. The learning curve for access creation in solo ultrasonography-guided percutaneous nephrolithotomy and the associated skills. Int. Urol. Nephrol. 2017, 49, 419–424. [Google Scholar] [CrossRef]
  11. Song, Y.; Ma, Y.; Song, Y.; Fei, X. Evaluating the Learning Curve for Percutaneous Nephrolithotomy under Total Ultrasound Guidance. PLoS ONE 2015, 10, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Sahan, M.; Sarilar, O.; Savun, M.; Caglar, U.; Erbin, A.; Ozgor, F. Adopting for Supine Percutaneous Nephrolithotomy: Analyzing the Learning Curve of Tertiary Academic Center Urology Team. Urology 2020, 140, 22–26. [Google Scholar] [CrossRef]
  13. Borofsky, M.S.; Rivera, M.E.; Dauw, C.A.; Krambeck, A.E.; Lingeman, J.E. Electromagnetic Guided Percutaneous Renal Access Outcomes Among Surgeons and Trainees of Different Experience Levels: A Pilot Study. Urology 2020, 136, 266–271. [Google Scholar] [CrossRef]
  14. Jawed, Y.T.; Golovyan, D.; Lopez, D.; Khan, S.H.; Wang, S.; Freund, C.; Imran, S.; Hameed, U.B.; Smith, J.P.; Kok, L.; et al. Feasibility of a virtual reality intervention in the intensive care unit. Heart Lung 2021, 50, 748–753. [Google Scholar] [CrossRef]
  15. Breve, B.; Caruccio, L.; Cirillo, S.; Deufemia, V.; Polese, G. Visual ECG Analysis in Real-world Scenarios. In Proceedings of the 27th International DMS Conference on Visualization and Visual Languages (DMSVIVA), Pittsburgh, PA, USA, 29–30 June 2021; pp. 46–54. [Google Scholar]
  16. Özcan, E.; Birdja, D.; Simonse, L.; Struijs, A. Alarm in the ICU! Envisioning Patient Monitoring and Alarm Management in Future Intensive Care Units. In Service Design and Service Thinking in Healthcare and Hospital Management: Theory, Concepts, Practice; Pfannstiel, M.A., Rasche, C., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 421–446. [Google Scholar] [CrossRef]
  17. Vidyarthi, A.; Mittal, N. Comparative Study for Brain Tumor Classification on MR/CT Images. In Proceedings of the Third International Conference on Soft Computing for Problem Solving; Pant, M., Deep, K., Nagar, A., Bansal, J.C., Eds.; Springer India: New Delhi, India, 2014; pp. 889–897. [Google Scholar]
  18. Aamir, M.; Rahman, Z.; Dayo, Z.A.; Abro, W.A.; Uddin, M.I.; Khan, I.; Imran, A.S.; Ali, Z.; Ishfaq, M.; Guan, Y.; et al. A deep learning approach for brain tumor classification using MRI images. Comput. Electr. Eng. 2022, 101, 108105. [Google Scholar] [CrossRef]
  19. Breve, B.; Caruccio, L.; Cimino, G.; Cirillo, S.; Iuliano, G.; Polese, G. Brain tumors classification from MRI images: A comparative study between different neural networks. In Proceedings of the 28th International DMS Conference on Visualization and Visual Languages, Pittsburgh, PA, USA, 29–30 June 2022; pp. 23–30. [Google Scholar]
  20. Knudsen, B.E.; Matsumoto, E.D.; Chew, B.; Johnson, B.; Margulis, V.; Cadeddu, J.A.; Pearle, M.S.; Pautler, S.E.; Denstedt, J.D. A randomized, controlled, prospective study validating the acquisition of percutaneous renal collecting system access skills using a computer based hybrid virtual reality surgical simulator: Phase I. J. Urol. 2006, 176, 2173–2178. [Google Scholar] [CrossRef]
  21. Mishra, S.; Kurien, A.; Patel, R.; Patil, P.; Ganpule, A.; Muthu, V.; Sabnis, R.B.; Desai, M. Validation of virtual reality simulation for percutaneous renal access training. J. Endourol. 2010, 24, 635–640. [Google Scholar] [CrossRef]
  22. Papatsoris, A.G.; Shaikh, T.; Patel, D.; Bourdoumis, A.; Bach, C.; Buchholz, N.; Masood, J.; Junaid, I. Use of a Virtual Reality Simulator to Improve Percutaneous Renal Access Skills: A Prospective Study in Urology Trainees. Urol. Int. 2012, 89, 185–190. [Google Scholar] [CrossRef]
  23. Sainsbury, B.; Lacki, M.; Shahait, M.; Goldenberg, M.; Baghdadi, A.; Cavuoto, L.; Ren, J.; Green, M.; Lee, J.; Averch, T.D.; et al. Evaluation of a Virtual Reality Percutaneous Nephrolithotomy (PCNL) Surgical Simulator. Front. Robot. AI 2020, 6, 145. [Google Scholar] [CrossRef]
  24. Sainsbury, B.; Wilz, O.; Ren, J.; Green, M.; Fergie, M.; Rossa, C. Preoperative Virtual Reality Surgical Rehearsal of Renal Access during Percutaneous Nephrolithotomy: A Pilot Study. Electronics 2022, 11, 1562. [Google Scholar] [CrossRef]
  25. Pinzon, D.; Byrns, S.; Zheng, B. Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery. Surg. Innov. 2016, 23, 415–421. [Google Scholar] [CrossRef]
  26. Vavra, P.; Roman, J.; Zonca, P.; Ihnat, P.; Nemec, M.; Kumar, J.; Habib, N.; El-Gendi, A. Recent Development of Augmented Reality in Surgery: A Review. J. Healthc. Eng. 2017, 2017, 4574172. [Google Scholar] [CrossRef] [PubMed]
  27. Lamata, P.; Ali, W.; Cano, A.; Cornella, J.; Declerck, J.; Elle, O.; Freudenthal, A.; Furtado, H.; Kalkofen, D.; Naerum, E.; et al. Augmented Reality for Minimally Invasive Surgery: Overview and Some Recent Advances. In Augmented Reality; IntechOpen: London, UK, 2010. [Google Scholar]
  28. Detmer, F.J.; Hettig, J.; Schindele, D.; Schostak, M.; Hansen, C. Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review. IEEE Rev. Biomed. Eng. 2017, 10, 78–94. [Google Scholar] [CrossRef] [PubMed]
  29. Wen, R.; Chng, C.B.; Chui, C.K. Augmented Reality Guidance with Multimodality Imaging Data and Depth-Perceived Interaction for Robot-Assisted Surgery. Robotics 2017, 6, 13. [Google Scholar] [CrossRef]
  30. Oliveira-Santos, T.; Peterhans, M.; Roth, B.; Reyes, M.; Nolte, L.P.; Thalmann, G.; Weber, S. Computer aided surgery for percutaneous nephrolithotomy: Clinical requirement analysis and system design. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Buenos Aires, Argentina, 31 August–4 September 2010; pp. 442–445. [Google Scholar] [CrossRef]
  31. Rassweiler, J.J.; Muller, M.; Fangerau, M.; Klein, J.; Goezen, A.S.; Pereira, P.; Meinzer, H.P.; Teberd, D. iPad-Assisted Percutaneous Access to the Kidney Using Marker-Based Navigation: Initial Clinical Experience. Eur. Urol. 2012, 61, 628–631. [Google Scholar] [CrossRef]
  32. Akand, M.; Civcik, L.; Buyukaslan, A.; Altintas, E.; Kocer, E.; Koplay, M.; Erdogru, T. Feasibility of a novel technique using 3-dimensional modeling and augmented reality for access during percutaneous nephrolithotomy in two different ex-vivo models. Int. Urol. Nephrol. 2019, 51, 17–25. [Google Scholar] [CrossRef]
  33. Muller, M.; Rassweiler, M.C.; Klein, J.; Seitel, A.; Gondan, M.; Baumhauer, M.; Teber, D.; Rassweiler, J.J.; Meinzer, H.P.; Maier-Hein, L. Mobile augmented reality for computer-assisted percutaneous nephrolithotomy. Int. J. Comput. Assist. Radiol. Surg. 2013, 8, 663–675. [Google Scholar] [CrossRef]
  34. Tai, Y.; Wei, L.; Zhou, H.; Peng, J.; Li, Q.; Li, F.; Zhang, J.; Shi, J. Augmented-reality-driven medical simulation platform for percutaneous nephrolithotomy with cybersecurity awareness. Int. J. Distrib. Sens. Netw. 2019, 15, 1550147719840173. [Google Scholar] [CrossRef]
  35. Mozer, P.; Conort, P.; Leroy, A.; Baumann, M.; Payan, Y.; Troccaz, J.; Chartier-Kastler, E.; Richard, F. Aid to percutaneous renal access by virtual projection of the ultrasound puncture tract onto fluoroscopic images. J. Endourol. 2007, 21, 460–465. [Google Scholar] [CrossRef] [Green Version]
  36. Wilz, O.; Sainsbury, B.; Rossa, C. Constrained haptic-guided shared control for collaborative human–robot percutaneous nephrolithotomy training. Mechatronics 2021, 75, 102528. [Google Scholar] [CrossRef]
  37. Stoianovici, D.; Whitcomb, L.L.; Anderson, J.H.; Taylor, R.H.; Kavoussi, L.R. A modular surgical robotic system for image guided percutaneous procedures. In Medical Image Computing and Computer-Assisted Intervention—MICCAI’98; Wells, W.M., Colchester, A., Delp, S., Eds.; Springer: Heidelberg/Berlin, Germany, 1998; pp. 404–410. [Google Scholar]
  38. Stoianovici, D.; Jun, C.; Lim, S.; Li, P.; Petrisor, D.; Fricke, S.; Sharma, K.; Cleary, K. Multi-imager compatible, MR safe, remote center of motion needle-guide robot. IEEE Trans. Biomed. Eng. 2017, 65, 165–177. [Google Scholar] [CrossRef] [PubMed]
  39. Paranawithana, I.; Li, H.Y.; Foong, S.; Tan, U.X.; Yang, L.; Kiat Lim, T.S.; Ng, F.C. Ultrasound-Guided Involuntary Motion Compensation of Kidney Stones in Percutaneous Nephrolithotomy Surgery. In Proceedings of the 2018 IEEE 14th International Conference on Automation Science and Engineering (CASE), Munich, Germany, 20–24 August 2018; pp. 1123–1129. [Google Scholar] [CrossRef]
  40. Li, H.Y.; Paranawithana, I.; Chau, Z.H.; Yang, L.; Lim, T.S.K.; Foong, S.; Ng, F.C.; Tan, U.X. Towards to a Robotic Assisted System for Percutaneous Nephrolithotomy. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 791–797. [Google Scholar]
  41. Oo, M.M.; Gandhi, H.R.; Chong, K.T.; Goh, J.Q.; Ng, K.W.; Hein, A.T.; Tan, Y.K. Automated Needle Targeting with X-ray (ANT-X)-Robot-assisted device for percutaneous nephrolithotomy (PCNL) with its first successful use in human. J. Endourol. 2021, 35, e919. [Google Scholar] [CrossRef] [PubMed]
  42. Ghani, K.R.; Rogers, C.G.; Sood, A.; Kumar, R.; Ehlert, M.; Jeong, W.; Ganpule, A.; Bhandari, M.; Desai, M.; Menon, M. Robot-assisted anatrophic nephrolithotomy with renal hypothermia for managing staghorn calculi. J. Endourol. 2013, 27, 1393–1398. [Google Scholar] [CrossRef] [PubMed]
  43. Fattori, G.; Lomax, A.J.; Weber, D.C.; Safai, S. Technical assessment of the NDI Polaris Vega optical tracking system. Radiat. Oncol. 2021, 16, 1–4. [Google Scholar] [CrossRef]
  44. Franz, A.M.; Seitel, A.; Cheray, D.; Maier-Hein, L. Polhemus EM tracked Micro Sensor for CT-guided interventions. Med. Phys. 2019, 46, 15–24. [Google Scholar] [CrossRef]
  45. Farsoni, S.; Astolfi, L.; Bonfe, M.; Spadaro, S.; Volta, C.A. A versatile ultrasound simulation system for education and training in high-fidelity emergency scenarios. IEEE J. Transl. Eng. Health Med. 2017, 5, 1–9. [Google Scholar] [CrossRef]
  46. Farsoni, S.; Bonfè, M.; Astolfi, L. A low-cost high-fidelity ultrasound simulator with the inertial tracking of the probe pose. Control Eng. Pract. 2017, 59, 183–193. [Google Scholar] [CrossRef]
  47. Cai, Q.; Peng, C.; Prieto, J.C.; Rosenbaum, A.J.; Stringer, J.S.A.; Jiang, X. A Low-Cost Camera-Based Ultrasound Probe Tracking System: Design and Prototype. In Proceedings of the 2019 IEEE International Ultrasonics Symposium (IUS), Glasgow, UK, 6–9 October 2019; pp. 997–999. [Google Scholar] [CrossRef]
  48. Cai, Q.; Peng, C.; Lu, J.Y.; Prieto, J.C.; Rosenbaum, A.J.; Stringer, J.S.; Jiang, X. Performance Enhanced Ultrasound Probe Tracking With a Hemispherical Marker Rigid Body. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2021, 68, 2155–2163. [Google Scholar] [CrossRef]
  49. Pratt, P.; Jaeger, A.; Hughes-Hallett, A.; Mayer, E.; Vale, J.; Darzi, A.; Peters, T.; Yang, G.Z. Robust ultrasound probe tracking: Initial clinical experiences during robot-assisted partial nephrectomy. Int. J. Comput. Assist. Radiol. Surg. 2015, 10, 1905–1913. [Google Scholar] [CrossRef]
  50. Neshat, H.; Cool, D.W.; Barker, K.; Gardi, L.; Kakani, N.; Fenster, A. A 3D ultrasound scanning system for image guided liver interventions. Med. Phys. 2013, 40, 112903. [Google Scholar] [CrossRef]
  51. Xue, E.Y. Sensorless Ultrasound Probe 6DoF Pose Estimation through the Use of CNNs on Image Data. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2018. [Google Scholar]
  52. Chan, C.; Lam, F.; Rohling, R. A needle tracking device for ultrasound guided percutaneous procedures. Ultrasound Med. Biol. 2005, 31, 1469–1483. [Google Scholar] [CrossRef]
  53. Najafi, M.; Abolmaesumi, P.; Rohling, R. Single-camera closed-form real-time needle tracking for ultrasound-guided needle insertion. Ultrasound Med. Biol. 2015, 41, 2663–2676. [Google Scholar] [CrossRef]
  54. Xia, W.; West, S.J.; Finlay, M.C.; Pratt, R.; Mathews, S.; Mari, J.M.; Ourselin, S.; David, A.L.; Desjardins, A.E. Three-dimensional ultrasonic needle tip tracking with a fiber-optic ultrasound receiver. J. Vis. Exp. JoVE 2018, 138, e57207. [Google Scholar] [CrossRef]
  55. Kåsine, T.; Romundstad, L.; Rosseland, L.A.; Fagerland, M.W.; Kessler, P.; Omenås, I.N.; Holmberg, A.; Sauter, A.R. Ultrasonographic needle tip tracking for in-plane infraclavicular brachialis plexus blocks: A randomized controlled volunteer study. Reg. Anesth. Pain Med. 2020, 45, 634–639. [Google Scholar] [CrossRef]
  56. Reusz, G.; Sarkany, P.; Gal, J.; Csomos, A. Needle-related ultrasound artifacts and their importance in anaesthetic practice. Br. J. Anaesth. 2014, 112, 794–802. [Google Scholar] [CrossRef]
  57. Ding, M.; Fenster, A. A real-time biopsy needle segmentation technique using Hough Transform. Med. Phys. 2003, 30, 2222–2233. [Google Scholar] [CrossRef]
  58. Okazawa, S.H.; Ebrahimi, R.; Chuang, J.; Rohling, R.N.; Salcudean, S.E. Methods for segmenting curved needles in ultrasound images. Med. Image Anal. 2006, 10, 330–342. [Google Scholar] [CrossRef]
  59. Kaya, M.; Bebek, O. Gabor filter based localization of needles in ultrasound guided robotic interventions. In Proceedings of the 2014 IEEE International Conference on Imaging Systems and Techniques (IST) Proceedings, Santorini, Greece, 14–17 October 2014; pp. 112–117. [Google Scholar]
  60. Kaya, M.; Senel, E.; Ahmad, A.; Bebek, O. Visual needle tip tracking in 2D US guided robotic interventions. Mechatronics 2019, 57, 129–139. [Google Scholar] [CrossRef]
  61. Mathiassen, K.; Dall’Alba, D.; Muradore, R.; Fiorini, P.; Elle, O.J. Robust real-time needle tracking in 2-D ultrasound images using statistical filtering. IEEE Trans. Control Syst. Technol. 2016, 25, 966–978. [Google Scholar] [CrossRef]
  62. Ferraguti, F.; Minelli, M.; Farsoni, S.; Bazzani, S.; Bonfè, M.; Vandanjon, A.; Puliatti, S.; Bianchi, G.; Secchi, C. Augmented Reality and Robotic-Assistance for Percutaneous Nephrolithotomy. IEEE Robot. Autom. Lett. 2020, 5, 4556–4563. [Google Scholar] [CrossRef]
  63. Wein, W.; Ladikos, A.; Fuerst, B.; Shah, A.; Sharma, K.; Navab, N. Global registration of ultrasound to MRI using the LC 2 metric for enabling neurosurgical guidance. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Nagoya, Japan, 22–26 September 2013; pp. 34–41. [Google Scholar]
  64. Wein, W.; Karamalis, A.; Baumgartner, A.; Navab, N. Automatic bone detection and soft tissue aware ultrasound–CT registration for computer-aided orthopedic surgery. Int. J. Comput. Assist. Radiol. Surg. 2015, 10, 971–979. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Queries results: The number of contributions in which PCNL is related to virtual/augmented reality (46: blue circle) and robotics (73: yellow circle). The intersection contains 9 contributions. Source: Web of Science.
Figure 1. Queries results: The number of contributions in which PCNL is related to virtual/augmented reality (46: blue circle) and robotics (73: yellow circle). The intersection contains 9 contributions. Source: Web of Science.
Electronics 11 02984 g001
Figure 2. The categories of robotic assistance for PCNL surgery.
Figure 2. The categories of robotic assistance for PCNL surgery.
Electronics 11 02984 g002
Figure 3. The classification of tracking methods for the ultrasound probe and for the surgical needle.
Figure 3. The classification of tracking methods for the ultrasound probe and for the surgical needle.
Electronics 11 02984 g003
Figure 4. A possible architecture implementing the integration between augmented reality and robotic assistance for PCNL surgery.
Figure 4. A possible architecture implementing the integration between augmented reality and robotic assistance for PCNL surgery.
Electronics 11 02984 g004
Table 1. Latest literature on virtual and augmented reality and robotic assistance in PCNL.
Table 1. Latest literature on virtual and augmented reality and robotic assistance in PCNL.
Virtual and Augmented Reality
ReferenceHighlights
[23]VR: Introduction of a virtual reality simulator to emulate PCNL.
[22]VR: Percutaneous renal access using the PERC Mentor simulator.
[21]VR: Validation of VR-based simulators for percutaneous renal access.
[20]VR: Assessment of training improvement on a VR simulator.
[24]VR: Assessment of Marion K181: PCNL simulator with haptic feedback
[34]AR: Introduction of an augmented reality simulator to emulate PCNL.
[29]AR: Image-guidance for localizing the structures and navigating.
[31,32,33]AR: Superimposition of a 3D model onto the image from a tablet.
[30]AR: Navigation system based on optical tracking in PCNL.
[35]AR: Superimposition of the puncture tract onto fluoroscopic images.
Robotic assistance
ReferenceHighlights
[36]Skill-trainer robot teleoperated via a haptic device.
[39]Admittance-controlled robot with visual-servoing.
[40]Robotic system with automatic compensation of kidney displacement.
[38]Robot with manual alignment and teleoperated insertion.
[41]ANT-X Robot with automated needle alignment and CT image registration.
Table 2. Probe/needle tracking methods and features.
Table 2. Probe/needle tracking methods and features.
TrackingReferenceMethodRequired Equipment
Probe[45]Sensor-fusionIMU and RFID
[47]OpticalSingle-camera
[48,49]OpticalStereo-camera
[50]MechanicalManipulator
[51]Neural NetworksSensorless
Features
NeedleTip
curvaturetracking
Needle[52]Sensor-basedStereo-camera×
[53]Sensor-basedSingle camera×
[55]Sensor-basedPiezoelectric
[54]Sensor-basedFiber-optic
[57]Image processingUS imaging××
[58]Image processingUS imagingPoor
[59]Image processingUS imaging×
[60]Image processingUS imaging
[61]Image processingUS imaging×
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ferraguti, F.; Farsoni, S.; Bonfè, M. Augmented Reality and Robotic Systems for Assistance in Percutaneous Nephrolithotomy Procedures: Recent Advances and Future Perspectives. Electronics 2022, 11, 2984. https://doi.org/10.3390/electronics11192984

AMA Style

Ferraguti F, Farsoni S, Bonfè M. Augmented Reality and Robotic Systems for Assistance in Percutaneous Nephrolithotomy Procedures: Recent Advances and Future Perspectives. Electronics. 2022; 11(19):2984. https://doi.org/10.3390/electronics11192984

Chicago/Turabian Style

Ferraguti, Federica, Saverio Farsoni, and Marcello Bonfè. 2022. "Augmented Reality and Robotic Systems for Assistance in Percutaneous Nephrolithotomy Procedures: Recent Advances and Future Perspectives" Electronics 11, no. 19: 2984. https://doi.org/10.3390/electronics11192984

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop