Medical Augmented Reality Summer School (MARSS) 2021: Selected Contributions

A special issue of Journal of Imaging (ISSN 2313-433X). This special issue belongs to the section "Mixed, Augmented and Virtual Reality".

Deadline for manuscript submissions: closed (16 September 2022) | Viewed by 20917

Special Issue Editors

Chair of Computer Aided Medical Procedures, Technical University Munich, Munich, Germany
Interests: medical image computing, medical augmented reality, computer aided interventions, computer vision
Research in Orthopedic Computer Science, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008 Zurich, Switzerland
Interests: registration; augmented reality; intraoperative data; deep and reinforcement learning; instrument tracking; user and validation studies; orthopedic surgery
Special Issues, Collections and Topics in MDPI journals
Chair of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
Interests: entire spectrum of spine surgery, medical augmented reality
Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, 8008 Zurich, Switzerland
Interests: surgical navigation; medical artificial intelligence; spatial technologies in surgery; medical augmented reality; medical image analysis
1. Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
2. Chair of Computer Aided Medical Procedures, Technical University Munich, Munich, Germany
Interests: deep learning, acoustic sensing, medical augmented reality

Special Issue Information

Dear Colleagues,

Medical Augmented Reality is a constantly growing research field, pushing the boundaries of medical data and imaging visualization, increasing the quality and safety of surgical procedures, and is able to provide surgical guidance and additional information to medical professionals in the most intuitive way. Augmented Reality (AR) education systems train the surgeons of tomorrow to acquire the critical skills, support the surgical staff in clinical routine, and AR rehabilitation concepts help patients to recover faster after interventions.

To teach a new generation of clinicians, scientists, engineers, and industry professionals, we initiated the Medical Augmented Reality Summer School (MARSS) program which has already been successfully conducted as an on-site event in 2019, and as a fully virtual event in 2021. The program consists of keynotes, lectures, and workshops, as well as a competition, in which novel ideas and concepts are fostered and developed into working prototypes and solutions in international and interdisciplinary teams.

This MDPI Journal of Imaging Special Issue showcases selected research projects initiated and conducted within the competition of the MARSS2021 program, as well as complementary scientific articles about Medical Augmented Reality research from selected main speakers of MARSS2021.

Prof. Dr. Nassir Navab
Prof. Dr. Philipp Fürnstahl
Prof. Dr. Mazda Farshad
Dr. Hooman Esfandiari
Matthias Seibold
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Imaging is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • medical augmented reality
  • augmented reality
  • medical imaging
  • medical visualization
  • computer aided surgery
  • computer aided interventions
  • medical/surgical education
  • medical/surgical training
  • surgical navigation
  • optical tracking
  • human computer interface
  • human robot interaction

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

13 pages, 6623 KiB  
Article
Remote Interactive Surgery Platform (RISP): Proof of Concept for an Augmented-Reality-Based Platform for Surgical Telementoring
by Yannik Kalbas, Hoijoon Jung, John Ricklin, Ge Jin, Mingjian Li, Thomas Rauer, Shervin Dehghani, Nassir Navab, Jinman Kim, Hans-Christoph Pape and Sandro-Michael Heining
J. Imaging 2023, 9(3), 56; https://doi.org/10.3390/jimaging9030056 - 23 Feb 2023
Cited by 2 | Viewed by 1812
Abstract
The “Remote Interactive Surgery Platform” (RISP) is an augmented reality (AR)-based platform for surgical telementoring. It builds upon recent advances of mixed reality head-mounted displays (MR-HMD) and associated immersive visualization technologies to assist the surgeon during an operation. It enables an interactive, real-time [...] Read more.
The “Remote Interactive Surgery Platform” (RISP) is an augmented reality (AR)-based platform for surgical telementoring. It builds upon recent advances of mixed reality head-mounted displays (MR-HMD) and associated immersive visualization technologies to assist the surgeon during an operation. It enables an interactive, real-time collaboration with a remote consultant by sharing the operating surgeon’s field of view through the Microsoft (MS) HoloLens2 (HL2). Development of the RISP started during the Medical Augmented Reality Summer School 2021 and is currently still ongoing. It currently includes features such as three-dimensional annotations, bidirectional voice communication and interactive windows to display radiographs within the sterile field. This manuscript provides an overview of the RISP and preliminary results regarding its annotation accuracy and user experience measured with ten participants. Full article
Show Figures

Figure 1

10 pages, 5150 KiB  
Article
Verification, Evaluation, and Validation: Which, How & Why, in Medical Augmented Reality System Design
by Roy Eagleson and Leo Joskowicz
J. Imaging 2023, 9(2), 20; https://doi.org/10.3390/jimaging9020020 - 17 Jan 2023
Cited by 2 | Viewed by 1364
Abstract
This paper presents a discussion about the fundamental principles of Analysis of Augmented and Virtual Reality (AR/VR) Systems for Medical Imaging and Computer-Assisted Interventions. The three key concepts of Analysis (Verification, Evaluation, and Validation) are introduced, illustrated with examples of systems using AR/VR, [...] Read more.
This paper presents a discussion about the fundamental principles of Analysis of Augmented and Virtual Reality (AR/VR) Systems for Medical Imaging and Computer-Assisted Interventions. The three key concepts of Analysis (Verification, Evaluation, and Validation) are introduced, illustrated with examples of systems using AR/VR, and defined. The concepts of system specifications, measurement accuracy, uncertainty, and observer variability are defined and related to the analysis principles. The concepts are illustrated with examples of AR/VR working systems. Full article
Show Figures

Figure 1

13 pages, 1660 KiB  
Article
CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording
by Manuel Birlo, Philip J. Eddie Edwards, Soojeong Yoo, Brian Dromey, Francisco Vasconcelos, Matthew J. Clarkson and Danail Stoyanov
J. Imaging 2023, 9(1), 6; https://doi.org/10.3390/jimaging9010006 - 29 Dec 2022
Cited by 2 | Viewed by 2037
Abstract
Obstetric ultrasound (US) training teaches the relationship between foetal anatomy and the viewed US slice to enable navigation to standardised anatomical planes (head, abdomen and femur) where diagnostic measurements are taken. This process is difficult to learn, and results in considerable inter-operator variability. [...] Read more.
Obstetric ultrasound (US) training teaches the relationship between foetal anatomy and the viewed US slice to enable navigation to standardised anatomical planes (head, abdomen and femur) where diagnostic measurements are taken. This process is difficult to learn, and results in considerable inter-operator variability. We propose the CAL-Tutor system for US training based on a US scanner and phantom, where a model of both the baby and the US slice are displayed to the trainee in its physical location using the HoloLens 2. The intention is that AR guidance will shorten the learning curve for US trainees and improve spatial awareness. In addition to the AR guidance, we also record many data streams to assess user motion and the learning process. The HoloLens 2 provides eye gaze, head and hand position, ARToolkit and NDI Aurora tracking gives the US probe positions and an external camera records the overall scene. These data can provide a rich source for further analysis, such as distinguishing expert from novice motion. We have demonstrated the system in a sample of engineers. Feedback suggests that the system helps novice users navigate the US probe to the standard plane. The data capture is successful and initial data visualisations show that meaningful information about user behaviour can be captured. Initial feedback is encouraging and shows improved user assessment where AR guidance is provided. Full article
Show Figures

Figure 1

21 pages, 26921 KiB  
Article
Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process
by Nassir Navab, Alejandro Martin-Gomez, Matthias Seibold, Michael Sommersperger, Tianyu Song, Alexander Winkler, Kevin Yu and Ulrich Eck
J. Imaging 2023, 9(1), 4; https://doi.org/10.3390/jimaging9010004 - 23 Dec 2022
Cited by 7 | Viewed by 4152
Abstract
Three decades after the first set of work on Medical Augmented Reality (MAR) was presented to the international community, and ten years after the deployment of the first MAR solutions into operating rooms, its exact definition, basic components, systematic design, and validation still [...] Read more.
Three decades after the first set of work on Medical Augmented Reality (MAR) was presented to the international community, and ten years after the deployment of the first MAR solutions into operating rooms, its exact definition, basic components, systematic design, and validation still lack a detailed discussion. This paper defines the basic components of any Augmented Reality (AR) solution and extends them to exemplary Medical Augmented Reality Systems (MARS). We use some of the original MARS applications developed at the Chair for Computer Aided Medical Procedures and deployed into medical schools for teaching anatomy and into operating rooms for telemedicine and surgical guidance throughout the last decades to identify the corresponding basic components. In this regard, the paper is not discussing all past or existing solutions but only aims at defining the principle components and discussing the particular domain modeling for MAR and its design-development-validation process, and providing exemplary cases through the past in-house developments of such solutions. Full article
Show Figures

Figure 1

8 pages, 6592 KiB  
Communication
Remote Training for Medical Staff in Low-Resource Environments Using Augmented Reality
by Austin Hale, Marc Fischer, Laura Schütz, Henry Fuchs and Christoph Leuze
J. Imaging 2022, 8(12), 319; https://doi.org/10.3390/jimaging8120319 - 29 Nov 2022
Cited by 2 | Viewed by 1737
Abstract
This work aims to leverage medical augmented reality (AR) technology to counter the shortage of medical experts in low-resource environments. We present a complete and cross-platform proof-of-concept AR system that enables remote users to teach and train medical procedures without expensive medical equipment [...] Read more.
This work aims to leverage medical augmented reality (AR) technology to counter the shortage of medical experts in low-resource environments. We present a complete and cross-platform proof-of-concept AR system that enables remote users to teach and train medical procedures without expensive medical equipment or external sensors. By seeing the 3D viewpoint and head movements of the teacher, the student can follow the teacher’s actions on the real patient. Alternatively, it is possible to stream the 3D view of the patient from the student to the teacher, allowing the teacher to guide the student during the remote session. A pilot study of our system shows that it is easy to transfer detailed instructions through this remote teaching system and that the interface is easily accessible and intuitive for users. We provide a performant pipeline that synchronizes, compresses, and streams sensor data through parallel efficiency. Full article
Show Figures

Figure 1

12 pages, 1925 KiB  
Article
Towards a Low-Cost Monitor-Based Augmented Reality Training Platform for At-Home Ultrasound Skill Development
by Marine Y. Shao, Tamara Vagg, Matthias Seibold and Mitchell Doughty
J. Imaging 2022, 8(11), 305; https://doi.org/10.3390/jimaging8110305 - 09 Nov 2022
Cited by 3 | Viewed by 1719
Abstract
Ultrasound education traditionally involves theoretical and practical training on patients or on simulators; however, difficulty accessing training equipment during the COVID-19 pandemic has highlighted the need for home-based training systems. Due to the prohibitive cost of ultrasound probes, few medical students have access [...] Read more.
Ultrasound education traditionally involves theoretical and practical training on patients or on simulators; however, difficulty accessing training equipment during the COVID-19 pandemic has highlighted the need for home-based training systems. Due to the prohibitive cost of ultrasound probes, few medical students have access to the equipment required for at home training. Our proof of concept study focused on the development and assessment of the technical feasibility and training performance of an at-home training solution to teach the basics of interpreting and generating ultrasound data. The training solution relies on monitor-based augmented reality for displaying virtual content and requires only a marker printed on paper and a computer with webcam. With input webcam video, we performed body pose estimation to track the student’s limbs and used surface tracking of printed fiducials to track the position of a simulated ultrasound probe. The novelty of our work is in its combination of printed markers with marker-free body pose tracking. In a small user study, four ultrasound lecturers evaluated the training quality with a questionnaire and indicated the potential of our system. The strength of our method is that it allows students to learn the manipulation of an ultrasound probe through the simulated probe combined with the tracking system and to learn how to read ultrasounds in B-mode and Doppler mode. Full article
Show Figures

Figure 1

13 pages, 8871 KiB  
Article
HAPPY: Hip Arthroscopy Portal Placement Using Augmented Reality
by Tianyu Song, Michael Sommersperger, The Anh Baran, Matthias Seibold and Nassir Navab
J. Imaging 2022, 8(11), 302; https://doi.org/10.3390/jimaging8110302 - 06 Nov 2022
Cited by 2 | Viewed by 1629
Abstract
Correct positioning of the endoscope is crucial for successful hip arthroscopy. Only with adequate alignment can the anatomical target area be visualized and the procedure be successfully performed. Conventionally, surgeons rely on anatomical landmarks such as bone structure, and on intraoperative X-ray imaging, [...] Read more.
Correct positioning of the endoscope is crucial for successful hip arthroscopy. Only with adequate alignment can the anatomical target area be visualized and the procedure be successfully performed. Conventionally, surgeons rely on anatomical landmarks such as bone structure, and on intraoperative X-ray imaging, to correctly place the surgical trocar and insert the endoscope to gain access to the surgical site. One factor complicating the placement is deformable soft tissue, as it can obscure important anatomical landmarks. In addition, the commonly used endoscopes with an angled camera complicate hand–eye coordination and, thus, navigation to the target area. Adjusting for an incorrectly positioned endoscope prolongs surgery time, requires a further incision and increases the radiation exposure as well as the risk of infection. In this work, we propose an augmented reality system to support endoscope placement during arthroscopy. Our method comprises the augmentation of a tracked endoscope with a virtual augmented frustum to indicate the reachable working volume. This is further combined with an in situ visualization of the patient anatomy to improve perception of the target area. For this purpose, we highlight the anatomy that is visible in the endoscopic camera frustum and use an automatic colorization method to improve spatial perception. Our system was implemented and visualized on a head-mounted display. The results of our user study indicate the benefit of the proposed system compared to baseline positioning without additional support, such as an increased alignment speed, improved positioning error and reduced mental effort. The proposed approach might aid in the positioning of an angled endoscope, and may result in better access to the surgical area, reduced surgery time, less patient trauma, and less X-ray exposure during surgery. Full article
Show Figures

Figure 1

14 pages, 5139 KiB  
Article
An Augmented Reality-Based Interaction Scheme for Robotic Pedicle Screw Placement
by Viktor Vörös, Ruixuan Li, Ayoob Davoodi, Gauthier Wybaillie, Emmanuel Vander Poorten and Kenan Niu
J. Imaging 2022, 8(10), 273; https://doi.org/10.3390/jimaging8100273 - 06 Oct 2022
Cited by 5 | Viewed by 2014
Abstract
Robot-assisted surgery is becoming popular in the operation room (OR) for, e.g., orthopedic surgery (among other surgeries). However, robotic executions related to surgical steps cannot simply rely on preoperative plans. Using pedicle screw placement as an example, extra adjustments are needed to adapt [...] Read more.
Robot-assisted surgery is becoming popular in the operation room (OR) for, e.g., orthopedic surgery (among other surgeries). However, robotic executions related to surgical steps cannot simply rely on preoperative plans. Using pedicle screw placement as an example, extra adjustments are needed to adapt to the intraoperative changes when the preoperative planning is outdated. During surgery, adjusting a surgical plan is non-trivial and typically rather complex since the available interfaces used in current robotic systems are not always intuitive to use. Recently, thanks to technical advancements in head-mounted displays (HMD), augmented reality (AR)-based medical applications are emerging in the OR. The rendered virtual objects can be overlapped with real-world physical objects to offer intuitive displays of the surgical sites and anatomy. Moreover, the potential of combining AR with robotics is even more promising; however, it has not been fully exploited. In this paper, an innovative AR-based robotic approach is proposed and its technical feasibility in simulated pedicle screw placement is demonstrated. An approach for spatial calibration between the robot and HoloLens 2 without using an external 3D tracking system is proposed. The developed system offers an intuitive AR–robot interaction approach between the surgeon and the surgical robot by projecting the current surgical plan to the surgeon for fine-tuning and transferring the updated surgical plan immediately back to the robot side for execution. A series of bench-top experiments were conducted to evaluate system accuracy and human-related errors. A mean calibration error of 3.61 mm was found. The overall target pose error was 3.05 mm in translation and 1.12 in orientation. The average execution time for defining a target entry point intraoperatively was 26.56 s. This work offers an intuitive AR-based robotic approach, which could facilitate robotic technology in the OR and boost synergy between AR and robots for other medical applications. Full article
Show Figures

Figure 1

16 pages, 7479 KiB  
Article
AR-Supported Supervision of Conditional Autonomous Robots: Considerations for Pedicle Screw Placement in the Future
by Josefine Schreiter, Danny Schott, Lovis Schwenderling, Christian Hansen, Florian Heinrich and Fabian Joeres
J. Imaging 2022, 8(10), 255; https://doi.org/10.3390/jimaging8100255 - 21 Sep 2022
Viewed by 1615
Abstract
Robotic assistance is applied in orthopedic interventions for pedicle screw placement (PSP). While current robots do not act autonomously, they are expected to have higher autonomy under surgeon supervision in the mid-term. Augmented reality (AR) is promising to support this supervision and to [...] Read more.
Robotic assistance is applied in orthopedic interventions for pedicle screw placement (PSP). While current robots do not act autonomously, they are expected to have higher autonomy under surgeon supervision in the mid-term. Augmented reality (AR) is promising to support this supervision and to enable human–robot interaction (HRI). To outline a futuristic scenario for robotic PSP, the current workflow was analyzed through literature review and expert discussion. Based on this, a hypothetical workflow of the intervention was developed, which additionally contains the analysis of the necessary information exchange between human and robot. A video see-through AR prototype was designed and implemented. A robotic arm with an orthopedic drill mock-up simulated the robotic assistance. The AR prototype included a user interface to enable HRI. The interface provides data to facilitate understanding of the robot’s ”intentions”, e.g., patient-specific CT images, the current workflow phase, or the next planned robot motion. Two-dimensional and three-dimensional visualization illustrated patient-specific medical data and the drilling process. The findings of this work contribute a valuable approach in terms of addressing future clinical needs and highlighting the importance of AR support for HRI. Full article
Show Figures

Figure 1

Other

Jump to: Research

10 pages, 933 KiB  
Concept Paper
Translation of Medical AR Research into Clinical Practice
by Matthias Seibold, José Miguel Spirig, Hooman Esfandiari, Mazda Farshad and Philipp Fürnstahl
J. Imaging 2023, 9(2), 44; https://doi.org/10.3390/jimaging9020044 - 14 Feb 2023
Cited by 1 | Viewed by 1411
Abstract
Translational research is aimed at turning discoveries from basic science into results that advance patient treatment. The translation of technical solutions into clinical use is a complex, iterative process that involves different stages of design, development, and validation, such as the identification of [...] Read more.
Translational research is aimed at turning discoveries from basic science into results that advance patient treatment. The translation of technical solutions into clinical use is a complex, iterative process that involves different stages of design, development, and validation, such as the identification of unmet clinical needs, technical conception, development, verification and validation, regulatory matters, and ethics. For this reason, many promising technical developments at the interface of technology, informatics, and medicine remain research prototypes without finding their way into clinical practice. Augmented reality is a technology that is now making its breakthrough into patient care, even though it has been available for decades. In this work, we explain the translational process for Medical AR devices and present associated challenges and opportunities. To the best knowledge of the authors, this concept paper is the first to present a guideline for the translation of medical AR research into clinical practice. Full article
Show Figures

Figure 1

Back to TopTop