Extended Reality: A New Way of Interacting with the World

A special issue of Information (ISSN 2078-2489). This special issue belongs to the section "Information and Communications Technology".

Deadline for manuscript submissions: closed (30 November 2023) | Viewed by 6549

Special Issue Editors


E-Mail Website
Guest Editor
Dipartimento di Automatica e Informatica, Politecnico di Torino, C.so Duca degli Abruzzi 24, I-10129 Torino, Italy
Interests: computer graphics; virtual and augmented reality; human-machine interaction
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Extended reality (XR) is a technological umbrella that brings together virtual reality (VR), augmented reality (AR) and mixed reality (MR) technologies.

This technology enables an extension of the real world by increasing innovative functionalities that allow the user to perceive their surrounding reality in a different and enhanced way, simulating the real world or extending it through the combination of virtual elements. Virtual content is mixed with reality and the user can interact with both the real environment and virtual objects.

Extended reality has great potential in many sectors ranging from medicine and surgery to cultural heritage, industry, tourism, retail and marketing, education and gaming, and its diffusion is demonstrated by the increasing investments promoted by companies.

To allow extended reality to become part of our daily lives, there are still some challenges to overcome.

This Special Issue solicits submissions of high-quality original research papers on any aspect and application of extended reality.

All authors of papers presented at the 1st International Conference on eXtended Reality (XR Salento 2022) are invited to submit an extended version of their works. All submitted papers will undergo the standard peer-review procedure.

Submitted manuscripts should not have been previously published, nor be under consideration for publication elsewhere. Conference papers should be cited and noted on the paper. Authors of invited papers should be aware that the final submitted manuscript must provide a minimum of 50% new content and not exceed 30% copy/paste from the proceedings paper.

Prof. Dr. Lucio Tommaso De Paolis
Prof. Dr. Andrea Sanna
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Information is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • XR in medicine and surgery
  • XR in rehabilitation
  • XR in education
  • XR in cultural heritage and arts
  • XR in industry
  • XR in military
  • XR in marketing
  • psychological issues of integrating virtual objects in real world

Related Special Issue

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 8948 KiB  
Article
Sharing Augmented Reality between a Patient and a Clinician for Assessment and Rehabilitation in Daily Living Activities
by Mariolino De Cecco, Alessandro Luchetti, Isidro Butaslac III, Francesco Pilla, Giovanni Maria Achille Guandalini, Jacopo Bonavita, Monica Mazzucato and Kato Hirokazu
Information 2023, 14(4), 204; https://doi.org/10.3390/info14040204 - 27 Mar 2023
Cited by 1 | Viewed by 1569
Abstract
In rehabilitation settings that exploit Mixed Reality, a clinician risks losing empathy with the patient by being immersed in different worlds, either real and/or virtual. While the patient perceives the rehabilitation stimuli in a mixed real–virtual world, the physician is only immersed in [...] Read more.
In rehabilitation settings that exploit Mixed Reality, a clinician risks losing empathy with the patient by being immersed in different worlds, either real and/or virtual. While the patient perceives the rehabilitation stimuli in a mixed real–virtual world, the physician is only immersed in the real part. While in rehabilitation, this may cause the impossibility for the clinician to intervene, in skill assessment, this may cause difficulty in evaluation. To overcome the above limitation, we propose an innovative Augmented Reality (AR) framework for rehabilitation and skill assessment in clinical settings. Data acquired by a distributed sensor network are used to feed a “shared AR” environment so that both therapists and end-users can effectively operate/perceive it, taking into account the specific interface requirements for each user category: (1) for patients, simplicity, immersiveness, engagement and focus on the task; (2) for clinicians/therapists, contextualization and natural interaction with the whole set of data that is linked with the users’ performances in real-time. This framework has a strong potential in Occupational Therapy (OT) but also in physical, psychological, and neurological rehabilitation. Hybrid real and virtual environments may be quickly developed and personalized to match end users’ abilities and emotional and physiological states and evaluate nearly all relevant performances, thus augmenting the clinical eye of the therapist and the clinician-patient empathy. In this paper, we describe a practical exploitation of the proposed framework in OT: setting-up the table for eating. Both a therapist and a user wear Microsoft HoloLens 2. First, the therapist sets up the table with virtual furniture. Next, the user places the corresponding real objects (also in shape) to match them as closely as possible to the corresponding virtual ones. The therapist’s view is augmented during the test with motion, balance, and physiological estimated cues. Once the training is completed, he automatically perceives deviations in the position and attitude of each object and the elapsed time. We used a camera-based localization algorithm achieving a level of accuracy of 5 mm with a confidence level of 95% for position and 1° for rotation. The framework was designed and tested in collaboration with clinical experts of Villa Rosa rehabilitation hospital in Pergine (Italy), involving both a set of patients and healthy users to demonstrate the effectiveness of the designed architecture and the significance of the analyzed parameters between healthy users and patients. Full article
(This article belongs to the Special Issue Extended Reality: A New Way of Interacting with the World)
Show Figures

Figure 1

18 pages, 5973 KiB  
Article
Comparison of Point Cloud Registration Algorithms for Mixed-Reality Cross-Device Global Localization
by Alexander Osipov, Mikhail Ostanin and Alexandr Klimchik
Information 2023, 14(3), 149; https://doi.org/10.3390/info14030149 - 24 Feb 2023
Cited by 1 | Viewed by 1634
Abstract
State-of-the-art approaches for localization and mapping are based on local features in images. Along with these features, modern augmented and mixed-reality devices enable building a mesh of the surrounding space. Using this mesh map, we can solve the problem of cross-device localization. This [...] Read more.
State-of-the-art approaches for localization and mapping are based on local features in images. Along with these features, modern augmented and mixed-reality devices enable building a mesh of the surrounding space. Using this mesh map, we can solve the problem of cross-device localization. This approach is independent of the type of feature descriptors and SLAM used onboard the AR/MR device. The mesh could be reduced to the point cloud that only takes vertices. We analyzed and compared different point cloud registration methods applicable to the problem. In addition, we proposed a new pipeline Feature Inliers Graph Registration Approach (FIGRA) for the co-localization of AR/MR devices using point clouds. The comparative analysis of Go-ICP, Bayesian-ICP, FGR, Teaser++, and FIGRA shows that feature-based methods are more robust and faster than ICP-based methods. Through an in-depth comparison of the feature-based methods with the usual fast point feature histogram and the new weighted height image descriptor, we found that FIGRA has a better performance due to its effective graph-theoretic base. The proposed pipeline allows one to match point clouds in complex real scenarios with low overlap and sparse point density. Full article
(This article belongs to the Special Issue Extended Reality: A New Way of Interacting with the World)
Show Figures

Figure 1

21 pages, 1751 KiB  
Article
Freehand Gestural Selection with Haptic Feedback in Wearable Optical See-Through Augmented Reality
by Gang Wang, Gang Ren, Xinye Hong, Xun Peng, Wenbin Li and Eamonn O’Neill
Information 2022, 13(12), 566; https://doi.org/10.3390/info13120566 - 02 Dec 2022
Cited by 2 | Viewed by 1867
Abstract
Augmented reality (AR) technologies can blend digital and physical space and serve a variety of applications intuitively and effectively. Specifically, wearable AR enabled by optical see-through (OST) AR head-mounted displays (HMDs) might provide users with a direct view of the physical environment containing [...] Read more.
Augmented reality (AR) technologies can blend digital and physical space and serve a variety of applications intuitively and effectively. Specifically, wearable AR enabled by optical see-through (OST) AR head-mounted displays (HMDs) might provide users with a direct view of the physical environment containing digital objects. Besides, users could directly interact with three-dimensional (3D) digital artefacts using freehand gestures captured by OST HMD sensors. However, as an emerging user interaction paradigm, freehand interaction with OST AR still requires further investigation to improve user performance and satisfaction. Thus, we conducted two studies to investigate various freehand selection design aspects in OST AR, including target placement, size, distance, position, and haptic feedback on the hand and body. The user evaluation results indicated that 40 cm might be an appropriate target distance for freehand gestural selection. A large target size might lower the selection time and error rate, and a small target size could minimise selection effort. The targets positioned in the centre are the easiest to select, while those in the corners require extra time and effort. Furthermore, we discovered that haptic feedback on the body could lead to high user preference and satisfaction. Based on the research findings, we conclude with design recommendations for effective and comfortable freehand gestural interaction in OST AR. Full article
(This article belongs to the Special Issue Extended Reality: A New Way of Interacting with the World)
Show Figures

Figure 1

Back to TopTop