The Roles of the Collaborative eXtended Reality in the New Social Era

A special issue of Journal of Imaging (ISSN 2313-433X). This special issue belongs to the section "Mixed, Augmented and Virtual Reality".

Deadline for manuscript submissions: closed (31 January 2023) | Viewed by 5557

Special Issue Editors


E-Mail Website
Guest Editor
Department of Mathematics, Computer Science, and Economics, Basilicata University, Viale dell’Ateneo Lucano 10, 85100 Potenza, Italy
Interests: computer graphics; deep learning; extended reality XR; human-computer interaction

E-Mail Website
Guest Editor
Department of Mathematics, Computer Science, and Economics, University of Basilicata, Potenza, Italy
Interests: computer graphics; virtual reality; mixed reality

E-Mail Website
Guest Editor

E-Mail Website
Guest Editor
Institute for High Performance Computing and Networking, National Research Council of Italy, 80131 Napoli, Italy
Interests: human-computer interaction (HCI); eXtended reality (XR); artificial intelligence

Special Issue Information

Dear Colleagues,

The presence of the COVID-19 pandemic has characterized the last few years. The consequences of this health emergency have accelerated the evolution of new technologies that have made it possible to feel less distance between people. Consequently, new collaborative tools started to emerge, supporting the social sphere from various points of view, e.g.: work, leisure, education, etc. However, classical collaborative tools based on remote video meetings are characterized by poor presence feeling. eXtended Reality (XR) has the potential to reduce this gap through the involvement of all technologies that can remove the boundaries between the digital and the physical world.

XR aided the development of solutions based on Virtual Reality, Augmented Reality, and Mixed Reality which has transformed how people interact, work and communicate. Today the term XR is closely related to the Metaverse concept. It represents a collaborative and content-authoring platform that involves social media, video games, virtual meeting rooms, meeting places, etc. Furthermore, XR has allowed a paradigm like spatial computing to evolve rapidly. Indeed, through the scanning of real environments using new tools based on photogrammetry, laser scanning, lidar, RGB-D camera, and so on, computers are more and more aware of physical environments and can use this knowledge to blend digital and real worlds and connect users by providing innovative interaction methods.

This Special Issue aims to communicate and disseminate recent tools, methodologies, and pipelines related to the collaborative and/or authoring XR-based platforms. The goal is to show how XR can radically transform social interaction in any context. We specifically welcome papers containing original and mature research on XR applications based on collaboration and content authoring. A particular focus is given to everything concerning XR-based spatial computing technologies and their applicability in the most varied social contexts (e.g. work, entertainment, leisure, education, etc.).

The proposed special issue matches most of the goals of the MDPI Journal of Imaging, in particular:

  • Original research studies of XR and real-time visualization
  • Computer graphics and animation, human-computer interaction
  • Gaming

The topics of interest include:

  • XR-based collaborative architectures
  • Collaborative authoring tools 
  • New models to evaluate the impact of the XR 
  • Human-Computer interaction techniques to interact in collaborative environments
  • Spatial computing technologies for shared creation scenes
  • Emerging Technologies and Applications for collaborative XR-based environments
  • Proof-of-Concept in collaborative environments: Experimental prototyping and testbeds

Dr. Nicola Capece
Prof. Dr. Ugo Erra
Prof. Dr. Lucio Tommaso De Paolis
Dr. Giuseppe Caggianese
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Imaging is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 27462 KiB  
Article
A Collaborative Virtual Walkthrough of Matera’s Sassi Using Photogrammetric Reconstruction and Hand Gesture Navigation
by Nicla Maria Notarangelo, Gilda Manfredi and Gabriele Gilio
J. Imaging 2023, 9(4), 88; https://doi.org/10.3390/jimaging9040088 - 21 Apr 2023
Cited by 4 | Viewed by 1544
Abstract
The COVID-19 pandemic has underscored the need for real-time, collaborative virtual tools to support remote activities across various domains, including education and cultural heritage. Virtual walkthroughs provide a potent means of exploring, learning about, and interacting with historical sites worldwide. Nonetheless, creating realistic [...] Read more.
The COVID-19 pandemic has underscored the need for real-time, collaborative virtual tools to support remote activities across various domains, including education and cultural heritage. Virtual walkthroughs provide a potent means of exploring, learning about, and interacting with historical sites worldwide. Nonetheless, creating realistic and user-friendly applications poses a significant challenge. This study investigates the potential of collaborative virtual walkthroughs as an educational tool for cultural heritage sites, with a focus on the Sassi of Matera, a UNESCO World Heritage Site in Italy. The virtual walkthrough application, developed using RealityCapture and Unreal Engine, leveraged photogrammetric reconstruction and deep learning-based hand gesture recognition to offer an immersive and accessible experience, allowing users to interact with the virtual environment using intuitive gestures. A test with 36 participants resulted in positive feedback regarding the application’s effectiveness, intuitiveness, and user-friendliness. The findings suggest that virtual walkthroughs can provide precise representations of complex historical locations, promoting tangible and intangible aspects of heritage. Future work should focus on expanding the reconstructed site, enhancing the performance, and assessing the impact on learning outcomes. Overall, this study highlights the potential of virtual walkthrough applications as a valuable resource for architecture, cultural heritage, and environmental education. Full article
(This article belongs to the Special Issue The Roles of the Collaborative eXtended Reality in the New Social Era)
Show Figures

Figure 1

20 pages, 5345 KiB  
Article
Environment-Aware Rendering and Interaction in Web-Based Augmented Reality
by José Ferrão, Paulo Dias, Beatriz Sousa Santos and Miguel Oliveira
J. Imaging 2023, 9(3), 63; https://doi.org/10.3390/jimaging9030063 - 08 Mar 2023
Cited by 3 | Viewed by 2910
Abstract
This work presents a novel framework for web-based environment-aware rendering and interaction in augmented reality based on WebXR and three.js. It aims at accelerating the development of device-agnostic Augmented Reality (AR) applications. The solution allows for a realistic rendering of 3D elements, handles [...] Read more.
This work presents a novel framework for web-based environment-aware rendering and interaction in augmented reality based on WebXR and three.js. It aims at accelerating the development of device-agnostic Augmented Reality (AR) applications. The solution allows for a realistic rendering of 3D elements, handles geometry occlusion, casts shadows of virtual objects onto real surfaces, and provides physics interaction with real-world objects. Unlike most existing state-of-the-art systems that are built to run on a specific hardware configuration, the proposed solution targets the web environment and is designed to work on a vast range of devices and configurations. Our solution can use monocular camera setups with depth data estimated by deep neural networks or, when available, use higher-quality depth sensors (e.g., LIDAR, structured light) that provide a more accurate perception of the environment. To ensure consistency in the rendering of the virtual scene a physically based rendering pipeline is used, in which physically correct attributes are associated with each 3D object, which, combined with lighting information captured by the device, enables the rendering of AR content matching the environment illumination. All these concepts are integrated and optimized into a pipeline capable of providing a fluid user experience even on middle-range devices. The solution is distributed as an open-source library that can be integrated into existing and new web-based AR projects. The proposed framework was evaluated and compared in terms of performance and visual features with two state-of-the-art alternatives. Full article
(This article belongs to the Special Issue The Roles of the Collaborative eXtended Reality in the New Social Era)
Show Figures

Figure 1

Back to TopTop