Next Article in Journal
A Deeper Look into Exercise Intensity Tracking through Mobile Applications: A Brief Report
Next Article in Special Issue
Exploring the Digital Atmosphere of Museums: Perspectives and Potential
Previous Article in Journal
Towards Safe Visual Navigation of a Wheelchair Using Landmark Detection
Previous Article in Special Issue
Reconstruction of Industrial and Historical Heritage for Cultural Enrichment Using Virtual and Augmented Reality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Interaction with Physical Museum Artifacts

by
Andreas Pattakos
1,
Emmanouil Zidianakis
1,
Michalis Sifakis
1,
Michalis Roulios
1,
Nikolaos Partarakis
1,* and
Constantine Stephanidis
1,2
1
Institute of Computer Science, Foundation for Research and Technology Hellas (ICS-FORTH), N. Plastira 100, Vassilika Vouton, 70013 Heraklion, Greece
2
Computer Science Department, University of Crete, Voutes Campus, 70013 Heraklion, Greece
*
Author to whom correspondence should be addressed.
Technologies 2023, 11(3), 65; https://doi.org/10.3390/technologies11030065
Submission received: 16 March 2023 / Revised: 4 April 2023 / Accepted: 15 April 2023 / Published: 25 April 2023
(This article belongs to the Special Issue Immersive Technologies and Applications on Arts, Culture and Tourism)

Abstract

:
In the digital information world, visualizing information in public spaces has been implemented in various formats and for application contexts such as advertisement, useful information provision, and provision of critical information in the cases of accidents, natural disasters, etc. Among the different types of information displays, in this research work, the focus is given to the ones that extend the experience of people visiting cultural heritage institutions. To this end, the design and implementation of an interactive display case that aims to overcome the “non-touch policy” of museums are presented. This novel display allows visitors to get engaged with artifacts and information through touch-based interaction with the ambition to extend the target audience and impact of museum content. The conducted study demonstrates that the interactive display case is an effective solution for providing relevant information to visitors, enhancing their engagement with exhibits, and improving their overall experience. The proposed solution is user-friendly, engaging, and informative, making it ideal for museums and other public exhibit spaces.

1. Introduction

Providing sufficient information in the context of a museum visit is particularly important in the cases of museums where the objects themselves are not self-explanatory or host artifacts that are connected to the social and historic context of a social group [1]. At the same time, museums can be considered places where the diversity of the target visitors is extremely high in terms of background, cultural knowledge, attitude towards the visit, age, language, etc. [2,3,4]. This has resulted in the proposal of several forms of information personalization to fine-tune the services of the museum to the individual visitor’s profile [5,6,7,8]. Lately, such possibilities have also been explored in more open museum experiences such as in the open air and ecomuseums where the combination of mobile devices, localization, and recommendation systems may provide novel forms of personalization [9,10].
An alternative to profile-based content personalization is motivating visitors of the museum to interact with information and digital content through various forms of immersive experiences (e.g., [11,12,13]). These experiences can augment existing museum content and artifacts or provide alternative content [14]. At the same time, a new trend is to provide novel experiences enhanced with storytelling and narratives [15], inviting the visitor to be engaged in the museum metaverse [16].
In this work, we provide an overview of the design and implementation of a new form of display for museum artifacts. The main objective of the display is to connect user interaction with the artifacts with storytelling content that reveals the social and historic context of each artifact. By generalizing this objective, the display proposed comes with an authoring environment and multimedia content rendering engine to support its easy optimization to any form of museum artifacts.
In our use case, biographical artifacts were used that relate to the personalities of the Greek Revolution in the context of the installation that occurred during the great anniversary exhibition for the Revolution of 1821 entitled REVOLUTION ’21 REFRAMED, organized by the National Historical Museum of Greece. The rest of this article is structured as follows. In Section 2 we present background and related work on information displays and interactive museum artifacts highlighting the contributions of this research work. The design of the physical enclosure for the ICT equipment is presented in Section 3, followed by the implementation of the software in Section 4. The validation of the evaluation of the system before its official installation is presented in Section 5. The article concludes with a discussion of the results of this work.

2. Background and Related Work

2.1. Information Displays

Digital signage brings together various market players, all of which have different objectives and expected benefits. For this reason, it has been a domain of research in economic science. In this context, researchers are proposing frameworks for digital signage that would allow the development of various business strategies and associated business values [17]. At the same time making such information displays interactive has been studied since most of these displays are based merely on the provision rather than interaction with information [18].
The study of user motivation, content consumption, and engagement are directly relevant to their usage of information displays for advertisement, product display, and multiple forms of information providers such as in airports, train stations, etc. In this context past research has shown that audience expectations towards what is presented on public displays can correlate with their attention towards these displays. Similar to the effect of banner blindness on the Web, displays for which users expect uninteresting content are often ignored [19,20,21]. Other researchers have highlighted the fact that the effectiveness of digital signage messages that contain aesthetically pleasing sensory-affective cues is higher than the ones that present more functional content [22].
At the same time, the types and form factors of such displays seem to have an impact on their utility. For example, chained displays, have been proposed as a combination of several screens to create different form factors for interactive public displays [23]. At the same time, more provocative types arise such for example free-standing displays [24,25,26,27], personal displays [28], mobile devices [29,30,31], interactive shopping displays [32,33,34], and even battery-free patched wearable displays for opportunistic interactions [35]. Recently, more exotic forms of information displays are under research to support the full-color holographic presentation of visual information [36,37].

2.2. Interactive Information Displays

Recently a significant amount of research work has been focused on the domain of information visualization, mainly focusing on the alternative means and strategies for visualizing big data [38,39,40,41,42,43,44]. At the same time, the type of visualization and the visualization medium has been proven to be of extreme importance and several attempts have been made to utilize different presentation mediums, display formats, and interaction types [45,46,47,48,49,50]. Inevitably interaction with such data is required for the user to submit data queries and view and interact with results in a more attractive way [51,52,53,54,55]. At the same time, in the case of public information displays, privacy is also important [56].

2.3. Interactive Museum Artifacts

In museums today, digital technologies are increasingly integrated into diverse practices of collection and collections management, information management, curating, exhibiting, and educating [57]. This need was made more urgent due to the COVID-19 pandemic [58,59]. At the same time, especially in the domains of exhibiting and educating, modern methods based on interaction design, interactive storytelling, and artificial intelligence have been employed and paradigms for museum experience design have been proposed [60,61,62]. A survey on virtual museums has depicted the following technology-types of the virtual museum [63,64] (a) enhanced imaging [65], (b) virtual reality exhibitions [66,67,68,69,70], (c) augmented reality (AR) and web-based AR exhibitions (e.g., [71,72,73,74]), (d) Web3D exhibitions (e.g., [75,76,77,78,79]), (e) mixed reality (MR) exhibitions [80,81,82,83,84,85], (f) haptics (e.g., [86,87,88,89,90] and (g) mobile devices in museums (e.g., [91,92,93,94,95,96]) and (h) accessible virtual museums (e.g., [97,98,99,100]). Among these, interactive museum artifacts are blending bits and pieces from the aforementioned technologies to provide unique interaction and storytelling experiences (e.g., [101,102,103,104,105,106]). At the same time, the museum content is expanded to support aspects of intangible cultural heritage including the oral tradition, festive events, recipes, social events, and craft practices (e.g., [107,108,109,110,111,112,113,114]).

2.4. Contribution of This Research Work

In this paper, we present the design and implementation of an interactive display case for museum artifacts, that breaks the barriers of the “non-touch policy” of museums. Museum artifacts are placed in a glass display and the users can touch the glass right on top of each artifact to select the object upon which they wish to get more information. This display is complemented by a large screen where users experience multimedia content relevant to their selected artifact. In this way, the users are welcome to engage with artifacts and digital information in an interactive learning dialogue. Furthermore, through targeted multimedia content production, we can bind the artifacts with storytelling techniques since in the use case each artifact is connected with a video production thus suspending the disbelief and providing a transition to the past to reveal the story of each artifact.
The major benefit of the proposed approach concerning other relevant research outcomes is that it is complementary to the museum experience. Most of the relevant works have focused on extending the museum experience by creating another form of digital encounter by facilitating various technologies such as AR, MR, VR, etc. The main distinction of this research work that makes its exploitation easier is that the museum structure is principally the same and it enhances the physical items of the museums with digital dimensions. As such it can be conceived as a more interactive way of accessing information from the exhibit labels. Furthermore, instead of proposing a new virtual world or virtual experience, it provides interaction that leads to forms of content which are more easily accepted by museums such as information videos and photographic documentation.

3. Design of Information Display

Museum displays often provide limited information about the objects on display, making it challenging for visitors to understand the context and significance of exhibits without additional guidance. To overcome these issues, we designed an interactive display case that provides contextual information using a transparent touch glass. The system allows visitors to interact with exhibits through touch and instantly learn more about them. The project was inspired by the need to create an interactive display case for a glass case of antique weapons from the Greek Revolution of 1821.
Regarding the design of the display case, the main requirements that were considered were the robustness of the construction, the integration of technology within the shell of the display, and the self-contained ability of the entire exhibit for it to be easily installed and maintained. To this end, the industrial design of the display aimed at creating a common container for both the technology and the display case where the artifacts are placed. In this way, technology is designed to be integrated within the container of the display case and thus does not affect the design aesthetics. To do so, the design of the display is oriented towards generating the negative space needed to host the equipment as shown in Figure 1a,c.
The design of the display was conducted in CAD software and integrates sizing, materials, analysis in parts, and cutting dimensions which are straightforward to be implemented by standard construction methods used in small and medium-sized wood and metal construction workshops (see Figure 1b). The next step was to create a prototype of the display for the functional validation of the concept to act as a testbed for the software development iterations.

4. Software Implementation

The software of the interactive display case can be considered as two applications one for the association of content to specific parts of the touch glass case, and another one for the visualization of the associated information when a user touched the corresponding part of the exhibit. This approach allows the display to be easily adapted to different exhibits and contents offering at the same time a seamless interaction experience to visitors.
As shown in the system’s architecture (see Figure 2) a state manager is used to define the state of the system and thus control which application currently has the user’s focus. Based on the focused application, the input controller which is a high-level wrapper of the Windows touch device built on top of the device driver of the touch foil translates touch inputs into touch points or touch areas. Touch areas are used in authoring content areas while touch points are translated to selections of areas in content presentation mode. With simple and intuitive architecture, we manage to combine the functionality of the two applications into one software to be easily installed maintained and used by end users.
As analyzed above, having two separate applications for content association and information visualization in the interactive display case can increase overall complexity and installation challenges. To this end, for the implementation, we used the Unity 3D game engine [115] which provides a versatile environment that can handle both low and high-level interaction concepts. Furthermore, by leveraging Unity’s versatile environment, we can merge these two applications into a simpler and more manageable setup. With a general manager implemented in Unity, we can easily monitor and switch between the applications in real time.
In addition to its versatile environment, Unity also includes a device communication layer that enables seamless communication between hardware devices and the software running on the display case. This layer can accurately translate touch controller inputs into x/y coordinates, enabling the display to respond immediately and accurately to user interactions without any added latency. This precise touch input tracking provides visitors with a seamless and intuitive interaction experience, making the exhibit engaging and memorable.
By integrating Unity into the display case, we can not only simplify the installation process and improve the user experience but also take advantage of Unity’s vast library of tools and assets to enrich the interaction between the visitor and the exhibit, and the flexibility to continue enhancing and evolving the exhibit to meet the ever-changing needs and expectations of our visitors.

4.1. Content Creation

The content creator tool supports the assignment of multimedia content to specific bounding areas of each exhibit within the display case. Creating a new exhibit is as simple as drawing the bounding area on the touch glass case to define the exact position of the exhibit. As the user draws the area, a line renderer follows the touch position to display the drawing. After the area is drawn, the system fills in any gaps and creates a mesh collider of the same size as the area for the content viewer to identify later. The system then prompts the user to provide all the necessary content information, such as file paths for Greek and English content, timers for static images, and any sequential content desired after the initial content is finished. By default, the sequential content is set to the introduction content. The entire workflow is presented in Figure 3.
The system not only allows for the creation and management of exhibit content, but also provides additional useful functions such as language switching. To set up a language switch, the museum curator defines the bounding area on the touch glass case in a similar manner to creating a new exhibit. Instead of content, however, the system assigns an event function that toggles the language between Greek and English. This feature enhances the accessibility and engagement of the display, allowing users to easily switch between languages while exploring the exhibit. With the flexibility and ease of use provided by these system functions, museums can create immersive and interactive exhibits that cater to diverse audiences.
Overall, the content creator streamlines the process of assigning and managing exhibit content, enabling coordinators to easily create interactive displays with rich multimedia content.

4.2. Content Viewer

The content viewer serves as the primary interface for end users to interact with the exhibits displayed in the case. Its primary functions are to detect user interactions and display the relevant content based on those interactions. When a user touches the touch glass, the content viewer receives a signal containing the precise coordinates of the touch event in relation to the rest of the glass. Using Raycasting, the system searches for any exhibit colliders that intersect with the touch point. If an exhibit collider is found, the system retrieves and displays the content associated with that specific area on the secondary display. Once the content has finished playing, the system checks for any sequential content to play. If none is found, the system will display the introduction content instead.
An example of the in-lab set-up of the working prototype is presented in Figure 4.

5. Validation–Evaluation

Creating interactive applications that will be available for use by hundreds of visitors requires that several aspects have to be considered before the deployment of the solution. With this in mind, the validation of this work has been performed in a laboratory environment where an actual prototype of the system was created. The design of the prototype is discussed in Section 3 and for the validation, we moved to implement an actual physical prototype of the installation. This physical prototype acted as the test bench for application developers, and it was critical for testing the functionality before production. In this test bench all the ICT components were integrated and several touch-based interaction techniques were evaluated before going with the solution of the touch film since all computer vision-based solutions had severe problems due to external lighting, illumination, and reflections from the glass, etc. In this context, the test bench was used both for the selection of the appropriate technologies and for the validation of the software. In terms of software validation, it was important to validate the efficiency of the administrative, and authoring interface and the quality of recognition in terms of success rates. A critical factor was to test that the perceived bounding box of the curator is translated accurately to regions of the screen and their correspondence with museum artifacts within the display.
The validation process took several iterations where a different version of the hardware and software setup were tested. These tests were performed mainly by technical personnel and the objectives were to validate that the integrated technology provides the appropriate features and that it operates as expected in various conditions. This validation was conducted before evaluating the authoring and interaction application.
Regarding the evaluation of the prototype, this was carried out iteratively using two user groups. The first group was comprised of usability and user experience experts that evaluate an interactive application based on the Heuristic Evaluation method [116], which is an expert-based review method that is highly beneficial to eliminate usability problems before testing with representative end users, [117,118] also taking into account evaluation methods targeted on museum experience [119,120]. A small group of evaluators examined the user interfaces to spot violations of established usability principles, commonly known as heuristics [121]. To find heuristic violations or other usability issues pertinent to the system, the evaluators performed multiple iterations. Each evaluator recorded the identified problems and specified the violated principles for each one. Then, all evaluation reports were combined into a single one, addressing each problem exactly once. To prioritize the problems, the evaluators reviewed the combined list and gave each problem a severity rating. The problems’ final severity score was determined by averaging the results of each evaluator.
Results in each iteration highlighted usability problems that should be rectified, ordering them by severity. Given the multiple iterations, numerous problems were identified and corrected until a prototype was identified as usable. These evaluation iterations can be considered part of the development process since they are interwoven with this process. Each iteration results in an updated version of the system and continues iteratively until all comments are satisfied.
This section summarizes key issues identified during evaluation/lessons learned regarding the interactive display:
  • Buttons or interaction areas in the case of the display that are relevant to a specific context, such as the areas of interaction on the glass, should be easily associated with the specific exhibit to which they are relevant, by placing them at an appropriate screen location ensuring that the diagonal viewing angle of the visitor will not affect his/her perception of the interactive area.
  • There must be some form of feedback from the system upon the selection of an area of interaction. In the earlier versions of the system, this was done silently resulting in some cases of frustration regarding which area is activated. Auditory feedback was introduced to cope with this issue.
  • Regarding language changes, it was requested that some form of typography should be integrated into the surface of the display so the visitor to easily locate the interaction areas for language switching.
  • The user must be constantly informed regarding which artifact from the physical display is selected, to have a constant association between content and artifact. The evaluators proposed that led lighting should be introduced within the display in the form of a grid to be able to activate only the lights that are within the selected region of a specific artifact.
The second user group was the users of such an application. For this, user group co-workers with expertise in diverse fields were used (a social scientist, an anthropologist, a philologist, an English teacher, and a performing arts director). These users were invited to interact with the system each one assuming the role of the visitor. This evaluation was mainly content based thus, experiencing the provided interaction and content to judge user experience. Mainly we received positive comments on both aspects. All users were very impressed by the new form of interaction and positively judged the quality of the museum content which provided a storytelling dimension through the dedicated production of historic videos.

6. First Commercial Setup of the Display Case

The first commercial setup of the display case was conducted for the great anniversary exhibition of the Revolution of 1821 entitled REVOLUTION ’21 REFRAMED, organized by the National Historical Museum of Greece in the Old Parliament House in Athens as part of the celebration of 200 years since the beginning of the Greek Revolution. The central anniversary exhibition highlighted the ideas, causes, persons, events, and results of the Greek War of Independence, as they were formed through conflicts and compositions of different interests and traditions.
The display case presented historical weapons and other relics of the Revolution. The success of the system was so wide that the National Historical Museum decided to include it in the permanent collection of the museum and thus since 2021 it has been accessed and used by tens of thousands of visitors.
The design of the final product was an adaptation of the initial design to address the need for a larger display case and 65 inches display as shown in Figure 5.

7. Conclusions

In this work, we designed and implemented an interactive display case to overcome the “non-touch policy” of museums and allow visitors to get engaged with artifacts and information. Thus, we are extending the audience and impact of museum content. We consider this the main contribution of this research work since it keeps the artifacts in focus while augmenting their existence through interactive features that are linked to further information.
The main solution that was followed during the construction of the display and after experimenting with several computer vision-based approaches was to mount a touch film on the glass top of the display and create a dedicated software capable of assigning regions of the glass top to physical objects within the display. Then, by touching each region multimedia content assigned to this region is reproduced. To ensure the reusability of the display case we have performed the industrial design of a solution that can be considered as a generic interactive display case complemented by an authoring software and content renderer. Thus, the setup of any such display can be performed easily through the content creator app. The resulting design was implemented as a prototype before its first installation for development purposes and the validation of the proposed solution in the lab. After the successful validation, the first commercial installation occurred during the great anniversary exhibition for the Revolution of 1821 entitled REVOLUTION ’21 REFRAMED, organized by the National Historical Museum of Greece. We are particularly happy that due to the success of this exhibition, the display case is now part of the main exhibition of the museum since it was selected as a permanent exhibit for the museum.
The design process followed can be considered close to the one followed by commercial implementation considering that we followed all the steps of designing a product and performed its validation and proof of concept prototyping. Thus, we were able to transform research efforts into providing richer interaction with museum artifacts in a non-obtrusive way into an actual product experienced by thousands of people. In this process, we followed an approach of keeping innovation in the interaction style and project concept and trying to adapt it using standard ICT equipment to ensure that the final result could support trouble-free interaction under the restrictions posed by the need to implement a public information display.
To do so we implemented an actual prototype of the installation for development validation and evaluation purposes. Validation was conducted in the lab to ensure the appropriate functioning of the equipment according to specifications. The evaluation was performed in two phases. The first phase involved usability experts to improve the interaction with the system and the perceived ease of use. This resulted in several modifications in terms of hardware, software, and setup. The second phase involved a small set of users with different backgrounds to evaluate the device function and content in simulated museum usage. After concluding with both phases, which resulted in the adaptation of the software and hardware part of the system, the museum installation was conducted. Supplementary Materials in the form of a usage video are available through zenodo [122].
By concluding this research work, we are confident that through this paper the main concepts under the creation of an interactive display for museum artifacts will support the further exploitation of the proposed concept from its creators but also from the cultural and creative industries in the context of new museum applications and services. Regarding future improvements to the display, we are focusing on the following directions. The first regards the enhancement of visual feedback on the selected artifact within the display. The second regards the implementation of a richer UI application that combines with a multi-touch screen to enhance the information sources that can be assigned and becomes interactive for each artifact on display.

Supplementary Materials

The following supporting information can be downloaded at: https://we.tl/t-z399gGd3Nc (accessed on 15 March 2023), A video presenting the final result as integrated into the main exhibition of the National History Museum of Athens is available at zenodo [122].

Author Contributions

Conceptualization, E.Z., M.S. and M.R.; methodology, E.Z.; software, A.P.; validation, E.Z., M.S., N.P. and C.S.; formal analysis, E.Z. and M.S.; investigation, E.Z., M.S., N.P. and C.S.; resources, M.S. and M.R.; data curation, M.S.; writing—original draft preparation, N.P., A.P., E.Z., M.S. and C.S.; writing—review and editing, N.P., E.Z., M.R. and C.S.; visualization, A.P., M.R. and E.Z.; supervision, E.Z., M.S., N.P. and C.S.; project administration, E.Z.; funding acquisition, M.S., E.Z., N.P. and C.S. All authors have read and agreed to the published version of the manuscript.

Funding

The project was funded by the National Historical Museum of Greece in the context of the implementation of the great anniversary exhibition for the Revolution of 1821 entitled REVOLUTION ’21 REFRAMED.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is available upon request.

Acknowledgments

In this work, we collaborated with the National Historical Museum of Greece in the context of the implementation of the great anniversary exhibition for the Revolution of 1821 entitled REVOLUTION ’21 REFRAMED. The authors would like to thank the National Historical Museum of Greece for their trust and fruitful collaboration in this project.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Booth, B. Understanding the information needs of visitors to museums. Mus. Manag. Curatorship 1998, 17, 139–157. [Google Scholar] [CrossRef]
  2. Antoniou, A.; Katifori, A.; Roussou, M.; Vayanou, M.; Karvounis, M.; Kyriakidi, M.; Pujol-Tost, L. Capturing the Visitor Profile for a Personalized Mobile Museum Experience: An Indirect Approach; Enlighten: Brisbane, Australia, 2016. [Google Scholar]
  3. Lewalter, D.; Phelan, S.; Geyer, C.; Specht, I.; Grüninger, R.; Schnotz, W. Investigating visitor profiles as a valuable addition to museum research. Int. J. Sci. Educ. Part B 2015, 5, 357–374. [Google Scholar] [CrossRef]
  4. Davidson, L.; Sibley, P. Audiences at the “new” museum: Visitor commitment, diversity and leisure at the Museum of New Zealand Te Papa Tongarewa. Visit. Stud. 2011, 14, 176–194. [Google Scholar] [CrossRef]
  5. Kosmopoulos, D.; Styliaras, G. A survey on developing personalized content services in museums. Pervasive Mob. Comput. 2018, 47, 54–77. [Google Scholar] [CrossRef]
  6. Kiourt, C.; Koutsoudis, A.; Kalles, D. Enhanced virtual reality experience in personalised virtual museums. Int. J. Comput. Methods Herit. Sci. (IJCMHS) 2018, 2, 23–39. [Google Scholar] [CrossRef]
  7. Loboda, O.; Nyhan, J.; Mahony, S.; Romano, D.; Terras, M. Content-Based Recommender Systems for Heritage: Developing a Personalised Museum Tour; DSRS-Turing: London, UK, 2019. [Google Scholar]
  8. Partarakis, N.; Antona, M.; Stephanidis, C. Adaptable, personalizable and multi user museum exhibits. In Curating the Digital: Space for Art and Interaction; Springer: Berlin/Heidelberg, Germany, 2016; pp. 167–179. [Google Scholar]
  9. Ivanov, R.; Velkova, V. Delivering Personalized Content to Open-air Museum Visitors Using Geofencing. Digit. Present. Preserv. Cult. Sci. Herit. 2022, 12, 141–150. [Google Scholar] [CrossRef]
  10. Vrettakis, E.; Katifori, A.; Kyriakidi, M.; Koukouli, M.; Boile, M.; Glenis, A.; Ioannidis, Y. Personalization in Digital Ecomuseums: The Case of Pros-Eleusis. Appl. Sci. 2023, 13, 3903. [Google Scholar] [CrossRef]
  11. Chen, S.X.; Wu, H.C.; Huang, X. Immersive experiences in digital exhibitions: The application and extension of the service theater model. J. Hosp. Tour. Manag. 2023, 54, 128–138. [Google Scholar] [CrossRef]
  12. Kocaturk, T.; Mazza, D.; McKinnon, M.; Kaljevic, S. GDOM: An Immersive Experience of Intangible Heritage through Spatial Storytelling. ACM J. Comput. Cult. Herit. 2023, 15, 1–18. [Google Scholar] [CrossRef]
  13. Khalil, S.; Kallmuenzer, A.; Kraus, S. Visiting museums via augmented reality: An experience fast-tracking the digital transformation of the tourism industry. Eur. J. Innov. Manag. 2023; ahead-of-print. [Google Scholar] [CrossRef]
  14. Barbara, J.; Bellini, M.; Makai, P.K.; Sampatakou, D.; Irshad, S.; Koenitz, H. The Sacra infermeria—A focus group evaluation of an augmented reality cultural heritage experience. New Rev. Hypermedia Multimed. 2023, 1–28. [Google Scholar] [CrossRef]
  15. Basaraba, N.; Cauvin, T. Public history and transmedia storytelling for conflicting narratives. Rethink. Hist. 2023, 1–27. [Google Scholar] [CrossRef]
  16. Hutson, J.; Hutson, P. Museums and the Metaverse: Emerging Technologies to Promote Inclusivity and Engagement; Lindenwood University: St Charles, MO, USA, 2023. [Google Scholar]
  17. Bauer, C.; Dohmen, P.; Strauss, C. Interactive digital signage-an innovative service and its future strategies. In Proceedings of the 2011 International Conference on Emerging Intelligent Data and Web Technologies, Tirana, Albania, 7–9 September 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 137–142. [Google Scholar]
  18. Want, R.; Schilit, B.N. Interactive digital signage. Computer 2012, 45, 21–24. [Google Scholar] [CrossRef]
  19. Müller, J.; Wilmsmann, D.; Exeler, J.; Buzeck, M.; Schmidt, A.; Jay, T.; Krüger, A. Display blindness: The effect of expectations on attention towards digital signage. In Proceedings of the Pervasive Computing: 7th International Conference, Pervasive 2009, Nara, Japan, 11–14 May 2009; Proceedings 7. Springer: Berlin/Heidelberg, Germany, 2009; pp. 1–8. [Google Scholar]
  20. Huang, E.M.; Koster, A.; Borchers, J. Overcoming assumptions and uncovering practices: When does the public really look at public displays? In Proceedings of the Pervasive Computing: 6th International Conference, Pervasive 2008, Sydney, Australia, 19–22 May 2008; Proceedings 6. Springer: Berlin/Heidelberg, Germany, 2008; pp. 228–243. [Google Scholar]
  21. Parker, C.; Tomitsch, M.; Kay, J. Does the public still look at public displays? A field observation of public displays in the wild. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Singapore, 8–12 June 2018; Volume 2, pp. 1–24. [Google Scholar]
  22. Dennis, C.; Brakus, J.J.; Gupta, S.; Alamanos, E. The effect of digital signage on shoppers’ behavior: The role of the evoked experience. J. Bus. Res. 2014, 67, 2250–2257. [Google Scholar] [CrossRef]
  23. Ten Koppel, M.; Bailly, G.; Müller, J.; Walter, R. Chained displays: Configurations of public displays can be used to influence actor-, audience-, and passer-by behavior. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 317–326. [Google Scholar]
  24. Schneegass, S.; Alt, F.; Scheible, J.; Schmidt, A.; Su, H. Midair displays: Exploring the concept of free-floating public displays. In CHI’14 Extended Abstracts on Human Factors in Computing Systems; ACM Digital Library: New York, NY, USA, 2014; pp. 2035–2040. [Google Scholar]
  25. Scheible, J.; Hoth, A.; Saal, J.; Su, H. Displaydrone: A flying robot based interactive display. In Proceedings of the 2nd ACM International Symposium on Pervasive Displays, Mountain View, CA, USA, 4–5 June 2013; pp. 49–54. [Google Scholar]
  26. Nozaki, H. Flying display: A movable display pairing projector and screen in the air. In CHI’14 Extended Abstracts on Human Factors in Computing Systems; ACM Digital Library: New York, NY, USA, 2014; pp. 909–914. [Google Scholar]
  27. Schneegass, S.; Alt, F.; Scheible, J.; Schmidt, A. Midair displays: Concept and first experiences with free-floating pervasive displays. In Proceedings of the International Symposium on Pervasive Displays, Copenhagen Denmark, 3–4 June 2014; pp. 27–31. [Google Scholar]
  28. Kleinman, L.; Hirsch, T.; Yurdana, M. Exploring mobile devices as personal public displays. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services, Copenhagen, Denmark, 24 August 2015; pp. 233–243. [Google Scholar]
  29. Pearson, J.; Robinson, S.; Jones, M. It’s About Time: Smartwatches as public displays. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015; pp. 1257–1266. [Google Scholar]
  30. Greenberg, S.; Boyle, M.; LaBerge, J. PDAs and shared public displays: Making personal. Pers. Ubiquitous Comput. 1999, 3, 54–64. [Google Scholar]
  31. Lucero, A.; Holopainen, J.; Jokela, T. MobiComics: Collaborative use of mobile phones and large displays for public expression. In Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services, San Francisco, CA, USA, 21–24 September 2012; pp. 383–392. [Google Scholar]
  32. Alt, F.; Vehns, J. Opportunistic deployments: Challenges and opportunities of conducting public display research at an airport. In Proceedings of the 5th ACM International Symposium on Pervasive Displays, Oulu, Finland, 20–22 June 2016; pp. 106–117. [Google Scholar]
  33. Longo, S.; Kovacs, E.; Franke, J.; Martin, M. Enriching shopping experiences with pervasive displays and smart things. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Zurich, Switzerland, 8–12 September 2013; pp. 991–998. [Google Scholar]
  34. Lecointre-Erickson, D.; Daucé, B.; Legoherel, P. The influence of interactive window displays on expected shopping experience. Int. J. Retail. Distrib. Manag. 2018, 46, 802–819. [Google Scholar] [CrossRef]
  35. Dierk, C.; Nicholas, M.J.P.; Paulos, E. AlterWear: Battery-free wearable displays for opportunistic interactions. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–11. [Google Scholar]
  36. Guo, X.; Zhong, J.; Li, B.; Qi, S.; Li, Y.; Li, P.; Zhao, J. Full-color holographic display and encryption with full-polarization degree of freedom. Adv. Mater. 2022, 34, 2103192. [Google Scholar] [CrossRef]
  37. Chuang, C.H.; Chen, C.Y.; Li, S.T.; Chang, H.T.; Lin, H.Y. Miniaturization and image optimization of a full-color holographic display system using a vibrating light guide. Opt. Express 2022, 30, 42129–42140. [Google Scholar] [CrossRef]
  38. Andrienko, G.; Andrienko, N.; Drucker, S.; Fekete, J.D.; Fisher, D.; Idreos, S.; Sharaf, M. Big data visualization and analytics: Future research challenges and emerging applications. In Proceedings of the BigVis 2020-3rd International Workshop on Big Data Visual Exploration and Analytics, Copenhagen, Denmark, 30 March 2020; Available online: https://dspace.mit.edu/bitstream/handle/1721.1/132286/BigVis2020_big_data_visualization_analytics_challenges_report.pdf?sequence=2&isAllowed=y (accessed on 10 March 2023).
  39. Manogaran, G.; Thota, C.; Lopez, D. Human-computer interaction with big data analytics. In Research Anthology on Big Data Analytics, Architectures, and Applications; IGI Global: Hershey, PA, USA, 2022; pp. 1578–1596. [Google Scholar]
  40. Heer, J.; Card, S.K.; Landay, J.A. Prefuse: A toolkit for interactive information visualization. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; pp. 421–430. [Google Scholar]
  41. Fekete, J.D.; Plaisant, C. Interactive information visualization of a million items. In Proceedings of the IEEE Symposium on Information Visualization, INFOVIS, Boston, MA, USA, 28–29 October 2002; IEEE: Piscataway, NJ, USA, 2002; pp. 117–124. [Google Scholar]
  42. Belkin, N.J.; Cool, C.; Stein, A.; Thiel, U. Cases, scripts, and information-seeking strategies: On the design of interactive information retrieval systems. Expert Syst. Appl. 1995, 9, 379–395. [Google Scholar] [CrossRef]
  43. Agrawal, R.; Kadadi, A.; Dai, X.; Andres, F. Challenges and opportunities with big data visualization. In Proceedings of the 7th International Conference on Management of Computational and Collective Intelligence in Digital EcoSystems, Caraguatatuba, Brazil, 25–29 October 2015; pp. 169–173. [Google Scholar]
  44. Yang, J.; Hubball, D.; Ward, M.O.; Rundensteiner, E.A.; Ribarsky, W. Value and relation display: Interactive visual exploration of large data sets with hundreds of dimensions. IEEE Trans. Vis. Comput. Graph. 2007, 13, 494–507.3844. [Google Scholar] [CrossRef]
  45. Redström, J.; Skog, T.; Hallnäs, L. Informative art: Using amplified artworks as information displays. In Proceedings of the DARE 2000 on Designing Augmented Reality Environments, Elsinore, Denmark, 1 April 2000; pp. 103–114. [Google Scholar]
  46. Carpendale, M.S.T. Considering visual variables as a basis for information visualisation. 2003. [Google Scholar]
  47. Cordeil, M.; Bach, B.; Li, Y.; Wilson, E.; Dwyer, T. Design space for spatio-data coordination: Tangible interaction devices for immersive information visualisation. In Proceedings of the 2017 IEEE Pacific Visualization Symposium (PacificVis), Seoul, Republic of Korea, 18–21 April 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 46–50. [Google Scholar]
  48. Dostal, J.; Hinrichs, U.; Kristensson, P.O.; Quigley, A. SpiderEyes: Designing attention-and proximity-aware collaborative interfaces for wall-sized displays. In Proceedings of the 19th International Conference on Intelligent User Interfaces, Haifa, Israel, 24–27 February 2014; pp. 143–152. [Google Scholar]
  49. Chen, C. Information Visualisation and Virtual Environments; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  50. Wan, S.; Tang, J.; Wan, C.; Li, Z.; Li, Z. Angular-encrypted quad-fold display of nanoprinting and meta-holography for optical information storage. Adv. Opt. Mater. 2022, 10, 2102820. [Google Scholar] [CrossRef]
  51. Vogel, D.; Balakrishnan, R. Interactive public ambient displays: Transitioning from implicit to explicit, public to personal, interaction with multiple users. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, Santa Fe, NM, USA, 24–27 October 2004; pp. 137–146. [Google Scholar]
  52. Claes, S.; Moere, A.V. The role of tangible interaction in exploring information on public visualization displays. In Proceedings of the 4th International Symposium on Pervasive Displays, Saarbruecken, Germany, 10–12 June 2015; pp. 201–207. [Google Scholar]
  53. Ojala, T.; Kostakos, V.; Kukka, H.; Heikkinen, T.; Linden, T.; Jurmu, M.; Zanni, D. Multipurpose interactive public displays in the wild: Three years later. Computer 2012, 45, 42–49. [Google Scholar] [CrossRef]
  54. Hinrichs, U.; Carpendale, S.; Valkanova, N.; Kuikkaniemi, K.; Jacucci, G.; Moere, A.V. Interactive public displays. IEEE Comput. Graph. Appl. 2013, 33, 25–27. [Google Scholar] [CrossRef] [PubMed]
  55. Mäkelä, V.; Heimonen, T.; Luhtala, M.; Turunen, M. Information wall: Evaluation of a gesture-controlled public display. In Proceedings of the 13th International Conference on Mobile and Ubiquitous Multimedia, Melbourne, Australia, 25–28 November 2014; pp. 228–231. [Google Scholar]
  56. Liu, C.K.; Chang, S.C.; Juang, Y.S.; Cheng, K.T. Hiding private information in private information protection liquid crystal displays using periodical waveplates and pixel quaternity. Opt. Express 2023, 31, 2445–2455. [Google Scholar] [CrossRef]
  57. Geismar, H. Museum+ digital=? In Digital Anthropology; Routledge: Abingdon, UK, 2021; pp. 264–287. [Google Scholar]
  58. Giannini, T.; Bowen, J.P. Museums and Digital Culture: From reality to digitality in the age of COVID-19. Heritage 2022, 5, 192–214. [Google Scholar] [CrossRef]
  59. Sifaki, E. Museum Digital Activities During the Pandemic: Art as a Communicative Experience. In The Digital Folklore of Cyberculture and Digital Humanities; IGI Global: Hershey, PA, USA, 2022; pp. 267–295. [Google Scholar]
  60. Dal Falco, F.; Vassos, S. Museum experience design: A modern storytelling methodology. Des. J. 2017, 20 (Suppl. 1.), S3975-–S3983. [Google Scholar] [CrossRef]
  61. Vermeeren, A.P.; Calvi, L.; Sabiescu, A.; Trocchianesi, R.; Stuedahl, D.; Giaccardi, E.; Radice, S. Future Museum Experience Design: Crowds, Ecosystems and Novel Technologies; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 1–16. [Google Scholar]
  62. Harada, T.; Hideyoshi, Y.; Gressier-Soudan, E.; Jean, C. Museum experience design based on multi-sensory transformation approach. In Proceedings of the DS 92: Proceedings of the DESIGN 2018 15th International Design Conference, Dubrovnik, Croatia, 21–24 May 2018; Design Society: Glasgow, UK, 2018; pp. 2221–2228. [Google Scholar]
  63. Styliani, S.; Fotis, L.; Kostas, K.; Petros, P. Virtual museums, a survey and some issues for consideration. J. Cult. Herit. 2009, 10, 520–528. [Google Scholar] [CrossRef]
  64. Sylaiou, S.; Liarokapis, F.; Sechidis, L.; Patias, P.; Georgoula, O. Virtual museums: First results of a survey on methods and tools. In Proceedings of the CIPA XX Symposium, Torino, Italy, 26–30 September 2005; pp. 1138–1143. [Google Scholar]
  65. Sylaiou, S.; Mania, K.; Karoulis, A.; White, M. Exploring the relationship between presence and enjoyment in a virtual museum. Int. J. Hum. Comput. Stud. 2010, 68, 243–253. [Google Scholar] [CrossRef]
  66. Wojciechowski, R.; Walczak, K.; White, M.; Cellary, W. Building virtual and augmented reality museum exhibitions. In Proceedings of the Ninth International Conference on 3D Web Technology, Monterey, CA, USA, 5–8 May 2004; pp. 135–144. [Google Scholar]
  67. Carrozzino, M.; Bergamasco, M. Beyond virtual museums: Experiencing immersive virtual reality in real museums. J. Cult. Herit. 2010, 11, 452–458. [Google Scholar] [CrossRef]
  68. Giangreco, I.; Sauter, L.; Parian, M.A.; Gasser, R.; Heller, S.; Rossetto, L.; Schuldt, H. Virtue: A virtual reality museum experience. In Proceedings of the 24th International Conference on Intelligent User Interfaces: Companion, Marina del Ray, CA, USA, 17–20 March 2019; pp. 119–120. [Google Scholar]
  69. Roussou, M. Immersive interactive virtual reality in the museum. Proc. TiLE (Trends Leis. Entertain.) 2001. [Google Scholar]
  70. Kersten, T.P.; Tschirschwitz, F.; Deggim, S. Development of a virtual museum including a 4D presentation of building history in virtual reality. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 361–367. [Google Scholar] [CrossRef]
  71. Miyashita, T.; Meier, P.; Tachikawa, T.; Orlic, S.; Eble, T.; Scholz, V.; Lieberknecht, S. An augmented reality museum guide. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, 15–18 September 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 103–106. [Google Scholar]
  72. Rodrigues, J.M.; Pereira, J.A.; Sardo, J.D.; de Freitas, M.A.; Cardoso, P.J.; Gomes, M.; Bica, P. Adaptive card design UI implementation for an augmented reality museum application. In Proceedings of the Universal Access in Human–Computer Interaction. Design and Development Approaches and Methods: 11th International Conference, UAHCI 2017, Held as Part of HCI International 2017, Vancouver, BC, Canada, July 9–14, 2017, Proceedings, Part I 11 2017; Springer International Publishing: Berlin/Heidelberg, Germany, 2017; pp. 433–443. [Google Scholar]
  73. He, Z.; Wu, L.; Li, X.R. When art meets tech: The role of augmented reality in enhancing museum experiences and purchase intentions. Tour. Manag. 2018, 68, 127–139. [Google Scholar] [CrossRef]
  74. Venigalla, A.S.M.; Chimalakonda, S. Towards enhancing user experience through a web-based augmented reality museum. In Proceedings of the 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT), Maceio, Brazil, 15–18 July 2019; IEEE: Piscataway, NJ, USA, 2019; Volume 2161, pp. 357–358. [Google Scholar]
  75. Zhao, J. Designing virtual museum using Web3D technology. Phys. Procedia 2012, 33, 1596–1602. [Google Scholar] [CrossRef]
  76. White, M.; Liarokapis, F.; Mourkoussis, N.; Basu, A.; Darcy, J.; Petridis, P.; Lister, P. ARCOLite-an XML based system for building and presenting virtual museum exhibitions using Web3D and augmented reality. In Proceedings of the Proceedings Theory and Practice of Computer Graphics, Bournemouth, UK, 8–10 June 2004. [Google Scholar]
  77. Liarokapis, F.; Sylaiou, S.; Basu, A.; Mourkoussis, N.; White, M.; Lister, P.F. An Interactive Visualisation Interface for Virtual Museums. In Proceedings of the VAST 2004: The 5th International Symposium on Virtual Reality, Archaeology and Cultural Heritage, Oudenaarde, Belgium, 7–10 December 2004; Chrysanthou, Y., Cain, K., Silberman, N., Niccolucci, F., Eds.; 2004. Available online: https://www.semanticscholar.org/paper/An-Interactive-Visualisation-Interface-for-Virtual-Liarokapis-Sylaiou/2af763ee5034897eee657230d8b782d468b8fd34 (accessed on 15 March 2023). [CrossRef]
  78. Petridis, P.; White, M.; Mourkousis, N.; Liarokapis, F.; Sifniotis, M.; Basu, A.; Gatzidis, C. Exploring and Interacting with Virtual Museums. In The World is in Your Eyes. CAA2005. Computer Applications and Quantitative Methods in Archaeology. Proceedings of the 33rd Conference, Tomar, Portugal, March 2005; Figueiredo, A., Velho, G.L., Eds.; CAA: Tomar, Portugal, 2007; pp. 73–82. [Google Scholar]
  79. Zidianakis, E.; Partarakis, N.; Ntoa, S.; Dimopoulos, A.; Kopidaki, S.; Ntagianta, A.; Stephanidis, C. The invisible museum: A user-centric platform for creating virtual 3D exhibitions with VR support. Electronics 2021, 10, 363. [Google Scholar] [CrossRef]
  80. Carre, A.L.; Dubois, A.; Partarakis, N.; Zabulis, X.; Patsiouras, N.; Mantinaki, E.; Manitsaris, S. Mixed-reality demonstration and training of glassblowing. Heritage 2022, 5, 103–128. [Google Scholar] [CrossRef]
  81. Hall, T.; Ciolfi, L.; Bannon, L.; Fraser, M.; Benford, S.; Bowers, J.; Flintham, M. The visitor as virtual archaeologist: Explorations in mixed reality technology to enhance educational and social interaction in the museum. In Proceedings of the 2001 Conference on Virtual Reality, Archeology, and Cultural Heritage, Glyfada, Greece, 28–30 November 2001; pp. 91–96. [Google Scholar]
  82. Hughes, C.E.; Smith, E.; Stapleton, C.B.; Hughes, D.E. Augmenting museum experiences with mixed reality. In Proceedings of the KSCE 2004, Prague, Czech, 25–27 October 2004; pp. 22–24. [Google Scholar]
  83. Galani, A. Mixed Reality Museum Visits: Using new technologies to support co-visiting for local and remote visitors. Museol. Rev. 2003, 10. [Google Scholar]
  84. Hammady, R.; Ma, M.; Strathern, C.; Mohamad, M. Design and development of a spatial mixed reality touring guide to the Egyptian museum. Multimed. Tools Appl. 2020, 79, 3465–3494. [Google Scholar] [CrossRef]
  85. Brown, B.; MacColl, I.; Chalmers, M.; Galani, A.; Randell, C.; Steed, A. Lessons from the lighthouse: Collaboration in a shared mixed reality system. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Ft. Lauderdale, FL, USA, 5–10 April 2003; pp. 577–584. [Google Scholar]
  86. Butler, M.; Neave, P. Object appreciation through haptic interaction. Hello! Where are you in the landscape of educational technology? Proc. Ascilite Melb. 2008, 2008, 133–141. [Google Scholar]
  87. Dima, M.; Hurcombe, L.; Wright, M. Touching the past: Haptic augmented reality for museum artefacts. In Virtual, Augmented and Mixed Reality. Applications of Virtual and Augmented Reality: 6th International Conference, VAMR 2014, Held as Part of HCI International 2014, Heraklion, Crete, Greece, June 22–27, 2014; Proceedings, Part II 6; Springer International Publishing: Berlin/Heidelberg, Germany, 2014; pp. 3–14. [Google Scholar]
  88. Prytherch, D.; Jefsioutine, M. Touching Ghosts: Haptic technologies in museums. Power Touch 2016, 223–240. [Google Scholar] [CrossRef]
  89. Comes, R. Haptic devices and tactile experiences in museum exhibitions. J. Anc. Hist. Archaeol. 2016, 3. [Google Scholar] [CrossRef]
  90. Brewster, S. The impact of haptic ‘touching’technology on cultural applications. In Digital Applications for Cultural and Heritage Institutions; Routledge: Abingdon, UK, 2017; pp. 301–312. [Google Scholar]
  91. Rennick-Egglestone, S.; Brundell, P.; Koleva, B.; Benford, S.; Roussou, M.; Chaffardon, C. Families and mobile devices in museums: Designing for integrated experiences. J. Comput. Cult. Herit. (JOCCH) 2016, 9, 1–13. [Google Scholar] [CrossRef]
  92. Raptis, D.; Tselios, N.; Avouris, N. Context-based design of mobile applications for museums: A survey of existing practices. In Proceedings of the 7th International Conference on Human Computer Interaction with Mobile Devices & Services, Salzburg, Austria, 19–22 September 2005; pp. 153–160. [Google Scholar]
  93. Savidis, A.; Zidianakis, M.; Kazepis, N.; Dubulakis, S.; Gramenos, D.; Stephanidis, C. An integrated platform for the management of mobile location-aware information systems. In Proceedings of the Pervasive Computing: 6th International Conference, Pervasive 2008, Sydney, Australia, 19–22 May 2008; Proceedings 6. Springer: Berlin/Heidelberg, Germany, 2008; pp. 128–145. [Google Scholar]
  94. Tesoriero, R.; Gallud, J.A.; Lozano, M.; Penichet, V.M.R. A location-aware system using RFID and mobile devices for art museums. In Proceedings of the Fourth International Conference on Autonomic and Autonomous Systems (ICAS’08), Gosier, France, 16–21 March 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 76–81. [Google Scholar]
  95. Laurillau, Y.; Paternò, F. Supporting museum co-visits using mobile devices. In Proceedings of the Mobile Human-Computer Interaction-MobileHCI 2004: 6th International Symposium, MobileHCI, Glasgow, UK, 13–16 September 2004; 2004. [Google Scholar]
  96. Tesoriero, R.; Gallud, J.A.; Lozano, M.; Penichet, V.M.R. Enhancing visitors’ experience in art museums using mobile technologies. Inf. Syst. Front. 2014, 16, 303–327. [Google Scholar] [CrossRef]
  97. Dulyan, A.; Edmonds, E. AUXie: Initial evaluation of a blind-accessible virtual museum tour. In Proceedings of the 22nd Conference of the Computer-Human Interaction Special Interest Group of Australia on Computer-Human Interaction, Brisbane Australia, 22–26 November 2010; pp. 272–275. [Google Scholar]
  98. Rojas, H.; Renteria, R.; Acosta, E.; Arévalo, H.; Pilares, M. Application of accessibility guidelines in a virtual museum. In Proceedings of the 2020 3rd International Conference of Inclusive Technology and Education (CONTIE), Baja California Sur, Mexico, 28–30 October 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 73–79. [Google Scholar]
  99. Partarakis, N.; Zabulis, X.; Foukarakis, M.; Moutsaki, M.; Zidianakis, E.; Patakos, A.; Tasiopoulou, E. Supporting sign language narrations in the museum. Heritage 2022, 5, 1–20. [Google Scholar] [CrossRef]
  100. Partarakis, N.; Klironomos, I.; Antona, M.; Margetis, G.; Grammenos, D.; Stephanidis, C. Accessibility of cultural heritage exhibits. In Universal Access in Human-Computer Interaction. Interaction Techniques and Environments: 10th International Conference, UAHCI 2016, Held as Part of HCI International 2016, Toronto, ON, Canada, July 17–22, 2016, Proceedings, Part II 10; Springer International Publishing: Berlin/Heidelberg, Germany, 2016; pp. 444–455. [Google Scholar]
  101. Ciolfi, L.; Bannon, L. Designing Interactive Museum Exhibits: Enhancing visitor curiosity through augmented artefacts. In Proceedings of the Eleventh European Conference on Cognitive Ergonomics, Belfast, Northern Ireland, 8–11 September 2002; Volume 7. [Google Scholar]
  102. Zidianakis, E.; Partarakis, N.; Kontaki, E.; Kopidaki, S.; Xhako, A.; Pervolarakis, Z.; Agapakis, A.; Foukarakis, M.; Ntoa, S.; Barbounaki, I.; et al. Web-Based Authoring Tool for Virtual Exhibitions. In Proceedings of the HCI International 2022–Late Breaking Posters: 24th International Conference on Human-Computer Interaction, HCII 2022, Virtual Event, 26 June–1 July 2022; Springer Nature Switzerland: Cham, Switzerland Proceedings, Part I. ; pp. 378–385. [Google Scholar]
  103. Benko, H.; Holz, C.; Sinclair, M.; Ofek, E. NormalTouch and TextureTouch: High-Fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; pp. 717–728. [Google Scholar] [CrossRef]
  104. Partarakis, N.; Grammenos, D.; Margetis, G.; Zidianakis, E.; Drossis, G.; Leonidis, A.; Stephanidis, C. Digital cultural heritage experience in Ambient Intelligence. Mix. Real. Gamification Cult. Herit. 2017, 473–505. [Google Scholar]
  105. Marshall, M.T.; Dulake, N.; Ciolfi, L.; Duranti, D.; Kockelkorn, H.; Petrelli, D. Using tangible smart replicas as controls for an interactive museum exhibition. In Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, Eindhoven, The Netherlands, 14–17 February 2016; pp. 159–167. [Google Scholar]
  106. Ciolfi, L.; Bannon, L.J. Designing hybrid places: Merging interaction design, ubiquitous technologies and geographies of the museum space. CoDesign 2007, 3, 159–180. [Google Scholar] [CrossRef]
  107. Stefanidi, E.; Partarakis, N.; Zabulis, X.; Zikas, P.; Papagiannakis, G.; Magnenat Thalmann, N. TooltY: An approach for the combination of motion capture and 3D reconstruction to present tool usage in 3D environments. In Intelligent Scene Modeling and Human-Computer Interaction; Springer International Publishing: Cham, Switzerland, 2021; pp. 165–180. [Google Scholar]
  108. Stefanidi, E.; Partarakis, N.; Zabulis, X.; Papagiannakis, G. An approach for the visualization of crafts and machine usage in virtual environments. In Proceedings of the 13th International Conference on Advances in Computer-Human Interactions, Valencia, Spain, 21–25 November 2020; pp. 21–25. [Google Scholar]
  109. Hazan, S.; Katz, A.L. The willing suspension of disbelief: The tangible and the intangible of heritage education in e-learning and virtual museums. Mix. Real. Gamification Cult. Herit. 2017, 549–566. [Google Scholar]
  110. Langlais, D. Cybermuseology and intangible heritage. ETopia 2005. [Google Scholar] [CrossRef]
  111. Giaccardi, E. Collective storytelling and social creativity in the virtual museum: A case study. Des. Issues 2006, 22, 29–41. [Google Scholar] [CrossRef]
  112. Bounia, A.; Myrivili, E. Beyond the virtual: Intangible museographies and collaborative museum experiences. In Uncertain Spaces. Virtual Configurations in Contemporary Art and Museums; Barranha, H., Martins, S.S., Eds.; Instituto de História da Arte, FCSH—Universidade Nova de Lisboa: Lisboa, Portugal, 2015; pp. 15–33. [Google Scholar]
  113. Partarakis, N.; Kaplanidi, D.; Doulgeraki, P.; Karuzaki, E.; Petraki, A.; Metilli, D.; Zabulis, X. Representation and presentation of culinary tradition as cultural heritage. Heritage 2021, 4, 612–640. [Google Scholar] [CrossRef]
  114. Kondylakis, G.; Galanakis, G.; Partarakis, N.; Zabulis, X. Semantically Annotated Cooking Procedures for an Intelligent Kitchen Environment. Electronics 2022, 11, 3148. [Google Scholar] [CrossRef]
  115. Unity 3D. Available online: https://unity.com/ (accessed on 2 March 2023).
  116. Nielsen, J. Usability Engineering; Morgan Kaufmann: Burlington, MA, USA, 1994. [Google Scholar]
  117. Ntoa, S.; Margetis, G.; Antona, M.; Stephanidis, C. User experience evaluation in intelligent environments: A comprehensive framework. Technologies 2021, 9, 41. [Google Scholar] [CrossRef]
  118. Martins, A.I.; Queirós, A.; Silva, A.G.; Rocha, N.P. Usability evaluation methods: A systematic review. Hum. Factors Softw. Dev. Des. 2015, 250–273. [Google Scholar] [CrossRef]
  119. Yi, J.H.; Kim, H.S. User experience research, experience design, and evaluation methods for museum mixed reality experience. J. Comput. Cult. Herit. (JOCCH) 2021, 14, 1–28. [Google Scholar] [CrossRef]
  120. Leopardi, A.; Ceccacci, S.; Mengoni, M.; Naspetti, S.; Gambelli, D.; Ozturk, E.; Zanoli, R. X-reality technologies for museums: A comparative evaluation based on presence and visitors experience through user studies. J. Cult. Herit. 2021, 47, 188–198. [Google Scholar] [CrossRef]
  121. Nielsen, J. Enhancing the explanatory power of usability heuristics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 24–28 April 1994; pp. 152–158. [Google Scholar]
  122. Smart Display at the National History Museum of Athens. Available online: https://zenodo.org/record/7727957 (accessed on 10 March 2023).
Figure 1. Display design (a) photorealistic and wireframe renderings; (b) analysis of negative space of the display case where equipment is integrated; (c) implementation of the prototype.
Figure 1. Display design (a) photorealistic and wireframe renderings; (b) analysis of negative space of the display case where equipment is integrated; (c) implementation of the prototype.
Technologies 11 00065 g001
Figure 2. System architecture.
Figure 2. System architecture.
Technologies 11 00065 g002
Figure 3. The user is touching the glass display to create an interaction area and Unity translates touch inputs to a line segment which is used to create a mesh collider which is assigned by the user with multimedia content.
Figure 3. The user is touching the glass display to create an interaction area and Unity translates touch inputs to a line segment which is used to create a mesh collider which is assigned by the user with multimedia content.
Technologies 11 00065 g003
Figure 4. Content viewer in-lab prototype presenting information upon the selection of the assigned information areas (in the left side of the screen information regarding the exhibit and how to initiate interaction are provided in Greek).
Figure 4. Content viewer in-lab prototype presenting information upon the selection of the assigned information areas (in the left side of the screen information regarding the exhibit and how to initiate interaction are provided in Greek).
Technologies 11 00065 g004
Figure 5. The setup of the display case in the National History Museum of Athens.
Figure 5. The setup of the display case in the National History Museum of Athens.
Technologies 11 00065 g005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pattakos, A.; Zidianakis, E.; Sifakis, M.; Roulios, M.; Partarakis, N.; Stephanidis, C. Digital Interaction with Physical Museum Artifacts. Technologies 2023, 11, 65. https://doi.org/10.3390/technologies11030065

AMA Style

Pattakos A, Zidianakis E, Sifakis M, Roulios M, Partarakis N, Stephanidis C. Digital Interaction with Physical Museum Artifacts. Technologies. 2023; 11(3):65. https://doi.org/10.3390/technologies11030065

Chicago/Turabian Style

Pattakos, Andreas, Emmanouil Zidianakis, Michalis Sifakis, Michalis Roulios, Nikolaos Partarakis, and Constantine Stephanidis. 2023. "Digital Interaction with Physical Museum Artifacts" Technologies 11, no. 3: 65. https://doi.org/10.3390/technologies11030065

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop