Next Article in Journal
Dispersion Curve Interpolation Based on Kriging Method
Next Article in Special Issue
Augmented and Virtual Reality to Enhance the Didactical Experience of Technological Heritage Museums
Previous Article in Journal
Long-Term Structural State Trend Forecasting Based on an FFT–Informer Model
Previous Article in Special Issue
Comparison of a VR Stylus with a Controller, Hand Tracking, and a Mouse for Object Manipulation and Medical Marking Tasks in Virtual Reality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

User Experience of a Digital Fashion Show: Exploring the Effectiveness of Interactivity in Virtual Reality

School of Games, Hongik University, Sejong 30016, Republic of Korea
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2023, 13(4), 2558; https://doi.org/10.3390/app13042558
Submission received: 30 January 2023 / Revised: 9 February 2023 / Accepted: 14 February 2023 / Published: 16 February 2023
(This article belongs to the Special Issue Virtual Reality Technology and Applications)

Abstract

:
A virtual reality (VR) environment is different from the generally produced video-based experience, as it creates an interactive user experience (UX) by allowing the users to respond to operations or commands using various input devices. In particular, as existing analog-type fashion shows that have limitations in space and time have evolved into digital fashion shows that include interaction, the importance of the naturalness of interaction and visual expression is being emphasized in UX research. However, study cases that maximize user immersion through the interaction of visual changes in the stage and clothes of a digital fashion show are scarce. As such, this study introduces an interactive VR fashion show to analyze the impact of interaction on the UX in a VR environment. In the design of the interactive VR fashion show, various interaction elements that affect the UX are selected and the scope of UX application is suggested. In addition, by utilizing a commercial game engine, the production process necessary to simulate an existing fashion show is shown step-by-step. The user test in this study is examined in-depth by dividing the evaluation of the user into a two-step survey, and the results are objectively verified through a statistical analysis.

1. Introduction

Virtual reality (VR) provides an interface that allows users to feel as if they are interacting with an actual situation by simulating a specific space with 3D graphics. In the entertainment media industry where VR is utilized in videos and games, audiovisual realism is increased to maximize the immersion by users [1]. It is also possible for users to experience VR by operating or responding to commands through various input devices. Therefore, VR is different from the general video-based experience in that it creates a user experience (UX) [2].
Conventional fashion shows were used as a major means of product sales by advertising and promoting fashion brands, retaining existing customers, and creating new customers by conveying information through stimulating the emotion of consumers [3]. This is because fashion shows are at the center of artistic and cultural history and belong to a genre of expression that can penetrate through the general public to help scale commercial businesses. However, the limitation of conventional fashion shows is their low general popularity owing to their limitations in space and time. To overcome this limitation, fashion shows are evolving beyond the function of merely providing information to the function of providing entertainment for pleasure—in other words, contents with various event characteristics that can attract the attention of the audience [4].
As existing analog fashion shows with constraints in space and time have evolved into digital fashion shows that include interaction, the VR environment is helpful in evaluating apparel, promoting fashion shows, and purchasing the apparel [5]. Hyung et al. introduced the study of interaction by using telepresence in smartphone-based augmented reality (AR) [6], and Nam et al. emphasized the importance of the naturalness and visual expression of the interaction in the study of UX in VR games through a head-mounted display (HMD) [7]. Seminara et al. used indirect illumination light to communicate with the visitors in the art museum and analyzed an immersive experience [8]. Jung et al. investigated the consumer experiences in VR communications by using the interpretation of visual images and a hermeneutic model of experiential gestalt [9]. Enhancing the presence and immersion during virtual training, Baek et al. proposed a VR-based job training system using various sensors [10]. Battistoni et al. developed virtual fitting rooms using a set of interaction design patterns and explored the shopping experience in AR [11]. However, study cases on maximizing the immersion of users through interaction with the visual changes of the stage and clothes of a digital fashion show are scarce.
This study proposes an interactive VR (IVR) fashion show that makes evaluation possible as part of the UX through an interactive experience to compensate for the aforementioned limitations. An IVR fashion show is a digital fashion show produced under a VR environment and is a simulation model that can measure the effects of various interaction elements on the UX through independent variables. In the analysis of the UX, the user test is conducted through a two-step survey, and the results are objectively verified through a statistical analysis. Figure 1 summarizes the range and method of the UX analyzed in this study.
The remainder of this paper is structured as follows. In Section 2, related studies are introduced, and in Section 3, the IVR fashion show designed and implemented in this study is presented. In Section 4, the UX is compared through a user test and the analysis results are described. Finally, in Section 5, the conclusion and future study directions are presented.

2. Related Works

2.1. Convergence Contents

Levinson introduced the concept of remediation as an anthropotropic process in which a new media technology improves upon or modifies an existing technology and defined it as a theory whereby new media reform the old media format [12]. It was also suggested that the method of one medium borrowing the interface, expression, and social awareness of another medium increased the possibility of remediacy by technical convergence. This concept of remediation was further developed into the three attributes of transparent immediacy, hypermediacy, and remediacy by Bolter and Grusin, and was realized by a visual representation style whereby the presence of media, such as the VR, was forgotten during the moment of immersion by the audience [13]. This study defines that transparent immediacy, a representation style, provides visual space through the screen of an HMD, and hypermediacy recognizes the actions in media directed in multiple heterogeneous VR spaces. Dobson defined that, with the emergence of new media, old media, such as perspective painting, photography, and theatrical play, are being reformed or absorbed into new media in various forms, such as the Internet, movies, and computer graphics [14]. In the VR space, which is a representative new medium, the possibility of shifting to the newly added state-of-the-art space while recognizing old media was also predicted. Hwang et al. claimed that digital convergence is not a simple technological change, but rather a shift in technological, industrial, cultural, and social paradigms by changing the way individuals, groups, and society communicate [15]. However, in the age of digital convergence, the premise that knowledge and contents wanted by the users should be adequately linked to the products and services that have complex functions is required rather than simply providing multiple functions simultaneously.
As such, this study focuses on the concept of remediacy, whereby technical convergence contents are studied by selecting an existing fashion show as a new medium and converting it to the VR space. However, this study analyzes the possibility of utilizing the evaluation of the UX by general evaluators in the future based on the development of convergence contents and examines the independent variables of the interaction elements inside VR that impact the usability of digital fashion shows. The possibility of expanding into the application of VR-contents-based convergence industry is to be verified. In particular, the possibility of commercial utilization through the convergence of modern fashion shows to IT technology is explored, and the possibility of expanding the new contents to create cultural and economic value is examined.

2.2. Digital Fashion Show

VR utilizes the sensory information of the user to enable realistic spatial and visual experiences. Bamodu and Ye proposed components for an effective VR system in a VR review [16]. They emphasized the need for empirical components, such as virtual world, immersion, sensory feedback, and interactivity, in addition to system components, such as HMDs and input devices. In particular, an interactive experience is a two-way experience and is a very important factor in evaluating the usability of media [17]. Wu et al. defined the digital fashion show as a new type of fashion show that is merged with digital images and high-tech devices rather than the existing analog method [18]. Gao et al. mentioned that the user immersion could be maximized through the stimulation of interaction, for example, by using HMDs, rather than using a normal display [5]. For the sensory experiences, the system needs to be configured for the visual, auditory, and tactile sensations of the user. Furthermore, the transformation of data, such as virtual stage, lighting, music, effects, and animation, for the digital fashion show needs to be reflected in the interaction module of the UX through the user interface.
As shown in Figure 2, traditional fashion shows have evolved into various digital platforms with the development of technology. Such transitions require the evaluation of the UX for verifying the possibility of commercialization. However, the process of evaluating the UX requires a wide range of advanced research, as it needs a preparation process from producing digital clothes and visualizing VR through a rendering engine to designing the interaction components. Currently, there is a lack of research on the design and user evaluation of interaction-based VR fashion shows for UX analysis. Therefore, studies on the comprehensive perspective to apply interactions based on the design from the perspective of the producer are necessary prior to UX analysis.
As such, in this study, an IVR fashion show that can maximize the user immersion under a VR environment is produced step-by-step, and the impact on the UX through a user test is analyzed. In particular, by applying the interaction of games to VR, the impact of the changes in various interaction elements on the UX is measured in the form of independent variables to evaluate user satisfaction.

3. Design and Implementation of IVR Fashion Show

The IVR fashion show introduced in this study intends to simulate the 3D animation of digital clothes in the VR space through an avatar model. The interaction components for an interactive experience are selected through the opinion of fashion experts, and fashion show contents are implemented step-by-step using commercial graphic tools and 3D game engines. Figure 3 shows the production steps of an IVR fashion show.

3.1. Interaction Components

There have been cases where various interaction components for the design of digital fashion shows were used as independent variables [6], but cases that evaluated the UX in the VR space by comprehensively using vision and hearing have been rare. In this study, 10 professional fashion designers were surveyed on the importance of eight components of a fashion show to incorporate the components of a traditional fashion show to a VR environment. Based on the results, four interaction components were selected: point of view (POV), clothing, illumination effect, and background sound. Table 1 shows the results of the importance survey. Owing to the nature of the fashion show, it is notable that the walk of the model has a standardized pattern, and hence, the importance was evaluated to be relatively low.
The users of the IVR fashion show can apply the changes in the above interaction components to VR contents through a pre-built user interface (UI) to evaluate the impact on the UX. A UI is an intermediate medium for maximizing the UX with the purpose of effectively conveying information through a clear interaction [19,20]. Therefore, in this study, user-based element changes are designed to operate in real time in the VR fashion show for an effective conveyance of interaction information. A change in POV is a function of controlling the direction and position of the eye gaze of the user wearing the HMD based on the clothes of the fashion show in a VR environment. When the POV is activated, the user can experience the real-time change in the viewpoint of the avatar model. These changes affect immersion by inducing both interest and fun in the future UX. However, sudden changes in the rotation angle of the POV can cause cybersickness and can cause a negative impact on interest, and hence, the usage has to be limited. The change in clothes is a function of switching the digital clothes of the avatar model, which is configured such that various types of clothes could be changed in real time at the desired time of the user. These changes can significantly increase the fun aspect of the experience, but the limitation in the type of clothes and the naturalness and completeness of digital clothes in the production stage can impact the UX. A change in illumination effect is a function that creates a change in the brightness and effect of light in the virtual stage space, and the user can change the color of the stage background and the particle effects. Functions that change the effect in the virtual space can increase the short-term fun aspect, but continued usage could potentially reduce interest and immersion. Finally, a change in background sound is a function of changing the background music played in the VR space, and the user can select from a variety of sound sources. Although the change in sound can also increase the short-term fun aspect, repeated usage can reduce the interest and immersion of the user. Figure 4 shows the UI configuration of the interaction components that can be changed by the user in the IVR fashion show.

3.2. Digital Clothes

Digital clothes can naturally be simulated through the transforming polygons process arising from the physical collision between the virtual character avatar model and the digital clothes [18,21]. Digital clothes require setting detailed properties of wrinkles ranging from thickness, shrinkage, and fit [22,23]. The difference in various fabrics must be indicated by adjusting the different material properties according to the given fashion concept.
In this study, digital sewing technology is used to make clothing fabric from numerous polygons through clothing pattern production, and then clothing simulation is conducted by applying it to an avatar model. The fabric properties required for implementing digital clothes are thickness, weight, and color. As for the thickness, the thicker the fabric, the stronger the tendency to sag, and the greater the degree of curvature of the wrinkles, but less friction between the surface of the wrinkles reduces the density. As for the weight, the heavier the fabric, the higher the density, and hence, the fabric is less impacted by air resistance and gravity. Therefore, by adjusting the weight, the characteristics of the clothing can be specified. Color has little correlation to the properties of thickness and weight, and hence, it is possible to focus on the design of the clothing itself through the coordination of individual colors depending on the situation. Table 2 shows the properties of the three fabrics used in the IVR fashion show, and Figure 5 shows the results of applying various fabrics (wool, cotton, and silk) to the same digital clothing by adjusting the fabric properties.
The digital clothes used in the digital fashion show are produced through a five-step process using the fabric characteristics provided in Table 2. The first step is to create a pattern for the virtual fabric based on the concept of the digital clothes set up at the planning stage. The thickness, weight, and color of the virtual clothes are not specified, but only the position and size of the top and bottom are determined by editing the fabric pieces. The second step is to model the final digital clothes by fitting the fabric produced earlier to the three-dimensional avatar model. For a natural fit to the avatar model, the thickness, weight, and characteristics of the fabric for the digital clothes are carefully adjusted. The third step is determining the color change and transparency of the top and bottom fabrics through the shading process. Using the inner and outer layer of the clothes, the wrinkle shape, transparency, and various colors of the fabric properties for each material are expressed. The fourth step is applying detailed 2D patterns on the digital clothes, for which the basic coloring has been completed, through the texturing process. For a detailed design of the texture style pattern on the digital clothes, the image that can be applied to the logo and material of the clothes is added. In addition, the weight of the fabric is optimized to minimize the abrupt change in the image applied to the digital clothes. In the final step, the digital clothes obtained through the previous steps are simulated on the avatar model to verify the looks while the avatar is posing. In particular, visual errors are corrected in advance by testing the bending, refraction, and light reflection of the digital clothes applied in the animation of the avatar model in the VR environment. Figure 6 shows the step-by-step production results of various digital clothes.
This study used the Marvelous Designer [24] for the 3D model of the digital clothes, and the simulation and rendering were performed by applying to avatar mode in Unity [25]. Note that it is possible to express the clothing realistically by maintaining the natural deformation of the curve according to the moving shape of the avatar and by maintaining the shape based on the physical nature of the clothing itself through the clothing simulation.

3.3. User Experience

The UX is generally defined as various aspects of user experience obtained by interacting with the product [26]. In addition, the UX related to virtual simulation requires the design of the UI and is also defined as the communication method for expressing the results of the user input and response [27]. This study intends to evaluate the impact on the UX using the interaction components selected previously in the form of four independent variables: fun, interest, immersion, and usability (see Figure 1). This method is intended to determine the impact on the viewer experience of the IVR fashion show and is used for measuring the user satisfaction for the UX in the future. In the user evaluation, various aspects of experience required in the interactive communication are analyzed to examine whether the experience has a positive or negative impact on the user.

3.4. VR Space

To observe the UX in VR, various environmental factors need to be assigned and user feedback response must be tracked [28]. In this study, to evaluate the UX in a VR environment, factors, such as speed, range, and mapping, were applied to the VR space of the fashion show. The speed is the interaction feedback time required from the virtual world, and the user immersion is improved by responding as quickly as possible to the user response in VR space. In addition, by optimizing the data upload speed in VR, a positive experience could be provided to the user. The range can maximize fun and interest of the UX through the real-time visualization of the interaction components in the form of a UI within an appropriate range with the VR main medium on the HMD screen. As for mapping, it allows the user to concentrate on the storytelling developed in the contents by limiting the expression of the visual elements through the rotation angle of the HMD screen itself.

4. Experimental Results

4.1. IVR Fashion Show

The important elements realized in the IVR fashion show include 3D avatar models, animations, 2D UI, and digital clothes. We utilized several software tools to create them due to their popularity and commercial support. The 3D model and 2D UI, such as the avatar, were produced manually using Autodesk and Adobe graphic tools [29,30], and the animation of the avatar was converted to data by recording the runway walking motion of the fashion show model with the motion capture system [31]. Digital clothes were produced by the design tool Marvelous Designer [24], and the movement of the fabric was reproduced through the physical simulator of Unity game engine [25]. The data for each element were integrated in Unity together with the UI composed of the POV, clothing, effect, and sound for interaction and ported to VIVE HMD [32]. Figure 7 shows the results of the data used in the production process of the IVR fashion show.
The user needs to learn the control of UI beforehand through the tutorial mode of the content, because the change in scene according to the selection on the UI provided in the IVR fashion show will affect the UX evaluation. As for the change in POV, the point of view of looking at the avatar by the user wearing the HMD changes in real time. These changes are comprised of zoom in and out along with the four directions of front, left, right, and rear sides. As for the change in clothing, the clothing worn by the avatar changes in real time, but it does not directly affect the animation of the avatar on the runway stage. As for the change in effect, the falling effect of small particles is produced in real time along with the turning on and off of the background illumination. These changes are repeated in units of seconds, making the visual effect look natural. As for the sound, various background sounds were played, and the sound level was adjusted according to the location of the avatar. Figure 8 shows the change in scene according to the selection on the UI of the IVR fashion show. The accompanying video is shown in Supplementary Materials.

4.2. User Test

This study analyzed the impact of the interaction components, which were suggested previously, on the UX after conducting the user test through the audiovisual data of the IVR fashion show. To this end, the user satisfaction was investigated using a two-stage survey comprised of a one-way experience where the digital fashion show was only watched on the VR environment and an interactive experience where user control existed, and they were compared. The first survey was administered to 12 general people, and the second survey was administered to 120 general people by applying the IVR fashion show that was improved based on the negative feedback from the first survey. The step-by-step evaluation process of the survey was configured in the order of pre-questionnaire for participant background information (name, gender, age, occupation, group), quantitative evaluation for UX (i.e., a five-point Likert scale: the larger the number, the higher the satisfaction), and the degree and range of cybersickness (i.e., a five-point Likert scale: the larger the number, the less the cybersickness experienced). Figure 9 compares the VR system configuration between the one-way and interactive experiences in the VR environment. It is notable that we adopted standard VR controllers as main input devices to simplify the UI controls during the tests.
Figure 10 shows the evaluation results of the first survey, whereby the impact of each interaction component on the UX was compared for the one-way and interactive experience cases. The interactive experience received a higher evaluation overall compared with the one-way experience, and in particular, there was a notable difference in usability.
This is presumed to be mainly because the ability to control the elements, such as camera position, clothing, and effect, was preferable by the user and had an adverse effect on UX in the IVR fashion show. In particular, the change in effect and sound showed a relatively lower rate of increase in evaluation as compared with the POV and clothing, and this was because the young participants who were frequently exposed to games and digital media were less sensitive to the change in effect and sound. On the other hand, the change in sound received a relatively low evaluation on the UX, which appears to be the result of the negative impact caused by the distraction owing to the background music used in the first survey. As such, the type and volume of the background music in the IVR fashion show for the evaluation of the second survey were modified. In addition, the discomfort owing to cybersickness was noted, and hence, the rotation angle of the camera was limited to 180° left and right.
Figure 11 shows the evaluation results of the second survey, whereby the impact of each interaction component on the UX was compared for the one-way and interactive experience cases. Overall, the evaluation of the interactive experience of the IVR fashion show improved in the second evaluation as compared with the first evaluation. It is presumed that the interaction function of the IVR fashion show that was modified after the first survey had a positive impact on the second evaluation. On the other hand, the rating of the one-way experience decreased overall compared with the first survey, and this is because the improved IVR fashion show after the first survey had a negative impact on the evaluation of one-way experience in the second survey. In addition, the increased participants had deteriorated the overall UX.
Figure 12 compares the two UXs including cybersickness on the radar chart. The higher the value for cybersickness, the less is the likelihood for motion sickness, and the interactive experience had an average value of 3.73, which was evaluated to be superior to the average one-way experience value of 3.12. The restrictions on rotation by the HMD and interactivity had a positive impact on minimizing cybersickness. The restrictions on viewing angle can also be controlled in the one-way experience, but the huge reduction in the spatial realism of the UX had a negative impact overall.
A t-test was conducted for an objective comparison of the average user satisfaction value data obtained from the UX. As shown in Table 3, the average value, standard deviation, t-value, and p-value by the UX were calculated with each of the interaction components used as independent variables. Based on the t-test results, all the variables except sound were within the 99% confidence interval, indicating that the corresponding interaction components (POV, clothing, and effect) had a significant impact on the UX. On the other hand, it was difficult to assess the impact of the change in sound on the immersion of the UX objectively. In summary, the IVR fashion show with interaction components applied can be evaluated to convey higher UX satisfaction in most interactions other than the change in sound compared with the one-way fashion show in which videos are only watched in the VR environment.

5. Conclusions

In this study, the impact of various interaction elements on the UX of a digital fashion show in a VR environment was analyzed. To this end, the design and prototype of an IVR fashion show were presented, and through the user test, it was verified that the interactive experience had a more positive impact on the main components (interest, fun, immersion, and usability) of the UX than the one-way experience. The study on the production of the VR fashion show and user evaluation for the UX analysis requires a design related to planning that meets specific study purposes in the production stage. As such, the design and implementation method of the production case in this study is expected to be set as a guideline for the planners and developers in the planning, production, and testing stages of future VR fashion show preparation.
For future studies, the scope of UX analysis is planned to be expanded by improving the visual quality and function of the IVR fashion show. It was particularly noted in the user test that technically unstable elements that appeared during interaction in the VR environment, such as screen disconnection, inaccurate focus of view, and the distortion of graphic presentation, had to be improved. These issues are expected to be resolved by improving the real-time response rate of the graphic data shown on the game engine. In addition, there is an issue of the afterimage left by the digital clothes during the rotation of the HMD, and hence, the quality of the clothing animation is planned to be improved by applying optimized polygon data and adopting a marker-free motion capture system [33]. Finally, to minimize cybersickness in the VR environment, the stage of the fashion show and the motion of the avatar will be tracked through a camera angle similar to the point of view of the observer, and the function to adjust the viewing angle of the HMD automatically will be added [34].

Supplementary Materials

The experiment results can be seen on the video located at https://drive.google.com/file/d/1GIr7HgBQYqHetBEgoEAb_aKW7a2zuH8G/view?usp=sharing.

Author Contributions

Conceptualization, D.-K.A., B.-C.B. and Y.K.; methodology, D.-K.A., B.-C.B. and Y.K.; software, D.-K.A.; validation, D.-K.A.; formal analysis, D.-K.A., B.-C.B. and Y.K.; investigation, D.-K.A., B.-C.B. and Y.K.; resources, D.-K.A.; data curation, D.-K.A.; writing—original draft preparation, D.-K.A. and Y.K.; writing—review and editing, D.-K.A., B.-C.B. and Y.K.; visualization, D.-K.A. and Y.K.; supervision, B.-C.B. and Y.K.; project administration, B.-C.B. and Y.K.; funding acquisition, B.-C.B. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by grants from the National Research Foundation of Korea (NRF) funded by the Korea government (MSIT) (No. 2021R1A2C1012377 and 2021R1F1A1046513).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jeong, W.-J.; Cho, J.-H.; Kim, M.-S. A Study of VR Contents problem associated with VR Market Change. In Proceedings of the Korean Society of Design Science Conference, Daegu, Republic of Korea, 18 November 2017; pp. 200–201. [Google Scholar]
  2. Chun, H.W.; Han, M.K.; Jang, J.H. Application trends in virtual reality. Electron. Telecommun. Trends 2017, 32, 93–101. [Google Scholar]
  3. Choi, H.-J.; Shin, Y.-O. The effects of the components of a fashion show on viewing satisfaction. J. Fash. Bus. 2008, 12, 45–62. [Google Scholar]
  4. Diehl, M.E. How to Produce a Fashion Show; Fairchild Books: New York, NY, USA, 1983; ISBN 978-0870051593. [Google Scholar]
  5. Gao, X.; Chen, M.; Guo, S.; Sun, W.; Liao, M. Virtual fashion show with HTC VIVE. DEStech Trans. Comput. Sci. Eng. 2018. [Google Scholar] [CrossRef]
  6. Hyun, Y.; Kim, H.C.; Kim, Y.G. A Verification of the structural relationships between system quality, information quality, service quality, perceived usefulness and reuse intention to augmented reality by applying transformed tam model: A focus on the moderating role of telepresence and the mediating role of perceived usefulness. Korean Manag. Rev. 2014, 43, 1465–1492. [Google Scholar]
  7. Nam, S.; Yu, H.S.; Shin, D. User experience in virtual reality games: The effect of presence on enjoyment. Int. Telecommun. Policy Rev. 2017, 24, 85–125. [Google Scholar]
  8. Seminara, M.; Meucci, M.; Tarani, F.; Riminesi, C.; Catani, J. Characterization of a VLC system in real museum scenario using diffusive LED lighting of artworks. Photonics Res. 2021, 548–557. [Google Scholar] [CrossRef]
  9. Jung, J.; Yu, J.; Seo, Y.; Ko, E. Consumer experiences of virtual reality: Insights from VR luxury brand fashion shows. J. Bus. Res. 2021, 130, 517–524. [Google Scholar] [CrossRef]
  10. Baek, S.; Gil, Y.-H.; Kim, Y. VR-based job training system using tangible interactions. Sensors 2021, 21, 6794. [Google Scholar] [CrossRef] [PubMed]
  11. Battistoni, P.; Di Gregorio, M.; Romano, M.; Sebillo, M.; Vitiello, G.; Brancaccio, A. Interaction design patterns for augmented reality fitting rooms. Sensors 2022, 22, 982. [Google Scholar] [CrossRef] [PubMed]
  12. Levinson, P. The Soft Edge: A Natural History and Future of the Information Revolution; Routledge: London, UK, 1998; ISBN 978-0415197724. [Google Scholar]
  13. Bolter, J.D.; Grusin, R. Remediation: Understanding New Media; MIT Press: Cambridge, MA, USA, 2000; ISBN 978-0262522793. [Google Scholar]
  14. Dobson, S. Remediation. Understanding new media—Revisiting a classic. Int. J. Media Tech. Lifelong Learn. 2009, 5, 1–6. [Google Scholar]
  15. Hwang, J.-S.; Park, Y.-J.; Lee, D.-H. Digital convergence and change of spatial awareness. Korean Inf. Soc. Develop. Inst. Rep. (KISDI) 2009, 9, 1–128. [Google Scholar]
  16. Bamodu, Q.; Ye, X. Virtual reality and virtual reality system components. In Proceedings of the 2nd International Conference on Systems Engineering and Modeling (ICSEM 2013), Beijing, China, 21–22 April 2013; pp. 765–767. [Google Scholar]
  17. Forlizzi, J.; Battarbee, K. Understanding experience in interactive systems. In Proceedings of the 5th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques (DIS 2004), Cambridge, MA, USA, 1–4 August 2004; pp. 261–268. [Google Scholar]
  18. Wu, S.; Kang, Y.; Ko, Y.; Kim, A.; Kim, N.; Ko, H. A Study on the case analysis and the production of 3D digital fashion show. J. Fash. Bus. 2013, 17, 64–80. [Google Scholar] [CrossRef] [Green Version]
  19. McKay, E.N. UI is Communication; MK Morgan Kaufmann: Burlington, MA, USA, 2013; ISBN 978-0123969804. [Google Scholar]
  20. Ahn, D.-K.; Lee, W.; Lee, J.-W. UI design consideration for interactive VR fashion-show. In Proceedings of the 2019 Spring Korea Game Society Conference, Seoul, Republic of Korea, 17–18 May 2019; pp. 259–261. [Google Scholar]
  21. Volino, P.; Cordier, F.; Magnenat-Thalmann, N. From early virtual garment simulation to interactive fashion design. Comput. Aided Des. 2005, 37, 593–608. [Google Scholar] [CrossRef]
  22. Kim, S.; Ahn, D.-K. Analysis of substantial visual elements in 3D digital fashion show. J. Commun. Des. 2016, 57, 305–320. [Google Scholar]
  23. Ahn, D.-K.; Chung, J.-H. A study on material analysis with usability for virtual costume hanbok in digital fashion show. J. Digit. Converg. 2017, 15, 351–358. [Google Scholar] [CrossRef]
  24. Marvelous Designer. Available online: https://marvelousdesigner.com (accessed on 1 July 2022).
  25. Unity. Available online: https://unity.com (accessed on 1 July 2022).
  26. Norman, D. The Design of Everyday Things; Basic Books: New York, NY, USA, 1988; ISBN 978-0465067107. [Google Scholar]
  27. Lee, K.-H. UX/UI Design to Create User Experience; Freelec Books: Bucheon, Republic of Korea, 2018; ISBN 978-8965402169. [Google Scholar]
  28. Purwar, S. Designing User Experience for Virtual Reality Application. Available online: https://uxplanet.org (accessed on 1 August 2022).
  29. Autodesk. Available online: https://autodesk.com (accessed on 1 July 2022).
  30. Adobe. Available online: https://adobe.com (accessed on 1 July 2022).
  31. OptiTrack. Available online: https://optitrack.com (accessed on 1 July 2022).
  32. VIVE. Available online: https://www.vive.com (accessed on 1 July 2022).
  33. Kim, Y.; Baek, S.; Bae, B.C. Motion capture of the human body using multiple depth sensors. ETRI J. 2017, 39, 181–190. [Google Scholar] [CrossRef]
  34. Cheng, D.; Duan, J.; Chen, H.; Wang, H.; Li, D.; Wang, Q.; Hou, Q.; Yang, T.; Hou, W.; Wang, D.; et al. Freeform OST-HMD system with large exit pupil diameter and vision correction capability. Photonics Res. 2022, 10, 21–32. [Google Scholar] [CrossRef]
Figure 1. Overview of the proposed approach: exploring the effects of interactivity on user experiences in an interactive VR fashion show.
Figure 1. Overview of the proposed approach: exploring the effects of interactivity on user experiences in an interactive VR fashion show.
Applsci 13 02558 g001
Figure 2. Development of digital fashion shows.
Figure 2. Development of digital fashion shows.
Applsci 13 02558 g002
Figure 3. Design and implementation of the IVR fashion show: (a) selecting 2D and 3D graphic tools, (b) creating 3D data (avatar model, digital clothes, and animation), (c) applying to the 3D game engine, and (d) evaluating UX.
Figure 3. Design and implementation of the IVR fashion show: (a) selecting 2D and 3D graphic tools, (b) creating 3D data (avatar model, digital clothes, and animation), (c) applying to the 3D game engine, and (d) evaluating UX.
Applsci 13 02558 g003
Figure 4. UIs for interaction components.
Figure 4. UIs for interaction components.
Applsci 13 02558 g004
Figure 5. Digital clothes: (a) wool, (b) cotton, and (c) silk.
Figure 5. Digital clothes: (a) wool, (b) cotton, and (c) silk.
Applsci 13 02558 g005
Figure 6. Digital clothes applied to an avatar model in steps: (a) creating a pattern, (b) fitting to the avatar, (c) shading and coloring the fabric properties, (d) mapping the texture on the clothing, and (e) posing the avatar.
Figure 6. Digital clothes applied to an avatar model in steps: (a) creating a pattern, (b) fitting to the avatar, (c) shading and coloring the fabric properties, (d) mapping the texture on the clothing, and (e) posing the avatar.
Applsci 13 02558 g006
Figure 7. Asset data used for IVR fashion show: (a) 3D models (file format: OBJ), (b) animation (file format: FBX), (c) images for UI and textures (file formats: PNG, TGA, and PSD), and (d) clothing simulation (file format: ABC).
Figure 7. Asset data used for IVR fashion show: (a) 3D models (file format: OBJ), (b) animation (file format: FBX), (c) images for UI and textures (file formats: PNG, TGA, and PSD), and (d) clothing simulation (file format: ABC).
Applsci 13 02558 g007
Figure 8. Scene changes from a selection of (a) point of view, (b) digital clothes, and (c) illumination effect.
Figure 8. Scene changes from a selection of (a) point of view, (b) digital clothes, and (c) illumination effect.
Applsci 13 02558 g008
Figure 9. Comparison of VR fashion shows between (a) one-way and (b) interactive experiences.
Figure 9. Comparison of VR fashion shows between (a) one-way and (b) interactive experiences.
Applsci 13 02558 g009
Figure 10. Comparison of UX between the one-way and interactive experiences: an average value of the participants in the first survey is used for assessing the level of influence of interaction components, change of POV (P), clothes (C), effect (E), and sound (S), on user experiences, interest (I), immersion (M), fun (F), and usability (U).
Figure 10. Comparison of UX between the one-way and interactive experiences: an average value of the participants in the first survey is used for assessing the level of influence of interaction components, change of POV (P), clothes (C), effect (E), and sound (S), on user experiences, interest (I), immersion (M), fun (F), and usability (U).
Applsci 13 02558 g010
Figure 11. Comparison of UX between the one-way and interactive experiences: an average value of the participants in the second survey is used for assessing the level of influence of interaction components, change of POV (P), clothes (C), effect (E), and sound (S), on user experiences, interest (I), immersion (M), fun (F), and usability (U).
Figure 11. Comparison of UX between the one-way and interactive experiences: an average value of the participants in the second survey is used for assessing the level of influence of interaction components, change of POV (P), clothes (C), effect (E), and sound (S), on user experiences, interest (I), immersion (M), fun (F), and usability (U).
Applsci 13 02558 g011
Figure 12. Comparison of the radar charts from (a) one-way and (b) interactive experiences: an average value from the second survey is used for assessing the level of influence of interaction components on user experiences (including cybersickness).
Figure 12. Comparison of the radar charts from (a) one-way and (b) interactive experiences: an average value from the second survey is used for assessing the level of influence of interaction components on user experiences (including cybersickness).
Applsci 13 02558 g012
Table 1. Interaction components selected for the IVR fashion show.
Table 1. Interaction components selected for the IVR fashion show.
Interaction ComponentsImportance (Scale: 1 to 10)
Point of View9
Audience5
Clothes10
Walking Model7
Stage6
Walking Animation4
Illumination Effect9
Background Sound9
Table 2. Fabric properties of digital clothes.
Table 2. Fabric properties of digital clothes.
PropertyElementDescription
ThicknessApplsci 13 02558 i001StretchRelaxation of the slack
BendBending strength for the turnaround
FrictionDeformation upon impact
WeightApplsci 13 02558 i002DensityDensity of garment
Air DragResistant strength weight
GravityStrain control on ground
ColorApplsci 13 02558 i003PrimaryAdjust primary color
ReflectionReflectance value for light
OpacityControl the transparency
Table 3. t-test results of the UX.
Table 3. t-test results of the UX.
UX
Component
Independent
Variable
One-Way
Average (S.D.)
Interactive
Average (S.D.)
t-Valuep-Value
InterestPOV3.63 (1.04)4.34 (0.74)6.09<0.001
Cloth3.59 (0.91)4.42 (0.66)8.09<0.001
Effect3.57 (0.91)4.18 (0.74)5.70<0.001
Sound3.67 (0.92)3.82 (0.86)2.17<0.05
FunPOV3.43 (1.06)4.43 (0.71)8.59<0.001
Cloth3.40 (1.03)4.53 (0.59)10.43<0.001
Effect3.44 (0.99)4.37 (0.67)8.52<0.001
Sound3.41 (1.03)4.12 (0.74)6.13<0.001
ImmersionPOV3.70 (0.98)4.32 (0.84)5.26<0.001
Cloth3.68 (0.89)4.12 (0.87)3.87<0.001
Effect3.58 (0.99)4.00 (0.86)3.51<0.001
Sound3.78 (0.90)3.84 (0.89)0.52>0.05
UsabilityPOV3.50 (1.12)4.40 (0.84)7.04<0.001
Cloth3.43 (1.03)4.31 (0.81)7.36<0.001
Effect3.21 (1.08)3.86 (0.99)4.86<0.001
Sound3.21 (1.04)3.73 (1.01)3.93<0.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ahn, D.-K.; Bae, B.-C.; Kim, Y. User Experience of a Digital Fashion Show: Exploring the Effectiveness of Interactivity in Virtual Reality. Appl. Sci. 2023, 13, 2558. https://doi.org/10.3390/app13042558

AMA Style

Ahn D-K, Bae B-C, Kim Y. User Experience of a Digital Fashion Show: Exploring the Effectiveness of Interactivity in Virtual Reality. Applied Sciences. 2023; 13(4):2558. https://doi.org/10.3390/app13042558

Chicago/Turabian Style

Ahn, Duck-Ki, Byung-Chull Bae, and Yejin Kim. 2023. "User Experience of a Digital Fashion Show: Exploring the Effectiveness of Interactivity in Virtual Reality" Applied Sciences 13, no. 4: 2558. https://doi.org/10.3390/app13042558

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop