Next Article in Journal
Identification of Ratholes in Desert Steppe Based on UAV Hyperspectral Remote Sensing
Previous Article in Journal
The Use of Effective Microorganisms as a Sustainable Alternative to Improve the Quality of Potatoes in Food Processing
Previous Article in Special Issue
Touch Matters: The Impact of Physical Contact on Haptic Product Perception in Virtual Reality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Effective Advertising Types in Virtual Environment

1
Department of Culture and Technology Convergence, Changwon National University, Changwon 51140, Republic of Korea
2
Department of Future Technology, Koreatech, Cheonan 31253, Republic of Korea
3
Department of Culture Technology, Changwon National University, Changwon 51140, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(12), 7063; https://doi.org/10.3390/app13127063
Submission received: 18 May 2023 / Revised: 6 June 2023 / Accepted: 10 June 2023 / Published: 12 June 2023
(This article belongs to the Special Issue User Experience in Extended Reality)

Abstract

:
Virtual reality (VR) applies various types of advertisements (ads) to promote brands while collaborating with companies. This study aims to present effective advertisement types by verifying user responses in a VR environment. First, by analyzing the cases of advertisements with immersive content, the types of advertisements in VR were defined as avatar costumes, products, and wall posters. The user response was measured in two categories: gaze response measured by the eye-tracking VR advertisement monitoring system (EVAMS) and the advertisement effect analyzed through surveys. As a result of analyzing the user responses, the avatar costumes among the advertisement types caused the highest visual attention and advertisement effect. In addition, by analyzing the correlation between visual attention and the advertisement effect, it was observed that there was a positive relationship between the number of fixations and advertisement attention, fixation time, and advertisement recall. Thus, it was confirmed that the higher the number of fixations and the longer the fixation time, the more positively an advertisement was affected. In addition, it is expected that the results of this study can be used as a reference for effective advertisement directing in VR content development and advertisement directing and processing.

1. Introduction

According to the Facebook report “AR/VR: Opening a new world,” the development of virtual reality (VR) technology starts in the fields of advertisement, art, and entertainment, and it is reviewing the way to apply ads to VR content [1]. In addition, Grand View Research predicted that VR and metaverse would account for more than 50% of advertisements in the field and will expand further in the next five years [2]. Accordingly, research is being conducted on utilizing advertisements as a revenue model for content and providing reliable data to advertisers. In addition, the need for a method to efficiently direct ads without interfering with a user’s virtual world experience is emphasized.
To precisely analyze the advertisement effect in a virtual space, it is necessary to examine the user’s response to the advertisement. In the field of advertising, various physiological measurement methods, such as eye-tracking, EEG measurement, and heart rate measurement technologies, are used. In particular, eye-tracking technology can immediately and objectively measure a user’s gaze response, enabling accurate analysis of visual stimuli. It is also useful for detailed user gaze-response analysis because it can obtain various data such as gaze frequency, duration, and order [3,4].
Therefore, we aim to examine the effective types of advertisements by verifying user responses to advertisements presented in various forms in a VR environment with visual attention and advertisement effects. First, the types of advertisements with VR content and previous studies on visual attention and eye-tracking technology are reviewed. In addition, the type of advertisement and eye-tracking index used in the study are selected (Section 2). Section 3 presents the eye-tracking VR advertisement monitoring system to which the eye-tracking metric developed to measure the gaze response to advertisements in a VR environment is applied. We then describe the ad effect measurement experiment, including eye-tracking for visual attention measurement using the system and the ad effect survey in Section 4. Section 5 and Section 6 compare the experimental results to derive effective advertisement types and analyze the results of the correlation analysis between gaze response and advertisement effectiveness.

2. Related Work

2.1. Advertisement in 3D Spatial Content

Virtual space-based content, including VR, is discussed to reflect the format of existing 3D games, and a plan to apply the in-game advertisement format to VR content is being attempted. In a 3D game space, advertisements are applied as sponsorship, branded characters (appearance), real-world analogs (billboards and posters), and product placements (real-world products, clothing, costumes, items, and buildings) [5]. As this did not have a significant impact on gameplay, users showed a favorable attitude. For example, one study applied game advertisements to 1350 male players aged 13–44 years and found that 67% of the players thought that in-game advertisements made the game more realistic, and 40% thought that this type of advertisement influenced their purchase decisions [6]. Another study showed that repeated exposure to advertisement messages in sports video games had a significant and positive effect on brand recall and awareness, brand attitude, and brand purchase intention [7].
Advertisements applied to virtual-space-based content are mostly studied based on a metaverse platform, and the importance of ads is emphasized according to the expansion of commerce services within the content. Depending on the method of advertisement, five types were derived: use as an item, reproduction of advertisements in the real world, temporary game events, the brand’s games, and exposure to advertisement banners [8]. Based on the level of participation in the metaverse content and the purpose of utilization, the virtual world type uses the brand’s own virtual space, the information-providing type provides brand information, the participation-inducing type induces game participation, and the creative tool type produces a brand product in a virtual world [9]. Finally, depending on the placement method, they are classified into banner advertisements, virtual outdoor advertisements, reproductions of offline stores, and sales of branded products and services in virtual spaces [10].
Based on these studies, we classify the types of advertisements applied to a virtual environment according to the exposure method. First, it was divided into cases of direct and indirect exposure through the background. Directly presented is when a character wears a brand costume or item and a real product is presented. Indirect presentation corresponds to presenting brand information by utilizing space elements such as wall posters or arranging the brand’s building. Among these, we selected three types of advertisements that could be applied to one VR space. An avatar costume, product placement, and wall poster were selected, and the visual attention and ad effects were examined.

2.2. Visual Attention and Eye-Tracking

Visual attention occurs selectively to efficiently process visual information to cognitively resolve visual stimuli [11]. Moreover, as the first step toward encouraging consumers to pay attention to advertisements, it is essential to review the production, placement, and application of marketing. If an advertisement does not attract consumers’ visual attention, it cannot be perceived or interpreted [12]. Advertisement effects vary depending on users’ focus [13,14]. Therefore, it is necessary to analyze which advertisement elements consumers pay attention to and how long they stare at them.
Previously, visual attention was mostly measured using post-questionnaires. The recall of advertisements, consumer attitudes toward brands, and purchase intentions have been investigated [15,16,17]. These measurements have been used for a long time and are considered reliable. However, this was not carried out simultaneously with the advertisement exposure, and subjective interpretation intervention was inevitable until the survey. Therefore, questions have been raised as to whether measurement results accurately reflect advertising effects. To compensate for this, quantitative measurement technology, such as eye-tracking technology, which responds immediately when a user sees an advertisement, is required.
Eye-tracking is a technique that collects and analyzes information on how humans accept and respond to visual stimuli by observing eye movements [18]. Responses to visual stimuli are collected in real-time, which has the advantage of providing objective data with no time lag between stimuli and responses [19]. Furthermore, with the development of eye-tracking technology, modern eye-tracking equipment can record large amounts of eye movements under natural exposure conditions with high precision and low cost, making it easy to measure visual attention [3,14].
Recently, eye-tracking has been used to analyze user reactions in virtual environments. Applying eye-tracking to VR environments can help analyze user gaze responses in real environments [20]. For example, Kellogg, a food manufacturer and marketing company, used VR and eye-tracking to analyze consumer reactions to a new product before launching it. By implementing a virtual store, data on the application of billboards for products placed on the shelves, route of application, and time of application were collected to present data for effective product marketing [21].
Many studies have used fixation to analyze gaze responses using eye-tracking technologies. This is because to perceive visual objects and process information, it is necessary to provide a certain amount of time to view objects. Therefore, related research defines fixation as a state of fixing eyes to visual stimuli for a certain period [4,22]. In general, fixation is judged when the gaze is performed for more than 0.1–0.5 s and is appropriately adjusted for research [23,24,25]. The metrics for fixation were expanded to include fixation duration, number of fixations, first fixation, and entry time.
An analysis of previous research on eye-tracking revealed that the number of fixations and fixation time were most frequently used to measure visual attention. Therefore, we adopted the number of gazes and gaze times as indicators to measure the user’s visual attention in the VR environment. The number of fixations indicated the extent to which the area of interest was gazed at, and the fixation time was defined as the total amount of time spent gazing at the area of interest. Furthermore, we propose an eye-tracking VR ad monitoring system to measure the visual attention to ads presented in a VR environment.

3. Eye-Tracking VR Advertisement Monitoring System

The eye-tracking VR advertisement monitoring system (EVAMS) tracks the eye movements of users experiencing VR content and analyzes how they respond to advertisements in a virtual space. When a user wears a VR head-mounted display (HMD) equipped with an eye-tracking function and accesses the VR environment, EVAMS starts automatically. If the user stares at the VR intra-environment advertisements, it distinguishes them, and the corresponding system writes eye-tracking data regarding the number of fixations and fixation times for each ad. When the user’s VR environment observation was completed, the eye-tracking data were automatically saved as a CSV file. Finally, the user’s gaze reactions were analyzed.

3.1. System Development

The proposed system was developed using the game engine Unity version 2020.3.9.f1 and C#. The SRanipal SDK, an eye-tracking algorithm provided by HTC Vive, was used to reflect the direction and position of a subject’s gaze in a VR environment. In Unity, the user corresponded to Main Camera. In order to extract information about the user’s gaze, the Gaze Ray object in the SRanipal SDK was assigned as a child of the Main Camera. To visualize eye movements and extract collide data about the eye-tracking data in detail, the subject’s gaze was implemented in the form of a straight line (Ray) extending from the Main Camera to the subject (Figure 1c). The Main Camera was assigned a Tracked Pose Driver component (Figure 1a). This is a setting in which real-time information about the HMD is received, with the Pose Source referencing the HMD to extract the center eye. In addition, the Tracking Type was set to rotation only so that the user could rotate 360° with its position fixed. The Limit Scene script was used to allow the administrator to adjust the experiment time and automatically shut down the system when the experiment was over.
The Gaze Ray was assigned a Collider component and a Rigidbody component to create eye-tracking data when an advertisement was recognized (Figure 1b). The Collider was set to have the same thickness and length as the Ray; therefore, eye movement was accurately reflected. The Rigidbody was set such that the position and rotation were fixed so that the position and shape did not change after collision with other objects. Furthermore, the Collider Manager script and the Write CSV script are applied such that when the ray is recognized in the advertisement, the eye-tracking data are written and stored as a file at the end. The Collider Manager script was designed to create real-time data on the number of fixations and fixation time, including the name of the advertisement, metrics, and measurements. In the script, the eye-tracking data creation process proceeds in three steps: start, progress, and end. In the starting step, the number of fixations is measured based on the point in time when the Ray touches the advertisement, and one fixation is added each time the user gazes at the advertisement. During the progress step, the time taken to gaze at each advertisement was recorded in units of 0.02 s. When the subject gazes at another object and gazes at the advertisement again, fixation time accumulates. In the end step, the fixation time measured in the progress step was synthesized and output. The Write CSV script was used to save the eye-tracking data. The data created during the experiment were automatically stored in the Unity project as a CSV file shortly after the completion of the experiment. In addition, to measure the advertisement effect, a tag was assigned to distinguish each advertisement, and when the Ray contacts the Collider, the physical data were returned.

3.2. Prototyping the VR Content for Experiment

The VR content for the experiment included EVAMS and consisted of Start, Test, and End scenes. The content screens in all scenes appear differently depending on the subject and the experiment manager. First, in the Start scene, experimental guidance appears for the subject, and the launch calibration button and start button are presented to the manager (Figure 2a). The eye calibration system provided by Steam VR is activated when the manager clicks on the launch calibration function button. After the eye calibration is complete, the screen returns to the previous screen, and the manager can start the experiment by clicking the start button. In the Test scene, the EVAMS was applied, and the gaze response was collected while the subject performed the experiment (Figure 2b). The experimental situation could be checked in real-time on a PC screen. In addition, the manager can configure the experimental environment according to the situation by changing the virtual space, advertisement, or material of the advertisement. When the experiment time set in the Test scene was exceeded, the content was automatically transferred to the End scene that informs ‘End’ (Figure 2c).

4. Method

4.1. Research Model

This study aims to verify user responses in a VR environment and derive an effective advertisement type. The advertisement types applied to VR were avatar costumes, products, and wall posters. An avatar costume is a brand-logo T-shirt applied to a fixed avatar. For the product, a 3D model similar to the actual product was placed, and for the wall poster, a brand logo poster was attached to the wall.
User response was investigated in two stages: visual attention and advertisement effect. Visual attention is the gaze response collected by EVAMS. First, memories cannot be formed when the fixation time is less than 0.3 s [4,26]. Therefore, we judged fixation as valid only when gazing at an advertisement of interest for more than 0.3 s. Therefore, while experiencing the VR environment, the number of fixations and fixation time for each advertisement were measured to analyze the type of advertisements the subjects gazed at frequently and for a long time.
The advertisement effect, as measured by the survey, was a cognitive response to the advertisement, and advertisement attention and recall were investigated. Advertisement attention refers to the level of conspicuousness and attention due to visual characteristics that arouse interest in the advertisement of a particular brand or product or the level at which consumers perceive it as important [12]. The form of advertisement works as a visual variable that affects the advertisement effect, and it has been shown that products and brands with high attention are highly likely to be selected [27]. Advertisement recall is an index that measures how accurately people remember an advertisement or how much attention they pay to it. When visual attention is paid to an advertisement, the consumer recognizes and remembers it through cognitive processing [28]. In addition, advertisements are directly related to consumer activities, and recall is higher when they are presented clearly [29]. Therefore, advertisement attention and advertisement recall were selected as measurement indicators to measure ad recognition and memory according to advertisement type in a virtual environment.
In addition, we analyzed the correlation between visual attention and advertising effects. Existing research surveys and studies that use eye-tracking use the number of fixations as an index to measure the level of visual attention. However, as self-administered surveys involve subjective thoughts and there is a time lag between the experiment and the survey, quantitative measurement using eye-tracking technology is necessary. In addition, eye-tracking and questionnaires are sometimes conducted in parallel to analyze visual attention in depth. Likewise, we also analyzed the relationship between the number of fixations and advertisement attention and between fixation time and advertisement recall. This is because the subjects will frequently stare at advertisements in which they are interested [30,31], which will result in high ad attention. In addition, to recognize an advertisement, a certain period of time is required to make visual efforts. Therefore, the more time that is paid to an ad, the better it is to form memories, such as recall [29].

4.2. Materials and Equipment

Vive Pro Eye (Tobii AB) HMD with an eye-tracking function was used as the VR device for eye-tracking. It outputs gaze data at a frequency of 120 Hz, has an accuracy of 0.5–1.1°, and a trackable field of view of 110°. It also supports eye calibration through SteamVR, enabling accurate eye-tracking based on the subject. In addition, the Vive Pro Eye HMD was confirmed to be an appropriate tool for data collection, such as eye movement delay, speed, error rate, and pupil response in gaze response analysis through eye-tracking [32,33].
The environment in which the experiment was conducted is illustrated in Figure 3. The subject was placed in the center of the space where the base station was installed, and a chair was placed to rotate 360° to observe the virtual space. The base station is a device that sets the range of the virtual space in real space. In this experiment, the base station is installed facing each other at both ends of the diagonal of the subject and on the ceiling for the progress of the experiment. The experiment manager was placed close to the subject and equipment so that the experiment could be conducted and the experimental status could be monitored using a PC.

4.3. Stimulus

One brand was used as the experimental stimulus to exclude variables related to brand preference and awareness. Among the 100 best global brands, Coca-Cola, which provides all three types of clothes, products, and wall posters, was selected as the experimental brand [34]. The 3D models of Unity Asset and Adobe Mixamo were used as experimental stimuli. All 3D models are fixed without movement, the clothing is the NPC avatar’s costume, the product is a model of a real can model in Coca-Cola, and the poster is a Coca-Cola logo image. Texture images of 3D models were edited using Adobe Photoshop to insert the Coca-Cola logo into each model. The experimental stimuli, produced as avatar costumes, products, and wall posters, are shown in Figure 4.
The virtual environment in which the stimuli were placed was composed of the brand’s own space to exclude variables caused by the gap between the advertisement and the space and was created by referring to the Coca-Cola pop-up store that was operated in reality. In addition, to exclude variables due to the distance and size between the subject and the advertisement, they were placed at the same distance and in a similar size, as shown in Figure 5.

4.4. Subjects and Experiment Process

A small-scale experimental study was conducted at Changwon National University in South Korea. The experiment included 30 participants (female: 14, male: 16, mean age: 24 years) who voluntarily participated in the university community. As listed in Table 1, eye-tracking experiments and surveys were conducted using eye-tracking VR monitoring systems. Before the experiment, the subjects were guided, and after wearing the VR HMD, an eye-tracking experiment was started. Prior to the experiment, we informed subjects that it was an eye-tracking experiment and guided them to freely observe the virtual environment for 2 min after eye calibration. The subjects then wore VR HMDs and began the experiment. To improve the accuracy of the eye-tracking data, eye calibration was performed for each subject, and the VR environment in which the experimental stimuli were placed was observed for 2 min. After the experiment was completed, the participants completed the survey.

4.5. Survey and Analysis Method

To measure the effectiveness of advertisements, advertisement attention and advertisement recall were investigated, and related studies were modified and applied to suit this study. Items on advertisement attention and advertisement recall were presented identically to the three types of advertisements and evaluated on a 5-point Likert scale. Table 2 lists five items for advertisement attention [35,36]. Advertisement recall was presented with three multiple-choice items and an additional item describing memorable elements [37,38]. Cronbach’s α of the advertisement attention item was 0.683, advertisement recall was 0.637, and the total questionnaire was 0.785. In general, a Cronbach’s alpha value of 0.6 to 0.7 is considered an acceptable level of reliability [39].
Data analysis was performed using SPSS Statistics ver. 27 and verified based on commonly used confidence intervals of 95%, 99%, and 99.9%. First, the reliability of the survey was analyzed. Then, a one-way analysis of variance (ANOVA) was conducted to examine the average difference between eye responses and advertisement effect according to the type of advertisement. The eye-tracking data were reflected as the data derived from EVAMS, and advertisement attention was calculated as the average value of each item surveyed on a 5-point Likert scale. Finally, the correlation was analyzed to examine the relationship between the gaze response derived from the EVAMS and the advertisement effect survey and interpreted according to the standard correlation coefficient [40].

5. Results

5.1. Visual Attention and Advertisement Effect

The number of fixations by advertisement type was F = 3.629, p = 0.031, which was confirmed to have a statistically significant average difference based on the 95% confidence interval as listed in Table 3. The avatar costume (average = 5.70), wall poster (average = 5.03), and product (average = 4.33) had the highest averages, and the subjects gazed at the avatar costume most often.
Fixation time showed a significant difference at a 95% confidence interval (F = 4.847, p = 0.01) as listed in Table 4. The average fixation time was derived in the order of avatar costumes (average = 7.36), wall posters (average = 7.09), and products (average = 7.09). It was confirmed that the subjects gazed at the avatar costumes the longest.
There was a significant difference in the 95% confidence interval (F = 3.721, p = 0.028) for advertisement attention by advertisement type as listed in Table 5. The average was in the order of avatar costumes (average = 4.25), wall posters (average = 4.02), and products (average = 3.79). The subjects felt that the avatar costumes induced the greatest visual attention.
Advertisement recall was calculated by adding the average value of the Likert 5-point scale and the converted value of the description item. There was a significant average difference at the 95% confidence interval with F = 3.451 and p = 0.036 as listed in Table 6. It was confirmed that the avatar costumes (average = 5.10), wall posters (average = 4.62), and products (average = 4.47) had the highest averages and that the subjects rated the avatar costumes as the most memorable.

5.2. Correlation between Visual Attention and Ad Effectiveness

There was a significant correlation between the number of fixations and advertisement attention at the 90% confidence interval. The correlation coefficient was positive at 0.206; however, as listed in Table 7, the correlation was weak.
Correlation analysis between fixation time and advertisement recall revealed a significant correlation at a confidence interval of 95%, and a correlation coefficient of 0.239 was derived, indicating a positive but low correlation.

6. Discussion

The purpose of this study was to derive effective advertisement types by analyzing visual attention and advertisement effects according to the advertisement types applied in VR. The advertisement types were defined as avatar costumes, products, and wall posters. To verify the user response to the advertisement, visual attention was measured by developing EVAMS, which measures the number of fixations and fixation time. To analyze the participants’ subjective advertisement effects through the survey, advertisement attention and advertisement recall were investigated. The results are as follows.
As a result of analyzing visual attention according to advertisement type in VR through EVAMS, there was a significant difference between the averages in the number of fixations and fixation time according to advertisement type. With regard to the number of fixations and fixation time in VR, avatar costumes, wall posters, and products were observed to be the highest. By analyzing the subjects’ subjective advertisement effects through a survey, both advertisement attention and advertisement recall were derived in the following order: avatar costumes, wall posters, and products. It was observed that the avatar costume evoked the highest visual attention, followed by the wall poster and product. This supports the results of previous studies showing that the more salient and conspicuous the type of advertisement related to the direct behavior of users or people, the higher the attention and recall of the ad [41,42]. However, in the case of the wall poster and product, the subjects felt that the wall poster and product were the backgrounds, as in the creative placement of Babin and Carder (1996) [16].
The results of the correlation analysis, which was conducted to identify the connection between quantitative and qualitative analyses of user responses to advertisement types, showed significant correlations between the number of fixations and advertisement attention and between fixation times and advertisement recall. For the number of fixations and advertisement attention, the correlation coefficient was positive at 0.206 but low. This means that the more frequently the user fixates on the advertisement, the higher the level of attention to the ad. For fixation time and advertisement recall, the correlation coefficient is 0.239, indicating a positive correlation, and it is confirmed that the longer the advertisement fixation, the higher the advertisement recall. Therefore, the higher the number of fixations, the higher the advertisement attention, and the longer the fixation time, the higher the advertisement recall. Therefore, the higher the number of gazes and the longer the gaze time, the more positive the effect on the advertisement, which is consistent with the results of previous studies [29,43]. Furthermore, as the correlation between the results of the EVAMS and the survey was derived as a positive relationship, eye-tracking technology is considered able to collect the gaze response data necessary for measuring advertisement effectiveness [32,33].
This study has several limitations. First, the experiment conducted in this study was designed to allow the subjects to observe the advertisements while rotating the virtual VR environment 360°. Moreover, in the experiment, we presented the same advertisements in the same location to all users. This excludes the case in which the user freely moves in a virtual space using a control device, such as a controller in VR content; therefore, it does not reflect actual user activity. It may also be considered that this paper excludes variables for the location of the advertisement or the order in which the user contacts it. Second, it is a space designed for experiments, which is different from the environment experienced by actual VR users. VR content is characterized by rich experiences in various places. Therefore, we need to apply various types of advertisements to diverse types of spaces and conduct experiments on them. Third, it is necessary to select strict standards and brands of experimental stimuli. Consumer responses to advertisements are likely to be influenced by existing brand attitudes toward companies; therefore, the brand of the experimental stimulant should be selected considering these factors. Fourth, the sample size of the experimental group was small. In the future, a larger number of participants should be recruited to derive equitable research results. Finally, it is necessary to identify a comprehensive correlation between eye-tracking data and advertising effectiveness. We classified and analyzed each matrix according to its purpose; however, this was insufficient to draw a comprehensive conclusion for measuring advertising effectiveness. If these limitations are addressed and the study is supplemented, more meaningful results can be obtained.

7. Conclusions

In this study, to derive effective advertisement types for VR environments, a quantitative method using eye-tracking and a qualitative method using a survey were explored in depth. EVAMS was developed to present a methodology for measuring advertisements in a virtual environment and directing VR advertisements. This can be used as a reference for effective advertisement production in the process of directing advertisements to be reflected in VR and developing VR content. It is also expected to be used as a system to measure advertisement effectiveness and set advertisement costs for VR content producers and advertisers.
In the future, we plan to analyze advertisement effects according to the color and arrangement of advertisements in the virtual environment or movement by reflecting on user activities in VR. In addition, we will analyze the accuracy according to the user’s position and shape of the experiment in the gaze response analysis using the eye-tracking VR monitoring system.

Author Contributions

Conceptualization, D.K.; methodology, J.K. and S.N.; software, D.K.; validation, J.K.; investigation, D.K.; writing—original draft preparation, D.K.; writing—review and editing, J.K. and S.N.; project administration, S.N.; funding acquisition, S.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korean government (MSIT) (No. 2021-0-00986, Development of Interaction Technology to Maximize Realization of Virtual Reality Contents using Multimodal Sensory Interface).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Facebook IQ. AR/VR: New Dimensions of Connection [Online]. 2021. Available online: https://www.facebook.com/business/news/insights/future-ar-vr (accessed on 20 December 2022).
  2. Grand View Research. Metaverse Market Size, Share & Trends Report, 2030 [Online], GVR-4-68039-915-5; Grand View Research. 2021. Available online: https://www.grandviewresearch.com/industry-analysis/metaverse-market-report# (accessed on 20 December 2022).
  3. Poole, A.; Ball, L.J. Eye Tracking in HCI and Usability Research. In Encyclopedia of Human Computer Interaction; IGI Global: Pennsylvania, PA, USA, 2006; pp. 211–219. [Google Scholar] [CrossRef]
  4. Borys, M.; Plechawska-Wójcik, M. Eye-Tracking Metrics in Perception and Visual Attention Research. Eur. J. Med. Technol. 2017, 3, 11–23. [Google Scholar]
  5. Herrewijn, L. The Effectiveness of In-Game Advertising: The Role of AD Format, Game Context and Player Involvement; University of Antwerp: Antwerp, Belgium, 2015. [Google Scholar]
  6. International, G. Activision and Nielsen Entertainment Release Results of Pioneering Research on In-Game Advertising. Available online: https://www.gamesindustry.biz/activision-and-nielsen-entertainment-release-results-of-pioneering-research-on-in-game-advertising (accessed on 20 December 2022).
  7. Kim, Y.; Leng, H.K. Effectiveness of In-Game Advertisement: An Examination of Repetition Effect, Brand Familiarity and the Relationship between Gaming Skills and Advertising Execution. J. Glob. Sport Manag. 2017, 2, 42–64. [Google Scholar] [CrossRef]
  8. DMC Report. What Marketers Need to Know in the Metaverse Era [Online]; IRIF20210003; DMC Media. 2021. Available online: https://www.dmcreport.co.kr/report/indepthReport/freeView?reportcode=DMCIRF20210028&drtopdeth=RPT_TYPE_2&keyword_type=REPORT_KEYWORD_1 (accessed on 20 December 2022).
  9. Lee, K.E.; Chang, D.R. A Study on Engagement Branded Gamification for The Metaverse Space- Focusing On The Case Of Global Brands. J. Brand Des. Assoc. Korea 2021, 19, 277–290. [Google Scholar] [CrossRef]
  10. Jo, J.W. A Study on Deriving Success Factors and Activating Methods through Metaverse Marketing Cases. J. Digit. Converg. 2022, 20, 791–797. [Google Scholar] [CrossRef]
  11. Kahneman, D.; Tversky, A. On the Psychology of Prediction. Psychol. Rev. 1973, 80, 237–251. [Google Scholar] [CrossRef] [Green Version]
  12. James, W. The Principles of Psychology; Cosimo Classics: New York, NY, USA, 1890. [Google Scholar]
  13. Hsieh, Y.-C.; Chen, K.-H. How Different Information Types Affect Viewer’s Attention on Internet Advertising. Comput. Hum. Behav. 2011, 27, 935–945. [Google Scholar] [CrossRef]
  14. Sajjacholapunt, P.; Ball, L.J. The Influence of Banner Advertisements on Attention and Memory: Human Faces with Averted Gaze Can Enhance Advertising Effectiveness. Front. Psychol. 2014, 5, 166. [Google Scholar] [CrossRef] [Green Version]
  15. Venkatraman, V.; Dimoka, A.; Pavlou, P.A.; Vo, K.; Hampton, W.; Bollinger, B.K.; Hershfield, H.; Ishihara, M.; Winer, R.S. Predicting Advertising Success beyond Traditional Measures: New Insights from Neurophysiological Methods and Market Response Modeling. SSRN Electron. J. 2015, 52, 436–452. [Google Scholar] [CrossRef] [Green Version]
  16. Babin, L.A.; Carder, S.T. Viewers’ Recognition of Brands Placed within a Film. Int. J. Advert. 1996, 15, 140–151. [Google Scholar] [CrossRef]
  17. Russell, C.A. Investigating the Effectiveness of Product Placements in Television Shows: The Role of Modality and Plot Connection Congruence on Brand Memory and Attitude. J. Consum. Res. 2002, 29, 306–318. [Google Scholar] [CrossRef]
  18. Miki, K.; Nagamatsu, T.; Hansen, D.W. Implicit User Calibration for Gaze-Tracking Systems Using Kernel Density Estimation. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 22–28 March 2016. [Google Scholar] [CrossRef]
  19. Pastel, S.; Chen, C.H.; Martin, L.; Naujoks, M.; Petri, K.; Witte, K. Comparison of Gaze Accuracy and Precision in Real-World and Virtual Reality. Virtual Real. 2021, 25, 175–189. [Google Scholar] [CrossRef]
  20. Zhang, L.M.; Zhang, R.X.; Jeng, T.S.; Zeng, Z.Y. Cityscape Protection Using VR and Eye Tracking Technology. J. Vis. Commun. Image Represent. 2019, 64, 102639. [Google Scholar] [CrossRef]
  21. Accenture Kellogg’s Virtual Reality Merchandising: Case Study. Available online: https://www.accenture.com/cz-en/case-studies/consumer-goods-services/virtual-reality-merchandising-with-kelloggs (accessed on 20 December 2022).
  22. Just, M.A.; Carpenter, P.A. Eye Fixations and Cognitive Processes. Cogn. Psychol. 1976, 8, 441–480. [Google Scholar] [CrossRef]
  23. Rayner, K. The 35th Sir Frederick Bartlett Lecture: Eye Movements and Attention in Reading, Scene Perception, and Visual Search. Q. J. Exp. Psychol. 2009, 62, 1457–1506. [Google Scholar] [CrossRef]
  24. Lai, M.L.; Tsai, M.J.; Yang, F.Y.; Hsu, C.Y.; Liu, T.C.; Lee, S.W.Y.; Lee, M.H.; Chiou, G.L.; Liang, J.C.; Tsai, C.C. A Review of Using Eye-Tracking Technology in Exploring Learning from 2000 to 2012. Educ. Res. Rev. 2013, 10, 90–115. [Google Scholar] [CrossRef]
  25. Rubo, M.; Gamer, M. Social Content and Emotional Valence Modulate Gaze Fixations in Dynamic Scenes. Sci. Rep. 2018, 8, 3804. [Google Scholar] [CrossRef]
  26. Borkin, M.A.; Bylinskii, Z.; Kim, N.W.; Bainbridge, C.M.; Yeh, C.S.; Borkin, D.; Pfister, H.; Oliva, A. Beyond Memorability: Visualization Recognition and Recall. IEEE Trans. Vis. Comput. Graph. 2016, 22, 519–528. [Google Scholar] [CrossRef]
  27. Heath, R. How Do We Predict Advertising Attention and Engagement; School of Management University of Bath Working Paper; University of Bath: Bath, UK, 2007. [Google Scholar]
  28. Rayner, K. Eye Movements in Reading and Information Processing: 20 Years of Research. Psychol. Bull. 1998, 124, 372–422. [Google Scholar] [CrossRef] [PubMed]
  29. Wedel, M.; Pieters, R. Eye Fixations on Advertisements and Memory for Brands: A Model and Findings. Mark. Sci. 2000, 19, 297–312. [Google Scholar] [CrossRef] [Green Version]
  30. Kroeber-Riel, W.; Barton, B. Scanning Ads—Effects of Position and Arousal Potential of Ad Elements. Curr. Issues Res. Advert. 1980, 3, 147–163. [Google Scholar] [CrossRef]
  31. Pechmann, C.; Stewart, D.W. The Psychology of Comparative Advertising. In Attention, Attitude, and Affect in Response to Advertising; Psychology Press: London, UK, 2019; pp. 79–96. [Google Scholar]
  32. Imaoka, Y.; Flury, A.; de Bruin, E.D. Assessing Saccadic Eye Movements with Head-Mounted Display Virtual Reality Technology. Front. Psychiatry 2020, 11, 572938. [Google Scholar] [CrossRef] [PubMed]
  33. Luo, Z.; O’Steen, B.; Brown, C. The Use of Eye-Tracking Technology to Identify Visualisers and Verbalisers: Accuracy and Contributing Factors. Interact. Technol. Smart Educ. 2020, 17, 229–247. [Google Scholar] [CrossRef]
  34. Best Global Brands. Available online: https://interbrand.com/best-global-brands/ (accessed on 20 December 2022).
  35. Thorson, E.; Chi, A.; Leavitt, C. Attention, Memory, Attitude, and Conation: A Test of the Advertising Hierarchy. In Advances in Consumer Research; The Association for Consumer Research: Seattle, WA, USA, 1992; Volume 19, pp. 366–379. [Google Scholar]
  36. Chang, Y.; Thorson, E. Media Multitasking, Counterarguing, and Brand Attitude: Testing the Mediation Effects of Advertising Attention and Cognitive Load. Comput. Hum. Behav. 2023, 139, 107544. [Google Scholar] [CrossRef]
  37. Chattopadhyay, A.; Alba, J.W. The Situational Importance of Recall and Inference in Consumer Decision Making. J. Consum. Res. 1988, 15, 209140. [Google Scholar] [CrossRef]
  38. Norris, C.E.; Colman, A.M. Context Effects on Recall and Recognition of Magazine Advertisements. J. Advert. 1992, 21, 37–46. [Google Scholar] [CrossRef] [Green Version]
  39. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis; Cengage: Boston, MA, USA, 2019. [Google Scholar]
  40. Kader, G.; Franklin, C. The Evolution of Pearson’s Correlation Coefficient. Math. Teach. 2008, 102, 292–299. [Google Scholar] [CrossRef]
  41. Balasubramanian, S.K. Beyond Advertising and Publicity: Hybrid Messages and Public Policy Issues. J. Advert. 1994, 23, 29–46. [Google Scholar] [CrossRef]
  42. Dhar, R.; Wertenbroch, K. Consumer Choice between Hedonic and Utilitarian Goods. J. Mark. Res. 2000, 37, 60–71. [Google Scholar] [CrossRef]
  43. Lee, J.; Ahn, J.H. Attention to Banner Ads and Their Effectiveness: An Eye-Tracking Approach. Int. J. Electron. Commer. 2012, 17, 119–137. [Google Scholar] [CrossRef] [Green Version]
Figure 1. System Setting.
Figure 1. System Setting.
Applsci 13 07063 g001
Figure 2. Individual Scenes of EVAMS.
Figure 2. Individual Scenes of EVAMS.
Applsci 13 07063 g002
Figure 3. Physical environment for the experiment.
Figure 3. Physical environment for the experiment.
Applsci 13 07063 g003
Figure 4. VR experimental content configuration.
Figure 4. VR experimental content configuration.
Applsci 13 07063 g004
Figure 5. VR experimental content configuration.
Figure 5. VR experimental content configuration.
Applsci 13 07063 g005
Table 1. Experimental procedure.
Table 1. Experimental procedure.
ProcessTime (min)
Advance explanationExperiment guide and wearing VR HMD1–2
ExperimentEye-trackingCalibration1–2
VR Environment2
Survey5
Table 2. Survey reliability.
Table 2. Survey reliability.
ItemQuestionCronbach’s α
Advertisement attention1. I looked at A with interest.0.683
2. I looked at A.
3. A caught my eye.
4. I looked at A with attention.
5. I was an eyesore with A.
Advertisement recall6. I remember A.0.637
7. A comes to mind.
8. I remember the location of A.
Total0.785
Table 3. One-way ANOVA of number of fixations by ad types.
Table 3. One-way ANOVA of number of fixations by ad types.
ANOVA
Number of Fixations
Sum of SquaresdfMean SquareFSig.
Between groups28.022214.0113.6290.031 **
Within groups335.933873.861
Total363.95689
Type of AdvertisementMeanSDMinMax
Avatar costume5.702.02211
Product4.331.4727
Wall poster5.032.31110
** p < 0.05.
Table 4. One-way ANOVA of fixation time by ad types.
Table 4. One-way ANOVA of fixation time by ad types.
ANOVA
Fixation Time
Sum of SquaresdfMean SquareFSig.
Between groups105.708252.8544.8470.010 **
Within groups948.6718710.904
Total1054.37989
Type of AdvertisementMeanSDMinMax
Avatar costume7.363.061.6615.54
Product4.942.590.9510.46
Wall poster7.094.080.7715.53
** p < 0.05.
Table 5. One-way ANOVA of advertisement attention by ad types.
Table 5. One-way ANOVA of advertisement attention by ad types.
ANOVA
Advertisement Attention
Sum of SquaresdfMean SquareFSig.
Between groups2.88821.4443.7210.028 **
Within groups33.759870.388
Total36.64789
Type of AdvertisementMeanSDMinMax
Avatar costume4.230.592.675.00
Product3.790.592.675.00
Wall poster4.020.692.175.00
** p < 0.05.
Table 6. One-way ANOVA of advertisement recall by ad types.
Table 6. One-way ANOVA of advertisement recall by ad types.
ANOVA
Advertisement Recall
Sum of SquaresdfMean SquareFSig.
Between groups6.53223.2663.4510.036 **
Within groups82.334870.946
Total88.86689
Type of AdvertisementMeanSDMinMax
Avatar costume5.100.823.336.00
Product4.471.022.006.00
Wall poster4.621.062.336.00
** p < 0.05.
Table 7. Correlation between visual attention and ad effectiveness.
Table 7. Correlation between visual attention and ad effectiveness.
Number of FixationsFixation Time
Advertisement attention0.206 *0.085
Advertisement recall0.314 ***0.239 **
* p < 0.10, ** p < 0.05, *** p < 0.01.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kang, D.; Kwon, J.; Nam, S. Research on Effective Advertising Types in Virtual Environment. Appl. Sci. 2023, 13, 7063. https://doi.org/10.3390/app13127063

AMA Style

Kang D, Kwon J, Nam S. Research on Effective Advertising Types in Virtual Environment. Applied Sciences. 2023; 13(12):7063. https://doi.org/10.3390/app13127063

Chicago/Turabian Style

Kang, Donghyun, Joungheum Kwon, and Sanghun Nam. 2023. "Research on Effective Advertising Types in Virtual Environment" Applied Sciences 13, no. 12: 7063. https://doi.org/10.3390/app13127063

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop