Next Article in Journal
Developing a 3D Hydrodynamic and Water Quality Model for Floating Treatment Wetlands to Study the Flow Structure and Nutrient Removal Performance of Different Configurations
Previous Article in Journal
Effects of Photovoltaic Solar Farms on Microclimate and Vegetation Diversity
Previous Article in Special Issue
Application of Visitor Eye Movement Information to Museum Exhibit Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Employing Eye Tracking to Study Visual Attention to Live Streaming: A Case Study of Facebook Live

1
Department of Digital Media Design, Chang Jung Christian University, Tainan 711301, Taiwan
2
School of Information and Design, Chang Jung Christian University, Tainan 711301, Taiwan
3
Department of Computer Science and Information Engineering, National Taichung University of Science and Technology, Taichung City 404348, Taiwan
4
Bachelor Degree Program in Smart Life Application, Chang Jung Christian University, Tainan 711301, Taiwan
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(12), 7494; https://doi.org/10.3390/su14127494
Submission received: 30 April 2022 / Revised: 13 June 2022 / Accepted: 14 June 2022 / Published: 20 June 2022
(This article belongs to the Special Issue Sustainable and Human-Centric E-Commerce)

Abstract

:
In recent years, the COVID-19 pandemic has led to the development of a new business model, “Live Streaming + Ecommerce”, which is a new method for commercial sales that shares the goal of sustainable economic growth (SDG 8). As information technology finds its way into the digital lives of internet users, the real-time and interactive nature of live streaming has overturned the traditional entertainment experience of audio and video content, moving towards a more nuanced division of labor with multiple applications. This study used a portable eye tracker to collect eye movement information from participants watching Facebook Live, with 31 participants who had experience using the live streaming platform. The four eye movement indicators, namely, latency of first fixation (LFF), duration of first fixation (DFF), total fixation durations (TFD), and the number of fixations (NOF), were used to analyze the distribution of the visual attention in each region of interest (ROI) and explore the study questions based on the ROIs. The findings of this study were as follows: (1) the fixation order of the ROIs in the live ecommerce platform differed between participants of different sexes; (2) the DFF of the ROIs in the live ecommerce platform differed among participants of different sexes; and (3) regarding the ROIs of participants on the live ecommerce platform, participants of different sexes showed the same attention to the live products according to the TFD and NOF eye movement indicators. This study explored the visual search behaviors of existing consumers watching live ecommerce and provides the results as a reference for operators and researchers of live streaming platforms.

1. Introduction

The United Nations launched 17 Sustainable Development Goals (SDGs) in 2015, including 169 targets that can be grouped into three pillars of sustainable development: economy, society, and the environment [1,2]. Interdisciplinary cooperation with other parties is required to achieve the SDGs, as the goals relate to all aspects of human endeavors, and in this context, business operations play an important role in supporting the achievement of the goals, as they involve many business activities that contribute to the improvement of living standards [3,4,5,6].
The impacts of the COVID-19 pandemic on governments, industries, and all business activities worldwide have seriously challenged the achievement of the SDGs and increased the complexity of the intersecting SDGs [7,8]. Due to this impact, the need for digital technology applications has increased, and the study published in The World in 2050 (TWI205), Transformations to Achieve the Sustainable Development Goals, recommends that the world adopt the United Nations SDGs as the basis for immediate key transformations [9]. Under the influence of COVID-19, many brick-and-mortar shops and businesses closed down; however, online businesses benefited from the growth of the “Stay-at-Home Economy” [10,11,12]. As a result, existing online business models flourished around the world during the pandemic, giving rise to a new business model: live ecommerce platforms, which share the goal of sustainable economic growth with SDG 8. This reinforces the fact that during the pandemic, going out has been reduced but not spending, and online and offline commerce can play complementary roles [13,14,15].
The main consumption scenario of consumers has changed from brick-and-mortar shops to business activities in an electronic form combining the internet and commerce (ecommerce), meaning the T2O (TV to online) business model, which links television programs with ecommerce to create an innovative model of combining live streaming with ecommerce. This business model has brought a new shopping experience to consumers and created the so-called “internet celebrity economy”, which has created opportunities for the development of live ecommerce [16]. Live ecommerce is a new business model driven by real-time interactions, which effectively stimulates consumers and develops consumers into fans to follow the live broadcast through entertaining live content that meets their needs [17,18,19], thus bringing money-catching (eye-catching) traffic to the live shows and, in turn, providing a new channel of interaction and communication between consumers and merchants [20]. Consequently, ecommerce platforms have been actively joining live ecommerce in recent years [21]; for example, Taobao, JD.com, Jumei Youpin, and Facebook fan pages have all launched live ecommerce features [22,23].
According to MIC’s survey data on live streaming in Taiwan, 31.3% of Taiwanese netizens follow a specific live streamer (a user who plays the role of a live stream host), and 78.4% of those who watch video and audios online have watched a live broadcast [24]. However, live streaming is not just a one-sided personal showcase for the live streamer, it also includes the interactions and reactions of the live viewers to the content of the live stream, which together form part of the live ecommerce platform. Thus, how to make consumers willing to learn about brands and products, share their ideas and needs, and receive immediate responses from live streamers have become the most important issues for live ecommerce to create a new commerce model different from the traditionally static way of displaying ecommerce products [25]. Most existing study methods of ecommerce live streaming are based on questionnaires or interviews, which are used to infer the participants’ mental activities. However, these study methods may be biased by the participants’ memory errors, introspective correctness errors, or subconscious attempts to conform to social expectations, which may result in the research data not being detailed enough or not being able to observe the overall cognitive process, causing invalid data and experimental bias [26]. Therefore, in order to explore the cognitive processes of participants while watching live ecommerce, this study used a more natural and real-time method related to physiological indicators to compensate for the shortcomings of traditional methods.
Due to the ease of use and maturity of eye tracking technology [27,28], eye tracking research has been used in a wide range of research fields since its inception for reading behavior analyses, such as cognitive science, psychology, human factors engineering, human–computer interaction interfaces, virtual reality, digital learning, artificial intelligence, machine learning and so on [29,30,31]. Eye tracking technology uses the principle of image processing to capture the infrared rays reflected from the pupil and records eye movements using a special camera that locks onto the eyes to analyze the eye tracking process. The observation and analysis of eye movements also facilitate the study of users’ perceptual recognition or attention distribution, thus, inferring their cognitive processes [32,33]. Fixation and Saccade are two of the most commonly used methods to process data in eye tracking studies [34,35,36]. “Fixation” is defined as the state of visual attention to a specific area between one eye movement and the next, and is the primary means of receiving visual information. “Saccade” is defined as a continuous and rapid movement between eye fixation points, meaning a rapid eye movement that leads to a specific visual target [37,38,39]. The sequence of eye movements, which includes saccade and fixation, is known as the scan path. In addition, in the process of studying eye tracking technology, in order to record and analyze the eye movements of users’ browsing behavior in a physiologically and psychologically unaffected state, researchers must set regions of interest (ROIs) to observe and collect eye fixation and visual behavior at certain locations [40,41,42,43,44]. Based on the interactions of the ROIs and visual information, the latency of first fixation (LFF), duration of first fixation (DFF), total fixation durations (TFD), and number of fixations (NOF) are commonly used as evaluation criteria for processing eye movement data [45,46]. In recent years, the rapid development of the internet has led to the explosive growth of ecommerce, resulting in innovative models for live ecommerce. By recognizing the importance of the behavioral process, some researchers have used eye trackers to record the order in which consumers browse products, as well as the length of time they spend on messages in a particular browsing task or behavior, in order to understand their visual search methods and decision-making processes.
Based on the aforementioned research background and motivation, this study attempted to explore and evaluate the study of the participants’ fixation sequence of their visual attention, duration of their first fixation, and eye movement behavior analysis by using a portable eye tracker in the four ROIs defined by the researcher while watching live ecommerce. Therefore, the questions to be answered in this study are as follows:
In the ecommerce experiment, in terms of the participants:
  • What is the order of fixation on the ROIs?
  • What is the order of the attention span for the ROIs?
  • What is the amount of dedicated fixation given to each ROI?

2. Related Work

2.1. Business Model and Studies of Live Ecommerce

With the popularity of mobile vehicles, the rapid development of 5G, and the rise of the OTT (over-the-top) audio and video industry, live streaming applications have formed a whirlwind, and the in-depth combination of online live streaming and ecommerce has attracted a large amount of traffic and business opportunities for live ecommerce [47]. In view of this, many ecommerce companies are hoping to use the appeal, performance, and influence of internet celebrities to promote their products, thus, creating the so-called internet celebrity economy and increasing the likelihood of potential consumers [48]. The first year of the “live streaming + ecommerce” model was 2016, in which internet celebrities used their image, appeal, or influence to deepen consumers’ impressions of their products and brands, thus, turning their popularity into online profits, which creates a win–win economic model for ecommerce, sellers, and internet celebrities [49]. In recent years, many scholars have used questionnaires or interviews to explore how live ecommerce, meaning the new form of online media with interactive and real-time characteristics, increases consumers’ urges to buy and consume, as well as their desire to experience and be entertained, based on the premise that live ecommerce can attract consumers’ attention and satisfy their shopping needs through the TV to Online mode and TV Shopping [49,50]. Such studies are both effective and instructive; for example, one study showed that three types of motivations, namely “information motivation”, “social interaction motivation”, and “entertainment motivation”, have a positive impact on the frequency of use of live ecommerce. Moreover, the extroversion and openness of the users have a positive predictive effect on the motivation for, behavior in, and satisfaction level with live ecommerce. Using a structural equation model, a post-acceptance and continuous adoption model of information systems was used to explore user behavior on Twitch, which is a live webcasting platform, and the results showed a significant effect of cognitive interactivity on perceived usefulness, a significant effect of perceived usefulness on satisfaction and intention to continue using, and a significant effect of satisfaction on intention to continue using. Previous studies have indicated that highly motivated community participants are more likely to generate brand trust, which in turn increases the psychological attachment of fans to brands and leads to their long lasting relationship with the fan page [8]. It was also found that a live presence had a significant positive effect on online interactivity and message credibility, online interactivity and message credibility had a significant positive effect on identity, and identity and affordability had a significant positive effect on consumers’ purchase intention, where affordability mediated the effect of identity and purchase intention. A social motivation model with eight factors was constructed based on the use and satisfaction model to explore the level of engagement of four-sided live viewers, and the findings showed that less-viewed live platforms had a higher social engagement motivation than more-viewed live platforms [51,52].

2.2. Eye Tracking Technology and Related Studies

For most human beings, 80% of information processing relies on vision; thus, studying eye movements is considered to be the most effective tool in visual information processing and the most important source of sensory information in cognitive processing [36]. The main reason is that eye movements can be effectively used to capture the complex cognitive processes of human visual information, meaning they can be used as a technique to locate where people are looking [53,54,55]. Unlike previous cognitive processes for individuals, eye movements can also effectively characterize the information processed during the reading process and provide external behavioral indicators [56,57,58]. Therefore, eye tracking is considered to be a measurement technique that reflects the visual information process, helps researchers to record the fixation positions of the participants’ eyes at a given time, and displays the movement sequence and trajectories of the information. Eye tracking technology provides an important tool for the natural and online exploration of cognitive thinking, which reflects the correlation between eye movements and psychological changes in readers when receiving information [37]. This technique has been widely applied to understand the reading process, as well as other related topics and research experiments, such as eye movement characteristics, perceptual span, and information integration. Eye movement data are also used to examine the cognitive processes of different cognitive tasks, and in many studies related to eye tracking, total fixation durations, the number of fixations, and the sequence of fixations are the most frequently investigated variables [56,57]. When the eye is in fixation or saccade, it indicates visual attention. Eye movement processes, such as fixation duration, fixation position, and visual trajectory, can be used as bases for assessing whether visual behavior is being attended to [59]; for example, researchers have shown that the fixation durations and the number of fixations are directly related to preference, where shorter fixation durations or fewer fixations indicate a lower preference, while longer fixation durations or more frequent fixations indicate a higher preference [32,60]. An experiment was conducted with 33 participants using an eye-monitoring device to investigate the relationship between eye tracking and landscape preference, attention, image characteristics, and the number of fixations, and the results showed that the total number of fixations was influenced by preference, and that preference factors varied with personal factors, such as professional background or sex [32,34,61].

2.3. Eye Tracking Technology and Visual Attention

Recent advances in computer processing technology have led psychologists to believe that the human psychological function is similar to that of computers [62,63], i.e., the structure and process of human psychological functioning can be understood through the way that computers process information [64,65]. At the same time, the relationship between the perceptual information processing theory and cognitive processes has led many researchers to conclude that eye tracking technology is the most direct and effective way to study visual information processing, observe the reading patterns and visual attention of different learners, and even as a tool for teachers to diagnose learning disabilities [66]. The eye movements commonly observed and recorded in eye tracking are Fixation and Saccade. The Scan path refers to a series of fixation points and sweeps, which are the conscious eye movements associated with attention shifting, higher-level memory, and comprehension of cognitive processes.
Another study investigated the relationship between students’ reading comprehension and visual attention when reading Chinese sentences with misplaced words by combining eye tracking technology and an electroencephalogram, and the results showed that misplaced words did not affect reading comprehension and that increasing the number of misplaced words in a sentence did not affect the fixation duration. An empirical analysis that used eye tracking technology was conducted to investigate how graphic design in e-books with a high and low correlation affected learners’ visual behavior and learning outcomes when learning single Spanish words [67,68,69,70,71]. Eye tracking technology was used to investigate the differences in the order of seeing and reading, as well as the cognitive span of native Japanese learners, when they viewed words in their first language (L1) and second language (L2). The empirical results of the study showed that for L2 readers, reading was more important than seeing, and there was a difference in parafoveal processing between L1 readers and L2 readers when reading target words in sentences [72,73]. In addition, in order to understand the effectiveness and concentration of learners in game-based learning, a study on the design and development of digital games using eye tracking technology found that visual trajectories were used to analyze learners’ learning processes, and that incorporating game features with rules and objectives in the games was effective in capturing learners’ attention [74,75,76].
Visual behavior, processes, and achievements in game-based learning (GBL) have been used to explore the differences between players with medium–high and low conceptual comprehension in visual behavior and game process in GBL using eye tracking technology [77]. The results suggested that players in the high comprehension group showed effective text reading strategies and better metacognitive control of visual attention during gameplay. Another study explored the impact of using a simulation environment with animated agents on the visual attention, emotion, performance, and perception, in order to assess how animated agents of emotion in simulation-based training affected the performance outcomes and perceptions of individuals interacting with the training application in real-time. The results of the study showed that both experienced and novice participants focused more visual attention on the animated agents than on other defined regions of interest in the simulation environment [78]. While multiple choice (MC) based on visual attention is an important form of testing to assess students’ academic performance in eLearning, MC question evaluation indices (e.g., correctness rate) only consider the correctness of the final choice but ignore the process that led the participants to select the answer. The experimental results showed that including this measure could reflect differences in fixation movements and help teachers infer the true academic level of students [79].
Eye tracking has been used to study students’ visual attention to solve upper-level physics questions, and to identify the differences in understanding and cognitive processing when solving questions in a graphical and abstract mathematical manner. The fixation patterns and associated eye-tracking measures suggested that the two visual strategies had different cognitive processes, and the different strategies led to different fixation patterns and learning outcomes [80]. A comparative analysis of Facebook online advertising attention using eye tracking technology was conducted to explore the eye tracking indices and Facebook advertising attention behavior [81]. VR eye tracking was used to explore how prior knowledge affected the visual attention and learning outcomes for Japanese mimicry and onomatopoeia, and the results of the study showed that learning in VR improved attention to learning [82,83,84,85]. A study investigating the educational effects of anthropology through a GBL approach was carried out to learn about the association between game immersion and visual attention distribution. The eye tracking technology showed that students who played anthropology in an immersive manner were more focused on the played character [86]. In a study of viewers’ visual attention to subtitles in home shopping broadcasts, eye tracking technology was used to obtain objective data on visual attention to home shopping subtitles and to determine the factors that draw the visual attention to subtitles on television home shopping screens. In addition, an eye tracking device was used to propose an effective method of producing and directing home shopping subtitles [87].

3. Research Method

This study was conducted using a portable eye tracker as an experimental tool to collect and record the participants’ eye movement data. The main purpose of this study was to analyze the participants’ eye movement data while watching the content of a live ecommerce platform, to investigate the distribution of their visual attention and eye movement trajectories, based on the defined regions of interest.

3.1. Participants

A total of 32 participants who had used live streaming platforms were recruited for this study. One participant’s data were incomplete due to the problem of data offset; therefore, only 31 participants’ eye movement data (14 males and 17 females) were available for this study.

3.2. Stimuli

This study adopted Facebook live, a highly viewed and user-friendly platform with diversified content, as the experimental stimuli. A three-minute-long Facebook live was projected onto a large TV screen so that participants could clearly and accurately view the interface and services of the live ecommerce service while wearing an eye tracker. At the beginning of the experiment, in order for the eyeglass-like portable eye tracker to be firmly attached to the participants’ faces to reduce shaking, the researcher assisted the participants to put on and adjust the eye tracker’s elastic band. A tablet computer was placed on a table to collect eye movement data, and the eyewear was wrapped around the participants using tape to reduce distractions during the experiment. After each participant completed the experiment, the eye tracking system automatically output a wmv file used to define the dynamic ROIs. The ROIs planned and defined in this study included the host of the live ecommerce, the live product, the name of the live shop, and the content of the viewers’ comments. The researcher could use the bar key to control the playback progress of the video file to include the defined ROIs in the dynamic picture frame. The scenario of the experiment and the content of the Facebook live platform are shown in Figure 1.

3.3. Experimental Design

This analytical study was designed to investigate the distribution of visual attention to different ROIs by the eye movement trajectories of participants with experience using live streaming platforms during approximately three minutes of viewing live ecommerce. The eye movement indicators in this study included the latency of first fixation (LFF), the duration of first fixation (DFF), the total fixation durations (TFD), and the number of fixations (NOF), and visualization software was used to analyze and compare the visual attention distribution of the participants by sex on the live ecommerce platform.

3.4. Experimental Instruments

This study used a portable eye tracker with an infrared pupil detection lens as a tool to provide 120° wide-angle online photography, which perfectly fits the range of the human visual angle. The eye tracking software was installed on a tablet computer, which was portable and lightweight, in order to record the fixation positions and durations of the eye gaze. The eye tracker and eye movement software analysis equipment used in this study are shown in Figure 2 and Figure 3.

3.5. Procedures

3.5.1. Experimental Process

Before the experiment, the eye tracker underwent a 5-point calibration procedure to enable the system to accurately record the eye movements. After the researcher assisted the participants to put on the eye tracker correctly, the participants used an eye-corrected reference chart, which sent out corrective movements with minimal head movement. The calibration procedures for this study were as follows: (1) We selected the calibration eye(s), left, right, or both eyes; then, we selected the camera type to be used; and we clicked on Start to enter the calibration procedure. At the next screen, we connected to the camera and pressed Start to start the pupil capture function. We placed the pupil in the center of the image and pressed the Autoset key to capture the pupil; then, we selected the calibration mode and clicked Calibrate to start the calibration. (2) During the calibration, the participants were instructed to view the five calibration points (upper left, lower left, lower right, upper right, and center) of the calibration reference chart, and the calibrated information was recorded. The calibration was completed when the calibration software displayed a high-quality calibration score after the participants viewed all five calibration points. (3) When the calibration was completed, the participants began the eye tracking experiment. During the calibration procedure, the eyes moved rapidly in the same direction and with the same amplitude because they were in a saccade; thus, the central retinal fossa was aligned in the same position in order for both eyes to focus on the stimulus. However, some misalignment can occur, particularly at the beginning of fixation, and in the absence of binocular coordination, saccade could cause differences in fixation and affect the visual process. In such cases of vergence disparity, binocular correction was re-established. The portable eye tracker calculated the coordinates on the screen corresponding to the eye rotation angle based on the calibration results. After calibration, the experiment was ready to proceed as shown in Figure 4.

3.5.2. Eye Tracking Data Output (Output)

The experiment was conducted using an eye tracking device, and the researcher logged into the Facebook community platform during the experiment. In order to ensure that the content on the live ecommerce platform was the same for each participant, to ensure the consistency of the experimental material, the researcher used live ecommerce content pre-recorded on the Facebook platform as the main content and played it on a large TV screen. The portable eye tracker recorded the eye movement data of each participant, and at the end of the experiment, the tablet computer connected to the eye tracker automatically output the eye tracking data as a video file (.wmv) and a text file (.txt). The video file was used to define the ROIs, while the text file recorded the participants’ eye movement data. For this study, as shown in Figure 5, the red boxes represent the defined ROIs. ROI1 was the live streamer, ROI2 was the live product, ROI3 was the brand name of the product, and ROI4 was the area with the responses and comments from the viewers. Since the information content presented on the live platform was dynamic, in the process of defining the time zone of interest, the researcher used the bar key to control the playback progress of the video file to include each ROI in the dynamic video frame.

3.5.3. Data Collection and Analysis

After the experiments were completed, the eye movement data were summarized and analyzed to understand the participants’ visual behaviors in order to answer the proposed questions. After the eye movement data were collected, this study used analysis software to complete the analysis with two modified tools, the dynamic ROIs and the fixation calculator tool, in order to define the ROIs and perform preliminary data aggregation. The dynamic ROI definition tool provided the researcher with the ability to define the ROIs, while the fixation calculator tool automatically prioritized the ROIs that overlapped to avoid errors during data analysis. The analysis process of the eye tracking software tool is shown in Figure 6. The processes designed to classify the ROIs in this study were as follows:
  • We imported the post-experimental motion pictures into the dynamic ROIs tool.
  • We began to execute the pictures.
  • We pressed the left mouse button, dragged the mouse to the region to be analyzed, and released the left mouse button to define the ROI.
  • We repeated Step 3 according to the fixation calculator tool, and once all the ROIs were defined for the entire experiment, the information from all the ROIs could be used for the eye movement analysis.

3.5.4. Data Analysis

This study was conducted using an eye tracker and eye tracking visualization analysis software to collect the fixation information in milliseconds. The following four eye tracking indices were used for independent sample t-test analysis using SPSS 22 for Windows to examine the differences in the visual attention of male and female participants. The eye movement behaviors of the participants’ during the experiment were analyzed to answer the questions posed in the introduction.
  • LFF: The length of time elapsed from the beginning of the stimulus until the participant first looked at the defined ROI. The shorter the time elapsed, the faster it received attention.
  • DFF: The amount of time the participant spent fixed on an ROI for the first time. The longer the fixation, the more attention this received. Therefore, the extent to which this ROI was attractive to the participants can be determined.
  • TFD: The total amount of time the participant fixed on an ROI during the experiment, including the “DFF” and the “re-fixation” durations. TFD is often used to observe the participants’ visual attention or the extent to which they are interested in the ROI.
  • NOF: The number of times a participant fixed on an ROI, during the experiment. The number of fixations reflects the importance of the ROI, and a higher number of fixations indicates that the ROI was more important to the participants or provided more cues to the participants.
Based on the eye movement data collected by the eye tracker, the analysis software investigated the distribution of the visual attention of the participants during the experiment. Based on the results of the eye movement indicators, this study proposed the following hypotheses, based on the initial study questions:
1.
What is the order of fixation on the ROIs?
H1: The LFF of each participant was lowest for ROI1.
2.
What is the order of the attention span for the ROIs?
H2: The DFF of each participant was highest for ROI1.
3.
What is the amount of dedicated fixation given to each ROI?
H3: The sum of the TFD and NOF spent in each ROI was highest for ROI1.

4. Results and Discussion

The visual attention of the participants was investigated based on the eye movement data from the experiment, which was analyzed and compared based on the eye movement indicators, LFF, DFF, TFD, and NOF. The findings of this study are presented in Table 1, Table 2 and Table 3.

4.1. Visual Attention by LFF and Sex

The LFF for each ROI of the live ecommerce platform differed between the sexes. For males, the first focus of attention was the product brand (ROI3), and the viewing order was ROI3 > ROI4 > ROI2 > ROI1. For females, the first focus of attention was the product brand (ROI3), and the viewing order was ROI3 > ROI1 > ROI2 > ROI4. Therefore, the initial focus of attention on ROIs after logging in to the live ecommerce platform differed by sex. Overall, according to the findings of the LFF study, H1 (that the LFF of each participant was lowest for ROI1) is invalid as shown in Table 1.

4.2. Visual Attention by DFF and Sex

For males, the live product (ROI2) attracted the most attention in the DFF, and the viewing order was ROI2 > ROI3 > ROI1 > ROI4. Although product brand (ROI3) was the first for males in the LFF eye movement index analysis, it was not the region that received the longest fixation. The results clearly showed that males were more attracted to the live product. Therefore, this study suggests that males pay more attention to the product itself, before making a decision regarding a purchase. For females, the product brand (ROI3) attracted the most attention in the DFF, and the viewing order was ROI3 > ROI1 > ROI2 > ROI4. The product brand (ROI3) was the first for females in the LFF eye movement index analysis, and it was the region that received the longest fixation. The results clearly showed that females were more attracted to the product brand. In other words, when compared with female participants’ LFF data, it can be seen that product brand (ROI3) was both the first and the region with the longest DFF. Therefore, it suggests that brand is an important factor in females’ decisions on purchases. Thus, H2 (that the DFF of each participant was highest for ROI1) is invalid as shown in Table 2.

4.3. Visual Attention by Sex Based on TFD and NOF

The TFD and NOF of the male and female participants were observed in this experiment. In the visual attention behaviors, the data of the TFD and NOF for both sexes showed that the most frequently viewed region was the live product (ROI2). The viewing order was ROI2 > ROI1 > ROI3 > ROI4 for TFD and ROI2 > ROI1 > ROI4 > ROI3 for NOF, and the results suggested that the viewers remained focused on the live products. The results showed that the live streamers were not the main elements attracting the participants’ attention, meaning the main focus of the participants was on the products offered for sale, as shown in Table 3.
Therefore, H3 (that the sum of the TFD and NOF spent in each ROI was highest for ROI1) is invalid. A reference to the fixation heat map of female (No. 6) and male participants (No. 2) found that the live product received a longer fixation, as shown in Figure 7.
The eye movement data were analyzed by independent sample t-testing of the eye movement indicators, TFD and NOF. The test statistics t = 12.223, p = 0.000 < 0.001, showed a significant difference in the TFD of the live product (ROI2) between males (M = 1739.29, SD = 33.33) and females (M = 1181.06, SD = 167.69). The test statistics t = −10.065, p = 0.000 < 0.001 showed a significant difference in the NOF on the live streamer (ROI1) between males (M = 1.86, SD = 0.77) and females (M = 5.94, SD = 1.34). Therefore, the statistical analysis showed that there was a significant difference between the sexes for the TFD of ROI2 and the NOF of ROI1 between the two groups, respectively.

5. Conclusions and Suggestions

5.1. Conclusions

This study employed eye tracking technology to examine how participants of different sexes engaged in an ecommerce platform on Facebook live. According to the eye movement data on the ROIs, four indicators, including LFF, DFF, TFD, and NOF, were used to analyze the distribution of the visual attention of the participants. Although all participants were found to pay attention to the live streaming, the different sexes showed different patterns of visual behavior in viewing the live streaming information. Through the empirical analysis and proposed hypotheses examination, the findings of this study derived the following conclusions:
Firstly, according to the LFF, the male and female participants had different viewing sequences in terms of the ROIs; however, both male and female participants directed their first visual attention to the product brand; this result was similar to previous studies [88,89].
Secondly, according to the DFF, the male participants spent their longest first duration on the live product (ROI2) among the ROIs, while the female participants did so on the product brand (ROI3). In other words, male and female participants had different gaze durations in terms of ROIs with regard to the DFF while watching the streaming platform.
Finally, according to the TFD and NOF, this study showed that both male and female participants paid the most attention to the live product (ROI2) and the live streamer (ROI1) with the highest amount of gaze durations and frequency. This phenomenon may be explained from an endogenous attention-control perspective in which it does not matter what is on the live streaming platform, they pay more attention to the live product and the live streamer. More interestingly, in a previous study, male e-consumers had higher gaze durations on the live product than female e-consumers, while female e-consumers had higher frequency on the live streamer than male e-consumers [90]. Intuitively, live streaming provides a real-time and interactive mode of interaction in which the live streamer’s personality is one of major factors to receive e-consumers’ greatest capacity to hold attention, in addition to the live product itself [91].

5.2. Suggestions and Research Limitations

This paper studied the four eye movement indicators, LFF, DFF, TFD, and NOF, and analyzed the differences among participants while they were watching a Facebook live streaming. The live streaming platform is an interactive mode, which is intuitive, immediate, and interactive, creating a seemingly face-to-face real-time atmosphere with a focus on the product or the live streamer’s personality. Moreover, the live streaming platform also displays viewers’ comments while viewers are watching, which is different from the pre-recorded TV commercial’s one-way marketing behavior [92,93].
In this study, the screen was divided into four regions of interest, including three dynamic regions of interest: the live streamer, ROI1, the product, ROI2, and the real-time comment section, ROI4, as well as the brand name ROI3, which is regarded as a static area. Thus, in future studies, various factors can be added to the four regions, for instance, the live streamer’s style, sex, dynamic brand images, comments, and emoji stickers. Based on the findings of this study, new marketing approaches and business models can be developed to study consumers’ existing visual research behaviors on a live streaming platform. In the future, the findings can be presented to live streaming platform operators as important references for their streaming operation.
This study had several experimental limitations as follows. Firstly, the participants in this study were college students, who were of the same age group. Secondly, the participants were recruited based on their shopping experience on the live streaming platform, resulting in a small number of participants: 14 men and 17 women whose visual attention differences were studied and analyzed. Finally, the visual stimuli used in this study featuring women’s clothing; whether the nature of different products would cause a different distribution in the participants’ eye movement data is worth further examination and exploration.

Author Contributions

Conceptualization, H.-C.C. and C.-C.W.; methodology, H.-C.C. and C.-C.W.; software, C.-Y.H. and J.C.H.; validation, H.-C.C. and C.-C.W.; formal analysis, H.-C.C. and C.-C.W.; investigation, H.-C.C. and C.-Y.H.; resources, H.-C.C. and C.-C.W.; data curation, H.-C.C. and C.-C.W.; writing—original draft preparation, H.-C.C. and C.-C.W.; writing—review and editing, H.-C.C. and C.-C.W.; visualization, H.-C.C. and C.-C.W.; supervision, H.-C.C. and C.-C.W.; project administration, H.-C.C. and C.-C.W.; funding acquisition, C.-C.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Ministry of Science and Technology, Taiwan (No. MOST109-2813-C-309-026-H).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all participants involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gunawan, J.; Permatasari, P.; Tilt, C. Sustainable development goal disclosures: Do they support responsible consumption and production? J. Clean. Prod. 2020, 246, 118989. [Google Scholar] [CrossRef]
  2. Leal Filho, W.; Azul, A.M.; Brandli, L.; Özuyar, P.G.; Wall, T. Responsible Consumption and Production. Encyclopedia of the UN Sustainable Development Goals; Springer: Cham, Switzerland, 2020. [Google Scholar]
  3. Katila, P.; Colfer, C.J.P.; de Jong, W.; Galloway, G.; Pacheco, P.; Winkel, G. SDG 12: Responsible Consumption and Production—Potential Benefits and Impacts on Forests and Livelihoods. Sustainable Development Goals: Their Impacts on Forests and People; Cambridge University Press: Cambridge, UK, 2019; pp. 386–418. [Google Scholar]
  4. Franco, I.B.; Newey, L. SDG 12 Responsible Consumption and Production. In Actioning the Global Goals for Local Impact. Science for Sustainable Societies; Franco, I., Chatterji, T., Derbyshire, E., Tracey, J., Eds.; Springer: Singapore, 2020. [Google Scholar]
  5. Jubin, J.J.; Clare, D.; Tim, M.; Stephen, S. Synergistic Interactions of SDGs in Food Supply Chains: A Review of Responsible Consumption and Production. Sustainability 2021, 13, 8809. [Google Scholar]
  6. Whitson, J.; French, M. Productive play: The shift from responsible consumption to responsible production. J. Consum. Cult. 2021, 21, 14–33. [Google Scholar] [CrossRef]
  7. Ameli, M.; Esfandabadi, Z.S.; Sadeghi, S.; Ranjbari, M.; Zanetti, M.C. COVID-19 and Sustainable Development Goals (SDGs): Scenario analysis through fuzzy cognitive map modeling. Gondwana Res. 2022, in press. [CrossRef]
  8. Biasutti, M.; Frate, S. A validity and reliability study of the Attitudes toward Sustainable Development scale. Environ. Educ. Res. 2017, 23, 214–230. [Google Scholar] [CrossRef]
  9. Messner, D.; Nakicenovic, N.; Zimm, C.; Clarke, G.; Rockström, J.; Aguiar, A.P.; Boza-Kiss, B.; Campagnolo, L.; Chabay, I.; Collste, D.; et al. The Digital Revolution and Sustainable Development: Opportunities and Challenges-Report Prepared by the World in 2050 Initiative; International Institute for Applied Systems Analysis (IIASA): Laxenburg, Austria, 2019. [Google Scholar]
  10. Luers, B.A. The Missing SDG: Ensure the Digital Age Supports People, Planet, Prosperity & Peace; Inter Press Service: Rome, Italy, 2020; pp. 7–8. [Google Scholar]
  11. Alfonso, V.; Boar, C.; Frost, J.; Gambacorta, L.; Liu, J. E-commerce in the pandemic and beyond. BIS Bull. 2021, 36, 1–5. [Google Scholar]
  12. Liu, K.; Liu, B.; Xu, H.; He, Y.; Cao, Y. Research on E-Commerce Live Broadcasts Helping Poverty Alleviation under the Influence of the COVID-19: Take Xinhua County, Hunan Province as an Example. In Proceedings of the 2nd International Conference on Economic Management and Model Engineering (ICEMME), Chongqing, China, 20–22 November 2020; pp. 858–865. [Google Scholar]
  13. Guo, H.; Liu, Y.; Shi, X.; Chen, K.Z. The role of e-commerce in the urban food system under COVID-19: Lessons from China. China Agric. Econ. Rev. 2020, 13, 436–455. [Google Scholar] [CrossRef]
  14. Salem, M.A.; Nor, K.M. The effect of COVID-19 on consumer behaviour in Saudi Arabia: Switching from brick and mortar stores to E-Commerce. Int. J. Sci. Technol. Res. 2020, 9, 15–28. [Google Scholar]
  15. Addo, P.C.; Jiaming, F.; Kulbo, N.B.; Liangqiang, L. COVID-19: Fear appeal favoring purchase behavior towards personal protective equipment. Serv. Ind. J. 2020, 40, 471–490. [Google Scholar] [CrossRef] [Green Version]
  16. Chen, C.; Hu, Y.; Lu, Y.; Hong, Y. Everyone Can Be a Star: Quantifying Grassroots Online Sellers’ Live Streaming Effects on Product Sales. In Proceedings of the 52nd Hawaii International Conference on System Sciences, Grand Wailea, HI, USA, 8–11 January 2019; pp. 2548–2557. [Google Scholar]
  17. Cai, J.; Wohn, D.Y.; Mittal, A.; Sureshbabu, D. Utilitarian and Hedonic Motivations for Live Streaming Shopping. In Proceedings of the 2018 ACM International Conference on Interactive Experiences for TV and Online Video, Newark, NJ, USA, 26–28 June 2018. [Google Scholar]
  18. Chen, A.; Lu, Y.; Wang, B. Customers’ purchase decision-making process in social commerce: A social learning perspective. Int. J. Inf. Manag. 2017, 37, 627–638. [Google Scholar] [CrossRef]
  19. Chen, C.C.; Lin, Y.C. What drives live-stream usage intention? The perspectives of flow, entertainment, social interaction, and endorsement. Telemat. Inform. 2018, 35, 293–303. [Google Scholar] [CrossRef]
  20. Hilvert-Bruce, Z.; Neill, J.T.; Sjöblom, M.; Hamari, J. Social motivations of live-streaming viewer engagement on Twitch. Comput. Hum. Behav. 2018, 84, 58–67. [Google Scholar] [CrossRef] [Green Version]
  21. Hong, Z.; Yi, L. Research on the influence of perceived risk in consumer on-line purchasing decision. Phys. Procedia 2012, 24, 1304–1310. [Google Scholar] [CrossRef] [Green Version]
  22. Gajewski, A.S. A Qualitative Study of How Facebook Storefront Retailers Convert Fans to Buyers. Ph.D. Dissertation, Walden University, Minneapolis, MN, USA, 2013. [Google Scholar]
  23. Leeraphong, A.; Sukrat, S. How Facebook Live Urge SNS Users to Buy Impulsively on C2C Social Commerce? In Proceedings of the 2nd International Conference on E-Society, E-Education and E-Technology, Taipei, Taiwan, 13–15 August 2018; pp. 68–72. [Google Scholar]
  24. Huang, Z.; Benyoucef, M. The effects of social commerce design on consumer purchase decision-making: An empirical study. Electron. Commer. Res. Appl. 2017, 25, 40–58. [Google Scholar] [CrossRef]
  25. Scheibe, K.; Fietkiewicz, K.J.; Stock, W.G. Information Behavior on Social Live Streaming Services. J. Inf. Sci. Theory Prat. 2016, 4, 6–20. [Google Scholar] [CrossRef] [Green Version]
  26. Skjuve, M.; Brandtzaeg, P.B. Facebook live: A mixed-methods approach to explore individual live streaming practices and motivations on Facebook. Interact. Comput. 2019, 31, 589–602. [Google Scholar] [CrossRef]
  27. Just, M.A.; Carpenter, P.A. Eye fixations and cognitive processes. Cogn. Psychol. 1976, 8, 441–480. [Google Scholar] [CrossRef]
  28. Just, M.A.; Carpenter, P.A. A theory of reading: From eye fixations to comprehension. Psychol. Rev. 1980, 87, 329–354. [Google Scholar] [CrossRef]
  29. Hopkins, E. Machine Learning Tools, Algorithms, and Techniques. J. Self-Gov. Manag. Econ. 2022, 10, 43–55. [Google Scholar]
  30. Kliestik, T.; Kovalova, E.; Lăzăroiu, G. Cognitive decision-making algorithms in data-driven retail intelligence: Consumer sentiments, choices, and shopping behaviors. J. Self-Gov. Manag. Econ. 2022, 10, 30–42. [Google Scholar]
  31. Nica, E.; Sabie, O.M.; Mascu, S.; Luţan, A.G. Artificial Intelligence Decision-Making in Shopping Patterns: Consumer Values, Cognition, and Attitudes. Econ. Manag. Financ. Mark. 2022, 17, 31–43. [Google Scholar]
  32. Lai, M.L.; Tsai, M.J.; Yang, F.Y.; Hsu, C.Y.; Liu, T.C.; Lee, S.W.Y.; Li, M.H.; Chiou, G.L.; Liang, G.C.; Tsai, C.C. A review of using eye-tracking technology in exploring learning from 2000 to 2012. Educ. Res. Rev. 2013, 10, 90–115. [Google Scholar] [CrossRef]
  33. Rayner, K. Eye movements in reading and information processing: A 20-year study. Psychol. Bull. 1998, 124, 372. [Google Scholar] [CrossRef]
  34. Rayner, K.; Chace, K.H.; Slattery, T.J.; Ashby, J. Eye movements as reflections of comprehension process in reading. Sci. Stud. Read. 2006, 10, 241–255. [Google Scholar] [CrossRef]
  35. Rayner, K. Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 2009, 62, 1457–1506. [Google Scholar] [CrossRef]
  36. Vernet, M.; Kapoula, Z. Binocular motor coordination during saccades and fixations while reading: A magnitude and time analysis. J. Vis. 2009, 9, 2. [Google Scholar] [CrossRef] [Green Version]
  37. Nyström, M.; Holmqvist, K. An adaptive algorithm for fixation, saccade, and glissade detection in eye tracking data. Behav. Res. Methods 2010, 42, 188–204. [Google Scholar] [CrossRef] [Green Version]
  38. Agnieszka, A.T. Basic terminology of eye-tracking research. Appl. Linguist. Pap. 2018, 25, 123–132. [Google Scholar]
  39. Hessels, R.S.; Niehorster, D.C.; Nyström, M.; Andersson, R.; Hooge, I.T. Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. R. Soc. Open Sci. 2018, 5, 180502. [Google Scholar] [CrossRef] [Green Version]
  40. Heller, D.; Müller, H. On the Relationship between Saccade Size and Fixation Duration in Reading. In Eye Movements and Psychological Functions; Routledge: London, UK, 2021; pp. 287–302. [Google Scholar]
  41. Stuart, S.; Hickey, A.; Vitorio, R.; Welman, K.; Foo, S.; Keen, D.; Godfrey, A. Eye-tracker algorithms to detect saccades during static and dynamic tasks: A structured review. Physiol. Meas. 2019, 40, 02TR01. [Google Scholar] [CrossRef]
  42. Tanke, N.; Barsingerhorn, A.D.; Boonstra, F.N.; Goossens, J. Visual fixations rather than saccades dominate the developmental eye movement test. Sci. Rep. 2021, 11, 1162. [Google Scholar] [CrossRef]
  43. Hooge, I.T.; Niehorster, D.C.; Nyström, M.; Andersson, R.; Hessels, R.S. Fixation classification: How to merge and select fixation candidates. Behav. Res. Methods 2022, 1–12. [Google Scholar] [CrossRef]
  44. Kang, K.; Lu, J.; Guo, L.; Li, W. The dynamic effect of interactivity on customer engagement behavior through tie strength: Evidence from live streaming commerce platforms. Int. J. Inf. Manag. 2021, 56, 102251. [Google Scholar] [CrossRef]
  45. Ho, H.F. The effects of controlling visual attention to handbags for women in online shops: Evidence from eye movements. Comput. Hum. Behav. 2014, 30, 146–152. [Google Scholar] [CrossRef]
  46. Ho, H.F.; Chen, G.A.; Vicente, C.T. Impact of Misplaced Words in Reading Comprehension of Chinese Sentences: Evidences from Eye Movement and Electroencephalography. In Proceedings of the 23rd International Conference on Computers in Education(ICCE 2015), Hangzhou, China, 30 November–4 December 2015; pp. 573–579. [Google Scholar]
  47. Pang, J. E-Commerce Business Model Innovation Under the Background of Internet Celebrity Economy. In Proceedings of the 6th International Conference on Financial Innovation and Economic Development, Sanya, China, 29–31 January 2021; pp. 513–519. [Google Scholar]
  48. Geng, R.; Wang, S.; Chen, X.; Song, D.; Yu, J. Content marketing in e-commerce platforms in the internet celebrity economy. Ind. Manag. Data Syst. 2020, 120, 464–485. [Google Scholar] [CrossRef]
  49. Djafarova, E.; Rushworth, C. Exploring the credibility of internet celebrities’ Instagram profiles in influencing the purchase decisions of young female users. Comput. Hum. Behav. 2017, 68, 1–7. [Google Scholar] [CrossRef]
  50. Kang, J.; Tang, L.; Fiore, A.M. Enhancing consumer-brand relationships on restaurant Facebook fan pages: Maximizing consumer benefits and increasing active participation. Int. J. Hosp. 2014, 36, 145–155. [Google Scholar] [CrossRef]
  51. Zorah, H.B.; James, T. Neill Social motivations for viewer engagement on Twitch live streams. Comput. Hum. Behav. 2018, 84, 58–67. [Google Scholar]
  52. Sanders, M.S.; McCormick, E.J. Human Factors in Engineering and Design; McGraw-Hill: New York, NY, USA, 1987. [Google Scholar]
  53. Viviani, P. Eye movements in visual search: Cognitive, perceptual, and motor control aspects. Rev. Oculomot. Res. 1990, 4, 353–393. [Google Scholar]
  54. Cornsweet, T.N. New technique for measuring small eye movements. J. Opt. Soc. Am. 1958, 48, 808–811. [Google Scholar] [CrossRef]
  55. Valliappan, N.; Dai, N.; Steinberg, E.; He, J.; Rogers, K.; Ramachandran, V.; Navalpakkam, V. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nat. Commun. 2020, 11, 4553. [Google Scholar] [CrossRef] [PubMed]
  56. Duchowski, A.T. A breadth-first survey of eye-tracking applications. Behav. Res. Methods Instrum. Comput. 2002, 34, 455–470. [Google Scholar] [CrossRef] [PubMed]
  57. Henderson, J.M.; Hollingworth, A. Advanced scene awareness. Annu. Rev. Psychol. 1999, 50, 243–271. [Google Scholar] [CrossRef] [Green Version]
  58. Klaib, A.F.; Alsrehin, N.O.; Melhem, W.Y.; Bashtawi, H.O.; Magableh, A.A. Eye tracking algorithms, techniques, tools, and applications with an emphasis on machine learning and Internet of Things technologies. Expert Syst. Appl. 2021, 166, 114037. [Google Scholar] [CrossRef]
  59. Carter, B.T.; Luke, S.G. Best practices in eye tracking research. Int. J. Psychophysiol. 2020, 155, 49–62. [Google Scholar] [CrossRef] [PubMed]
  60. Dixson, B.; Grimshaw, G.; Ormsby, D.; Dixson, A. Eye tracking women’s preferences or men’s body type. Evol. Hum. Behav. 2014, 35, 73–79. [Google Scholar] [CrossRef]
  61. Wu, S.C. Using Eye-Tracking Technology to Examine the Relationship between Landscape Preference, Attention Recovery, Image Features and Number of Gazes. Ph.D. Dissertation, Feng Chia University Civil and Hydraulic Engineering, Taichung, Taiwan, 2015. [Google Scholar]
  62. Galvan, A. Neural plasticity of development and learning. Hum. Brain Mapp. 2010, 31, 879–890. [Google Scholar] [CrossRef]
  63. Phillips, D.C.; Soltis, J.F. Perspectives on Learning; Teachers College Press: New York, NY, USA, 2009. [Google Scholar]
  64. Mayer, R.E. Multimedia Learning; Cambridge University Press: Cambridge, UK, 2001. [Google Scholar]
  65. Yang, F.Y.; Chang, C.Y.; Chien, W.R.; Chien, Y.T.; Tseng, Y.H. Tracking learners’ visual attention during a multimedia presentation in a real classroom. Comput. Educ. 2013, 62, 208–220. [Google Scholar] [CrossRef]
  66. Chen, H.C.; Lai, H.D.; Chiu, F.C. Eye tracking technology for learning and education. Sci. Res. Educ. 2010, 4, 39–68. [Google Scholar]
  67. Goldberg, H.J.; Kotval, X.P. Computer interface evaluation using eye movements: Methods and constructs. Int. J. Ind. Ergon. 1999, 24, 631–645. [Google Scholar] [CrossRef]
  68. Van Gog, T.; Scheiter, K. Eye tracking as a tool to study and enhance multimedia learning. Learn. Instr. 2010, 20, 95–99. [Google Scholar] [CrossRef]
  69. Yildirim, B.; Sahin-Topalcengiz, E.; Arikan, G.; Timur, S. Using virtual reality in the classroom: Reflections of STEM teachers on the use of teaching and learning tools. J. Educ. Sci. Environ. Health 2020, 6, 231–245. [Google Scholar] [CrossRef]
  70. Zhang, X.B.; Yuan, S.M.; Chen, M.D.; Liu, X.L. A complete system for analysis of video lecture based on eye tracking. IEEE Access 2018, 6, 49056–49066. [Google Scholar] [CrossRef]
  71. Pan, T.W.; Tsai, M.J. Eye-Tracking Analyses of Text-and-Graphic Design Effects on E-Book Reading Process and Performance:“SPANISH Color Vocabulary” as an Example. In Proceedings of the 22nd International Conference on Computers in Education(ICCE2014), Nara, Japan, 30 November–4 December 2014; pp. 494–498. [Google Scholar]
  72. Leung, C.Y. Can Japanese EFL Learners “See” before They “Read”? In 2014 Studies in Japan Association for Language Education and Technology, Kansai Chapter; Methodology Special Interest Groups (SIG): Kobe, Japan, 2014; Volume 5, pp. 16–27. [Google Scholar]
  73. Was, C.; Sansosti, F.; Morris, B. Eye-Tracking Technology Applications in Educational Research; IGI Global: Hershey, PA, USA, 2016. [Google Scholar]
  74. Hamilton, W.A.; Garretson, O.; Kerne, A. Twitch Streaming: Fostering a Participatory Gaming Community in Live Mixed Media. In Proceedings of the 32nd ACM Annual Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 1315–1324. [Google Scholar]
  75. Liu, H.C.; Chuang, H.H. An examination of cognitive processing of multimedia information based on reviewers’ eye movements. Interact. Learn. Environ. 2011, 19, 503–517. [Google Scholar] [CrossRef]
  76. Liu, H.C.; Lai, M.L.; Chuang, H.H. Using eye-tracking technology to investigate the redundant effect of multimedia web pages on viewers’ cognitive processes. Comput. Hum. Behav. 2011, 27, 2410–2417. [Google Scholar] [CrossRef]
  77. Tsai, M.J.; Huang, L.J.; Hou, H.T.; Hsu, C.Y.; Chiou, G.L. Visual behavior, flow and achievement in game-based learning. Comput. Educ. 2016, 98, 115–129. [Google Scholar] [CrossRef]
  78. Romero-Hall, E.; Watson, G.S.; Adcock, A.; Bliss, J.; Adams Tufts, K. Simulated environments with animated agents: Effects on visual attention, emotion, performance, and perception. J. Comput. Assist. Learn. 2016, 32, 360–373. [Google Scholar] [CrossRef]
  79. Liu, W.; Yu, M.; Fan, Z.; Xu, J.; Tian, Y. Visual Attention Based Evaluation for Multiple-Choice Tests in E-Learning Applications. In Proceedings of the 2017 IEEE Frontiers in Education Conference (FIE), Indianapolis, IN, USA, 18–21 October 2017; pp. 1–6. [Google Scholar]
  80. Klein, P.; Dengel, A.; Kuhn, J. Students’ Visual Attention While Solving Multiple Representation Problems in Upper-Division Physics. In Positive Learning in the Age of Information; Zlatkin-Troitschanskaia, O., Wittum, G., Dengel, A., Eds.; Springer VS: Wiesbaden, Germany, 2018; pp. 67–87. [Google Scholar]
  81. Wang, C.C.; Hung, J.C. Comparative analysis of advertising attention to Facebook social network: Evidence from eye-movement data. Comput. Hum. Behav. 2019, 100, 192–208. [Google Scholar] [CrossRef]
  82. Wang, C.C.; Hung, J.C.; Chen, H.C. How Prior Knowledge Affects Visual Attention of Japanese Mimicry and Onomatopoeia and Learning Outcomes: Evidence from Virtual Reality Eye Tracking. Sustainability 2021, 13, 11058. [Google Scholar] [CrossRef]
  83. Kaakinen, K.J.; Hyönä, J.; Keenan, M.J. How prior knowledge, working memory capacity, and information relevance affect fixation in expository texts. J. Exp. Psychol. Learn. Mem. Cogn. 2003, 29, 447–457. [Google Scholar] [CrossRef]
  84. Clay, V.; König, P.; Koenig, S. Eye tracking in virtual reality. J. Eye Mov. Res. 2019, 12. [Google Scholar] [CrossRef] [PubMed]
  85. Zhao, D.; Lucas, J. Virtual reality simulation for construction safety promotion. Int. J. Inj. Control Saf. Promot. 2015, 22, 57–67. [Google Scholar] [CrossRef] [PubMed]
  86. Teng, Y.Y.; Chou, W.C.; Cheng, M.T. Learning immunology in a game: Learning outcomes, the use of player characters, immersion experiences and visual attention distributions. J. Comput. Assist. Learn. 2021, 37, 475–486. [Google Scholar] [CrossRef]
  87. Son, M.; Kim, J.; Kim, H. Viewers’ Visual Attention on Subtitles in Home Shopping Broadcasts: The NS Home Shopping Channel. Arch. Des. Res. 2022, 35, 217–235. [Google Scholar]
  88. Richardson, P.S.; Jain, A.K.; Dick, A. House Hold store Brand Proness:a framework. J. Retail. 1996, 72, 159–185. [Google Scholar] [CrossRef]
  89. Chovanová, H.H.; Korshunov, A.I.; Babčanová, D. Impact of brand on consumer behavior. Procedia Econ. Financ. 2015, 34, 615–621. [Google Scholar] [CrossRef] [Green Version]
  90. Lv, X.; Zhang, R.; Su, Y.; Yang, Y. Exploring how live streaming affects immediate buying behavior and continuous watching intention: A multigroup analysis. J. Travel Tour. Mark. 2022, 39, 109–135. [Google Scholar] [CrossRef]
  91. Guo, Y.; Zhang, K.; Wang, C. Way to success: Understanding top streamer’s popularity and influence from the perspective of source characteristics. J. Retail. Consum. Serv. 2022, 64, 102786. [Google Scholar] [CrossRef]
  92. Wongkitrungrueng, A.; Assarut, N. The role of live streaming in building consumer trust and engagement with social commerce sellers. J. Bus. Res. 2020, 117, 543–556. [Google Scholar] [CrossRef]
  93. Hou, F.; Guan, Z.; Li, B.; Chong, A.Y.L. Factors influencing people’s continuous watching intention and consumption intention in live streaming. Internet Res. 2019, 30, 141–163. [Google Scholar] [CrossRef]
Figure 1. Experimental scenario and screen of the Facebook live ecommerce platform.
Figure 1. Experimental scenario and screen of the Facebook live ecommerce platform.
Sustainability 14 07494 g001
Figure 2. A person wearing the eye tracker.
Figure 2. A person wearing the eye tracker.
Sustainability 14 07494 g002
Figure 3. The equipment used for software analysis.
Figure 3. The equipment used for software analysis.
Sustainability 14 07494 g003
Figure 4. Eye calibration chart and calibration screen.
Figure 4. Eye calibration chart and calibration screen.
Sustainability 14 07494 g004
Figure 5. The definitions of the regions of interest (ROIs) on the Facebook live ecommerce platform.
Figure 5. The definitions of the regions of interest (ROIs) on the Facebook live ecommerce platform.
Sustainability 14 07494 g005
Figure 6. Eye movement software analysis process.
Figure 6. Eye movement software analysis process.
Sustainability 14 07494 g006
Figure 7. Heat map of female (No. 6) (a) and male (No. 2) (b) participants.
Figure 7. Heat map of female (No. 6) (a) and male (No. 2) (b) participants.
Sustainability 14 07494 g007
Table 1. List of the LFF for each participant. (Time unit: ms).
Table 1. List of the LFF for each participant. (Time unit: ms).
Eye Movement IndicatorParticipantsROI1ROI2ROI3ROI4
LFF
(Latency of First Fixation)
Male (N = 14)43,06128,526291013,372
order4312
Female (N = 17)32,86340,23822,59764,558
order2314
Table 2. List of the DFFs of the ROIs of participants.
Table 2. List of the DFFs of the ROIs of participants.
Eye Movement IndicatorParticipantsROI1ROI2ROI3ROI4
DFF
(Duration of First Fixation)
Male (N = 14)1383439417641268
order3124
Female (N = 17)3226304453601003
order2314
Table 3. List of the TFD and NOF of the ROIs on a live ecommerce platform by sex (TFD unit: ms).
Table 3. List of the TFD and NOF of the ROIs on a live ecommerce platform by sex (TFD unit: ms).
Eye Movement IndicatorParticipantsROI1ROI2ROI3ROI4
TFD
(Total Fixation Durations)
Male (N = 14)907224,35033054083
Female (N = 17)19,47521,15753522729
TFD27,81244,4288,6576,732
Mean920.871467.97279.26219.74
NOF
(Number of Fixations)
Male (N = 14)261051018
Female (N = 17)1101032420
TFD1271963437
Mean4.396.711.101.23
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chen, H.-C.; Wang, C.-C.; Hung, J.C.; Hsueh, C.-Y. Employing Eye Tracking to Study Visual Attention to Live Streaming: A Case Study of Facebook Live. Sustainability 2022, 14, 7494. https://doi.org/10.3390/su14127494

AMA Style

Chen H-C, Wang C-C, Hung JC, Hsueh C-Y. Employing Eye Tracking to Study Visual Attention to Live Streaming: A Case Study of Facebook Live. Sustainability. 2022; 14(12):7494. https://doi.org/10.3390/su14127494

Chicago/Turabian Style

Chen, Hsuan-Chu, Chun-Chia Wang, Jason C. Hung, and Cheng-Yu Hsueh. 2022. "Employing Eye Tracking to Study Visual Attention to Live Streaming: A Case Study of Facebook Live" Sustainability 14, no. 12: 7494. https://doi.org/10.3390/su14127494

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop