Next Article in Journal
Academic Emotions and Regulation Strategies: Interaction with Higher Education Dropout Ideation
Previous Article in Journal
Promoting Interdisciplinary Research Collaboration among Mathematics and Special Education Researchers
Previous Article in Special Issue
What Is the Impact of ChatGPT on Education? A Rapid Review of the Literature
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modeling Students’ Perceptions of Chatbots in Learning: Integrating Technology Acceptance with the Value-Based Adoption Model

by
Ahlam Mohammed Al-Abdullatif
Department of Curriculum and Instruction, King Faisal University, Al-Hasa 31982, Saudi Arabia
Educ. Sci. 2023, 13(11), 1151; https://doi.org/10.3390/educsci13111151
Submission received: 27 September 2023 / Revised: 7 November 2023 / Accepted: 16 November 2023 / Published: 17 November 2023

Abstract

:
As technology continues to advance, chatbots are likely to become an increasingly vital tool in education. This study digs further into how students perceive and accept chatbots for use in learning activities. The study examines the integrated relationships between the constructs of the technology acceptance model (TAM) and the constructs of the value-based model (VAM), including perceived enjoyment, perceived risk, and perceived value, to predict students’ attitudes and, consequently, their acceptance of chatbots for learning in higher education. A total of 432 respondents participated in an online survey, and the proposed hypotheses were evaluated through structural equation modeling (SEM-PLS). The study offers useful insights on chatbot adoption in Saudi higher education, as the results highlight important drivers of chatbot acceptance among students, including perceived usefulness, perceived ease of use, attitude, perceived enjoyment, and perceived value. Perceived risk was not a significant predictor of students’ attitudes or their acceptance of chatbot use in learning. The results are expected to foster the adoption of chatbot technology in supporting distance learning in Saudi Arabia’s higher education.

1. Introduction

The educational sector is one of many being revolutionized by the advent of artificial intelligence (AI), and research has increasingly considered the potential of using AI-powered tools in the classroom. Chatbot technology is at the forefront of the tools that have recently been adopted in teaching and learning practices. Chatbots are computer programs that imitate human conversation to provide a new approach to exploring, building, and sharing knowledge [1]. As defined by Pérez et al. [2], a chatbot serves as a tool that is based on machine learning algorithms and natural language processing (NLP) to comprehend and respond to user inquiries through text, voice, or avatars. Chatbot technology uses AI to replicate human-to-human dialogue and usually incorporates chat platforms or other applications, making them user-friendly interfaces. Since the creation of ELIZA, the initial chatbot, by Joseph Weizenbaum in 1966, their use has grown with increasing speed, with many industries adopting the technology to improve customer service and engagement [3]. A recent report by Research and Markets [4] predicts that the global chatbot market will reach a value of 3.99 billion US dollars by 2030, with the market experiencing a growth rate of 25.7% in the years 2022–2030. According to the report, the growth is related to the rising demand for automated services and advancements in NLP. The education chatbot market is projected to experience a growth rate of 30.8% between 2020 and 2027 due to the increasing popularity of messaging platforms and the growing trend of creating more personalized learning experiences.
Using chatbot technology, students receive a more customized and motivating learning experience [5,6]. In terms of availability and flexibility, chatbots provide distinctive opportunities as communication and informational tools for digital learning [7], providing immediate personalized feedback to learners, assisting with complex problem-solving, and offering support outside the classroom, enabling learners to access resources and assistance at any time [8,9]. Additionally, chatbots can be designed to interact with learners in a fun, engaging way, contributing to increased motivation and interest in learning [10,11]. On the other hand, chatbots can potentially support and enhance teaching practices [12,13]. In educational settings involving a significant number of students per instructor, typically exceeding 100, chatbots play a crucial role in enabling teachers to deliver personalized assistance addressing the needs and preferences of individual learners [14].
Although chatbot technology holds a promising potential for enhancing and facilitating learning, uncertainties persist regarding its acceptance and adoption among learners. Hence, additional research is necessary to gain a thorough understanding of the factors that influence learners’ acceptance of and utilization of this innovative technology. Chatbots are still in the initial phases of their utilization [15,16], and they present many challenges that may affect students’ acceptance of their use in learning [17]. Among the challenges of chatbots mentioned in related studies are their accessibility and usability, including technical issues and user-friendliness [2,18,19], ethical concerns, including privacy and security risks [20,21], and students’ attitudes toward chatbot use in learning [22].
Accordingly, gaining insights into the drivers of students’ acceptance of utilizing and adopting chatbots in an educational context is crucial. Most research studies in the field are of an empirical nature, measuring chatbots effect on several aspects of learning such as motivation [23,24,25], engagement [26,27], interaction [28], academic performance [1,24,29], learning strategies [30], and learning self-efficacy [1] in specific domains such as language learning [31,32,33] and learning programming [34]. Additionally, many studies focus on teachers’ acceptance of chatbot utilization in teaching practices [35,36,37,38,39,40]. The teacher’s role is vital, but it is not sufficient for the successful adoption of chatbots, as students’ acceptance plays the most important role in their adoption and effectiveness. In the context of Saudi higher education, the implementation of chatbots is relatively new, and students’ perspectives on accepting chatbots in learning have scarcely been examined.
Therefore, this study investigated students’ technological acceptance of chatbot use in learning, examining the critical factors driving their acceptance. This study integrated the technology acceptance model (TAM) [41] and the value-based adoption model (VAM) [42] as a theoretical foundation for its investigation. The primary aim was to examine how the TAM factors (perceived ease of use, perceived usefulness, and attitude) and the VAM factors (perceived enjoyment, perceived risks, and perceived value) interact to predict the acceptance of chatbots among students at Saudi universities. This investigation aims to contribute to the literature by identifying the determining factors of students’ acceptance of chatbots in learning, which has not received adequate attention in Saudi Arabia. This study’s findings provide valuable insights to developers of chatbots and tertiary institutions in Saudi Arabia to understand students’ drivers of acceptance when providing a chatbot-based learning environment.

2. Theoretical Foundations

2.1. Chatbots in Education

Chatbots are increasingly popular in educational contexts due to their capacity to mimic human discussions, automate educational services, and minimize teachers’ efforts [22]. This growing popularity may be attributed to several reasons. First, the COVID-19 pandemic accelerated the acceptance of chatbots in education. The transition to remote learning and online education has made chatbots an invaluable resource for helping learners and supporting them beyond the traditional classroom. Chatbots can help learners register for a course, deliver customized feedback on assignments, and provide round-the-clock technical and learning support [24]. The main functions of chatbots are providing personalized interaction to users and responding to their inquiries and concerns [43]. Furthermore, the proportion of students to teachers has consistently risen, particularly due to the expansion of distance education and the popularity of massive open online courses (MOOCs), which have attracted larger numbers of participants [14,25]. In this context, teachers struggle to provide support and individual follow-up, affecting students’ learning, causing dissatisfaction, and thus increasing drop-out rates [43]. Moreover, the growing population of mobile device users and the widespread availability of messaging applications have led to a heightened dependance on utilizing mobile technology in education [33,44], making mobile learning the most favored learning mode among students in higher education [6]. Sandu and Gide [13] predict that chatbots will emerge as the favored technological solution for addressing students’ educational issues, driven by their increased availability, accessibility, and user-friendly nature.
The use of chatbot technology is becoming a noteworthy resource for educational purposes. Chatbots can engage with students as advisors, tutors, classmates, or gamers and have the potential to promote their motivation, cognitive skills, and overall learning performance [22,23,45,46]. The chatbot-based learning environment allows students to take charge of their own education, empowering them to set their own learning priorities. This is enabled through the division of learning components into segments and the arrangement of learning assignments, offering learners a range of tasks along with ongoing assistance and feedback [30]. As a result, learners can adeptly acquire the necessary knowledge and skills with efficiency and efficacy [2,11,33]. Chatbots encourage collaborative learning and enable the sharing of educational resources among users, irrespective of their geographical location or time zone [22]. This promotes a more personalized educational experience, as they can provide learning modules tailored to each student’s unique learning style. Learners, via chatbots, can assess their behavior and monitor their progress, which fosters their metacognitive learning skills [30].
Chatbots facilitate mobile learning, allowing students to access learning materials anytime and anywhere, making chatbots a useful application to support ubiquitous learning [47]. According to Troussas et al. [6] and Wollny et al. [7], chatbots are able to take creative approaches to give exams, evaluations, and feedback that accord with the physical properties of mobile devices, enabling learners to interact with the learning content rapidly and receive quick feedback. Furthermore, chatbots are able to stimulate students’ abilities to perform higher-order thinking, cultivate their self-efficacy in learning, encourage effective self-management, and elevate self-regulation in learning [2,45,48,49,50]. Overall, the utilization of educational chatbots is a game changer. With the potential to revolutionize learning and teaching, these cutting-edge tools are helping educational institutions adapt to the ever-changing landscape of modern education.

2.2. Related Work on Chatbot Acceptance in Learning

Many studies have examined students’ perspectives on chatbot technology in learning in higher education, finding a high willingness to use chatbots among university students and a great demand for their use [13,31,51]. Table 1 summarizes past research on chatbot acceptance in learning among higher education students. The research studies have been conducted in various learning settings and contexts, mostly in language learning [11,31,52,53] and online learning [23,51]. Chatbots have been used in these studies as teaching agents to support student learning, and students’ acceptance of chatbot use in learning practices was assessed using various theoretical models of users’ technological acceptance, such as the TAM, its extended forms (extended TAM), and both the Unified Theory of Acceptance and Use of Technology (UTAUT) and its updated version (UTAUT2). These studies’ results reveal university students’ high level of technological acceptance regarding the adoption and use of chatbot technology in learning. Their acceptance is influenced by several factors, including accessibility and availability [16,46], personalized learning experience [13,54], interaction and prompt feedback [54,55], user friendliness [56,57], utility in learning [31,57], attitude [16,22,58], self-efficacy [1,14], enjoyment [59], trust, and perceived risk [21,28,55,59]. Overall, these studies highlight that students perceive chatbots as intelligent tools capable of improving their learning performance.

2.3. Technology Acceptance Model

Davis [41] developed the TAM as a theory of user acceptance, and it is one of the more cited models for understanding individuals’ acceptance behavior toward technology [31,62]. Based on the TAM, perceived usefulness and perceived ease of use are the two primary components influencing rejection or acceptance of any technology [63]. Perceived usefulness means individuals see a particular technology as useful in supporting their job performance, whereas perceived ease of use is their belief that a particular technology is easy and requires no great effort to perform a task [64]. As proposed by Davis, these two components have a direct positive effect on influencing an individual’s attitude, the third component of the TAM. Attitude has been proven to be a significant mediating factor in predicting individuals’ acceptance and adoption behavior toward technology [56]. Attitude is defined as individuals’ positive or negative opinions regarding technology use [64]; the more people perceive technology as useful and easy to use, the more likely they are to have a positive attitude about it, and thus the greater their acceptance of adopting it in the future. According to the TAM, the three factors of perceived usefulness, perceived ease of use, and attitude significantly predict 40–50% of individuals’ willingness to use technology [65].

2.4. Value-Based Adoption Model

TAM is a robust model that is frequently used to judge individuals’ acceptance of new technology [62]. In a practical context, however, the TAM factors are limited to predicting individuals’ decision-making process in accepting or refusing a new technology [42,66]. To overcome this limitation, Kim et al. [42] proposed the VAM, which explains technology adoption based on the TAM [64] and incorporates the concept of perceived value, as defined by Zeithaml [67]. The VAM is based on the principle of understanding the underlying motivations (intrinsic and extrinsic) that influence users’ intentions to accept and use a particular technology [68]. It highlights the importance of perceived value, which is a powerful predictor of usage intentions and acceptance. Kim et al. [69] describe the VAM as “a cost–benefit paradigm that reflects the decision-making process where the decision to use is made by comparing the cost of uncertainty in choosing a new technology or product” (p. 1151). When examining technology use intention, according to Kim et al. [42], perceived value is predicted by two primary determinants: the benefits (usefulness and enjoyment) that individuals obtain and the relative sacrifices (perceived risk) that they make. In this study, perceived value represents students’ evaluation of the balance between perceived advantages and potential risks associated with the utilization of chatbots. If students perceive chatbots as enhancing their learning experience (valuable), they are more inclined to accept and adopt them.

2.5. The Integrated Model of TAM and VAM in Accepting Chatbots in Learning

Kim et al. [69] promoted the integration of TAM and VAM to comprehensively reflect the decision-making process wherein users weigh benefits and sacrifices before accepting and using new technology. In the context of AI-based products, Sohn and Kwon [70] compared several technology acceptability models, including the TAM, UTAUT, VAM, and theory of planned behavior (TPB). Among those models, they discovered that the VAM was the most effective in predicting users’ acceptance and adoption of the AI context. Based on individuals’ value perspectives, several studies have combined VAM with other models. For example, Hsiao and Chen [71] integrated a research model based on the VAM factors of environmental concerns and habits to evaluate university students’ adoption of e-book subscription services. Kim et al. [69] integrated the TAM and VAM to examine users’ acceptance and adoption intentions toward Internet of Things (IoT) smart home services. Kim et al. [72] integrated the VAM with the expectation and confirmation model to investigate customers’ continuous intention to use online application services. Liao et al. [62] adopted an integrated model of TAM and VAM to assess consumers’ adoption of e-learning technologies, and Liang et al. [73] proposed integrating VAM and transaction cost theories to predict consumers’ adoption of sharing platforms.
In the context of educational chatbots, several recent studies have investigated students’ acceptance of chatbot technology in learning, as summarized in Table 1. Most of the studies examined students’ acceptance from the perspective of the UTAUT and UTAUT2 models [28,57,58,60,61]. Few studies relied on the primary model of TAM [31,51], whereas most of the studies extended the TAM with other external factors to measure students’ acceptance of chatbot use in learning [16,52,54,55,69]. Within the scholarly literature pertaining to chatbot acceptance in the educational context, the VAM model has received little attention. The VAM model emphasizes the importance of delivering educational content that is customized to the specific needs of each learner. It includes important elements commensurate with the nature of chatbots, namely perceived benefits, perceived enjoyment, perceived risks, and overall perceived value, so it is important to integrate those elements with the TAM and investigate their impact on students’ acceptance of chatbots in learning. This study deepens the extant literature by integrating the TAM with VAM elements to assess chatbot acceptance, an approach never before investigated in the literature on chatbot acceptance. Figure 1 illustrates the integrated model proposed for this study.

2.6. Relationships in the Proposed Model

2.6.1. Relationships in the TAM

In the context of chatbot use in education, the TAM is by far the most used model for investigating both students’ and teachers’ perceptions of chatbot use in learning [31,62]. Perceived usefulness describes the degree to which students believe that chatbot technology benefits them by improving their learning performance. This includes improving the interaction process, learning activities, feedback, assessment, and learning outcomes. The chatbot’s perceived ease of use describes the extent to which students expect that dealing with it is easy, uncomplicated, and requires little effort or time. Attitude is defined as students’ opinions of the potential and utility of integrating chatbot technology in learning [16]. Many previous studies have proven that these three components of the TAM significantly predict students’ and teachers’ acceptance of chatbot technology. For example, Chen et al. [31] confirm in their study that the perceived usefulness (expected benefit) of chatbots positively affects students’ attitudes toward chatbot acceptance and use behavior in learning. Other studies have yielded similar results, such as [28,35,36,60,61,74]. In addition, Kumar and Silva [55] found that perceived ease of use (expected effort) positively affects students’ attitudes toward accepting the use of chatbots in learning, as did many other studies [16,35,52,60,61,74]. Regarding the factor of attitude, several studies indicate that students’ attitudes toward chatbots are a main predictor of their acceptance and use behavior [22,35,52,56,59,75]. Drawing upon the TAM, the present study aimed to determine what leads Saudi university students to accept or reject the adoption of chatbot technology in their learning. The intent was to revalidate the TAM’s inferences in the context of students’ acceptance of chatbot technology in Saudi Arabian higher education by exploring the following hypotheses:
Hypothesis 1 (H1).
Perceived ease of use positively predicts students’ attitudes toward using chatbots in learning.
Hypothesis 2 (H2).
Perceived usefulness positively predicts students’ attitudes toward using chatbots in learning.
Hypothesis 3 (H3).
Attitudes positively predict students’ acceptance of using chatbots in learning.

2.6.2. Relationships in the VAM

The VAM accounts for perceived benefits, a significant predictor of users’ acceptance of technology [42]. Perceived benefits include two factors: perceived enjoyment and perceived usefulness. In the chatbot context, the extent to which students believe that interacting with chatbots would improve their learning performance was defined as perceived usefulness. This could be in the form of a better understanding of concepts, improved communication with teachers, or increased engagement with learning materials, potentially convincing students that the advantages of chatbots in their learning performance outweigh the costs. Yu et al. [76] indicate that perceived usefulness mediated by perceived value was the most significant element influencing the adoption of media tablets, and Liao et al. [62] conclude that perceived usefulness significantly predicted the perceived value of adopting e-learning systems. Similar results are claimed in other studies [72,73,77]. Thus, we proposed the following hypothesis:
Hypothesis 4 (H4).
Perceived usefulness positively affects students’ perceived value of using chatbots in learning.
While perceived usefulness is important for its functional benefits (utilitarian value), perceived enjoyment is important for its emotional benefits (hedonic value) [78]. According to the VAM, perceived enjoyment strongly influences perceived value to predict technology adoption [42]. Students obtain benefits from chatbots that are exciting and fun in addition to improving learning. Therefore, perceived enjoyment in the present research denotes the extent to which students perceive that using chatbots offers interesting and delightful learning experiences. By providing an enjoyable and satisfying learning experience, chatbots can encourage students to spend more time interacting with them, which can lead to better learning outcomes. Many researchers have shown that the relationship between perceived enjoyment and perceived value is strongly significant [31,62,70,77,79].
Hypothesis 5 (H5).
Perceived enjoyment positively affects students’ perceived value of using chatbots in learning.
The VAM considers perceived sacrifice as the second significant factor influencing adoption decisions. This refers to the perceived risks that students may experience when using chatbots, including both monetary and nonmonetary aspects [69]. The monetary aspect includes the actual financial risk associated with the purchase or use of chatbots (which is not applicable in this study context, as Saudi higher education students are not responsible for any part of the cost of using chatbots). The nonmonetary aspect refers to the intangible risk associated with chatbots’ efficiency—that is, concerns about time, effort, security, and privacy [62]—which negatively impacts students’ perceived value of chatbots [80]. Chatbots are AI applications based on internet technology, and Chatterjee and Bhattacharjee [56] note that the “unfriendly nature of internet functions is instrumental for behavioral insecurity” (p. 3446). When using chatbots, people must consider information leakage, virus transmission, and other security and privacy concerns [81]. In the present study, students might hesitate to adopt chatbots if they perceive the risks (security and privacy) as outweighing the potential benefits. Many recent studies confirm the negative relationship between perceived value and perceived risks in predicting technology adoption [62,69,72,76], inspiring the following hypothesis:
Hypothesis 6 (H6).
Perceived risks negatively affect students’ perceived value of using chatbots in learning.
According to research on consumer behavior, perceived value significantly shapes consumers’ intentions toward a product [82]. Zeithaml [67] (p. 14) defines perceived value as “the consumer’s overall assessment of the utility of a product based on perceptions of what is received and what is given”. In the VAM, perceived value is the mediating variable to predict users’ adoption of new technology [42]. In the present study, chatbots’ perceived value is likely to increase when students’ learning experiences are enhanced through additional benefits and fewer risks. In information system research, perceived value is recognized as a significant predictor of technology acceptance and adoption [71,73,76,77,79,83,84]. In reference to the previous discussions and recognizing the impact of value perceptions on students’ acceptance of chatbots, we proposed the following hypothesis:
Hypothesis 7 (H7).
Perceived value positively affects students’ acceptance of using chatbots in learning.

2.6.3. The Integrated Relationships of the TAM and VAM

According to Kim et al. [42], the concept of maximum value is the basic assumption in consumers’ decisions, and value represents both costs and benefits. The perceived benefit of the VAM includes perceived usefulness [69], which, according to Davis [63], is significantly influenced by perceived ease of use. In the present study, perceived ease of use describes the simplicity of a chatbot’s operation. If students find a chatbot easy to use, they are more likely to consider it a valuable tool. This involves a smooth, intuitive user interface, clear, concise instructions, and easy navigation. Research has found a strong relationship between perceived ease of use and perceived value in people’s acceptance behavior [68,70,72]. In this case, an easy-to-use chatbot saves students time and effort, making it more convenient for them to use it (the benefit is greater than the loss). Overall, chatbots that are easy to use are more likely to be successful in attracting and retaining students. Therefore, we proposed the following hypothesis:
Hypothesis 8 (H8).
Perceived ease of use positively affects students’ perceived value of chatbots in learning.
In the field of technology adoption, the relationship between perceived enjoyment and user attitude toward accepting new technology represents a significant research focus [70]. In the VAM, perceived enjoyment reflects individuals’ emotional benefits from using a given technology [69]. In this study, perceived enjoyment indicates the extent to which students feel that interacting with chatbots while learning is fun, delightful, and enjoyable. When students perceive a chatbot to be enjoyable and pleasurable to interact with and learn from, they are more inclined to hold a favorable view of it and be willing to adopt it. Perceived enjoyment enhances the overall user experience, making the technology more appealing and engaging [62,81]. Chatbot technology may provide an engaging learning environment that satisfies students’ demand for sociability as well as their curiosity about new technology [80]. Research has shown that perceived enjoyment is positively related to user attitudes toward accepting new technology [62,68,72,77,85], in particular chatbot technology [59,81,86]. Accordingly, we formulated the following hypothesis:
Hypothesis 9 (H9).
Perceived enjoyment positively affects students’ attitudes toward using chatbots in learning.
Perceived risk is regarded as a critical factor influencing users’ attitudes toward adopting and accepting new technologies [21,59,68,80]. Zhang et al. [87] suggest that the risks involved in adopting AI-based tools should be considered in the education context. In this study, perceived risk refers to the degree to which students believe that using chatbots may have negative consequences with regard to privacy and security. Research has demonstrated a strong relationship between perceived risk (used interchangeably by researchers with perceived trust) [74,86] and user attitude toward accepting new technology. Individuals may hold negative opinions about a new technology, making them reluctant to embrace its adoption if they perceive a high level of risk (trust loss) associated with its use. A study by Chatterjee and Bhattacharjee [56] found that perceived risk had a significant negative impact on stakeholders’ attitudes toward the adoption of AI in higher education in India. In the context of chatbot adoption, several studies have concluded that increased perceived risks negatively affect users’ attitudes toward technology use [28,59,80,81,88]. Therefore, this study presumed the following:
Hypothesis 10 (H10).
Perceived risks negatively affect students’ attitudes toward using chatbots in learning.
Research on information systems and technology adoption has established the positive influence of perceived value on users’ attitudes toward technology adoption [27,80,83]. According to Turel et al. [89], the more individuals perceive a technology as valuable, the greater their positive attitude or intention toward using the technology. According to research, perceived values are positively associated with attitude in a variety of settings. In studying IoT adoption in smart homes, Kim et al. [69] found that users with higher perceived value had a positive attitude toward using the service. Similar findings are described by Ashfaq et al. [90] on the use of smart speaker technology and by Hsiao and Chen [71] on e-book subscription services. Therefore, we predicted that perceived value positively impacts students’ attitudes toward using chatbots in learning under the following hypothesis:
Hypothesis 11 (H11).
Perceived value positively affects students’ attitudes toward using chatbots in learning.

3. Methods

3.1. Data Collection and Participants

The purpose of this study is to seek university students’ perceptions regarding chatbot technology in general. For instance, students may utilize chatbots developed by their instructors in particular courses, or they may use generative chatbots such as ChatGPT and Bard. Therefore, data were gathered from students at three universities in Saudi Arabia during the months of February and March of the academic year 2023. An electronic link to the survey questionnaire has been sent to the potential participants via university e-mails and social networking platforms (e.g., WhatsApp and Telegram), who were all enrolled students, both undergraduate and postgraduate. Participants were provided with informed consent forms guaranteeing the confidentiality of their participation. They were given a period of six weeks to voluntarily fill out and submit their responses to the online survey. A total of 432 complete responses were received, which is considered a sufficient sample size according to Weisberg and Bowen’s sample size criteria in the social sciences [91]. Table 2 provides the sample profile. Of the respondents, 72.2% were female, most were undergraduates (89.4%), and most were aged 19–22 years (86.1%). The respondents were from various colleges across diverse academic domains, including health sciences (11.6%), humanities (33.3%), social sciences (23.8%), pure sciences (21.3%), and computer sciences and information technology (10%).

3.2. Data Analysis

Initially, the data was imported and organized utilizing the Statistical Package for the Social Sciences (SPSS) version 26. Subsequently, it was analyzed through partial least squares structural equation modeling (PLS-SEM) employing SmartPLS 4.0 software. Hair et al. [92] note that there are two primary phases to a PLS-SEM analysis: first, measuring the outer model, called the measurement model, by calculating metrics including factor loadings, internal consistency reliability, convergent validity, and discriminant validity; second, measuring the inner model, called the structural model, which involves hypothesis-testing among the model constructs. The PLS-SEM analysis findings in this study follow Hair et al.’s [92,93] guidelines.

3.3. Measurement

This study utilized a preexisting survey questionnaire to evaluate how participants perceived the seven constructs introduced in the research model proposed in Figure 1. The TAM constructs were adopted from Davis [63] and comprised perceived ease of use (4 items), perceived usefulness (4 items), attitude toward using (4 items), and chatbot acceptance (5 items). The VAM constructs adopted from Liao et al. [62] included perceived enjoyment (3 items), perceived risks (3 items), and perceived value (4 items). Three educational technology professors were invited to revise all the items to ensure clear, appropriate wording, which resulted in a slight modification to the wording of a few items. In the initial part of the survey, demographic details of the participants were gathered, while the subsequent part involved gauging the participants’ perceptions on the seven constructs using a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree) and comprising 27 items. The full survey questionnaire is shown in Appendix A.

4. Results

4.1. Measurement Model Analysis

The first stage in evaluating the measurement model involved determining the construct validity, which refers, according to Hair et al. [92,93], to how well the items measure the intended concept. All the items’ calculated indicator loadings are displayed in Table 3. Hair et al. [92,93] advise accepting loadings greater than 0.7, and all the indicators returned loading values between 0.78 and 0.95, indicating a good to high level of loading [92,93]. Next, we evaluated internal consistency reliability by computing the Cronbach’s alpha coefficient (α) and composite reliability (CR) of each construct. All the items had a CR of >0.7, most had a CR of >0.90, and all the α values ranged from 0.84 to 0.90, signifying a reliability level that is considered good to high (>0.7) [92,93]. Afterwards, the assessment of convergent validity took place by computing the average extracted variance (AVE) for all the constructs. The values for the seven constructs varied between 0.68 and 0.83, surpassing the 0.5 threshold value suggested by Hair et al. [92,93]. This means that each construct explains at least 50% of the variances between its items.
The last step in evaluating a measurement model is calculating the discriminant validity, a measure of how well a construct in a structural model is distinguished from the other constructs that measure different concepts [92,93]. It is calculated using the heterotrait-monotrait ratio (HTMT). Preferably, the square root of the AVE related to each construct ought to surpass the correlations between that specific construct and all other constructs within the structural model, as outlined by Hair et al. [92,93]. To signify discriminant validity, the HTMT ratios should be less than 0.85 [92,93]. As illustrated in Table 4, the square roots of all AVE values surpass the construct correlations, and the HTMT ratios fall below 0.85. Based on these results, discriminant validity is confirmed for this study’s model, indicating that the measurement model is reliable and valid and that the results of any analyses conducted using the model are reliable.

4.2. Structural Model Analysis

Once the suitability of the measurement model was confirmed, we proceeded to assess the structural model. This involved analyzing the size of standardized path coefficients (β), the standard error (SE), t-Values (t), and their respective significance levels (p-values) for each hypothesis, following the guidelines provided by Hair et al. [92,93]. Table 5 and Figure 2 show the results of testing the 11 hypotheses. It was found that perceived ease of use (β = 0.14, SE = 0.08, t = 1.73, p > 0.001), perceived enjoyment (β = 0.13, SE = 0.08, t = 1.67, p > 0.001), and perceived risks (β = −0.01, SE = 0.06, t = 0.19, p > 0.001) had no effect on students’ attitude toward using chatbots in learning, meaning that hypotheses 1, 9, and 10 are rejected. However, perceived usefulness (β = 0.30, SE = 0.07, t = 4.51, p < 0.001) and perceived value (β = 0.35, SE = 0.10, t = 3.50, p < 0.001) showed a significant positive effect on students’ attitudes toward chatbot use in learning. Thus, hypotheses 2 and 11 are accepted. Furthermore, the results indicate that students’ perceived value of using chatbots in learning is significantly and positively influenced by their perceived enjoyment (β = 0.45, SE = 0.06, t = 7.28, p < 0.001) and perceived ease of use (β = 0.34, SE = 0.08, t = 4.20, p < 0.001). Accordingly, hypotheses 5 and 8 are supported. However, the results reveal that students’ perceived value of using chatbots in learning was not affected by either perceived usefulness (β = 0.04, SE = 0.08, t = 0.56, p > 0.001) or perceived risks (β = −0.03, SE = 0.06, t = 0.59, p > 0.001), causing the rejection of hypotheses 4 and 6. In regard to students’ acceptance of using chatbots in learning, the results indicate that both students’ attitudes (β = 0.42, SE = 0.10, t = 4.62, p < 0.001) and their perceived value (β = 0.41, SE = 0.08, t = 4.96, p < 0.001) had a similar significant positive effect on students’ acceptance of chatbot use, which supports hypotheses 3 and 7.
Since this study aimed to evaluate students’ acceptance of using chatbots in learning, the predictive power of the research model was measured using the R2 value, as recommended by Henseler et al. [94], which represents the variance in the dependent construct that is explained by the independent constructs. For the dependent construct (outcome), Henseler et al. [94] rate R2 values of 0.67, 0.33, and 0.19 as excellent, moderate, and low, respectively. In this study, the R2 for attitude (0.58), perceived value (0.52), and chatbot acceptance (0.56) showed a high predictive ability for all three dependent constructs (shown in Table 3).
The Q2 value indicates the out-of-sample predictive relevance of the model [94], with a higher Q2 value indicating that the model is able to predict the dependent construct accurately even when it is applied to data that were not used to test the model. Q2 values of 0.35 and above are deemed substantial, according to Hair et al. [93]. The Q2 values for the dependent constructs (attitude = 0.49, perceived value = 0.51, and chatbot acceptance = 0.44, shown in Table 3) established the highly predictive relevance of the proposed model, suggesting that the model is able to predict students’ acceptance of using chatbots in learning with a high degree of accuracy.

5. Discussion and Implications

As educational institutions increasingly incorporate chatbot technology into their teaching methodologies, understanding students’ acceptance is paramount. The VAM, with its emphasis on perceived value, provides a robust and comprehensive framework for achieving this understanding, so this study examined the viability of combining the TAM and VAM to predict students’ acceptance of chatbots in learning. This study differs from work in the extant literature by integrating VAM-related factors (perceived enjoyment, perceived risk, and perceived value) with the three main factors of the TAM (perceived ease of use, perceived usefulness, and attitude). In addition, to our knowledge, this is the first study examining the drivers of chatbot acceptance among university students in Saudi Arabia. The study contributes to the existing body of knowledge regarding chatbots and provides a set of recommendations applicable to the educational sector (policymakers, instructors, and chatbot designers) for effectively employing chatbots in Saudi Arabia. This section addresses the study’s findings and their implications. The findings are divided and discussed in accordance with the relationships among the variables proposed in the research model.

5.1. Perceived Ese of Use, Attitude, and Perceived Value

According to TAM theory, perceived ease of use is one of the important factors associated with improving users’ attitudes toward accepting and adopting new technologies [63]. In the context of chatbots, many studies emphasize the role of perceived ease of use in predicting students’ attitudes [16,36,51,52,55,60,74]. The result of this study shows that the relationship between perceived ease of use and attitude (H1) was insignificant, contradicting previous research on chatbots. This result is consistent with that of a previous study by Chen et al. [31], which found that perceived ease of use was not a predictor of students’ acceptance of chatbots for learning Chinese vocabulary. By contrast, our study indicates that perceived ease of use significantly contributed to predicting students’ perceived value of chatbots (H8), meaning that students highly appreciate chatbots as valuable learning tools if they are easy to operate and manage in learning practices. This result implies that instructors and designers should focus on developing chatbots that are intuitive, user-friendly, and provide valuable support to students, enabling them to facilitate their learning with the least time and effort. Achieving ease of use in a chatbot may include designing a simple interface with clear, easy-to-understand buttons, menus, and icons; providing multiple interactive options to improve the user experience and make it easier; offering quick, customized answers to students’ inquiries and questions; making the chatbot compatible with various devices (smartphones, tablets, and desktop computers); and providing direct support services (live chat, email, and telephone) to answer students’ questions and provide the necessary technical assistance.

5.2. Perceived Usefulness, Attitude, and Perceived Value

Perceived usefulness is a prominent factor in its positive impact on users’ attitudes toward accepting and adopting technologies [63]. According to the VAM, it is also an important determinant that positively influences users’ perceived value of technology adoption [42]. The findings show that students’ attitudes were positively and significantly driven by the perceived usefulness of chatbots (H2), revealing a considerable concern among the students about chatbot utility and how it would affect their learning performance and achievement. This finding is in line with most previous studies [31,35,60,61,74,95]. Chatbot technology is characterized by its potential to personalize learning, such as by answering learners’ questions, providing feedback on their progress, and suggesting additional resources or activities [30]. By providing this support, chatbots can assist students in controlling and managing their learning, thus saving time and effort and improving learning efficiency [1]. However, these study results show that perceived usefulness had no effect on perceived value (H4), as proposed by the VAM and confirmed by many studies in various technological contexts [62,69,73,76]. This result means that students’ evaluation of the usefulness and benefit of chatbots for their educational performance contributed considerably to their attitudes toward accepting this technology. Therefore, instructors need to focus on the potential benefits of chatbots in the design process. They should establish clear, specific learning outcomes and design the chatbot to support achieving those outcomes; customize the chatbot according to the students’ needs, such as setting the level of difficulty and focusing on specific topics; provide personalized recommendations based on the learner’s progress and preferences; provide real-time feedback on their learning activities; include analytics features to identify areas where learners need assistance and provide tailored directions to students; add interactive options, such as graphs, charts, and illustrations, so that students can easily understand the concepts; and provide options for testing new knowledge and skills that have been learned.

5.3. Perceived Enjoyment, Attitude, and Perceived Value

Despite having no significant effect of perceived enjoyment on students’ attitudes towards chatbots (H9), the findings show that perceived enjoyment has a strong effect on how students perceive the value of chatbots (H5). In the context of AI-based technology, Sohn and Kwon [70] found that users’ acceptance was more highly influenced by perceived enjoyment than perceived utility, suggesting that students will believe in the value of chatbots as an important tool to support learning when they are provided with an interesting, delightful learning environment. Several other studies have yielded similar results [31,62,77,79]. By offering a pleasurable, rewarding learning experience, chatbots can motivate students to engage more in learning activities, resulting in a better learning outcome. In chatbots, making learning more enjoyable for students can be achieved by designing customized content that matches their learning styles and preferences; designing an attractive user interface by including colors, images, icons, and animated graphics to make chatbots more attractive; providing gamified learning activities (ranks, pages, and certificates) so that students feel challenged and motivated while learning; and integrating other technologies, such as augmented reality, virtual reality, and gamified activities, to improve the learning experience and make it more enjoyable.

5.4. Perceived Risk, Attitude, and Perceived Value

According to the VAM framework, users tend to value and adopt technology that is associated with a low level of risk [62,69]. The current study’s findings show that perceived risk had no significant impact on students’ attitudes (H10) or their perceived value of a chatbot (H6). Chatbot technology is an AI-based tool and raises concerns about data confidentiality, virus transmission, and other security matters [81,87], so the factor of perceived risk, as confirmed by many studies, strongly predicts users’ attitudes and perceived value of using AI-based tools [28,56,59,80]. The insignificance of this relationship in the results of the present study may reflect the students’ lack of experience in dealing with educational chatbots and inadequate awareness of how chatbot technology works. A recent study by Othman [96] in the Saudi context found that students are enthusiastic about using chatbot technology and believe in its usefulness in learning but lack the necessary knowledge to utilize it effectively, which may support our explanation of this result. In Saudi higher education, the integration of chatbot technology is still at a nascent stage [30], so students may be unaware of how their data are stored and managed and unaware of the potential privacy and security risks associated with chatbots. This result confirms the importance of raising Saudi students’ levels of awareness and knowledge regarding privacy and confidentiality issues and how to manage personal data and learning data provided to the chatbot. Accordingly, it is important that university policymakers and instructors ensure that students understand how a chatbot works and how their personal data are processed. Students’ awareness can be raised by offering training courses and workshops on safe, accurate methods of use; providing information on privacy protection and how to secure personal information; and encouraging participation in discussions about the risks of using chatbot technology in learning. This can build trust among learners and increase their acceptance of the technology. In addition, chatbot developers need to understand users’ perceived risks and concerns and address them in the design process [21,59,74].

5.5. Attitude, Perceived Value, and Chatbot Acceptance

The present study’s findings reveal that students’ acceptance of chatbot use in learning is positively and strongly influenced by students’ attitudes (H3) and their perceived value of chatbots (H7). Perceived value was also a strong predictor of students’ attitudes towards chatbot acceptance (H11). This result complements earlier research on chatbot acceptance [22,35,52,56,75]. Furthermore, this study found perceived value to be a significant mediating variable in predicting both students’ attitudes and their acceptance of chatbot use; the more students perceived chatbots as valuable tools (with benefits outweighing risks), the more willing they were to accept and adopt them in learning. Similar results were found in prior studies of various technological applications, showing that perceived value is a powerful predictor of users’ acceptance and adoption [71,73,77,83,84]. This yields a theoretical implication, as it highlights perceived value as a powerful determinant in the context of chatbot technology acceptance. It is important that, in future investigations in the context of education, researchers consider the perceived value component as a mediating variable for adopting AI-based tools such as ChatGPT and learning analytics.
In conclusion, this study confirms the viability of combining the TAM and VAM in AI adoption research, specifically in the context of chatbots, as the study of Kim et al. [69] demonstrated in the context of IoT-based smart services and Liao et al. [62] verified in that of e-learning.

6. Conclusions, Limitations, and Future Work

Utilizing an expanded TAM, this research investigated the potential association between the foundational components of the original TAM and additional constructs associated with VAM, including perceived enjoyment, perceived risks, and perceived value. The aim was to determine their collective influence on fostering a favorable attitude and, in turn, encouraging greater acceptance of chatbot usage in learning contexts. This is one of the first studies to use an integrated TAM and VAM model to determine chatbot acceptance in higher education. By integrating the TAM and VAM, educators and chatbot developers can better tailor their offerings to meet students’ internal and external needs, thereby enhancing the effectiveness and reach of chatbot technology adoption. This study concludes that attitude and perceived value are equivalent in their strong influence on students’ technological acceptance of chatbot technology. Perceived enjoyment and perceived ease of use are the two factors strongly affecting students’ perceived value. The students’ attitudes toward chatbot use were significantly influenced by perceived usefulness and perceived value. The results of this study support the integration of the TAM and VAM models, as confirmed by Kim et al. [69] and Liao et al. [62], to determine students’ acceptance of chatbot technology. Practically, university instructors in Saudi Arabia may foster chatbot acceptance among students by reflecting on these results in their future design and application of chatbots.
A number of limitations were detected in this study. First, in terms of sample selection, this study used a convenience sampling approach, with all the participants being recruited from the three universities in the eastern province (authors’ region). Therefore, the results’ generalizability to all university students across all provinces in Saudi Arabia is insufficient. Hence, future studies should recruit a more diverse student population, including more universities across various provinces in Saudi Arabia. Second, the data were gathered via an electronic questionnaire that was sent to all participants via university e-mails and social media platforms. Thus, a percentage of those who responded may not have had an adequate understanding of chat technology or have not used it before, which could potentially impact the interpretation of the findings of this study. Future research may consider establishing an inclusion criterion to intentionally select a sample of students experienced in using chatbots in learning. Third, regarding the theoretical framework, this study relied on integrating the constructs of the original TAM (perceived ease of use, perceived usefulness, and attitude) with those of the VAM (perceived benefits and sacrifices). The inclusion of other specific factors may contribute to the acceptance of chatbots among university students, so future work could integrate the AI-literacy factor [97,98] into the research model and examine its mediating role in predicting chatbot acceptance. In addition, the research model included only functional factors (perceived ease of use and usefulness) and emotional factors (perceived enjoyment and perceived risk) in predicting the value perspective and attitude toward the technological acceptability of chatbots. Future studies are advised to enhance this model by incorporating environmental factors, such as the level of institutional support or training in the use of chatbots. Additionally, subsequent research may examine the moderating role of gender, age, and college status, as students of different genders, ages, and even academic majors have different perspectives on new technology acceptance and adoption [99]. Despite these limitations, this study offers useful and important implications supporting the theoretical and practical applications of chatbot technology in higher education.

Funding

This research is financially supported by the Deanship of Scientific Research, Vice Presidency for Graduate Studies and Scientific Research, King Faisal University, Saudi Arabia (GRANT 3,955).

Institutional Review Board Statement

The study was conducted according to the guidelines of the Research Ethics Committee (REC) of King Faisal University (approval code KFU-REC-2022-JAN-ETHICS462).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A. Survey Questionnaire

Constructs
Perceived
Ease of Use
1. Learning how to use a chatbot is easy for me.
2. My interaction with the chatbot is clear and simple.
3. I find chatbots easy to use for my learning.
4. It is easy for me to become skilled in using chatbots.
Perceived
Usefulness
1. I find chatbots useful for performing my learning tasks.
2. Using a chatbot increases my chances of achieving high performance.
3. Using a chatbot helps me accomplish my learning tasks effortlessly.
4. Using chatbots increases my productivity.
Perceived
Enjoyment
1. I have fun interacting with chatbots.
2. Using chatbots provides me with a lot of enjoyment.
3. I enjoy using chatbots for learning.
Perceived
Risk
1. I feel unsafe when using a chatbot.
2. I am worried that personal information would be leaked when using a chatbot.
3. I am worried about personal information suffering from unauthorized use when using a chatbot.
Attitude1. Using chatbots makes learning more interesting.
2. Using chatbots has a positive influence on my learning.
3. I think learning with a chatbot is valuable.
4. I think it is a trend to use chatbots in learning.
Perceived
Value
1. I believe that using a chatbot is a valuable idea.
2. Chatbot is advantageous to me due to the general amount of effort I need to put in.
3. Chatbot is worthwhile for me based on the amount of time I need to spend.
4. Chatbots provide me with good value in general.
Chatbot
Acceptance
1. I look forward to using chatbots in my learning.
2. I intend to use chatbots in my future learning.
3. I plan to use chatbots in my future learning.
4. I think using chatbots will increase my future learning.
5. I support the adoption of chatbots in higher education.

References

  1. Chang, C.Y.; Hwang, G.J.; Gau, M.L. Promoting Students’ Learning Achievement and Self-Efficacy: A Mobile Chatbot Approach for Nursing Training. Br. J. Educ. Technol. 2022, 53, 171–188. [Google Scholar] [CrossRef]
  2. Pérez, J.Q.; Daradoumis, T.; Puig, J.M.M. Rediscovering the Use of Chatbots in Education: A Systematic Literature Review. Comput. Appl. Eng. Educ. 2020, 28, 1549–1565. [Google Scholar] [CrossRef]
  3. Fryer, L.K.; Nakao, K.; Thompson, A. Chatbot Learning Partners: Connecting Learning Experiences, Interest and Competence. Comput. Hum. Behav. 2019, 93, 279–289. [Google Scholar] [CrossRef]
  4. Research and Markets. Global Chatbot Market Size, Share & Trends Analysis Report by End Use (Large Enterprises, Medium Enterprises), by Application, by Type, by Product Landscape, by Vertical, by Region, and Segment Forecasts, 2022–2030. Available online: https://www.researchandmarkets.com/reports/4396458/global-chatbot-market-size-share-and-trends#sp-pos-1 (accessed on 17 July 2023).
  5. Cunningham-Nelson, S.; Boles, W.; Trouton, L.; Margerison, E. A Review of Chatbots in Education: Practical Steps Forward. In Proceedings of the 30th Annual Conference for the Australasian Association for Engineering Education (AAEE 2019): Educators Becoming Agents of Change: Innovate, Integrate, Motivate, Brisbane, Australia, 8–11 December 2019. [Google Scholar]
  6. Troussas, C.; Krouska, A.; Alepis, E.; Virvou, M. Intelligent and Adaptive Tutoring Through a Social Network for Higher education. New Rev. Hypermedia Multimed. 2020, 26, 138–167. [Google Scholar] [CrossRef]
  7. Wollny, S.; Schneider, J.; Di Mitri, D.; Weidlich, J.; Rittberger, M.; Drachsler, H. Are We There Yet?—A Systematic Literature Review on Chatbots in Education. Front. Artif. Intell. 2021, 4, 654924. [Google Scholar] [CrossRef] [PubMed]
  8. Clarizia, F.; Colace, F.; Lombardi, M.; Pascale, F.; Santaniello, D. Chatbot: An Education Support System for Student. In International Symposium on Cyberspace Safety and Security; Springer: Berlin/Heidelberg, Germany, 2018; pp. 291–302. [Google Scholar] [CrossRef]
  9. Colace, F.; De Santo, M.; Lombardi, M.; Pascale, F.; Pietrosanto, A.; Lemma, S. Chatbot for E-Learning: A Case of Study. Int. J. Mech. Eng. Robot. Res. 2018, 7, 528–533. [Google Scholar] [CrossRef]
  10. Bezverhny, E.; Dadteev, K.; Barykin, L.; Nemeshaev, S.; Klimov, V. Use of Chat Bots in Learning Management Systems. Procedia Comput. Sci. 2020, 169, 652–655. [Google Scholar] [CrossRef]
  11. Haristiani, N.; Rifai, M.M. Chatbot-Based Application Development and Implementation as an Autonomous Language Learning Medium. Indones. J. Sci. Technol. 2021, 6, 561–576. [Google Scholar] [CrossRef]
  12. Aleedy, M.; Atwell, E.; Meshoul, S. Using AI Chatbots in Education: Recent Advances Challenges and Use Case. In Artificial Intelligence and Sustainable Computing. Algorithms for Intelligent Systems; Pandit, M., Gaur, M.K., Rana, P.S., Tiwari, A., Eds.; Springer: Singapore, 2022. [Google Scholar] [CrossRef]
  13. Sandu, N.; Gide, E. Adoption of AI-Chatbots to Enhance Student Learning Experience in Higher Education in India. In Proceedings of the 2019 18th International Conference on Information Technology Based Higher Education and Training (ITHET), Magdeburg, Germany, 26–27 September 2019; pp. 1–5. [Google Scholar]
  14. Winkler, R.; Söllner, M. Unleashing the Potential of Chatbots in Education: A State-of-the-Art Analysis. Acad. Manag. Annu. Meet. 2018. [Google Scholar] [CrossRef]
  15. Hwang, G.J.; Chang, C.Y. A Review of Opportunities and Challenges of Chatbots in Education. Interact. Learn. Environ. 2021, 31, 4099–4112. [Google Scholar] [CrossRef]
  16. Malik, R.; Shrama, A.; Trivedi, S.; Mishra, R. Adoption of Chatbots for Learning among University Students: Role of Perceived Convenience and Enhanced Performance. Int. J. Emerg. Technol. Learn. (IJET) 2021, 16, 200–211. [Google Scholar] [CrossRef]
  17. Chen, Y.; Jensen, S.; Albert, L.J.; Gupta, S.; Lee, T. Artificial Intelligence (AI) Student Assistants in the Classroom: Designing Chatbots to Support Student Success. Inf. Syst. Front. 2023, 25, 161–182. [Google Scholar] [CrossRef]
  18. Hammad, R.; Bahja, M. Opportunities and Challenges in Educational Chatbots. In Trends, Applications, and Challenges of Chatbot Technology; IGI Global: Hershey, PA, USA, 2023; pp. 119–136. [Google Scholar] [CrossRef]
  19. Yang, S.; Evans, C. Opportunities and Challenges in Using AI Chatbots in Higher Education. In Proceedings of the 2019 3rd International Conference on Education and E-Learning, Barcelona, Spain, 5–7 November 2019; pp. 79–83. [Google Scholar]
  20. Hasal, M.; Nowaková, J.; Ahmed Saghair, K.; Abdulla, H.; Snášel, V.; Ogiela, L. Chatbots: Security, Privacy, Data Protection, and Social Aspects. Concurr. Comput. Pract. Exp. 2021, 33, e6426. [Google Scholar] [CrossRef]
  21. Wu, W.; Zhang, B.; Li, S.; Liu, H. Exploring Factors of the Willingness to Accept AI-Assisted Learning Environments: An Empirical Investigation Based on the Utaut Model and Perceived Risk Theory. Front. Psychol. 2022, 13, 870777. [Google Scholar] [CrossRef]
  22. Okonkwo, C.W.; Ade-Ibijola, A. Chatbots Applications in Education: A Systematic Review. Comput. Educ. Artif. Intell. 2021, 2, 100033. [Google Scholar] [CrossRef]
  23. Fidan, M.; Gencel, N. Supporting the Instructional Videos with Chatbot and Peer Feedback Mechanisms in Online Learning: The Effects on Learning Performance and Intrinsic Motivation. J. Educ. Comput. Res. 2022, 60, 1716–1741. [Google Scholar] [CrossRef]
  24. Kuhail, M.A.; Alturki, N.; Alramlawi, S.; Alhejori, K. Interacting with Educational Chatbots: A Systematic Review. Educ. Inf. Technol. 2023, 28, 973–1018. [Google Scholar] [CrossRef]
  25. Pereira, J.; Fernández-Raga, M.; Osuna-Acedo, S.; Roura-Redondo, M.; Almazán-López, O.; Buldón-Olalla, A. Promoting Learners’ Voice Productions Using Chatbots as a Tool for Improving the Learning Process in a MOOC. Technol. Knowl. Learn. 2019, 24, 545–565. [Google Scholar] [CrossRef]
  26. Kazoun, N.; Kokkinaki, A.; Chedrawi, C. Factors That Affect the Use of AI Agents in Adaptive Learning: A Sociomaterial and McDonaldization Approach in the Higher Education Sector. In Information Systems, Proceedings of the 18th European, Mediterranean, and Middle Eastern Conference, EMCIS 2021, Virtual Event, 8–9 December 2021; Springer International Publishing: Berlin/Heidelberg, Germany, 2022; pp. 414–426. [Google Scholar]
  27. Yin, J.; Goh, T.T.; Yang, B.; Xiaobin, Y. Conversation Technology with Micro-Learning: The Impact of Chatbot-Based Learning on Students’ Learning Motivation and Performance. J. Educ. Comput. Res. 2021, 59, 154–177. [Google Scholar] [CrossRef]
  28. Mohd Rahim, N.I.; Iahad, N.A.; Yusof, A.F.; Al-Sharafi, M.A. AI-Based Chatbots Adoption Model for Higher-Education Institutions: A Hybrid PLS-SEM-Neural Network Modelling Approach. Sustainability 2022, 14, 12726. [Google Scholar] [CrossRef]
  29. Essel, H.B.; Vlachopoulos, D.; Tachie-Menson, A.; Johnson, E.E.; Baah, P.K. The Impact of a Virtual Teaching Assistant (Chatbot) on Students’ Learning in Ghanaian Higher Education. Int. J. Educ. Technol. High. Educ. 2022, 19, 57. [Google Scholar] [CrossRef]
  30. Al-Abdullatif, A.M.; Al-Dokhny, A.A.; Drwish, A.M. Implementing the Bashayer Chatbot in Saudi Higher Education: Measuring the Influence on Students’ Motivation and Learning Strategies. Front. Psychol. 2023, 14, 1129070. [Google Scholar] [CrossRef]
  31. Chen, H.L.; Vicki Widarso, G.; Sutrisno, H. A Chatbot for Learning Chinese: Learning Achievement and Technology Acceptance. J. Educ. Comput. Res. 2020, 58, 1161–1189. [Google Scholar] [CrossRef]
  32. Troussas, C.; Krouska, A.; Virvou, M. Integrating an Adjusted Conversational Agent into a Mobile-Assisted Language Learning Application. In Proceedings of the 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI), Boston, MA, USA, 6–8 November 2017; pp. 1153–1157. [Google Scholar] [CrossRef]
  33. Troussas, C.; Krouska, A.; Virvou, M. MACE: Mobile Artificial Conversational Entity for Adapting Domain Knowledge and Generating Personalized Advice. Int. J. Artif. Intell. Tools 2019, 28, 1–16. [Google Scholar] [CrossRef]
  34. Lee, L.K.; Fung, Y.C.; Pun, Y.W.; Wong, K.K.; Yu, M.T.Y.; Wu, N.I. Using a Multiplatform Chatbot as an Online Tutor in a University Course. In Proceedings of the 2020 International Symposium on Educational Technology (ISET), Bangkok, Thailand, 24–27 August 2020; pp. 53–56. [Google Scholar]
  35. Al Darayseh, A. Acceptance of Artificial Intelligence in Teaching Science: Science Teachers’ Perspective. Comput. Educ. Artif. Intell. 2023, 4, 100132. [Google Scholar] [CrossRef]
  36. Chocarro, R.; Cortiñas, M.; Marcos-Matás, G. Teachers’ Attitudes Towards Chatbots in Education: A Technology Acceptance Model Approach Considering the Effect of Social Language, Bot Proactiveness, and Users’ Characteristics. Educ. Stud. 2021, 49, 295–313. [Google Scholar] [CrossRef]
  37. Chuah, K.M.; Kabilan, M. Teachers’ Views on the Use of Chatbots to Support English Language Teaching in a Mobile Environment. Int. J. Emerg. Technol. Learn. (IJET) 2021, 16, 223–237. [Google Scholar] [CrossRef]
  38. Merelo, J.J.; Castillo, P.A.; Mora, A.M.; Barranco, F.; Abbas, N.; Guillén, A.; Tsivitanidou, O. Chatbots and Messaging Platforms in the Classroom: An Analysis from the Teacher’s Perspective. Educ. Inf. Technol. 2023, 1–36. [Google Scholar] [CrossRef]
  39. Nikou, S.A.; Chang, M. Learning by Building Chatbot: A System Usability Study and Teachers’ Views About the Educational Uses of Chatbots. In International Conference on Intelligent Tutoring Systems, Proceedings of the 19th International Conference, ITS 2023, Corfu, Greece, 2–5 June 2023; Springer Nature Switzerland: Cham, Switzerland, 2023; pp. 342–351. [Google Scholar]
  40. Yang, T.C.; Chen, J.H. Pre-Service Teachers’ Perceptions and Intentions Regarding the Use of Chatbots Through Statistical and Lag Sequential Analysis. Comput. Educ. Artif. Intell. 2023, 4, 100119. [Google Scholar] [CrossRef]
  41. Davis, F. A Technology Acceptance Model for Empirically Testing New End-User Information Systems. Ph.D. Thesis, Massachusetts Institute of Technology, Cambrige, MA, USA, 1985. [Google Scholar]
  42. Kim, H.W.; Chan, H.C.; Gupta, S. Value-Based Adoption of Mobile Internet: An Empirical Investigation. Decis. Support Syst. 2007, 43, 111–126. [Google Scholar] [CrossRef]
  43. Rejón-Guardia, F.; Vich-I-Martorell, G.A. Design and Acceptance of Chatbots for Information Automation in University Classrooms. In EDULEARN20 Proceedings, Proceedings of the 12th International Conference on Education and New Learning Technologies, Online Conference, 6–7 July 2020; IATED: Valencia, Spain, 2020; pp. 2452–2462. [Google Scholar]
  44. Bahja, M.; Hammad, R.; Hassouna, M. Talk2Learn: A Framework for Chatbot Learning. In Transforming Learning with Meaningful Technologies; Scheffel, M., Broisin, J., Pammer-Schindler, V., Ioannou, A., Schneider, J., Eds.; Springer: Cham, Switzerland, 2019. [Google Scholar] [CrossRef]
  45. Pérez-Marín, D. A Review of the Practical Applications of Pedagogic Conversational Agents to be Used in School and University Classrooms. Digital 2021, 1, 18–33. [Google Scholar] [CrossRef]
  46. Sriwisathiyakun, K.; Dhamanitayakul, C. Enhancing Digital Literacy with an Intelligent Conversational Agent for Senior Citizens in Thailand. Educ. Inf. Technol. 2022, 27, 6251–6271. [Google Scholar] [CrossRef] [PubMed]
  47. Sjöström, J.; Dahlin, M. Tutorbot: A Chatbot for Higher Education Practice. In Designing for Digital Transformation. Co-Creating Services with Citizens and Industry, Proceedings of the 15th International Conference on Design Science Research in Information Systems and Technology, DESRIST 2020, Kristiansand, Norway, 2–4 December 2020; Springer International Publishing: Berlin/Heidelberg, Germany, 2020; pp. 93–98. [Google Scholar]
  48. Cabrera, N.; Fernández-Ferrer, M.; Maina, M.; Guàrdia, L. Peer Assessment in Online Learning: Promoting Self-Regulation Strategies Through the Use of Chatbots in Higher education. Envisioning Rep. 2022, 49, 49–51. [Google Scholar]
  49. Calle, M.; Narváez, E.; Maldonado-Mahauad, J. Proposal for the Design and Implementation of Miranda: A Chatbot-Type Recommender for Supporting Self-Regulated Learning in Online Environments. LALA 2021, 21, 19–21. [Google Scholar]
  50. Park, S.; Choi, J.; Lee, S.; Oh, C.; Kim, C.; La, S.; Lee, J.; Suh, B. Designing a Chatbot for a Brief Motivational Interview on Stress Management: Qualitative Case Study. J. Med. Internet Res. 2019, 21, e12231. [Google Scholar] [CrossRef]
  51. Mai, N.E.O. The Merlin Project: Malaysian Students’ Acceptance of an AI Chatbot in Their Learning Process. Turk. Online J. Distance Educ. 2022, 23, 31–48. [Google Scholar] [CrossRef]
  52. Belda-Medina, J.; Calvo-Ferrer, J.R. Using Chatbots as AI Conversational Partners in Language Learning. Appl. Sci. 2022, 12, 8427. [Google Scholar] [CrossRef]
  53. Huang, W.; Hew, K.F.; Fryer, L.K. Chatbots for Language Learning—Are They Really Useful? A Systematic Review of Chatbot-Supported Language Learning. J. Comput. Assist. Learn. 2021, 38, 237–257. [Google Scholar] [CrossRef]
  54. Pillai, R.; Sivathanu, B.; Metri, B.; Kaushik, N. Students’ Adoption of AI-Based Teacher-Bots (T-Bots) for Learning in Higher Education. Inf. Technol. People 2023, 1–25. [Google Scholar] [CrossRef]
  55. Kumar;Silva, P.A. Work-in-Progress: A Preliminary Study on Students’ Acceptance of Chatbots for Studio-Based Learning. In Proceedings of the 2020 IEEE Global Engineering Education Conference (EDUCON), Porto, Portugal, 27–30 April 2020; pp. 1627–1631. [Google Scholar] [CrossRef]
  56. Chatterjee, S.; Bhattacharjee, K.K. Adoption of Artificial Intelligence in Higher Education: A Quantitative Analysis Using Structural Equation Modelling. Educ. Inf. Technol. 2020, 25, 3443–3463. [Google Scholar] [CrossRef]
  57. Slepankova, M. Possibilities of Artificial Intelligence in Education: An Assessment of the Role of AI Chatbots as a Communication Medium in Higher Education. Master’s Thesis, Linnaeus University, Växjö, Sweden, 2021. Available online: https://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-108427 (accessed on 17 September 2023).
  58. Mokmin, N.A.M.; Ibrahim, N.A. The Evaluation of Chatbot as a Tool for Health Literacy Education among Undergraduate Students. Educ. Inf. Technol. 2021, 2, 6033–6049. [Google Scholar] [CrossRef] [PubMed]
  59. Keong, W.E.Y. Factors Influencing Adoption Intention Towards Chatbots as a Learning Tool. In The International Conference in Education (ICE), Proceedings; Faculty of Social Sciences and Humanities, UTM: Johor Bahru, Malaysia, 2022; pp. 96–99. [Google Scholar]
  60. Almahri, F.A.J.; Bell, D.; Merhi, M. Understanding Student Acceptance and Use of Chatbots in the United Kingdom Universities: A Structural Equation Modelling Approach. In Proceedings of the 2020 6th International Conference on Information Management (ICIM), London, UK, 27–29 March 2020; pp. 284–288. [Google Scholar] [CrossRef]
  61. Ragheb, M.A.; Tantawi, P.; Farouk, N.; Hatata, A. Investigating the Acceptance of Applying Chat-Bot (Artificial Intelligence) Technology among Higher Education Students in Egypt. Int. J. High. Educ. Manag. 2022, 8, 1–13. [Google Scholar] [CrossRef]
  62. Liao, Y.-K.; Wu, W.-Y.; Le, T.Q.; Phung, T.T.T. The Integration of the Technology Acceptance Model and Value-Based Adoption Model to Study the Adoption of E-Learning: The Moderating Role of e-WOM. Sustainability 2022, 14, 815. [Google Scholar] [CrossRef]
  63. Davis, F.D. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  64. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef]
  65. Park, S.Y. An Analysis of the Technology Acceptance Model in Understanding University Students’ Behavioral Intention to Use E-Learning. J. Educ. Technol. Soc. 2009, 12, 150–162. [Google Scholar]
  66. King, W.R.; He, J. A Meta-Analysis of the Technology Acceptance Model. Inf. Manag. 2006, 43, 740–755. [Google Scholar] [CrossRef]
  67. Zeithaml, V.A. Consumer Perceptions of Price, Quality, and Value: A Means-End Model and Synthesis of Evidence. J. Mark. 1988, 52, 2–22. [Google Scholar] [CrossRef]
  68. Kim, J.; Kim, J. An Integrated Analysis of Value-Based Adoption Model and Information Systems Success Model for Prop Tech Service Platform. Sustainability 2021, 13, 12974. [Google Scholar] [CrossRef]
  69. Kim, Y.; Park, Y.; Choi, J. A Study on the Adoption of IoT Smart Home Service: Using Value-Based Adoption Model. Total Qual. Manag. Bus. Excell. 2017, 28, 1149–1165. [Google Scholar] [CrossRef]
  70. Sohn, K.; Kwon, O. Technology Acceptance Theories and Factors Influencing Artificial Intelligence–Based Intelligent Products. Telemat. Inform. 2020, 47, 101324. [Google Scholar] [CrossRef]
  71. Hsiao, K.L.; Chen, C.C. Value-Based Adoption of E-Book Subscription Services: The Roles of Environmental Concerns and Reading Habits. Telemat. Inform. 2017, 34, 434–448. [Google Scholar] [CrossRef]
  72. Kim, S.H.; Bae, J.H.; Jeon, H.M. Continuous Intention on Accommodation Apps: Integrated Value-Based Adoption and Expectation-Confirmation Model Analysis. Sustainability 2019, 11, 1578. [Google Scholar] [CrossRef]
  73. Liang, T.P.; Lin, Y.L.; Hou, H.C. What Drives Consumers to Adopt a Sharing Platform: An Integrated Model of Value-Based and Transaction Cost Theories. Inf. Manag. 2021, 58, 103471. [Google Scholar] [CrossRef]
  74. Aslam, W.; Ahmed Siddiqui, D.; Arif, I.; Farhat, K. Chatbots in the Frontline: Drivers of Acceptance. Kybernetes 2022. ahead-of-print. [Google Scholar] [CrossRef]
  75. Kelly, S.; Kaye, S.A.; Oviedo-Trespalacios, O. What Factors Contribute to Acceptance of Artificial Intelligence? A Systematic Review. Telemat. Inform. 2022, 77, 101925. [Google Scholar] [CrossRef]
  76. Yu, J.; Lee, H.; Ha, I.; Zo, H. User Acceptance of Media Tablets: An Empirical Examination of Perceived Value. Telemat. Inform. 2017, 34, 206–223. [Google Scholar] [CrossRef]
  77. Lau, C.K.H.; Chui, C.F.R.; Au, N. Examination of the Adoption of Augmented Reality: A VAM Approach. Asia Pac. J. Tour. Res. 2019, 24, 1005–1020. [Google Scholar] [CrossRef]
  78. Teo, T. Factors Influencing Teachers’ Intention to Use Technology: Model Development and Test. Comput. Educ. 2011, 57, 2432–2440. [Google Scholar] [CrossRef]
  79. Yang, H.; Yu, J.; Zo, H.; Choi, M. User Acceptance of Wearable Devices: An Extended Perspective of Perceived Value. Telemat. Inform. 2016, 33, 256–269. [Google Scholar] [CrossRef]
  80. Rapp, A.; Curti, L.; Boldi, A. The Human Side of Human-Chatbot Interaction: A Systematic Literature Review of Ten Years of Research on Text-Based Chatbots. Int. J. Hum.-Comput. Stud. 2021, 151, 102630. [Google Scholar] [CrossRef]
  81. Marjerison, R.K.; Zhang, Y.; Zheng, H. AI in E-Commerce: Application of the Use and Gratification Model to the Acceptance of Chatbots. Sustainability 2022, 14, 14270. [Google Scholar] [CrossRef]
  82. Fatima, T.; Kashif, S.; Kamran, M.; Awan, T.M. Examining Factors Influencing Adoption of M-Payment: Extending UTAUT2 with Perceived Value. Int. J. Innov. Creat. Chang. 2021, 15, 276–299. [Google Scholar]
  83. Huang, W.; Hew, K.F.; Gonda, D.E. Designing and Evaluating Three Chatbot-Enhanced Activities for a Flipped Graduate Course. Int. J. Mech. Eng. Robot. Res. 2019, 8, 813–818. [Google Scholar] [CrossRef]
  84. Sacchetti, F.D.; Dohan, M.; Wu, S. Factors Influencing the Clinician’s Intention to Use AI Systems in Healthcare: A Value-Based Approach. In AMCIS 2022 Proceedings; Americas Conference on Information Systems (AMCIS): Minnesota, Country, 2022; p. 17. Available online: https://aisel.aisnet.org/amcis2022/sig_health/sig_health/17 (accessed on 26 August 2023).
  85. Zarouali, B.; Van den Broeck, E.; Walrave, M.; Poels, K. Predicting Consumer Responses to a Chatbot on Facebook. Cyberpsychol. Behav. Soc. Netw. 2018, 21, 491–497. [Google Scholar] [CrossRef] [PubMed]
  86. De Cicco, R.; Iacobucci, S.; Aquino, A.; Romana Alparone, F.; Palumbo, R. Understanding Users’ Acceptance of Chatbots: An Extended TAM Approach. In Chatbot Research and Design, Proceedings of the 5th International Workshop, CONVERSATIONS 2021, Virtual Event, 23–24 November 2021, Revised Selected Papers; Springer International Publishing: Berlin/Heidelberg, Germany, 2022; pp. 3–22. [Google Scholar]
  87. Zhang, R.; Zhao, W.; Wang, Y. Big data analytics for intelligent online education. J. Intell. Fuzzy Syst. 2021, 40, 2815–2825. [Google Scholar] [CrossRef]
  88. Völkel, S.T.; Haeuslschmid, R.; Werner, A.; Hussmann, H.; Butz, A. How to Trick AI: Users’ Strategies for Protecting Themselves from Automatic Personality Assessment. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–15. [Google Scholar]
  89. Turel, O.; Serenko, A.; Bontis, N. User Acceptance of Hedonic Digital Artifacts: A Theory of Consumption Values Perspective. Inf. Manag. 2010, 47, 53–59. [Google Scholar] [CrossRef]
  90. Ashfaq, M.; Yun, J.; Yu, S. My Smart Speaker is Cool! Perceived Coolness, Perceived Values, and Users’ Attitude toward Smart Speakers. Int. J. Hum.-Comput. Interact. 2021, 37, 560–573. [Google Scholar] [CrossRef]
  91. Hill, R. What Sample Size is “Enough” in Internet Survey Research? Interpers. Comput. Technol. Electron. J. 21st Century 1998, 6, 1–12. [Google Scholar]
  92. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to Use and How to Report the Results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  93. Hair, J.F., Jr.; Sarstedt, M.; Ringle, C.M.; Gudergan, S.P. Advanced Issues in Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed; Sage Publications: New York, NY, USA, 2017. [Google Scholar]
  94. Henseler, J.; Ringle, C.M.; Sarstedt, M. A New Criterion for Assessing Discriminant Validity in Variance-Based Structural Equation Modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef]
  95. Liu, Q.; Huang, J.; Wu, L.; Zhu, K.; Ba, S. CBET: Design and Evaluation of a Domain-Specific Chatbot for Mobile Learning. Univers. Access Inf. Soc. 2020, 19, 655–673. [Google Scholar] [CrossRef]
  96. Othman, K. Towards Implementing AI Mobile Application Chatbots for EFL Learners at Primary Schools in Saudi Arabia. J. Namib. Stud. Hist. Politics Cult. 2023, 33, 271–287. [Google Scholar] [CrossRef]
  97. Laupichler, M.C.; Aster, A.; Schirch, J.; Raupach, T. Artificial Intelligence Literacy in Higher and Adult Education: A Scoping Literature Review. Comput. Educ. Artif. Intell. 2022, 3, 100101. [Google Scholar] [CrossRef]
  98. Wang, B.; Rau, P.L.P.; Yuan, T. Measuring User Competence in Using Artificial Intelligence: Validity and Reliability of Artificial Intelligence Literacy Scale. Behav. Inf. Technol. 2022, 42, 1324–1337. [Google Scholar] [CrossRef]
  99. Venkatesh, V.; Thong, J.Y.; Xu, X. Consumer Acceptance and Use of Information Technology: Extending the Unified Theory of Acceptance and Use of Technology. MIS Q. 2012, 36, 157–178. [Google Scholar] [CrossRef]
Figure 1. The present study’s research model.
Figure 1. The present study’s research model.
Education 13 01151 g001
Figure 2. Standardized path coefficient results (* significant at p-value < 0.001).
Figure 2. Standardized path coefficient results (* significant at p-value < 0.001).
Education 13 01151 g002
Table 1. Previous research on students’ acceptance of chatbots in learning.
Table 1. Previous research on students’ acceptance of chatbots in learning.
SourceAimTheoretical ModelHighlights
1. [13]Exploring the factors affecting higher education students’ adoption of chatbots in India. Not clearly stated Personalized learning experiences and timely assistance resulted in students’ increased willingness to adopt chatbots.
2. [55]Investigating students’ acceptance of chatbot use in studio-based learning in a Malaysian university. Extended TAM Accessibility, perceived ease of use, prompt feedback, human-like interaction, and privacy positively influence intention to use.
3. [31]Examining chatbots’ impact on learning Chinese vocabulary and measuring students’ acceptance of chatbot technology.TAMPositive learning outcomes, perceived usefulness emerged as a powerful indicator of use intention, and perceived ease of use was not an indicator.
4. [60]Exploring the factors that influence the chatbots’ acceptance among university students.UTAUT2Effort expectancy performance expectancy and habit positively impact students’ intentions to adopt chatbots.
5. [28]Identifying the factors that affect chatbot adoption in higher education.UTAUT2 Habit, perceived trust, and performance expectancy influence the use intentions of chatbots. Interactivity, design, and ethics influence students’ perceived trust.
6. [16]Examining the drivers of students’ adoption of chatbots in higher education in India.Extended TAM Students’ adoption of chatbot technology is positively impacted by ease of use, usefulness, attitude, perceived convenience, and enhanced performance.
7. [58]Investigating undergraduates’ technological acceptance of chatbots as well as their impact on students’ health literacy.UTAUT2Positive impact on students’ health literacy. Seventy percent of the participants responded positively in terms of self-efficacy, effort expectancy, attitude, performance expectancy, and behavioral intention.
8. [57]Examining the acceptability of chatbot use among university students in Europe.UTAUT2Effort expectancy, nonjudgmental expectancy, and performance expectancy significantly predict intention to use.
9. [51]Assessing university students’ acceptance of adopting chatbot technology in online courses in Malaysia.TAMStudents demonstrated a high level of readiness to accept and use chatbot technology in their online courses.
10. [61]Measuring the determinants that impact chatbot acceptance among students in Egypt.UTAUTSocial influence, effort expectancy, and performance expectancy positively affect students’ acceptance of adopting chatbots in learning.
11. [43]Evaluating students’ acceptance and satisfaction with using chatbot technology.UTAUT2Chatbots effectively improved students’ learning and overall performance.
12. [52]Measuring the technological acceptance of chatbot integration in language learning. Extended TAM Students positively rated their chatbot experience, particularly in their responses to their perceived usefulness, attitude, perceived ease of use, and self-efficacy.
13. [59]Investigating what makes students more likely to adopt chatbots as e-learning tools.Extended TAMStudents’ acceptance of chatbots was significantly influenced by perceived usefulness, perceived trust, perceived risk, perceived enjoyment, and attitude.
14. [54]Investigating students’ actual use and intention of chatbots in learning.Extended TAM Personalization, perceived intelligence, anthropomorphism, perceived trust, perceived ease of use, interactivity, and perceived usefulness determine the intention to adopt.
Table 2. Sample profile (N = 432).
Table 2. Sample profile (N = 432).
Characteristicsn%
GenderMale12027.8
Female31272.2
Age≤18143.2
19–2019444.9
21–2217841.2
23–24286.5
≥25184.2
Educational LevelUndergraduate38689.4
Graduate4610.6
Academic MajorHealth sciences5011.6
Humanities14433.3
Social sciences10323.8
Pure sciences9221.3
Computer science and information technology4310.0
Table 3. The analysis of the measurement model.
Table 3. The analysis of the measurement model.
ConstructIndicator (In)Indicator LoadingsαCRAVER2R2
Adjusted
Q2
Perceived
Ease of Use
In 10.850.880.920.73
In 20.90
In 30.88
In 40.80
Perceived
Usefulness
In 10.850.860.900.70
In 20.87
In 30.79
In 40.84
Perceived
Enjoyment
In 10.890.890.930.83
In 20.94
In 30.89
Perceived
Risk
In 10.950.880.920.79
In 20.84
In 30.86
AttitudeIn 10.810.840.890.680.580.570.49
In 20.84
In 30.86
In 40.78
Perceived
Value
In 10.830.890.920.750.530.520.51
In 20.90
In 30.89
In 40.84
Chatbot
Acceptance
In 10.780.900.920.710.570.560.44
In 20.88
In 30.87
In 40.84
In 50.84
Table 4. Discriminant validity analysis and correlation matrix.
Table 4. Discriminant validity analysis and correlation matrix.
Constructs1234567
1. Perceived Ease of Use0.86
2. Perceived Usefulness0.61
(0.69)
0.84
3. Perceived Enjoyment0.53
(0.59)
0.55
(0.62)
0.91
4. Perceived Risk−0.10
(0.11)
−0.07
(0.07)
−0.03
(0.07)
0.89
5. Attitude0.60
(0.69)
0.63
(0.73)
0.59
(0.68)
−0.08
(0.08)
0.82
6. Perceived Value0.61
(0.79)
0.51
(0.57)
0.65
(0.73)
−0.09
(0.09)
0.67
(0.77)
0.89
7. Chatbot Acceptance0.56
(0.63)
0.50
(0.56)
0.62
(0.68)
−0.12
(0.09)
0.65
(0.79)
0.63
(0.76)
0.84
The bold values represent the square roots of the AVE, while the values enclosed in parentheses indicate the HTMT ratios.
Table 5. Results of the hypothesis testing.
Table 5. Results of the hypothesis testing.
HIndependent VariablesPathDependent
Variables
Path Coefficients (β)Standard Errors (SE)t-Valuesp-Values
H1Perceived Ease of UseAttitude0.140.081.730.08
H2Perceived UsefulnessAttitude0.300.064.510.00 *
H3AttitudeChatbot
Acceptance
0.420.094.620.00 *
H4Perceived UsefulnessPerceived Value0.040.070.560.57
H5Perceived
Enjoyment
Perceived Value0.450.067.280.00 *
H6Perceived
Risk
Perceived Value−0.030.060.590.55
H7Perceived ValueChatbot
Acceptance
0.410.084.970.00 *
H8Perceived Ease of UsePerceived Value0.340.084.190.00 *
H9Perceived
Enjoyment
Attitude0.130.081.680.09
H10Perceived
Risk
Attitude−0.010.060.190.84
H11Perceived ValueAttitude0.350.103.500.00 *
* Significant at p-value < 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Al-Abdullatif, A.M. Modeling Students’ Perceptions of Chatbots in Learning: Integrating Technology Acceptance with the Value-Based Adoption Model. Educ. Sci. 2023, 13, 1151. https://doi.org/10.3390/educsci13111151

AMA Style

Al-Abdullatif AM. Modeling Students’ Perceptions of Chatbots in Learning: Integrating Technology Acceptance with the Value-Based Adoption Model. Education Sciences. 2023; 13(11):1151. https://doi.org/10.3390/educsci13111151

Chicago/Turabian Style

Al-Abdullatif, Ahlam Mohammed. 2023. "Modeling Students’ Perceptions of Chatbots in Learning: Integrating Technology Acceptance with the Value-Based Adoption Model" Education Sciences 13, no. 11: 1151. https://doi.org/10.3390/educsci13111151

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop