Next Article in Journal
Projection of Thermal Bioclimate of Egypt for the Paris Agreement Goals
Next Article in Special Issue
Research Status, Hotspots, and Evolutionary Trends of Global Digital Education via Knowledge Graph Analysis
Previous Article in Journal
Thermodynamic Analysis of a New Combined Cooling and Power System Coupled by the Kalina Cycle and Ammonia–Water Absorption Refrigeration Cycle
Previous Article in Special Issue
Teachers’ Turnover Intentions in View of Implementing a Flexible Learning System: An Extended Theory of Planned Behavior
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development and Validation of an E-Learning Education Model in the COVID-19 Pandemic: A Case Study in Secondary Education

by
Mónica Martínez-Gómez
1,
Eliseo Bustamante
1,* and
César Berna-Escriche
1,2
1
Departamento de Estadística, Investigación Operativa Aplicadas y Calidad, Universitat Politècnica de València, 46022 Valencia, Spain
2
Instituto de Ingeniería Energética, Universitat Politècnica de València, 46022 Valencia, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(20), 13261; https://doi.org/10.3390/su142013261
Submission received: 16 September 2022 / Revised: 10 October 2022 / Accepted: 11 October 2022 / Published: 15 October 2022
(This article belongs to the Special Issue Digitalization of Education: Technology Enhanced Learning)

Abstract

:
E-learning was crucial during the global lockdown. In this way, this article aims to propose and validate a holistic framework in which all the E-learning services are needed to ensure their effective implementation and use. To this end, an original 3S-T model, to measure E-learning success based on self-student assessment, was developed. This innovative model, which reinforces the existing theoretical framework of models, identifies a wide array of success predictors and relates them to various measures that help to reach success, including learning and academic achievements. The validation of the 3S-T model was carried out using the partial least squares structural regression equations modeling technique (PLS-SEM). In this analysis, four major constructors were identified as determinants of E-learning service performance, namely, the surrounding conditions, system characteristics, tutor’s development and student’s own performance. Although each of them is composed of several subcategories, finally, 15 indicators that estimate the fulfillment of these factors were defined and evaluated. The present study is strongly connected to the fourth goal of the Agenda established by the United Nations, which seeks Quality Education to ensure the sustainable development of countries.

1. Introduction

The unstoppable advance of new information and communication technologies (ICTs) has led to important changes in many disciplines [1,2]. These technologies have taken advantage of new paradigms such as the internet, social networks, cloud computing, block-chain, and big data to favor their unstoppable expansion. These innovations have led to the emergence of new markets, products, processes, and services. The education field has been one of the disciplines where these technologies have had an important effect; many new paradigms were brought into learning, such as E-learning and mobile learning, meaning that the existing traditional face-to-face master classes are no longer the only learning option. In fact, the E-learning paradigm is not new, but it is considered an extension of the distance education mode initiated in the 1980s [3]. This process towards the increase in distance education has been accelerated by COVID-19, since E-learning has proven to be the only resource that allows learning to continue during the nowadays global blockade [4,5]. As a result, all institutions worldwide have been investing extensively in E-learning, so that most of the courses provided in traditional face-to-face mode have been converted to E-learning mode. This acceleration has only shortened a process that was already inevitable, so that in the near future, with the return to normality, the road traveled will not be undone and nothing will return to the way it was in the past; thus, E-learning is here to stay. Additionally, according to sustainability criteria, E-learning compared to conventional face-to-face learning has a much lower average energy consumption and CO2 emissions per student [6].
In the above-described scenario, to favor this E-learning, an online learning environment (OLE) has to be created; this means that Learning Management Systems (LMSs) have to be implemented. Within them, a key factor to achieving success is the deployment of a high Quality of Interaction (QoI) between the different actors of the online learning system. A widely used LMS tool is the Moodle platform [7]; in fact, this is the tool used by the student of the current research. All these platforms must offer fast access and a user-friendly environment, with large data management capacity and a variety of web-based tools. In the case of Valencia Community (Spain), Aules [8] is the official website for E-learning, developed by and for teachers of the correspondent institutional Department of Education (Conselleria d’Educació, Generalitat Valenciana). Any teacher of the public non-university educational system of this region can create and manage a virtual class with their students. Aules is based on a Moodle platform; it is very intuitive and attractive, taking into account that the age ranges of the students are from primary school (4 years old) to the end course before university (2nd baccalaureate—18 years old).
E-learning systems should be seen as a breakthrough, as they can even compensate for the weaknesses of traditional learning methods, as well as offer the possibility to extend knowledge to a more significant number of students, who may even be on the other side of the world. Therefore, if advantage is taken of all the possibilities of the new technologies, it will offer an excellent opportunity for young people and knowledge finders. However, in order to know if the implementation of E-learning is indeed an improvement compared to traditional in-person learning, it is essential to use success-measurement tools, which are fundamental in understanding the added value, the effect of management operations on the investments [9,10]. Student Self-Assessment (SSA) is a valuable way to evaluate E-learning success; this term could be understood as a person’s perceived quality of his work and educational abilities [11]. When aiming to evaluate the SSA, multidimensional factors must be analyzed in order to assess the different aspects involved in their learning [12].
The measurement/estimation of the effectiveness of E-learning initiatives has been vastly investigated [2,3,13]. A quick review of the publications in this field reveals that different studies use different conceptual approaches, such as the Technology Acceptance Model (TAM) [14,15], Information Systems Success (ISS) [9], SERVQUAL [16], the Decomposed Theory of Planned Behavior (DTPB) [17], the UTAUT and UTAUT2 models [18,19], and the 5Q model [20]. In addition, many E-learning success and quality evaluation models have been proposed, such as the E-Learning System Success (ELSS), the Evaluating E-learning Systems Success evaluation (EESS), the E-Learning Quality (ELQ), the E-learner Satisfaction (ELS), and the User Satisfaction Model (USM) [21,22,23,24,25,26,27]. In the same way, along with these researchers, many different dimensions, factors, and constructs have been considered to evaluate the E-learning performance in each particular application. Thus, several factors have been found to be critical to E-learning success. This paper presents a robust model for the measurement of E-learning systems performance based on students’ academic achievements and on students learning attainment. We are aware that it has also been difficult for educators but there are many papers that reveal a positive relationship between teacher’s perception and technology acceptance (E-learning) [28,29,30,31]. To this end, the fundamental purpose of this study is the development and validation of a tool that provides data to help better understand the factors that influence E-learning of students, and not only this, but also to be able to estimate the importance of each of them. In this model, not only was the measurement model for each construct validated, but the relationships between the measurement model and the structural model were also determined.

2. General Overview of the Non-University Educational System of Spain

It is crucial to understand the structure and situation of the non-university educational system for the public and potential readers of the article because the general knowledge is very low due to two fundamental aspects:
Each political option tries to impose new legislation repealing the previous one with sudden significant changes. In this vein, each government enacts a new educational law. Besides, all the communities of Spain have the possibility to modify from 40% to 50% of the contents (depending on if the community has its own language or not).
From the university and the business world, the structure and curriculum of non-university education is generally unknown. It seems rather isolated when there should be strong relationships and mutual knowledge. Obviously, non-university students go to the world of work or to a university.
The absence of a consensus among political parties to create a solid general law that would be valid for all political options and that would remain in force for a long time is highly criticized. In addition, the laws are increasingly lax in terms of the levels of demand for students. Furthermore, according to the above mentioned, local governments can be different from the central Spanish government and the assigned percentage can vary in one side or another, resulting in very different curricula. In Spain, the next course will begin to apply the new law LOMLOE [32], promoting the universal design of learning, the use of ICTs, and multilingualism among other educational aspects. The students during the COVID 19 pandemic and to date have been subjected under LOMCE [33], which has recently been repealed. Both laws divided the students into two large groups: compulsory (ESO: 1st, 2nd, 3rd, and 4th); and non-compulsory (high school or baccalaureate: 1st and 2nd). In the world, the age of leaving the non-university system is around 18 years, although there are exceptions, such as Italy, where they remain one more year in high school. Spain is one of the countries with the shortest baccalaureate (2 courses). Until the 1990s, with the General Law of Education (LGE, BOE 6-8-1970) [34], the Spanish baccalaureate had four years and the last course was especially oriented towards university (COU University orientation course). According to some teachers, a baccalaureate of only two years is too short to achieve the knowledge, strategies, and skills to begin university.
Currently in Spain, the non-university system is made up of the following:
-
Child education (0–6 years), first cycle (0–3 years), and second cycle (4–6 years).
-
Primary education. (1st, 2nd, 3rd, 4th, 5th, and 6th) (6–11 years old).
-
Compulsory secondary school (1st, 2nd, 3rd, and 4th ESO) (11–16 years old).
-
Non-compulsory secondary school (1st and 2nd) (16–18 years old).
-
Vocational training cycles (intermediate and higher) (more than 16 years old).
-
Other special regime teachings: languages, arts, dance, sports.
This article focuses on secondary students: compulsory and non-compulsory. This age (adolescents, teenagers) is very complicated and the effects of E-learning during the COVID-19 pandemic needs to be analyzed in depth. Students of this age are very sensitive, and they are the future power of a country. However, there is a considerable lack of articles focused on the secondary education period, when this period is crucial for the same secondary analyses or first-year university students.

3. Theoretical Framework

E-learning is the most widely used educational method for accessing remote resources with the help of computers, laptops, cloud systems, internal networks, tablets, and smartphones. The utilization of the latest technologies provides an added advantage in education, in the teaching–learning framework. E-learning has many advantages over traditional forms of learning; among others, it could be mentioned that it has greater accessibility to teaching material, provides fast and fluid communication, and facilitates the possibility of academic collaboration among students and the teacher. The continuous technological innovation and the vast advances have contributed to the difficulty in finding a sole E-learning definition. For example, E-learning could be defined as the use of technology during the learning process [22,35] or an information system that can incorporate a diversity of didactic material through e-mail, discussion, assignment, tests, and real-time online chat sessions [36,37]. Similarly, there are different methods to evaluate the success of an E-learning system; for instance, the Information Systems Success Model (ISSM) by [9], the Technology Acceptance Model (TAM) by [38,39], the User Satisfaction Model (USM) by [27,40], the E-Learning Quality (ELQ) models by [22,24], among many other different models.
In order to provide a global definition applicable to the different methodologies for measuring E-learning success, various theories and acceptance models have been consulted. Thus, in the elaboration of our model, mainly four of the most widely used approaches to evaluate E-learning and information systems have been considered, in such a way that it brings together the different contributions of each one of them.

3.1. Approach 1: Information System Success Model (ISSM/D&M Model)

The ISSM model was first introduced by [9] in the 1990s, being one of the best-known models for evaluating information systems’ success. It proposes the use of six interrelated and interdependent variables to evaluate the success of non-face-to-face learning systems: system quality, information quality, usage, user satisfaction, individual impact, and organizational impact. The quality of the system will be given by the desirable characteristics of the system (reliability, speed of access, etc.). Information quality refers to the timeliness, accuracy, completeness, clarity, etc., of the information contained in the platform. Usage refers to the users’ perception of use adequacy to perform the different tasks to be carried out in the system. User satisfaction can be defined as the degree of satisfaction that the user has during the use of the system. Individual performance could be defined as the individual perceived gains by the users during the use of the system, mainly in relation to their educational skills. Organizational impact focuses on users’ perceived level of organizational system success.
The authors modified their original model [9], since relevant criticisms were received over several years. The enhanced model adds the variables of intention to use and service quality and replaces the individual and organizational impact variables with the net benefit variable. For the authors, quality of service refers to the support quality received by the users from the system service provider; in turn, intention to use is defined as users’ predisposition to keep using the system, whereas net benefits are defined as the final degree to which information systems contribute to the successful outcomes of individuals, groups, and organizations (allowing researchers to apply the ISSM model to the desired level of analysis that is most appropriate to the research context).
A review of the existing literature on online education indicates that there is a consensus on the model’s validity, at least partially, for evaluating the performance of E-learning systems. Nevertheless, there are contradictions in the results when comparing different studies. For instance, whereas some research reported a strong effect of the general quality issues (service quality and system information) on current system utilization, other authors pointed out that this relationship was negligible.
The constructs taken from this model are the following:
  • System Quality (SQ).
  • Service Quality (SEQ).
  • Information Quality (IQ).
  • Student Satisfaction (SS).
  • Student Academic Performance (SAP) (Benefits).

3.2. Approach 2: Technology Acceptance Model (TAM)

Davis’ Technology Acceptance Model (TAM) [41] is another possible early model to evaluate the acceptance of information systems. This theory has been the most extensively used for measuring the success of a new technology in terms of its acceptance and use. This approach was defined from the Theory of Reasoned Action (TRA) and was classified within the theories of Social Psychology. The model is based on the fact that when users are presented with a new technology, several factors influence their decision on how and when they will use it. According to this model, external, social, cultural, and political factors are determinants in estimating the perceived usefulness and perceived ease of use by the user. Additionally, user-perceived usefulness and ease of use are the main predictors of attitude towards the use of the technology and intention to use it, while the intention to use is the most important indicator of current system use. In this work, ease of use, according to [22], is not considered as a separated construct due to the relationship with technical system quality (SQ).
An important number of research works based on the TAM model, as well as some of its multiple extensions, have been conducted in recent decades. For instance, an important extension, TAM2, was introduced by [41,42]. They extended the initial model by the addition of the processes of social influence (subjective norm, voluntariness, experience, and image). Instrumental cognitive processes were also considered (relevance of work, quality of results, and demonstrability of results). Years later, [18] constructed the Unified Theory of Acceptance and Use of Technology (UTAUT), which significantly improved the explanatory power of variance in intention to use. Successive extensions of TAM have evolved over time, in particular, Venkatesh published a new model, TAM3 [43] and UTAUT2 [19]. The TAM model and its different variants have been assiduously used in the context of E-learning systems to forecast the usefulness, intention to use, and usage of E-learning systems. In [41], the authors ascertained the importance of the role of Perceived Enjoyment (PE) in predicting computer acceptance and usage, and they found that PE could influence the Intention of Use (USE).
The constructs adapted from this model are as follows:
6.
Perceived Usefulness (PEU).
7.
Perceived Enjoyment (PE).
8.
Intention to Use (EUS).
9.
Subjective Norm (SN).
10.
Social Networking (NE).
11.
Student Learning Achievements (SSA).

3.3. Approach 3: E-Learning Self-Acceptance Measure (EIAM)

In aiming to evaluate the E-learning systems’ success, a widespread possibility is to use the perceptions of the users. The ElAM evaluates users’ perceptions of tutor quality, perceived usefulness, and facilitation conditions with regard to the utilization of the E-learning systems [44].
The ElAM model considers 21 items, a number that has been reached based on expert and students’ opinions, as well as on the different existing methods; basically, the two families of models described above. Finally, the 21 defined contributions were arranged into four categories: quality of the tutor (eight items), perceived usefulness (four items), perceived ease of use (five items), and facilitation conditions (four items). The part of TAM called “attitude towards using” is considered in the current model as a part of EIAM, related to the quality of the learner and instructor.
The construct adopted from this approach are as follows:
12.
Tutor Quality (TQ).
13.
Strategy (S).
14.
Engagement (E).

3.4. Approach 4: Online Learning Self-Efficacy (OLSE)

Self-efficacy can be defined as one’s own belief in carrying out a specific task. In this sense, applying the notion of E-learning, it could be understood as the self-confidence of having the capability to perform certain learning tasks using a determined E-learning system. Five dimensions are considered in the OLSE model [45,46]. These dimensions are the user’s self-efficacy to complete the online course; their ability to interact socially with classmates; to manage course tools; to interact with instructors; and to interact with classmates for academic purposes. In addition, according to the OLSE model, the role of demographic variables (such as gender, the number of online courses completed, and educational status) are aspects to consider in online learning self-efficacy [45].
The construct adopted from this approach is as follows:
15.
One-self Efficacy (OSE).

4. Development of the Conceptual Model (Research Model and Hypotheses)

In an attempt to give a global definition of E-learning success metrics, the four approaches most widely used over the last decades to evaluate E-learning have been taken into account in the elaboration of our new model.
In general, as different researchers have repeatedly reported, student satisfaction is a very reliable indicator for measuring the success of the implementation of E-learning-based initiatives (strong relationship between students’ perception of their academic performance and their degree of satisfaction in E-learning environments) [37,47,48,49,50]. Subsequently, Student Learning Achievements (SSA) has been widely used as an evaluation mechanism in the educational field [51,52,53,54]. The SSA is a powerful tool to evaluate the performance of E-learning strategies in higher education, but even more important is this evaluation in primary and secondary school, given that the students’ preparation is slower, since they are in an earlier period of their training. Therefore, it is even more important to use tools that allow seeing the degree of achievement of the objectives sought, as well as the determination of the causes that help or hinder the achievement of the goals sought.

4.1. Research Model

As discussed earlier, there are a huge number of factors affecting E-learning, with a multitude of complex interactions among them. Although, it is relevant to be aware that E-learning is an efficient mean for the teaching–learning process in the current educational environment, and even more so considering the pandemic situation. However, even more important is to know, in depth, the different factors that motivate users to accept and take full advantage of the capabilities of E-learning.
In the current study, the major factors or dimensions that were identified to be determinants in the E-learning achievements, were the ones related to social aspects, student factors, system factors, and tutor capabilities; so, we renamed our tool the 3S-T model. In this model, the student factors are divided into three sub-factors, individual factors, namely, the user’s beliefs, technology acceptance, and, finally, the student’s own performance—dimensions that cover the main elements of the existing approaches and are the major components of our new 3S-T model, although some of them can be subdivided into several subcategories. Ultimately, there are a total of 15 constructs that contribute to the SSA.
Figure 1 represents the survey model used in the current research, relying on the four aforementioned approaches.

4.2. Research Hypothesis

This section presents the linkages hypotheses in the current proposed model with the accompanying discussions. Every connection between the constructs in the model is justified on the basis on the empirically proved hypotheses found in the literature on the effectiveness of E-learning and information systems.

4.2.1. Facilitating Conditions: Social Factors and Social Networking

Facilitating conditions has been defined as an external factor that accompanies the main TAM-based construct and is defined as the level up to which students perceive that organizational and technical factors exist to support the use of E-learning during the pandemic [55].
It is generally admitted that social factors (social influence) affect users’ behavior very significantly [56,57]. Consequently, social factors have been tested over the years as subjective norms about the user’s intention towards the faced situation [57,58,59]. Subjective norm is defined as “the person’s perception that salient social referents think he/she should or should not perform the behaviour in question [58]”. Therefore, applying this aspect to the E-learning context, when an individual perceives that his/her salient referents think that he/she should use the E-learning system, he/she will incorporate these beliefs of his/her referents into his own beliefs [57,59] and consequently will intend to use it [57]. Additionally, [60] showed that these social influences have a direct impact on the attitude and intention to adopt information technologies (IT), and further stated that users may feel compelled to participate because they may want to be part of a community. However, the enjoyment perceived by individuals usually occurs in an autonomous context; therefore, it is very likely that they are not influenced by the individual’s relevant people, such as family, friends and mates [61]. Hence, it is possible that social factors may not affect perceived usefulness (PEU). Consequently, we put forward the following research hypotheses, aiming to estimate the degree of correlation, if any:
H1a-1: 
SN will positively affect the PEU of the E-learning system.
H1a-2: 
SN will positively affect using an E-learning system for sustainability.
Social Networking (NE) is another of the possible facilitating conditions; it refers to the growth of the perceived value of a product with an increase in the number of users [62,63], since the user’s perceived product utility is strongly affected by the number of users using it [64]. Researchers [60] analyzed the consequences of NE on IT adoption from the point of view of the existence of a “perceived critical mass”, noting that when the user perceives that this critical mass is reached by the product, it dominates the user’s attitudes. In this regard, in the E-learning environment, [57] indicated that when students perceived that a large and growing number of peers were using the E-learning system, it was inevitable that they would try the system. Furthermore, [57] demonstrated that NE, on the one hand, exerted a significant direct effect on PE (perceived enjoyment), PEU (E-learning use as sustainability-intention to use), and behavioral intention. On the other hand, individuals may have an autonomous perception of their own enjoyment of the activity itself; thus, they are usually not biased by others [61]. Consequently, it is possible that SN does not affect PEU; hence, we put forward the following hypotheses to corroborate a possible correlation:
H1b-1: 
NE will positively affect the PEU of the E-learning system.
H1b-2: 
NE will positively affect attitude towards using an E-learning system (EUS).
H1b-3: 
NE will positively affect attitude towards using an E-learning system (PE).

4.2.2. Individual Factors

Individual factor would probably be an important aspect to be considered in E-learning. Many previous research studies reported that users’ own individual factors have a meaningful effect on how they perceive an E-learning system, consequently greatly affecting their willingness to accept it. In relation to individual factors, self-efficacy reflects a person’s own beliefs about his or her ability to perform certain tasks successfully [65]. Similarly, several previous studies have shown that in an E-learning context, self-efficacy influences the PEU [54,57,66,67,68], while other researchers have shown that it can be a very important factor affecting PEU [54,67]. Therefore, this influence is hypothesized in this study, aiming to check this relationship:
H2a-1: 
One-self Efficacy (OSE) will positively affect the EUS of the E-learning system.
H2a-2: 
One-self Efficacy (OSE) will positively affect the PE of the E-learning system.
H2a-3: 
One-self Efficacy (OSE) will positively affect the PEU of the E-learning system.
On the other hand, many studies indicate that individual factors have a significant influence on how users perceive the E-learning system, and, subsequently, their willingness to embrace it. However, given that today’s students are different from the students of the past, they want to create, use the tools of their time, share control, and make decisions. They also want to share their opinions not only in class but globally, and additionally, they seek an education that is relevant and connected to the reality around them. All this makes them more predisposed to adapt to online learning. Consequently, the conclusions of previous research studies are even more applicable in today’s E-learning environment.
In this way, all constructs related to the student’s interests, motivation, perceptions, etc., are key aspects to reach adequate E-learning effectiveness [2,67]. In particular, the strategy and engagement of the students play key roles in the student’s performance, usually when there are high perceived enjoyment and usefulness by the students [10,69], resulting in positive user performance. Therefore, this influence is hypothesized in this study, aiming to check this relationship:
H2b-1: 
Strategy (S) positively influences engagement (E).
H2c-1: 
Engagement (E) positively influences students’ learning achievements (SSA).

4.2.3. System Factors

Since [41] proposed the TAM, it has been postulated that system factors strongly affect users’ beliefs. Subsequent studies have proved the importance of the role of system factors in predicting users’ beliefs and acceptance in the E-learning context [57,68,70,71]. The characteristics of the platform used determine the information available to the student, the way in which he/she can access it, the possibilities of sharing it, the possibilities of contacting peers or teachers, the possibilities of collaborating with peers, etc.; therefore, it will be of vital importance to achieve the E-learning objectives [72,73]. Specifically, faculty, as well as peers, are extremely important resources for students to learn. However, due to the usual complexity of E-learning platforms, as well as the barriers created by non-face-to-face teaching, it becomes more difficult for students to socialize with their faculty and peers/friends and requires the use of different approaches to establish these relationships. While, it is argued that students with prior experience in online socialization are able to approach their peers and faculty more effectively on the platform due to their familiarity with the norms and approaches [2]. Therefore, in particular, a key aspect is estimating the capability to improve the teaching–learning process. Consequently, indicators of system performance are of vital importance in evaluating E-learning achievements.
In [68], the authors analyzed the actual user usage of the E-learning system for a distance educational system using the TAM model; they concluded that system factors could affect the PEU very positively. In turn, [71] classified the system factors affecting user acceptance of E-learning into system, information, and service qualities. Consequently, the following hypotheses are put forward in this study:
H3a-1: 
SQ will positively affect the EUS of the E-learning system.
H3a-2: 
SQ will positively affect the PE of the E-learning system.
H3b-1: 
SEQ will positively affect the EUS of the E-learning system.
H3b-2: 
SEQ will positively affect the PE of the E-learning system.
H3b-3: 
SEQ will positively affect the PEU of the E-learning system.
H3c-1: 
IQ will positively affect the EUS of the E-learning system.
H3c-2: 
IQ will positively affect the PE of the E-learning system.
H3c-3: 
IQ will positively affect the PEU of the E-learning system.

4.2.4. User Beliefs and Technology Acceptance

Taking the extended TAM model [73,74,75,76,77] as a starting point, the proposed relationships between users’ beliefs regarding the E-learning system and their subsequent acceptance and use of the system are explained hereunder. It is generally admitted that EUS directly affects the learner’s attitude toward the use of the E-learning system [78,79,80,81]; in the same vein, it is admitted that PEU directly affects the attitude toward a user’s use of the E-learning system [78,80,81]. Likewise, PE is directly affected by the attitude toward the use of the E-learning system [79]. In addition, EUS is considered to mediate the influence of PEU on the user’s attitude toward E-learning system use [78,81]; it is also generally accepted that PE mediates the influence of PEU on the attitude toward system use [79]. The PEU directly determines the intention to use the E-learning system [54,67,82,83,84,85,86]; PE directly estimates the intention to use the E-learning system [78,84,85,86]; and attitude toward using the E-learning system directly estimates the intention to use the system [78,79,80]. Thus, the intention to use the E-learning system directly impacts the SS of the system [78]. Therefore, summarizing, the following hypotheses are put forward:
H4a-1: 
EUS will positively affect the PEU of the E-learning system.
H4a-2: 
EUS will positively affect the PE.
H4b-1: 
PEU will positively affect student satisfaction (SS).
H4c-1: 
PE will positively affect student satisfaction (SS).

4.2.5. Tutor’s Development

In relation to the contribution of the tutor in the success of E-learning, previous research has shown that the delivery of the course, the attributes of the tutor, and the facilitator conditions of this proved to be very important, if not the main determinants, in the usefulness perceived by the students [87,88]. The role of tutors in E-learning is even more important than in traditional education, as the e-instructor must be more skilled, especially in the application of classroom technology [89]. The authors of [90] reported that with the implementation of E-learning, the role of instructors have shifted from being subject matter experts to facilitators. To succeed in an online education system, the positive attitude of tutors is crucial. In [91], the authors identified and classified the competencies possessed by e-instructors into these categories: knowledge of the online system, technical competence, communication skills, content mastery, and personal characteristics. Particularly important has been the change brought about by the closing of universities and schools caused by the COVID-19 pandemic; this change brought about various psychological changes in both students and teachers [92], greatly affecting their performance. Researchers [93] analyzed the performance of the university mentoring system during the COVID-19 pandemic. The tutor–student relationship is supported by communication and collaboration; so, not losing them requires the rapid adoption of measures that favor them in the new situation, such as the use of many communication technologies. The authors’ investigations concentrated on four different forms of mentoring, namely, by email, in person, through virtual tutoring (Hangout/Google Meet), and using WhatsApp. These researchers noted that synchronous and frequent daily communication are key aspects for an efficient and successful mentoring system where the use of WhatsApp, complemented by synchronous communication through messages and video calls, is the best form to achieve student satisfaction. Thus, we put forward the following hypotheses:
H5a-1: 
Tutor quality (TQ) positively influences intention of use (USE).
H5a-2: 
Tutor quality (TQ) positively influences strategy (S).

4.2.6. Student’s Own Performance (Student Satisfaction)

Assiduity of use affects student satisfaction and performance, ultimately leading to the achievement of learning objectives. In the same way, many other correlations have been found among the different constructs of the student’s own performance and with the rest of the constructs. Satisfaction has more than demonstrated its effectiveness and reliability as an essential success measurement of both information and E-learning systems [22]. In the current model, we suppose that student satisfaction is a determinant of the benefits construct; that is, student learning achievements (SSA). Therefore, the following hypotheses were put forward:
H6a-1: 
Student satisfaction (SS) toward the E-learning system positively influences students’ academic performance (SAP).
H6a-2: 
Student satisfaction (SS) toward the E-learning system positively influences students’ learning achievement (SSA).
H6b-1: 
Students’ academic performance (SAP) positively influences students’ learning achievement (SSA).

5. Research Method

Quantitative methodologies have been used to verify the theoretical 3S-T model and its hypotheses; therefore, in order to “measure the success” of E-learning through learning achievement or student academic performance (SAP) and student learning achievements (SSA) during this pandemic, a quantitative survey was adopted in this research (see Appendix A Table A1). With regard to ethical considerations, the study was approved by the local ethics committee of UPV (protocol number P03_24032022).

5.1. Aim and Participants

The purpose of this study was to test and improve the 62 items of the 3S-T models. These items were analyzed using a 7-point Likert response scale, in which 1 means totally against and 7 means totally in favor. Prior to its completion, the participants were informed about the objectives of the research [60].
A total of 217 students participated in this investigation. The participants in this research are students of compulsory secondary education (Educación Secundaria Obligatoria, ESO) and baccalaureate (Bachillerato, non-compulsory), consisting of four and two grades, respectively, in the age range between 11 and 18 years. The study was carried out in the city of Valencia. In Supplementary Material, we can find the answers from the students.

5.2. Evaluation Model of E-Learning Performance

A confirmatory analysis (CFA) with structural equation modelling (SEM) was used to examine the factor structure of the 62-item scale. Two possible SEM approaches exist: covariance-based SEM (CB-SEM) and composite-based SEM or partial least squares SEM (PLS-SEM). This study employs the second one, the PLS-SEM approach, because of the model’s 15 constructs and 62 indicators. PLS-SEM is a useful approach for forecasting behaviors in behavioral research areas. For models that are complex and comprising formative (causal) and reflective (consequent) constructs, the PLS approach offers theoretical knowledge, with its major strength residing in the modelling [94,95]. This technique was chosen for its capability to simultaneously examine a series of dependence relationships, particularly when within the model latent variables of the first and second order are under study [96].
We have illustrated in Figure 2 the two stages of the methodology. Stage 1 addresses the evaluation of the reflective and formative measurement models, with both measurement models examining the measurement theory. Stage 2 addressed the evaluation of the structural model, which that covers the structural theory, involving testing the proposed hypotheses and addressing the relationships among the latent variables [97].
The model measurement and evaluation were carried out through data computation in SmartPLS 3.6. The measurement theory indicates how to measure latent variables. There are two types of measurement models [99,100]: formative and reflective measurement models.
Measurement models refer to the relationship between the indicators that reflects each construct and implies testing the measures’ reliability and validity. The measurement model was evaluated through the use of the following criteria [101]:
Indicator Reliability: the outer loading for the indicator should be ≥0.70 [101].
Internal Consistency Reliability: using two tests, the composite reliability (CR) and Cronbach’s alpha (α). The cut off value is ≥0.70 for both tests [102].
-
Validity:
-
Convergent Validity: the average variance extracted AVE should be ≥0.50 [103].
-
Discriminant Validity: through the use of three tests for verification:
  • Fornell-Larcker criterion [103];
  • Cross-loadings [102];
  • The Heterotrait–Monotrait ratio (HTMT) [104].
Structural models refer to the relationship among the constructs. The structural model was assessed using the criteria proposed by [105]:
-
Assess the structural model for collinearity issues (VIF < 5);
-
Evaluate the significance and relevance of the structural model relationships (p < 0.05);
-
Analyze the level of R2 (the cut-off levels are 0.190, weak; 0.333, moderate; and 0.670, substantial);
-
Assess the level of Q2 (cut-off point larger than zero);
-
Assess the model’s fit (SRMR ≤ 0.08; RMStheta ≤ 0.12).
To determine the ratio of the sample to variables for SEM analysis is also important. We can find interesting literature in this field [106,107,108].

6. Results

The average age of the students was 13.84 (SD = 1.72), and there were 94 (43.3%) females in the sample. There were 38 students (17.5%) without any computer skills. The sample characterization is shown in Table 1.

6.1. Measurement Model

6.1.1. Outer Loading, Internal Consistency, and Reliability

All factors were reflective, so the outer loading was analyzed, based on the suggestion of [101], which denotes that:
-
If the outer loading is less than 0.4, delete the indicator;
-
If the outer loading is higher than 0.7 maintain the indicator;
-
If the outer loading is between 0.4 and 0.7, analyze the impact of removing the indicator on the variance extracted (AVE) and composite reliability (CR).
In Appendix A Table A2, the outer loadings are shown. We can appreciate that some indicators fail to meet the minimum criteria, despite the indicators testing the internal consistency, the values of AVE and CR, being higher than 0.5 and 0.7 for all constructs. We analyzed the effect of removing SQ3 and USE 2, after which the values of AVE and CR declined; so, we maintained all indicators. Table 2 shows the measures of internal consistency, reliability, and validity.

6.1.2. Discriminant Validity

Discriminant validity is the degree to which a construct is different from the rest [96]. The correlation matrix for the Fornell–Larcker method is shown in Table 3 and we can add that the diagonal values are higher than other values in the same column, which indicates that the AVE scores of every construct are lower than their is shared variance.
The second method for assessing the discriminant validity is the cross loadings (Appendix A Table A2), and we can appreciate that each indicator loads higher on the construct related to it.
Finally, we checked the Heterotrait–Monotrait ratio (HTMT) criterion proposed to [104] to assess discriminant validity, as many authors established that the criterions mentioned before are insufficiently sensitive to detect discriminant validity. Heterotrait correlations are correlations of indicators across constructs measuring different characteristics, while monotrait correlations of indicators measure the same construct. As stated, [109] the threshold values ≤ 0.9 are accepted. The results show that the values significantly differed from 1. Table 4 shows the Heterotrait-Monotrait ratio (HTMT) correlation matrix.

6.1.3. Significance of the Outer Loading

The significance of the outer loadings was assessed through the use of the algorithm of bootstrapping in PLS. We used 50,000 bootstrap samples to estimate the t and p values to test the significance of the outer loadings at 5% error probability, thus meaning that the statistical significance level at 5% indicates that p-values must be >0.05 to accept the hypothesis and a t-value > 1.65. Results of bootstrapping are displayed in Appendix A Table A2. Results inform that all outer loadings are significant, with p-values lower 0.05.
A results summary for the measurement model assessment and the significance of outer loadings is shown in Appendix A Table A3.

6.2. Structural Model

As we stated previously, the assessment of the structural model includes five steps [101]. First, collinearity was assessed through the variance inflation factor (VIF). According to [105], values of VIF ≥ 5 indicates a potential problem of collinearity. In our case, the retrieved VIF values are all below 5; thus, our data did not present collinearity problems. Figure 3 shows the path coefficient (β values) of the relationships between the constructs and indicators.
Table 5 shows the p-values obtained to assess the path coefficient between the endogenous and exogenous constructs, and by applying the same criterion of a 5% level of significance, most hypotheses were supported, while others were rejected.
The third step consists of evaluating the coefficient of determination (R2) of the dependent variable, so that this measure can represent the variance proportion in the endogenous variables that can be explained by exogenous variables; i.e., it can be interpreted as the predictive accuracy of the proposed model. It ranges from 0 to 1, and [101] stated that values of 0.75 is substantial, 0.5 moderate, and 0.25 weak. As shown in Figure 3, PEU, USE, and PE performed moderately regarding student satisfaction, student academic performance (SAP), and student satisfaction and engagement (E); and moderately regarding the substantially explained (52.7%) student learning achievements (SSA). These R2 results show a sufficient level of this measure.
Three constructs, PEU, USE, and PE, were the main determinants of student satisfaction, together explaining 67.7% of the variance.
Fourth, we assessed the predictive relevance denounced as Q2 during the blindfolding in SmartPLS. If the model performs a predictive relevance (values of Q2 higher than 0), the test will demonstrate accuracy in predicting items’ data points [94]. The authors established that a value of Q2 of 0.02 denotes a small predictive relevance, a value of 0.15 shows a medium relevance, and a value of 0.35 presents a large predictive relevance. Table 6 shows the Q2 of the endogenous variables, with five with strong prediction power (PE, PEU, SS, SSA, and SAP) and three with moderate prediction power (USE, S, and E).
Results suggest that the model has considerable predictive power due to the value of Q2 for student academic performance (SAP) and student academic achievement (SSA).
Finally, the last step was to assess the model fit, as proposed [110]; that is, how well the specified model represents the underlying theory [111]. Ref. [112] proposed a set of fit measures, but stated that they have been introduced to provide a comparison to CB-SEM results rather than to represent an appropriate PLS-SEM index:
  • Standardized Root Mean Square Residual (SRMR), which is an absolute measure of model fit proposed to prevent misspecification of the model [104]. A value less than 0.10 or 0.08 (a more conservative version, see [113]) is considered a good fit. The SRMR for this study is 0.07, which is below the lower cut-off value suggested in the literature.
  • Root Mean Square Residual (RMStheta) verifies “the degree to which the outer model residuals correlate” [104]. Closeness to 0 indicates a good model fit (≤0.12 to indicate a good model fit) [101,104]. Using SmartPLS, the value of RMStheta is 0.116 which indicates a good model fit.
  • Normed Fit Index (NFI), which provides an incremental fit measure. Therefore, one of the main disadvantages is that it does not penalize for model complexity; i.e., the more parameters in the model, then the larger (i.e., better) the NFI result. Closeness of the NFI to 1 indicates a better fit. NFI values above 0.9 usually represent acceptable fit [110]. In our case, the value of NFI is 0.613.
  • Finally, the model’s goodness of fit (GoF) is defined as “how well the specified model reproduces the observed covariance matrix among the indicator items” [105] and this is our last criterion to assess the overall model fit. The purpose of GoF is to account for the model at both levels; i.e., the measurement and the structural models, with a focus on the overall performance [114]. There is no measure of global fit in PLS. However, investigators have suggested a global GoF, which is defined as the geometric mean of the average communality and average R2 of the endogenous constructs [115]. The GoF cut-off values used in this study were proposed by [116]:
    -
    GoF less than 0.1 means no fit;
    -
    GoF between 0.1 and 0.25 means small fit;
    -
    GoF between 0.25 and 0.36 means medium;
    -
    GoF greater than 0.36 means large.
The model’s goodness of fit for this research is 0.599, which means a large overall performance; in fact, it is significantly above the threshold value that constitutes a large fit.

7. Discussion

This study aimed to analyze how different factors can predict students’ accuracy in self-assessing their accomplishments within the secondary student population.
To explore the factors predicting student academic achievement (SSA) during COVID-19, a new model (3S-T model) was developed, based on extended TAM, ISSM, EIAM, and OSE models. This 3S-T model was used to explain the secondary school student perceptions of the process of the adoption of E-learning during the lockdown. The complicated age of these students and the lack of studies focused on this crucial educational level highlight the need of the present analysis. In this vein, our approach is strongly linked to Goal 4 of the Agenda adopted by the United Nations [117], which pursues Quality Education in order to achieve sustainable development among countries.
Hypothesis H1a-1, H2a-1, H2b-1, H2c-1, H3b-1, H3c-1, and H2a gained empirical support. Thus, aspects related to subject norm, one-self efficacy, strategy, engagement, and information. System quality, understood as the available information at the E-learning system, is helpful. The interest and reliability of the information available is an important aspect in order to contribute to the general satisfaction and perceived usefulness of the E-learning system. Furthermore, several aspects related to the students’ perception of the system quality, in particular, the ones concerning the site, such as the easiness to understand navigation, easiness to find the information, and to have a good website structure, are also vital aspects to EUS. These results support that information quality and system quality are determinants of the perceived satisfaction and the perceived usefulness.
In this study, it was shown that engagement profiles and study strategies are important predictors of SSA. This finding is consistent with other research where engagement have been related, such as the studies of [51] and [118].
Contrary to our prediction, H3a1, H3a-2, and H3a-3 were rejected, and this is interpreted as (i) their age, as the current students were born using the internet and platforms/systems of all types and complexity; (ii) in the survey, 17.5% affirmed that they do not have computer skills, which can be considered a “false-negative” answer because all the students in the educational system of Spain study informatics, and in the more experienced centers, all students use digital books to learn; (iii) developers of E-learning platforms/systems (most of them learned how to handle platforms when they were older) underestimate the skills of students for navigation, for searching information, and for understanding a webpage structure; and (iv) authors who assume that these hypotheses will have a positive effect are based on the beginning of the internet era, with students and tutors with limited skills and strategies.
Statistical analysis established that there are positive relationships among EUS and PEU (H4a-1), EUS and PE (H4a-2), and EUS and SS (H4a-3). These results could suggest that if students increase the usefulness towards sustainability, they may potentially increase their perceived usefulness and perceived enjoyment. Therefore, students considered that E-learning during the lockdown was valuable, and it creates a suitable atmosphere to learn, affecting positively the E-learning performance and effectiveness as well their control and utility. Besides, this mentioned E-learning atmosphere made the E-learning more enjoyable, pleasant, and funny, improving the general student satisfaction, using it again if necessary.
H5a-1 and H5a-2 were supported, since the instructor/tutor is the key in an E-learning environment [51], especially with underage students who are more dependent in every way.
H6a-1 and H6b-1 were also supported. More satisfied students gain more benefits, impacting their learning achievements. These results are consistent and in the line with previous research [10,35,51,119,120]. Obviously, students who feel satisfied have enhanced performance.
H6a-2 was also supported. Students were satisfied with E-learning and willing to use it again if necessary, and they also found what they need, which allowed them to achieve educational and personal goals, improving their creativity, their knowledge and information, and their experiences and performance.
Finally, H4b-1 and H4c-1 were strongly supported. Thus, it means that perceived usefulness and perceived enjoyment are determinants of student satisfaction. Therefore, students who perceived that E-learning is useful and can be used as a form of enjoyment were successful.
The results show that benefits of the 3S-T model are achieved; thus, the use of E-learning increases the learning performance and learning achievements. Our obtained results using the 3S-T model are in the line with other studies [35,51,102,119].

8. Conclusions

In the complicated period of the COVID-19 pandemic, and during the global lockdown, E-learning was the only resource capable of replacing traditional in-person learning procedures. Surprisingly, the age of secondary students being the most complicated in all terms (educational, behavior, and social), there is a great lack of studies that analyze the E-learning process during the global lockdown at this crucial educational level.
In the current research work, we have developed an original model 3S-T based on the current theoretical framework in order to identify a range of success predictors and to measure the success of the E-learning model based on a questionnaire answered by more than 200 students.
The measurement and success of the different factors that influence the E-learning were evaluated using the mentioned 3S-T model. In this way, the initial objective was accomplished, and we found that factors related to subject norm, one-self efficacy, strategy engagement and information, system quality, interest and reliability of the information available, and factors related to the student perception of the system quality are vital and similar to the results of other authors.
This article contributes to the emerging literature related to the analysis of E-learning systems success, providing a comprehensive multidimensional model that takes into account the main dimensions and subdimensions of four approaches: ISSM, TAM, ElAM, and OLSE.
The methodological procedure employed the regression technique of PLS-SEM using the SmartPLS 3.6 software. According to the results, the use of E-learning increased the learning performance and the learning achievements of the secondary students during the global lockdown. The adoption of the present 3S-T model to analyze and to measure the success of E-learning during the COVID pandemic is key. In this vein, the 3S-T model should be extended to other educational levels, such as primary education or at the university level, in futures articles.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/su142013261/s1 (answers from the students).

Author Contributions

M.M.-G. and E.B. conceived and designed the experiments; E.B. performed the experiments and M.M.-G. and C.B.-E. analyzed the data; M.M.-G. and E.B. contributed analysis tools; C.B.-E., M.M.-G. and E.B. wrote the paper. E.B. and C.B.-E. revised the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by Universitat Politècnica de València Research Ethics Committee (protocol project P03-24032022, Delegación de Protección de Datos, 23 March 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The authors confirm that the data supporting the findings of this study are available within the article and its Supplementary Materials.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

3S-TNewly proposed model (social aspects, student factors, system factors, and tutor capabilities)
5QsModel of the 5 Qualities (object, process, infrastructure, interaction, and communication atmosphere)
ASSStudent Learning Achievements
AVEAverage Variance Extracted
CB-SEMCovariance-Based and Structural Equation Modelling
CFAConfirmatory Analysis
COUUniversity Orientation Course (Curso de Orientación Universitaria)
COVID-19Corona Virus Disease 2019
CRComposite Reliability
DTPBDecomposed Theory of Planned Behavior
EEngagement
EIAME-learning Self-Acceptance Measure
ELQE-Learning Quality
ELS E-Learner Satisfaction
ELSSE-Learning System Success
EESSEvaluating E-learning Systems Success
ESOCompulsory Secondary School (Educación Secundaria Obligatoria)
EUSIntention to Use
GoFGoodness of Fit
HTMTHeterotrait–Monotrait ratio
ICTInformation and Communication Technologies
IQInformation Quality
ISSInformation Systems Success
ISSMInformation Systems Success Model
ITInformation Technologies
LMSLearning Management Systems
NESocial Net working
OLEOnline Learning Environment
OLSEOnline learning self-efficacy
OSEOne-Self Efficacy
PEPerceived Enjoyment
PEUPerceived Usefulness
PLSPartial Least Square
PLS-SEMPartial Least Square and Structural Equation Model
Q2Goodness of PLS prediction
QoIQuality of Interaction
R2Determination Coefficient
RMSRoot Mean Square
SStrategy
SAPStudent Academic Performance
SDStandard Deviation
SEMStructural Equation Model
SmartPLSCommercial Computational Code
SNSubjective Norm
SERVQUALMultiple-item scale for measuring consumer perceptions of Service Quality
SEQService Quality
SQSystem Quality
SRMRStandardized Root Mean Residual
SSStudent Satisfaction
SSAStudent Self-Assessment
TAMTechnology Acceptance Model
TAM2 and 3Extensions of the TAM model
TRATheory of Reasoned Action
TTTutor Quality
USMUser Satisfaction Model
UTAUTUnified Theory of Acceptance and Use of Technology
UTAUT2Modification of UTAUT model
VIFVariance Inflation Factor

Appendix A

Table A1. Questionnaire.
Table A1. Questionnaire.
CodeRelated Studies
Information QualityThe information at the E-learning system available is helpful IQ1[42,72]
The information available is interesting IQ2
The information available is reliableIQ3
Service QualityThe E-learning has a mechanism for overcoming the problems that I am facing quickly SEQ1[10,20,24,38,42]
The system on E-learning site is up to date. SEQ2
You feel safe with the E-learning system in terms of security and privacySEQ3
System QualityThe E-learning site has easy-to-understand navigation SQ1[16,20,22,42,68]
The E-learning site allows me to find the information I need easily SQ2
The E-learning site has a good website structure SQ3
Intention to UseI use the E-learning site to find information USE1[38,39,72,82]
I use E-learning to assess my skills USE2
I use E-learning to increase the chances of achieving better resultsUSE3
Perceived UsefulnessUsing the E-learning system improves my learning performance.PEU1[38,46,70,83,89,90]
Using the E-learning system enhances my learning effectivenessPEU2
Using the E-learning system gives me greater control over learning.PEU3
I find the E-learning system to be useful in my learningPEU4
Perceived EnjoymentI find using the E-learning system to be enjoyable.PE1[10,18,42,43]
The actual process of using the E-learning system is pleasant.PE2
I have fun using the E-learning system.PE3
Student SatisfactionIf there is any chance to use online learning again, I will gladly do it.SS1[9,21,22,24,27,79]
I am satisfied with the E-learning process SS2
I feel online learning gives me what I need SS3
Use as SustainabilityI spend a lot of time exploring within the E-learning system.EUS1[22,43,62,66,75]
I believe that the use of the system is valuable EUS2
E-learning provide suitable learning environmentEUS3
I think that using E-learning is well suited for the way to learn. EUS4
Social NetworkingI enjoy my time when using social networking tools.EN1[19,38,63]
Social networking tools increase students’ creativity and interactivity.EN2
Social networking tools facilitate knowledge sharing.EN3
Student Learning AchievementsAchieving educational goalsASS1[22,70,73]
Achieving personal goalsASS2
I feel the E-learning system helps me improve my creativity.ASS3
I feel the E-learning system helps me improve my knowledge and information.ASS4
I feel the E-learning system helps me improve my experiences and performanceASS5
Subjective NormMy teacher is very supportive of online learning system use for my learning SN1[59,60,61,87,88]
The management of my university support the E-learning activitiesSN2
Tutor QualityMy tutor could explain the concepts clearly TQ1[46,49,85,88,92]
My tutor was knowledgeable in ICT TQ2
My tutor was focused on helping me to learnTQ3
The tutorial activities were well-manage TQ4
My tutor was accessible when I needed to consult themTQ5
My tutor was patient when they interacted with me TQ6
The group sessions were well facilitatedTQ7
Self-efficacyI am willing to accept the challenge OSE1[38,59,67,68,69,71]
I am sure that I can complete all the stages that exist on E-learning site well. OSE2
Online Self-EfficacyI am sure that I can use synchronous technology to communicate with others (such as Skype)OS1[11,46,47,48,49,54,91]
I am sure that I can manage time effectively and complete all assignments on timeOS2
I am sure that I can learn without being in the same room as the instructor and other students OS3
EngagementI feel strong and vigorous when I am studying or going to E-learning classes.E1[2,10,51,56,72]
When the day starts I feel like going to class or studyingE2
I am enthused about my studiesE3
My studies inspire me to do new thingsE4
I am proud of doing this careerE5
I am happy when I am doing tasks related to my studiesE6
I am involved in my studiesE7
StrategyI tend to plan the time I am going to spend studyingS1[36,49,68,80]
I start studying from the beginning of the courseS2
I take notes of the teachers’ explanations.S3
I expand the information with complementary bibliography.S4
I have difficulties in following the teacher’s explanations in E-learning class.S5
I make outlines of the material I am going to study.S6
When I study for an exam I think of questions that can be included in the exam.S7
Table A2. Table of the outer loadings.
Table A2. Table of the outer loadings.
EUSzENIQOSEPEPEUSEQSESSAPASSSSSNSQTQ
E1 0.71
E2 0.74
E3 0.80
E4 0.82
E5 0.65
E6 0.76
E7 0.56
EUS10.72
EUS20.73
EUS30.78
EUS40.80
USE10.64
USE20.60
USE30.65
IQ1 0.88
IQ2 0.87
IQ3 0.60
OS1 0.72
OS2 0.74
OS3 0.68
PE1 0.90
PE2 0.87
PE3 0.91
PEU1 0.88
PEU2 0.93
PEU3 0.90
S1 0.83
S2 0.82
S3 0.69
S4 0.75
S6 0.57
SAP1 0.91
SAP2 0.92
SAP3 0.85
SAP4 0.88
SAP5 0.87
SE1 0.68
SE2 0.82
SN1 0.88
SN2 0.86
SN3 0.77
SQ1 0.82
SQ2 0.79
SQ3 0.56
SS1 0.79
SS2 0.87
SS3 0.85
SubN1 0.88
SubN2 0.89
SystQ1 0.84
SystQ2 0.88
SystQ3 0.87
TQ1 0.78
TQ2 0.79
TQ3 0.82
TQ4 0.85
TQ5 0.75
TQ6 0.74
TQ7 0.72
Table A3. Table of the significance of the outer loadings.
Table A3. Table of the significance of the outer loadings.
Original Sample (O)Sample Mean (M)Standard Deviation (STDEV)T Statistics (|O/STDEV|)p Values
E1 ← E0.7120.7130.04117.5590.000
E2 ← E0.7400.7340.03918.8990.000
E3 ← E0.7980.7950.02828.1080.000
E4 ← E0.8170.8150.02533.1380.000
E5 ← E0.6500.6490.04514.3660.000
E6 ← E0.7620.7600.04019.0000.000
E7 ← E0.5640.5630.0579.9440.000
EUS1 ← EUS0.7160.7160.03719.3770.000
EUS2 ← EUS0.7300.7270.03719.8170.000
EUS3 ← EUS0.7840.7850.02927.2550.000
EUS4 ← EUS0.8030.8020.02630.3530.000
IQ1 ← IQ0.8780.8760.02042.8810.000
IQ2 ← IQ0.8740.8740.01653.3320.000
IQ3 ← IQ0.5970.5900.0787.6030.000
OS1 ← OSE0.7170.7090.05313.5630.000
OS2 ← OSE0.7360.7350.04316.9660.000
OS3 ← OSE0.6820.6790.04017.2230.000
PE1 ← PE0.8980.8970.01561.3670.000
PE2 ← PE0.8760.8750.02436.3510.000
PE3 ← PE0.9070.9060.01562.3690.000
PEU1 ← PEU0.8820.8820.01848.9590.000
PEU2 ← PEU0.9280.9280.01659.8600.000
PEU3 ← PEU0.9010.9000.01561.6280.000
S1 ← S0.8310.8310.02041.5470.000
S2 ← S0.8230.8210.02434.3280.000
S3 ← S0.6920.6900.04714.6740.000
S4 ← S0.7520.7510.03620.7670.000
S6 ← S0.5740.5710.0629.3240.000
SAP1 ← SSA0.9060.9050.01849.3040.000
SAP2 ← SSA0.9250.9240.01277.4290.000
SAP3 ← SAP0.8470.8470.02238.0660.000
SAP4 ← SAP0.8780.8760.02338.6100.000
SAP5 <- SAP0.8690.8680.02043.4720.000
SE1 ← OSE0.6790.6770.05213.0440.000
SE2 ← OSE0.8150.8120.02237.0120.000
SN1 ← SN0.8760.8750.02338.3520.000
SN2 ← SN0.8560.8540.03127.7480.000
SN3 ← SN0.7670.7610.05514.0640.000
SQ1 ← SQ0.8190.8170.03722.4080.000
SQ2 ← SQ0.7940.7890.04020.0050.000
SQ3 ← SQ0.5610.5530.0826.8780.000
SS1 ← SS0.7810.7800.03721.3030.000
SS2 ← SS0.8720.8720.02043.2490.000
SS3 ← SS0.8540.8530.01750.3250.000
SubN1 ← SubNorm0.8760.8750.02240.3810.000
SubN2 ← SubNorm0.8910.8910.01752.2300.000
SystQ1 ← SystQ0.8400.8380.02929.4560.000
SystQ2 ← SystQ0.8790.8800.01848.9350.000
SystQ3 ← SystQ0.8670.8670.02436.7460.000
TQ1 ← TQ0.7810.7790.03125.1270.000
TQ2 ← TQ0.7850.7830.03125.4670.000
TQ3 ← TQ0.8220.8180.02928.4020.000
TQ4 ← TQ0.8480.8460.02238.9700.000
TQ5 ← TQ0.7530.7510.04516.6120.000
TQ6 ← TQ0.7450.7390.04217.6350.000
TQ7 ← TQ0.7180.7140.04117.5840.000
USE1 ← EUS0.6420.6360.05012.7070.000
USE2 ← EUS0.6010.5960.05610.7050.000
USE3 ← EUS0.6540.6500.05013.0670.000

References

  1. Zhao, L.; Cao, C.; Li, Y.; Li, Y. Determinants of the digital outcome divide in E-learning between rural and urban students: Empirical evidence from the COVID-19 pandemic based on capital theory. Comput. Hum. Behav. 2022, 130, 107–177. [Google Scholar] [CrossRef]
  2. Amin, I.; Yousaf, A.; Walia, S.; Bashir, M. What Shapes E-Learning Effectiveness among Tourism Education Students: An Empirical Assessment during COVID-19. J. Hosp. Leis. Sports Tour. Educ. 2022, 30, 100337. [Google Scholar] [CrossRef]
  3. DeTure, M. Cognitive Style and self-efficacy: Predicting student success in online distance education. Am. J. Distance Educ. 2004, 18, 31–38. [Google Scholar] [CrossRef]
  4. Yekefallah, L.; Namdar, P.; Panahi, R.; Dehghankar, L. Factors related to students’ satisfaction with holding e-learning during the COVID-19 pandemic based on the dimensions of e-learning. Heliyon 2021, 7, e07628. [Google Scholar] [CrossRef]
  5. Baber, H. Modelling the acceptance of e-learning during the pandemic of COVID-19-A study of South Korea. Int. J. Manag. Educ. 2021, 19, 100503. [Google Scholar] [CrossRef]
  6. Roy, R.; Potter, S.; Yarrow, K. Designing low carbon higher education systems: Environmental impacts of campus and distance learning systems. Int. J. Sustain. High. Educ. 2008, 9, 116–130. [Google Scholar] [CrossRef] [Green Version]
  7. Moodle—Open-Source Learning Platform. Moodle. Available online: www.moodle.org. (accessed on 26 July 2022).
  8. Aules Webpage. Conselleria de Educación, Cultura y Deporte; Generalitat Valenciana. Spanish Webpage. Available online: https://portal.edu.gva.es/aules/es/inicio/ (accessed on 26 July 2022).
  9. DeLone, W.H.; McLean, E.R. The DeLone and McLean model of information systems success: A ten-year update. J. Manag. Inf. Syst. 2003, 19, 9–30. [Google Scholar]
  10. Hassanzadeh, A.; Kanaani, F.; Elahi, S. A model for measuring e-learning systems success in universities. Expert Syst. Appl. 2012, 39, 10959–10966. [Google Scholar] [CrossRef]
  11. Fletcher, A. Australia’s National Assessment Programme rubrics: An impetus for self-assessment? Educ. Res. 2021, 63, 43–64. [Google Scholar] [CrossRef]
  12. Andrade, H.L. A Critical Review of Research on Student Self-Assessment. Front. Educ. 2019, 4, 87. [Google Scholar] [CrossRef] [Green Version]
  13. Ahmad, N.; Quadri, N.N.; Qureshi, M.R.N.; Alam, M.M. Relationship Modeling of Critical Success Factors for Enhancing Sustainability and Performance in E-Learning. Sustainability 2018, 10, 4776. [Google Scholar] [CrossRef]
  14. Davis, F.D.; Venkatesh, V. A critical assessment of potential measurement biases in the technology acceptance model: Three experiments. Int. J. Hum. Comput. Stud. 1996, 45, 19–45. [Google Scholar] [CrossRef] [Green Version]
  15. Sarbaini, S. Managing e-learning in public universities by investigating the role of culture. Pol. J. Manag. Stud. 2019, 20, 394–404. [Google Scholar] [CrossRef]
  16. Parasuraman, A.; Zeithaml, V.; Berry, L.L. SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. J. Retail. 1988, 64, 12–40. [Google Scholar]
  17. Sadaf, A.; Newby, T.J.; Ertmer, P.A. Exploring Factors that Predict Preservice Teachers’ Intentions to UseWeb 2.0 Technologies Using Decomposed Theory of Planned Behavior. J. Res. Technol. Educ. 2012, 45, 171–196. [Google Scholar] [CrossRef]
  18. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  19. Venkatesh, V.; Thong, J.Y.L.; Xu, X. Consumer Acceptance and Use of Information Technology: Extending the Unified Theory of Acceptance and Use of Technology. MIS Q. 2012, 36, 157–178. [Google Scholar] [CrossRef] [Green Version]
  20. Cardona, M.M.; Bravo, J.J. Service quality perceptions in higher education institutions: The case of a colombian university. Estud. Gerenc. 2012, 28, 23–29. [Google Scholar] [CrossRef] [Green Version]
  21. Alsabawy, A.Y.; Cater-Steel, A.; Soar, J. IT infrastructure services as a requirement for e-learning system success. Comput. Educ. 2013, 69, 431–451. [Google Scholar] [CrossRef]
  22. Al-Fraihat, D.; Joy, M.; Masa’Deh, R.; Sinclair, J. Evaluating E-learning systems success: An empirical study. Comput. Hum. Behav. 2020, 102, 67–86. [Google Scholar] [CrossRef]
  23. Waheed, M.; Kaur, K.; Qazi, A. Students’ perspective on knowledge quality in eLearning context: A qualitative assessment. Internet Res. 2016, 26, 120–145. [Google Scholar] [CrossRef]
  24. Vasconcelos, P.; Furtado, E.S.; Pinheiro, P.; Furtado, L. Multidisciplinary criteria for the quality of e-learning services design. Comput. Hum. Behav. 2020, 107, 105979. [Google Scholar] [CrossRef]
  25. Asoodar, M.; Vaezi, S.; Izanloo, B. Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Comput. Hum. Behav. 2016, 63, 704–716. [Google Scholar] [CrossRef]
  26. Yilmaz, R. Exploring the role of e-learning readiness on student satisfaction and motivation in flipped classroom. Comput. Hum. Behav. 2017, 70, 251–260. [Google Scholar] [CrossRef]
  27. Kanwal, F.; Rehman, M. Measuring Information, System and Service Qualities for the Evaluation of E-Learning Systems in Pakistan. Pak. J. Sci. 2016, 68, 302–307. Available online: https://www.thefreelibrary.com/MEASURING+INFORMATION%2c+SYSTEM+AND+SERVICE+QUALITIES+FOR+THE...-a0467164634 (accessed on 26 July 2022).
  28. Alhumaid, K.; Ali, S.; Waheed, A.; Zahid, E.; Habes, M. COVID-19 E-learning: Perceptions attitudes of tyeachers towards E-learning acceptance in the developing countries. Multic. Educ. 2020, 6, 100–115. [Google Scholar]
  29. Almanthari, A.; Maulina, S.; Bruce, S. Secondary School Mathematics Teachers’ Views on E-Learning Implementation Barriers during the COVID-19 pandemic: The case of Indonesia. Eurasia J. Math. Sci. Technol. 2020, 16, en1860. [Google Scholar]
  30. Cheok, M.L.; Wong, S.L.; Ayub, A.F.; Mahmud, R. Teachers’ Perceptions of E-learning in Malaysian Secondary Schools. Malays. Online J. Educ. Technol. 2017, 5, 20–33. [Google Scholar]
  31. Mahdizadeh, H.; Biemans, H.; Mulder, M. Determining factors of the use of E-learning environments by universitary teachers. Comput. Educ. 2008, 51, 142–154. [Google Scholar] [CrossRef]
  32. LOMLOE. Ley Orgánica de Modificación de la LOE. BOE (Boletín Oficial del Estado), 30 December 2022. Spanish Webpage. Available online: https://www.boe.es/boe/dias/2020/12/30/pdfs/BOE-A-2020-17264.pdf (accessed on 26 July 2022).
  33. LOMCE. Ley Orgánica Para la Mejora de la Calidad Educativa. BOE (Boletín Oficial del Estado), 10 December 2013. Spanish Webpage. Available online: https://www.boe.es/boe/dias/2013/12/10/pdfs/BOE-A-2013-12886.pdf (accessed on 26 July 2022).
  34. LGE (General Law of Education). Ley General de Educación. BOE (Boletín Oficial del Estado), 10 December 2013. Spanish Webpage. Available online: https://www.boe.es/boe/dias/1970/08/06/pdfs/A12525-12546.pdf (accessed on 26 July 2022).
  35. Cidral, W.A.; Oliveira, T.; Di Felice, M.; Aparicio, M. E-learning success determinants: Brazilian empirical study. Comput. Educ. 2018, 122, 273–290. [Google Scholar] [CrossRef] [Green Version]
  36. Klobas, J.E.; McGill, T.J. The role of involvement in learning management system success. J. Comput. High. Educ. 2010, 22, 114–134. [Google Scholar] [CrossRef]
  37. Eom, S.B.; Wen, H.J.; Ashill, N. The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decis. Sci. 2006, 4, 215–235. [Google Scholar] [CrossRef]
  38. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  39. Abdullah, F.; Ward, R.; Ahmed, E. Investigating the influence of the most commonly used external variables of TAM on students’ Perceived Ease of Use (PEOU) and Perceived Usefulness (PU) of e-portfolios. Comput. Hum. Behav. 2016, 63, 75–90. [Google Scholar] [CrossRef]
  40. Cyert, R.M.; March, J.G. A Behavioral Theory of the Firm; M.E. Sharpe: Englewood Cliffs, NJ, USA, 1963; Volume 2. [Google Scholar]
  41. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer technology: A comparison of two theoretical models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef] [Green Version]
  42. Venkatesh, V.; Davis, F.D. A theoretical extension of the technology acceptance model: Four longitudinal field studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef]
  43. Venkatesh, V.; Bala, H. Technology acceptance model 3 and a research agenda on interventions. Decis. Sci. 2008, 39, 273–315. [Google Scholar] [CrossRef] [Green Version]
  44. Teo, T. Development and validation of the E-learning Acceptance Measure (ElAM). Internet High. Educ. 2010, 13, 148–152. [Google Scholar] [CrossRef]
  45. Shen, D.; Cho, M.-H.; Tsai, C.-L.; Marra, R. Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet High. Educ. 2013, 19, 10–17. [Google Scholar] [CrossRef]
  46. Ithriah, S.A.; Ridwandono, D.; Suryanto, T.L.M. Online Learning Self-Efficacy: The Role in E-Learning Success. J. Phys. Conf. Ser. 2019, 1569, 022053. [Google Scholar] [CrossRef]
  47. Kerzic, D. Academic student satisfaction and perceived performance in the e-learning environment during the COVID-19 pandemic: Evidence across ten countries. PLoS ONE 2021, 16, e0258807. [Google Scholar] [CrossRef]
  48. Sun, P.C.; Tsai, R.J.; Finger, G.; Chen, Y.Y.; Yeh, D. What drives a successful e-Learning: An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 2008, 50, 1183–1202. [Google Scholar] [CrossRef]
  49. Gray, J.A.; DiLoreto, M. The effects of student engagement, student satisfaction, and perceived learning in online learning environments. Int. J. Educ. Leadersh. Prep. 2016, 11, n1. Available online: https://files.eric.ed.gov/fulltext/EJ1103654.pdf (accessed on 26 July 2022).
  50. Marks, R.B.; Sibley, S.D.; Arbaugh, J.B. A structural equation model of predictors for effective online learning. J. Manag. Educ. 2005, 29, 531–563. [Google Scholar] [CrossRef]
  51. León, S.P.; Augusto-Landa, J.M.; García-Martínez, I. Moderating Factors in University Students’ Self-Evaluation for Sustainability. Sustainability 2021, 13, 4199. [Google Scholar] [CrossRef]
  52. Ganji Arjenaki, B. Surveying the quality of electronic tests in the student satisfaction. Educ. Strateg. Med. Sci. 2017, 10, 180–188. [Google Scholar]
  53. Arlien, K.M. Community College Faculty Members’ Perceptions of Creating Digital Content to Enhance Online Instructor Social Presence. Ph.D. Dissertation, University of North Dakota, UND Scholary Commons, Grand Forks, ND, USA, 2016. Available online: https://commons.und.edu/theses/1862 (accessed on 26 July 2022).
  54. Ong, C.-S.; Lai, J.-Y.; Wang, Y.-S. Factors affecting engineers’ acceptance of asynchronous e-learning systems in high-tech companies. Inf. Manag. 2004, 41, 795–804. [Google Scholar] [CrossRef]
  55. Sukendro, S.; Habibi, A.; Khaeruddin, K.; Indrayana, B.; Syahruddin, S.; Makadada, F.A.; Hakim, H. Using an extended Technology Acceptance Model to understand students’ use of e-learning during COVID-19: Indonesian sport science education context. Heliyon 2020, 6, e05410. [Google Scholar] [CrossRef]
  56. Triandis, H.C. Values, attitudes, and interpersonal behavior. In Nebraska Symposium on Motivation; Howe, H.E., Page, M., Eds.; University of Nebraska Press: Lincoln, NE, USA, 1979; pp. 195–259. [Google Scholar]
  57. Lee, Y.-C. An empirical investigation into factors influencing the adoption of an e-learning system. Online Inf. Rev. 2006, 30, 517–541. [Google Scholar] [CrossRef] [Green Version]
  58. Fishbein, M.; Ajzen, I. Belief, Attitude, Intentions and Behavior: An Introduction to Theory and Research; Addison-Wesley Pub. Co.: Reading, MA, USA, 1975. [Google Scholar]
  59. Van Raaij, E.M.; Schepers, J.J.L. The acceptance and use of a virtual learning environment in China. Comput. Educ. 2008, 50, 838–852. [Google Scholar] [CrossRef]
  60. Hsu, C.-L.; Lu, H.-P. Why do people play on-line games? An extended TAM with social influences and flow experience. Inf. Manag. 2004, 41, 853–868. [Google Scholar] [CrossRef]
  61. Deci, E.L.; Ryan, R.M. The ‘what’ and the ‘why’ of goal pursuits: Human needs and the self-determination of behavior. Psychol. Inq. 2000, 11, 227–268. [Google Scholar] [CrossRef]
  62. Farrell, J.; Saloner, G. Standardization, compatibility, and innovation. RAND J. Econ. 1985, 16, 70–83. [Google Scholar] [CrossRef]
  63. Katz, M.L.; Shapiro, C. Network externalities, competition, and compatibility. Am. Econ. Rev. 1985, 75, 424–440. [Google Scholar]
  64. Van den Ende, J.; Wijnberg, N.; Vogels, R.; Kerstens, M. Organizing innovative projects to interact with market dynamics: A coevolutionary approach. Eur. Manag. J. 2003, 21, 273–284. [Google Scholar] [CrossRef]
  65. Bandura, A. Self-efficacy: Toward a unifying theory of behavioral change. Psychol. Rev. 1977, 84, 191–215. [Google Scholar] [CrossRef]
  66. Kim, B.G.; Park, S.C.; Lee, K.J. A structural equation modeling of the Internet acceptance in Korea. Electron. Commer. Res. Appl. 2007, 6, 425–432. [Google Scholar] [CrossRef]
  67. Ong, C.S.; Lai, J.Y. Gender differences in perceptions and relationships among dominants of e-learning acceptance. Comput. Hum. Behav. 2006, 22, 816–829. [Google Scholar] [CrossRef]
  68. Pituch, K.A.; Lee, Y.-K. The influence of system characteristics on e-learning use. Comput. Educ. 2006, 47, 222–244. [Google Scholar] [CrossRef]
  69. Barbeite, F.G.; Weiss, E.M. Computer self-efficacy and anxiety scales for an Internet sample: Testing measurement equivalence of existing measures and development of new scales. Comput. Hum. Behav. 2004, 20, 1–15. [Google Scholar] [CrossRef]
  70. Wang, W.T.; Wang, C.C. An empirical study of instructor adoption of webbased learning systems. Comput. Educ. 2009, 53, 761–774. [Google Scholar] [CrossRef]
  71. Roca, J.C.; Chiu, C.-M.; Martínez, F.J. Understanding e-learning continuance intention: An extension of the technology acceptance model. Int. J. Hum. Comput. 2006, 64, 683–696. [Google Scholar] [CrossRef] [Green Version]
  72. Petter, S.; McLean, E.R. A meta-analytic assessment of the DeLone and McLean IS success model: An examination of IS success at the individual level. Inf. Manag. 2009, 46, 159–166. [Google Scholar] [CrossRef]
  73. Cakır, R.; Solak, E. Attitude of Turkish EFL learners towards e-learning through TAM Model. Procedia Soc. Behav. Sci. 2015, 176, 596–601. [Google Scholar] [CrossRef] [Green Version]
  74. Mohammadi, H. Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Comput. Hum. Behav. 2015, 45, 359–374. [Google Scholar] [CrossRef]
  75. Ramírez-Correa, P.E.; Arenas-Gaitán, J.; Rondán-Cataluña, F.J. Gender and acceptance of e-learning: A multi-group analysis based on a structural equation model among college students in Chile and Spain. PLoS ONE 2015, 10, e014046. [Google Scholar]
  76. Saade, R.; Nebebe, F.; Tan, W. Viability of the” technology acceptance model” in multimedia learning environments: A comparative study. Interdiscip. J. E-Ski. Lifelong Learn. 2007, 3, 175–184. [Google Scholar] [CrossRef] [Green Version]
  77. Zhang, S.; Zhao, J.; Tan, W. Extending TAM for online learning systems: An intrinsic motivation perspective. Tsinghua Sci. Technol. 2008, 13, 312–317. [Google Scholar] [CrossRef]
  78. Stoel, L.; Lee, K.H. Modeling the effect of experience on student acceptance of Web-based courseware. Internet Res. 2003, 13, 364–374. [Google Scholar] [CrossRef]
  79. Lee, M.K.O.; Cheung, C.M.K.; Chen, Z. Acceptance of Internet-based learning medium: The role of extrinsic and intrinsic motivation. Inf. Manag. 2005, 42, 1095–1104. [Google Scholar] [CrossRef]
  80. Ngai, E.W.T.; Poon, J.K.L.; Chan, Y.H.C. Empirical examination of the adoption of WebCT using TAM. Comput. Educ. 2007, 48, 250–267. [Google Scholar] [CrossRef]
  81. Liu, S.-H.; Liao, H.-L.; Pratt, J.A. Impact of media richness and flow on e-learning technology acceptance. Comput. Educ. 2009, 52, 599–607. [Google Scholar] [CrossRef]
  82. Ndubisi, N.O. Factors of online learning adoption: A comparative juxtaposition of the theory of planned behaviour and the technology acceptance model. Int. J. E-Learn. 2006, 5, 571–591. [Google Scholar]
  83. Lee, Y.-C. The role of perceived resources in online learning adoption. Comput. Educ. 2008, 50, 1423–1438. [Google Scholar] [CrossRef]
  84. Roca, J.C.; Gagné, M. Understanding e-learning continuance intention in the workplace: A self-determination theory perspective. Comput. Hum. Behav. 2008, 24, 1585–1604. [Google Scholar] [CrossRef]
  85. Chatzoglou, P.D.; Sarigiannidis, L.; Vraimaki, E.; Diamantidis, A. Investigating Greek employees’ intention to use web-based training. Comput. Educ. 2009, 53, 877–889. [Google Scholar] [CrossRef]
  86. Lee, B.-C.; Yoon, J.-O.; Lee, I. Learners’ acceptance of e-learning in South Korea: Theories and results. Comput. Educ. 2009, 53, 1320–1329. [Google Scholar] [CrossRef]
  87. Teo, T. Examining the influence of subjective norm and facilitating conditions on the intention to use technology among pre-service teachers: A structural equation modeling of an extended technology acceptance model. Asia Pac. Educ. Rev. 2010, 11, 253–262. [Google Scholar] [CrossRef]
  88. Li, S.; Zhang, J.; Yu, C.; Chen, L. Rethinking distance tutoring in e-learning environments: A study of the priority of roles and competencies of open university tutors in China. Int. Rev. Res. Open Distance Learn. 2017, 18, 189–212. [Google Scholar] [CrossRef] [Green Version]
  89. Davis, H.C.; Fill, K. Embedding blended learning in a university’s teaching culture: Experiences and reflections. Br. J. Educ. Technol. 2007, 38, 817–828. [Google Scholar] [CrossRef] [Green Version]
  90. Hiltz, S.R.; Coppola, N.; Rotter, N.; Turoff, M.; Benbunan-Fich, R. Measuring the importance of collaborative learning for the effectiveness of ALN: A multimeasure, multi-method approach. J. Asynchronous Learn. Netw. 2000, 4, 103–125. [Google Scholar] [CrossRef]
  91. Salmon, G. E-Moderating: The Key to Online Teaching and Learning, 2nd ed.; Routledge: London, UK, 2004. [Google Scholar] [CrossRef]
  92. Nambiar, D. The impact of online learning during COVID-19: Students’ and teachers’ perspective. Int. J. Indian Psychol. 2020, 8, 783–793. Available online: https://ijip.in/articles/the-impact-of-online-learning-during-covid-19-students-and-teachers-perspective/ (accessed on 26 July 2022).
  93. Pérez-Jorge, D.; Rodríguez-Jiménez, M.C.; Ariño-Mateo, E.; Barragán-Medero, F. The effect of COVID-19 in university tutoring models. Sustainability 2020, 12, 8631. [Google Scholar] [CrossRef]
  94. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to use and how to report the results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  95. Lowry, P.B.; Gaskin, J. Partial least squares (PLS) structural equation modeling (SEM) for building and testing behavioral causal theory: When to choose it and how to use it. IEEE Trans. Prof. Commun. 2014, 57, 123–146. [Google Scholar] [CrossRef]
  96. Hair, J.F., Jr.; Anderson, R.E.; Tatham, R.L.; Black, W.C. Multivariate Data Analysis, 5th ed.; Prentice-Hall: Upper Saddle River, NJ, USA, 1998. [Google Scholar]
  97. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed.; Sage Publications: London, UK, 2017. [Google Scholar]
  98. Sarstedt, M.; Ringle, C.M.; Hair, J. Partial Least Squares Structural Equation Modeling. In Handbook of Market Research; Springer Publisher: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  99. Diamantopoulos, A.; Winklhofer, H. Index Construction with Formative Indicators: An Alternative to Scale Development. J. Mark. Res. 2001, 38, 269–277. [Google Scholar] [CrossRef] [Green Version]
  100. Coltman, T.; Devinney, T.M.; Midgley, D.F.; Venaik, S. Formative versus reflective measurement models: Two applications of formative measurement. J. Bus. Res. 2008, 61, 1250–1262. [Google Scholar] [CrossRef] [Green Version]
  101. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis: A Global Perspective, 7th ed.; Pearson Education, Prentice-Hall: Upper Saddle River, NJ, USA, 2010. [Google Scholar]
  102. Urbach, N.; Ahlemann, F. Structural equation modeling in information systems research using Partial Least Squares. J. Inf. Technol. Theory Appl. 2010, 11, 5–40. Available online: https://aisel.aisnet.org/jitta/vol11/iss2/2 (accessed on 26 July 2022).
  103. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  104. Henseler, J.; Ringle, C.; Sarstedt, M. A New Criterion for Assessing Discriminant Validity in Variance-based Structural Equation Modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
  105. Hair, J.; Ringle, C.; Sarstedt, M. PLS-SEM: Indeed a Silver Bullet. J. Mark. Theory Pract. 2011, 19, 139–151. [Google Scholar] [CrossRef]
  106. Wolf, E.J.; Harrintong, K.L.; Clark, S.L.; Miller, M.W. Sample size requirements for Structural Equation Model: An Evaluation of Power, Bias and Solution Propiety. Educ. Psycol. Meas. 2013, 73, 913–934. [Google Scholar] [CrossRef] [PubMed]
  107. MacCallum, R.C.; Widaman, K.F.; Zhang, S.; Hong, S. Sample size in factor analysis. Psycol. Methods 1999, 4, 84–89. [Google Scholar] [CrossRef]
  108. Gagné, P.; Hancock, G.R. Measurement model quality, sample size, and solution propriety in confirmatory factor models. Multivar. Behav. Res. 2006, 41, 65–83. [Google Scholar] [CrossRef]
  109. Teo, T.; Luan, W.S.; Sing, C.C. A cross-cultural examination of the intention to use technology between Singaporean and Malaysian pre-service teachers: An application of the Technology Acceptance Model (TAM). J. Educ. Techno. Soc. 2008, 11, 265–280. [Google Scholar]
  110. Hair, J.F.; Sarstedt, M.; Ringle, C.M.; Gudergan, S.P. Advanced Issues in Partial Least Squares Structural Equation Modeling (PLS-SEM); Sage: Thousand Oaks, CA, USA, 2018. [Google Scholar] [CrossRef]
  111. Hooper, D.; Coughlan, J.; Mullen, M.R. Structural equation modelling: Guidelines for determining model fit. Electron. J. Bus. Res. Methods 2008, 6, 53–60. [Google Scholar]
  112. Lohmöller, J.B. Predictive vs. structural modeling: Pls vs. ml. In Latent Variable Path Modeling with Partial Least Squares; Physica-Verlag: Heidelberg, Germany, 1989; pp. 199–226. [Google Scholar]
  113. Hu, L.-T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Modeling 1999, 6, e82131. [Google Scholar] [CrossRef]
  114. Henseler, J.; Sarstedt, M. Goodness-of-Fit Indices for Partial Least Squares Path Modeling. Comput. Stat. 2013, 28, 565–580. [Google Scholar] [CrossRef] [Green Version]
  115. Tenenhaus, M.; Vinzi, V.E.; Chatelin, Y.M.; Lauro, C. PLS Path Modeling. Comput. Stat. Data Anal. 2005, 48, 159–205. [Google Scholar] [CrossRef]
  116. Wetzels, M.; Odekerken-Schroder, G.; Van Oppen, C. Using PLS path modeling for assessing hierarchical construct models: Guidelines and empirical illustration. MIS Q. 2009, 33, 177–195. [Google Scholar] [CrossRef]
  117. United Nations. Sustainable Development Goals; UN: New York, NY, USA, 2022. Available online: https://sdgs.un.org/es/goals (accessed on 26 July 2022).
  118. Al-Gahtani, S.S. Empirical Investigation of e-Learning Acceptance and Assimilation: A Structural Equation Model. Appl. Comput. Inform. 2016, 12, 27–50. [Google Scholar] [CrossRef] [Green Version]
  119. Aparicio, M.; Bacao, F.; Olivera, T. An e-Learning Theoretical Framework. Educ. Technol. Soc. 2016, 19, 292–307. [Google Scholar]
  120. Seta, H.B.; Wati, T.; Muliawati, A.; Hidayanto, A.N. E-learning success model: An extention of Delone Mclean is’ success model. Indones. J. Electr. Eng. Inform. 2018, 6, 281–291. [Google Scholar] [CrossRef]
Figure 1. The research model (3S-T model).
Figure 1. The research model (3S-T model).
Sustainability 14 13261 g001
Figure 2. PLS-SEM model evaluation (adapted from [98]).
Figure 2. PLS-SEM model evaluation (adapted from [98]).
Sustainability 14 13261 g002
Figure 3. Structural model path coefficients.
Figure 3. Structural model path coefficients.
Sustainability 14 13261 g003
Table 1. Respondents’ profile.
Table 1. Respondents’ profile.
Characteristics NumberPercentage (%)
GenderMale12356.7
Female9443.3
Age11–1410147
14–167536
16–184119
Education Level1 ESO7133
2 ESO3416
3 ESO3516
4 ESO4018
1 Bachelor209
2 Bachelor178
Computer SkillsYes14568.9
No3817.5
No answer3415.7
Table 2. Measures of internal consistency, reliability, and validity.
Table 2. Measures of internal consistency, reliability, and validity.
Cronbach’s
Alpha
rho_AComposite
Reliability
Average Variance
Extracted (AVE)
OSE0.7790.7900.8490.530
PE0.8740.8740.9220.799
PEU0.8880.8940.9300.817
SEQ0.5830.6360.7730.539
NE0.7830.8080.8720.696
S0.7910.8210.8560.548
SSA0.8310.8320.8990.748
Student Academy Performance0.8070.8140.9120.838
Student Satisfaction0.7870.8010.8750.701
Subject Norm0.7190.7210.8770.781
SQ0.8280.8320.8970.743
TQ0.8930.9010.9150.608
EUS0.8320.8420.8740.501
E0.8460.8530.8850.526
IQ0.7040.7750.8330.630
Table 3. Fornell–Larcker discriminant validity correlation matrix.
Table 3. Fornell–Larcker discriminant validity correlation matrix.
EEUSIQOSEPEPEUSSAPSNSEQSSSSASNSQTQ
E0.71
EUS0.700.73
IQ0.660.600.79
OSE0.610.540.510.73
PE0.720.570.580.470.89
PEU0.790.620.630.490.650.90
S0.580.460.540.510.480.510.73
SAP0.350.330.330.340.300.240.270.83
SN0.530.600.490.490.430.450.450.220.74
SQ0.720.700.570.470.600.700.460.360.470.86
SS0.620.600.460.440.490.620.360.280.460.680.92
SSA0.760.620.610.480.720.740.540.340.460.690.660.84
SubNorm0.630.610.560.470.520.580.540.360.490.600.480.580.88
SystQ0.580.470.550.510.440.510.580.300.400.500.380.530.480.86
TQ0.590.630.510.500.500.530.520.320.550.550.480.520.610.470.78
Table 4. Heterotrait–Monotrait ratio (HTMT) correlation matrix.
Table 4. Heterotrait–Monotrait ratio (HTMT) correlation matrix.
EEUSIQOSEPEPEUSSAPSNSEQSSSSASNSQTQ
E
EUS0.83
IQ0.840.76
OSE0.730.640.68
PE0.840.660.710.55
PEU0.910.710.770.570.73
S0.780.640.850.730.640.68
SAP0.410.390.450.440.350.290.38
SN0.640.720.640.610.500.510.640.29
SQ0.860.820.730.560.700.820.640.450.57
SS0.750.720.580.530.580.720.500.360.550.82
SSA0.920.740.790.570.870.870.740.430.550.850.82
SN0.810.780.790.600.660.730.830.490.640.770.640.75
SystQ0.700.550.730.620.520.590.860.380.500.600.460.660.63
TQ0.660.710.640.590.550.560.720.380.630.620.540.590.750.53
Table 5. Results of the hypothesis testing and path analysis.
Table 5. Results of the hypothesis testing and path analysis.
HPathΒ-CoefficientStandard Deviation T Statistics p-ValuesSupport
H1a-1SN -> EUS0.1060.1050.0631.6850.093
H1a-2SN -> PEU0.2010.1990.0593.4200.001
H1a-3SN- > PE0.0580.0510.0730.7940.428
H1b-1NE -> PEU−0.073−0.0720.0451.6400.102
H1b-2NE-> EUS0.0180.0220.0490.3740.709
H1b-3NE-> PE0.0310.0340.0560.5490.583
H2a-1OSE -> EUS0.2110.2070.0593.6070.000
H2a-2OSE -> PE0.0080.0060.0710.1070.915
H2a-3OSE -> PEU−0.020−0.0270.0630.3170.751
H2b-1S -> E0.6040.6080.04413.8400.000
H2c-1E -> SAP0.3490.3490.0576.0990.000
H3a-1SQ -> EUS0.1220.1250.0592.0590.040
H3a-2SQ -> PE−0.046−0.0440.0670.6960.487
H3b-1SEQ -> EUS0.0910.0920.0631.4480.148
H3b-2SEQ -> PE0.0560.0550.0640.8770.381
H3b-3SEQ -> PEU0.0200.0200.0520.3790.705
H3c-1IQ -> EUS0.2550.2570.0524.9190.000
H3c-2IQ -> PE0.1610.1630.0672.3920.017
H3c-3IQ -> PEU0.1800.1800.0662.7520.006
H4a-1EUS -> PEU0.6310.6350.0728.7780.000
H4a-2EUS -> PE0.5580.5590.0747.5860.000
H4a-3EUS -> SS0.3040.3090.0843.6390.000
H4b-1PE -> SS0.3100.3090.0674.6660.000
H4c-1PEU -> SS0.2980.2970.0694.3090.000
H5a-1TQ -> EUS0.1200.1230.0651.8550.064
H5a-2TQ -> S0.5470.5530.05010.8490.000
H6a-1SS -> SAP0.6620.6620.04016.3910.000
H6a-2SS -> SSA0.3090.3090.0595.2480.000
H6b-1SAP -> SSA0.2600.2620.0693.7720.000
Table 6. Q2 results, showing the predictive relevance.
Table 6. Q2 results, showing the predictive relevance.
Q2Predictive Relevance
E0.185Moderate
EUS0.301Moderate
PE0.420Large
PEU0.520Large
S0.157Moderate
SAP0.359Large
SS0.463Large
SSA0.467Large
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Martínez-Gómez, M.; Bustamante, E.; Berna-Escriche, C. Development and Validation of an E-Learning Education Model in the COVID-19 Pandemic: A Case Study in Secondary Education. Sustainability 2022, 14, 13261. https://doi.org/10.3390/su142013261

AMA Style

Martínez-Gómez M, Bustamante E, Berna-Escriche C. Development and Validation of an E-Learning Education Model in the COVID-19 Pandemic: A Case Study in Secondary Education. Sustainability. 2022; 14(20):13261. https://doi.org/10.3390/su142013261

Chicago/Turabian Style

Martínez-Gómez, Mónica, Eliseo Bustamante, and César Berna-Escriche. 2022. "Development and Validation of an E-Learning Education Model in the COVID-19 Pandemic: A Case Study in Secondary Education" Sustainability 14, no. 20: 13261. https://doi.org/10.3390/su142013261

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop