Next Article in Journal
Digital Escape Room, Using Genial.Ly and A Breakout to Learn Algebra at Secondary Education Level in Spain
Previous Article in Journal
Cooperative Collaboration in the Hybrid Space of Google Docs Based Group Work
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Factors Affecting Acceptance of E-Learning: A Machine Learning Algorithm Approach

Academy of Journalism and Communication, 36 Xuan Thuy Street, Cau Giay District, Hanoi 123105, Vietnam
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2020, 10(10), 270; https://doi.org/10.3390/educsci10100270
Submission received: 27 August 2020 / Revised: 20 September 2020 / Accepted: 22 September 2020 / Published: 30 September 2020
(This article belongs to the Section Technology Enhanced Education)

Abstract

:
The Covid-19 epidemic is affecting all areas of life, including the training activities of universities around the world. Therefore, the online learning method is an effective method in the present time and is used by many universities. However, not all training institutions have sufficient conditions, resources, and experience to carry out online learning, especially in under-resourced developing countries. Therefore, the construction of traditional courses (face to face), e-learning, or blended learning in limited conditions that still meet the needs of students is a problem faced by many universities today. To solve this problem, we propose a method of evaluating the influence of these factors on the e-learning system. From there, it is a matter of clarifying the importance and prioritizing construction investment for each factor based on the K-means clustering algorithm, using the data of students who have been participating in the system. At the same time, we propose a model to support students to choose one of the learning methods, such as traditional, e-learning or blended learning, which is suitable for their skills and abilities. The data classification method with the algorithms multilayer perceptron (MP), random forest (RF), K-nearest neighbor (KNN), support vector machine (SVM) and naïve bayes (NB) is applied to find the model fit. The experiment was conducted on 679 data samples collected from 303 students studying at the Academy of Journalism and Communication (AJC), Vietnam. With our proposed method, the results are obtained from experimentation for the different effects of infrastructure, teachers, and courses, also as features of these factors. At the same time, the accuracy of the prediction results which help students to choose an appropriate learning method is up to 81.52%.

1. Introduction

The Covid-19 pandemic is spreading rapidly around the world and shows no signs of stopping. With different infection modes, there were nearly 20 million infections and more than 0.73 million deaths at the time of writing [1]. Moreover, the epidemic has affected all fields of economics and politics, and the social life of all people around the world [2]. The social separation and isolation required to avoid the spread of the epidemic has formed new habits and activities of life through the internet. Education is one of the fields greatly influenced as it is necessary for many people and in each different country. Students are limited in their ability to go to school and to participate in collective activities, so learning, research, and exchange activities are mainly done according to the e-learning or blended learning methods [3,4]. This approach has provided universities with an effective method of training in combining e-learning with traditional training [4], depending on the content and resources that the universities can provide for courses of e-learning at different levels. Favorable features for educational activities include online interactive activities, and data storage and assessment, which can be conducted on an internet platform so that students and faculty members can complete their tasks without being face to face [3,5].
However, e-learning has limitations: teachers and students must have certain skills, knowledge and experience in technology and pedagogical skills to complete their course. The infrastructure of the e-learning system needs to be synchronous, efficient, and secure. This is required to support teacher and student interaction, to store data, and evaluate the effectiveness of the course [6]. Therefore, the problem of how to have a good e-learning system is established. One of the interesting approaches is evaluating systems based on the influencing factors [7,8].
Katerina Kabassi et al. [9] evaluated the learning management system for blended learning in Greek using six influencing factors: platform, student, teacher, design, courses, environment and technology. Then, they used statistical methods to evaluate and analyze their system. Cheol-Rim Choi et al. [10] assessed the quality of the multimedia content of the e-learning system using four groups of factors: system quality, information quality, service quality, and the ability to evaluate the system based on calculating the weight of attributes using the analytic network process (ANP) technique. However, the weight initialized for attributes needs to be calculated carefully because it affects the evaluation results. Said A. Salloum et al. [5] used four groups of factors affecting the efficiency of e-learning systems: technology innovativeness, knowledge sharing, quality and trust. The data samples were collected from 251 students and statistical methods were used to evaluate the system.
Therefore, the method of sampling data from students in e-learning using affecting factors is scientifically grounded. In this research, we propose a method of evaluating e-learning systems based on data gathered from feedback of students using an explicit factor model (EFM) with clustering algorithm. We suggest separating factors into four groups such as: student, teacher, infrastructure and course, including sixteen features. The data clustering method used was the K-means machine learning algorithm. It is used to divide data into clusters. Thereby, it could help managers to evaluate and compare factors affecting the system to make appropriate decisions and build a better system.
The e-learning and blended learning systems are used by universities, and they are providing students with both online and traditional learning methods. Courses may be in online, traditional, or blended form depending on each institution. Normally, students often need help to get information on what form is suitable for their skills and abilities. This is solved by assistants or professors. However, this requires a lot of resources and sometimes does not really work out well. One interesting approach is to use data mining methods [11,12] which, from the available data, use an analytic process to give information about a problem in the future.
With the development of machine learning techniques and artificial intelligence, we propose a method to help students choose an appropriate learning method based on classification algorithms. It is called the studying method selection (SMS) model. According to this method, training data has been labelled and collected from previous students. Using the analytical SMS model, learners will receive results that they could use to select the best form of study. The most important aspect of this approach is how to show a set of features for systems. It is also a concerning problem for researchers, as in the following cases:
Alan Y.K. Chan et al. [8] evaluated online courses based on four main factors: online courses, learning effectiveness, evaluation methods and evaluation results. The student interaction information from the system based on the above factors is stored in the database. It is applied to system analysis and evaluation. The evaluation is based on the statistical method, aiming to recognize interactions between system factors. Said A. Salloum et al. [5] surveyed more than 280 students with a behavior intention model to predict the students’ intention to use e-learning with five related hypotheses. Sujit Kumar Basa et al. [13] have identified the main elements of e-learning to provide a framework to assess the uneven and difficult implementation of e-learning in initial and continuing educational institutions. Thereby, they are providing information to both learners and administrators. Their research is based on eight factors: institutional, technical, resource, training, competency, infrastructural, attitudinal and social integration. F. Martin et al. [8] have built a framework to evaluate the quality of an online course with seven factors: institutional support, technology infrastructure, course design, learner and instructor support, course assessment and evaluation, learning effectiveness, faculty and student satisfaction.
The training system is also built by effective learners. Therefore, feedback and assessment of students about the training system and capacity of the institution are useful. It helps to plan the policy and improve the quality of system, and following the above researches, identify influence factors based on the system purpose. They also affect the results, methodology and the type of evaluation. Because of that, we suggest four factors including sixteen features, which affect e-learning systems. Based on the features, a classification model will be made for forecasting the learning method. It helps the students choose a suitable method with their abilities and conditions.
In summary, we suggest four affected factors including sixteen features. Base on that, data samples that are collected from students who have taken the courses aim to solve two different problems: firstly, to evaluate the influence measurement of factors and features based on the clustering method by the K-mean algorithm. Secondly, to build a prediction model using classification techniques to support the students in choosing an appropriate learning method. The experiment is on the collected data samples from 303 students studying at the Academy of Journalism and Communication (AJC), Vietnam, with the prediction result being up to 81.52% accurate from the studying method selection model (SMSM). It is meaningful for AJC to enhance training qualities. Specially, to provide training in journalistic, publication, and political courses at any location of the nation.

2. Literature Review

2.1. Affected Factors

The evaluated results about e-learning systems are used so that students and management can develop the system effectively. The methods based on constituted factors of the system have applied for researches and showing their feasibility.
According to Table 1, the factors are usually used to assess e-learning, and blended learning systems such as: student or their activity on systems, shown in [7,8,9,14]. The teacher information and affectivity of them are shown in [8,9]. The course with content and features affectivity are shown in [8,9,15,16]. The infrastructure with component, structure, affectivity and ability are shown in [7,9,13,15,17,18]. Therefore, we also suggest a set of affecting factors aiming to evaluate the e-learning system. These features are separated into four groups such as: students, teacher, infrastructure and course. It is shown in Figure 1 below:
With the features illustrated in Table 2, we implemented the EFM model which is illustrated by Figure 2 to evaluate affecting measurement. By the measurement results, the manager improved the e-learning system. The features are also applied for SMSM aim to provide information for students by the model in Figure 3.

2.2. Artificial Intelligence Methodologies in Education

Nowadays, the artificial intelligence methodologies applied in education are more and more popular [4,19,20,21,22]. Ammar Almasri et al. [21] have used data mining techniques in education (EDM) to build an understandable model. It helps decision makers enhance the performance of the students. The novel of their approach is combination of clustering and classifying from data over 13 academic years of 1062 graduates. The classification algorithms MLP, NB, J48, ETM are used for the experiment. The accuracy, precision, recall and f1—score measurement are used to evaluate the model with the highest classification results up to 96.96%. Irene Pasina et al. [22] have used the hierarchal clustering algorithms to group students based on their learning style. Their experiment in engineering education is limited in Saudi Arabia.
It is shown that the data mining approach is more and more applied to education research. The popular algorithm could be found in research such as the K-mean clustering algorithm [23]; the classify algorithms are applied to build an understandable model, such as decision tree (DT), RF, extreme gradient boosting (EGB) in [15,21,24,25].

3. Evaluated E-Learning System Using Clustering Method

With the factors that affect that the E-learning system, we propose a method to calculate the influence measurement of factors based on clustering method such as Figure 2 below:
In Figure 2, we use the model to exploit an affectation measurement of factors with the e-learning system. Thereby, we have a basis for investing in and upgrading the system. The raw data obtained from the students’ assessment of the system is transformed to the dataset (D). It consists of N rows such as vector type V ( f 1 , f 2 , , f N ) with N as a number of features. The dataset D will be applied to find a number K, which is a number that the dataset D could divide into K clusters using D, K, and K-mean algorithm. We also obtained K clusters such as C l u s t e r ( C 1 , C 2 , , C K ) with C i = V i ( f 1 i , f 2 i , , f N i ) and N as number of features. Then, we got the per factor vector from their features by Equation (1). It shows the influence measurement of the factors and helps management to make a strategy decision to invest the factor or features. Our algorithm suggestion is illustrated by Algorithm 1 below:
Algorithm 1 calculated vector of influence factors
Initialization: D, a data set of N features.
1. getting K, number of clusters [26]
2. cluster D to K Cluster ( D 1 ,   D 2 , ,   D k ) by K Mean
3. calculated a Vector F a c t o r p ( f p 1 , f p 2 , , f p K ) by Equation (1)
Return  F a c t o r p ( f p 1 , f p 2 , , f p K )
Equation (1) calculated factors value.
F a c t o r p ( f p 1 , f p 2 , , f p K ) = ( j = 1 M f j 1 p M , j = 1 M f j 2 p M , , j = 1 M f j K p M ) w i t h   f j 1 p C 1   a n d   j = M   i s   n u m b e r   f e a t u r e   o f   F a c t o r   p

4. Studying Method Selection Model (SMSM)

4.1. Classifying Model

The SMSM is divided into two phases. Firstly, the training phase, in which the data training with labels is used for the training model and obtaining the classifier (CF). The data labels are collected from the students who have finished their course and evaluated the system based on feature suggestions. After data transformation, we obtained a data set (DT) with labels which are the learning method for training. The technical 10-fold cross-validation (10CV) is applied to an algorithm such as RF, NB, SVM, MP and KNN in order to find a better algorithm. By the algorithm selected, we built a classifier to predict the learning method when a student is required by their information input.
Secondly, the monitoring phase, after student’s data is transformed, the mode uses the classifier (CF) to predict a label. The obtainable prediction result is learning method label such as e-learning, traditional, or blended learning. It helps student to choose an appropriate studying method for a course via SMSM model. This result also helps institutions better support students as well as diversify their forms and improve their training quality.

4.2. Algorithms and Evaluated Measurement

4.2.1. Algorithms

In the first phase, some commonly used classification algorithms are: DT, RF, EGB and MP in [15,21,24,25]. Therefore, we suggest five algorithms to experiment on such as RF, KNN, SVM and NB algorithm, with the aim of choosing the appropriate algorithm for the second phase.

4.2.2. Evaluated Measurement

With the classification model, some measurements are used in evaluating data classification result, by factors such as: accuracy, precision, recall, f1-score, ROC in [15,21] and our previous research in [24,25]. Thereby, we also suggest accuracy, precision, recall, f1-score, and ROC measurement to evaluate classifying model results.

5. Experiment and Results

5.1. Data Collection

We used the survey samples method to collect data from students who have attended an online and offline training system at AJC. The illustration of student information based on features with their values is in Table 3 below:
We collected data samples from 303 students. They are from first year to fourth year studying at AJC. The number of collected data samples is 679 based on three different levels training: knowledge, basic and field. The AJC provided forms of learning for student such as e-learning, traditional and blended learning. This will be the data label used for classification model.

5.2. Environment

We used Google form to collect data samples from AJC students. The Waikato environment for knowledge analysis (WEKA) tool version 3.7.8 was used for clustering and classifying [27]. The default parameters of the machine learning algorithms on this WEKA version were applied during the experimentation. The number cluster K = 5 is found and applied for EFM.

5.3. Experiment on EFM Result

After collecting and transforming data, we used the method of selecting a number of clusters [26] to get the appropriate number of clusters K = 5. The obtained experiment on EFM results is illustrated in Table 4 with five clusters such as: C1, C2, C3, C4 and C5 below:
From the clustering results in Table 4, we could see that the blended learning method was chosen by the majority of students in all different levels of training. This is according to the effectiveness of blended learning. The traditional combination with online methods in blended learning for the whole course or each part of the course helps learners to complete their course more effectively. The results in Table 4 are also shown that journalistic and publication students who have better capacities will choose e-learning to implement their courses. From factors affecting the system, the manager also sees the influence level of three factors by the same measure: teacher, infrastructure, and course, which are calculated by Equation (1) and shown in Figure 4 below:
In Figure 4 it is shown that the infrastructure factor affects the e-learning system the most. This is also consistent with the practice; because if the infrastructure is not good it will adversely affect the teaching process, learning psychology, and the effectiveness of course evaluation. In addition, the manager can see each feature influencing in different factors. For the infrastructure factor, the specific level of influence for each feature is shown in Figure 5 below:
Figure 5 above also shows that the quality of the internet connection, hardware devices, and student learning equipment (IF9) are the most affected the e-learning system. The next is the LMS system and information storage services (IF10). The continuum feature is preparing lectures and material (IF8), then finally the policy environment set by the universities. This result provided information for managers to make policies and strategies to build a better system infrastructure or more features.
To perform an experiment with the same method on features of the course factor, we obtained the results in Figure 6 below:
As shown in Figure 6, the CF12 property has a higher value than the rest. It has proved that the feature of the suitableness of the content and nature of the course to the system are evaluated as higher. It is also shown that, depending on the nature and content requirements of the subject, it is possible to build the course according to the appropriate method at each training level (knowledge, basic, field).

5.4. Experiment on SMSM Result

In this session, we experimented on classification with five algorithms such as: NB, KNN, SVM, MP and RF. The method 10CV was also applied in the experimental process. The model results are evaluated by accuracy, precision, recall, f1—score and ROC measurement. It is expressed in Table 5 below:
As the results in Table 5 show, the results of the RF algorithm are the highest, such as: 81.52% in accuracy; 0.812 in f1-score and 0.91 in ROC measurement. With accuracy of the model by algorithm expressed in Figure 7, we used RF for predicted learning method label with monitoring data input on secondly phases. The classification performance results are different when we use different algorithms, on different datasets, or depending on the number of different features of the same dataset. For example, in the research in [15], the highest accuracy with RF algorithm is 75.5%, which was performed on the dataset that the research built.

5.5. Discussion

Each learning service system is oriented towards learners. Therefore, information received from learners is more important in building and managing the system. Based on each different purpose, the studies used appropriate methods based on the dataset collected. The data clustering method in this study is based on the proposed set of features used to render explicit and evaluate the effects of the system components. The experimentation results show that the infrastructure factor had the greatest impact on the system. In addition, the feature clearly shows influence based on its value. The order of precedence for features is at different levels, among which, hardware and internet quality is considered the most important property. Next is LMS and storage service and preparing lecture and material properties features. The last one is the influence of regulation environment property. The next important factor is the teacher; this is an indispensable factor because the reputation, orientation, skills and knowledge of the teachers help students get the best results in the learning and research process. Therefore, the experimentation results obtained show that the teacher is a greater influence than the course factor. This information is really useful, especially for universities in the transition phase of training model.
In the classification problem, the characteristics of the data always affect the classification results. Depending on the purpose of the problems, they propose different factors that affect different systems. Therefore, the number and value of the attributes will be one of the factors determining the model results. The question of how to best complete a course is always posed to each student. It depends on many influencing factors, but this information is not always fully provided by the student. Therefore, the SMSM model predicts the suitable method for a subject that students are preparing to choose to achieve the accuracy prediction result of 81.52% providing practical meaning. This gives students the confidence in their choice based on the data of previous learners. Moreover, the model also helps training institutions reduce the burden of consulting and answering students’ questions about related learning issues.
Using the RF classification algorithms, but on different datasets, the research [15] aims to understand the factors that influence the university learning process. The number of times students have accessed the resources made available on VLE platforms has been identified as a key factor affecting student performance.
The prediction accuracy results by our model are higher than that of research [15]. The comparison result is shown in Table 6. The predicted results of the SMSM model gave 3.32% higher accuracy than the results proposed by research [15].

6. Conclusions and Future Work

Building a strategy, operating, and managing online learning systems are some of the most important problems of each university. Choosing and properly assessing the influence of factors will help managers to have appropriate strategies. In our research, we have proposed four influencing factors including sixteen features, using data clustering algorithms to analyze elements and features to help managers enhance training qualities and systems effectively. At the same time, we used data classification techniques to build a model to support students in choosing the learning methods, such as e-learning, traditional, or blended leaning, that are suitable for their course content and learning capacity from the datasets of previous learners with accuracy of result of up to 81.52%.
The research is still limited because the experimental data has only been from AJC. The classification algorithms have used default parameters. Therefore, in the next studies, we will add more features, use the evaluation methods of feature selection, use advanced techniques such as deep learning and perform the steps of optimization of parameters and algorithms for higher results.
Beside the limitations, the research has proposed a set of features to enhance effective e-learning in order to build and improve the quality of the education system by current popular techniques. At the same time, we proposed research on facilities to improve the performance of training institutions based on successful application of artificial intelligence.

Author Contributions

Conceptualization, methodology, D.-N.L., H.-Q.L. and T.-H.V.; data collection and analysis: D.-N.L., H.-Q.L. and T.-H.V.; performed the experiments: D.-N.L.; writing—review and editing, D.-N.L., H.-Q.L. and T.-H.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. World Health Organization. S. Report, “Coronavirus Disease”. Available online: https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200811-covid-19-sitrep-204.pdf?sfvrsn=1f4383dd_2 (accessed on 11 August 2020).
  2. Chakraborty, I.; Maity, P. Science of the Total Environment COVID-19 outbreak: Migration, effects on society, global environment and prevention. Sci. Total Environ. 2020, 728, 138882. [Google Scholar] [CrossRef] [PubMed]
  3. Rapanta, C.; Botturi, L.; Goodyear, P.; Guàrdia, L. Online University Teaching During and after the Covid-19 Crisis: Refocusing Teacher Presence and Learning Activity. Postdigit. Sci. Educ. 2020, 1–23. [Google Scholar] [CrossRef]
  4. Leahy, S.M.; Holland, C.; Ward, F. The digital frontier: Envisioning future technologies impact on the classroom. Futures 2019, 113, 102422. [Google Scholar] [CrossRef]
  5. Salloum, S.A. Factors Affecting Students. In Acceptance of E-Learning System in Higher Education Using UTAUT and Structural Equation Modeling Approaches; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; Volume 2. [Google Scholar]
  6. Saini, D.K.; Al-mamri, M.R.S. Social Sciences & Humanities Open Investigation of Technological Tools used in Education System in Oman. Soc. Sci. Humanit. Open 2019, 1, 100003. [Google Scholar]
  7. Keskin, S.; Yurdugül, H. “Factors Affecting Students’ Preferences For Online And Blended Learning: Motivational Vs. Cognitive. Eur. J. Open Distance E-Learn. 2019, 22, 73–86. [Google Scholar] [CrossRef] [Green Version]
  8. Martin, F.; Kumar, S. Frameworks for Assessing and Evaluating e-Learning Courses and Programs; Springer: Cham, Switzerland, 2018; pp. 271–280. [Google Scholar]
  9. Kabassi, K.; Dragonas, I.; Ntouzevits, A.; Pomonis, T.; Papastathopoulos, G. Evaluating a learning management system for blended learning in Greek higher education. Springerplus 2016, 5, 101. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Ghosh, S. An Approach to Building a Learning Management System that Emphasizes on Incorporating Individualized Dissemination with Intelligent Tutoring. J. Inst. Eng. Ser. B 2016, 98, 1–8. [Google Scholar] [CrossRef]
  11. Hasan, H.M.R. Machine Learning A lgorithm for Student ’ s Performance Prediction. In Proceedings of the 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kanpur, India, 6–8 July 2019; Volume 201, pp. 1–7. [Google Scholar]
  12. Fachantidis, A.; Taylor, M.; Vlahavas, I. Learning to Teach Reinforcement Learning Agents. Mach. Learn. Knowl. Extr. 2017, 1, 21–42. [Google Scholar] [CrossRef] [Green Version]
  13. Basak, S.K.; Wotto, M.; Bélanger, P. International j ournal of e ngineering s ciences & m anagement r esearch factors affecting to e-learning in continuing education in africa: A review of literature. Int. J. Eng. Sci. Manag. Res. 2017, 4, 86–97. [Google Scholar]
  14. Monem, A.A.; Shaalan, K. Exploring Students. In Acceptance of E-Learning Through the Development of a Comprehensive Technology Acceptance Model; IEEE Access: Piscataway, NJ, USA, 2019; Volume 7, pp. 128445–128462. [Google Scholar] [CrossRef]
  15. Rivas, A.; González-briones, A.; Hernández, G.; Prieto, J.; Chamoso, P. Neurocomputing Artificial neural network analysis of the academic performance of students in virtual learning environments. Neurocomputing 2020. [Google Scholar] [CrossRef]
  16. Gamage, D.; Fernando, S. Factors affecting to effective eLearning: Learners Perspective. Sci. Res. J. 2014, 2, 42–48. [Google Scholar]
  17. Azizi, S.M.; Khatony, A. Investigating factors affecting on medical sciences students’ intention to adopt mobile learning. BMC Med Educ. 2019, 19, 381. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Choi, C.; Jeong, H. Quality Evaluation for Multimedia Contents of E-Learning Systems Using the ANP Approach on High Speed Network. Multimed. Tools Appl. 2019, 78, 28853–28875. [Google Scholar] [CrossRef]
  19. Kose, U.; Koc, D. Artificial Intelligence Applications in Distance Education; IGI Global: Herseys, PA, USA, 2014. [Google Scholar]
  20. Vitolina, I. E-inclusion Modeling for Blended e-learning Course. Procedia Procedia Comput. Sci. 2015, 65, 744–753. [Google Scholar] [CrossRef] [Green Version]
  21. Almasri, A.; Alkhawaldeh, R.S. Clustering-Based EMT Model for Predicting Student Performance. Arab. J. Sci. Eng. 2020. [Google Scholar] [CrossRef]
  22. Pasina, I.; Bayram, G.; Labib, W.; Abdelhadi, A.; Nurunnabi, M. MethodsX Clustering students into groups according to their learning style. MethodsX 2019, 6, 2189–2197. [Google Scholar] [CrossRef] [PubMed]
  23. Jin, X.; Han, J. K-Medoids Clustering. Encyclopedia of Machine Learning and Data Mining; Springer: Boston, MA, USA, 2017; pp. 563–565. [Google Scholar] [CrossRef]
  24. Lu, D.; Nguyen, D.; Nguyen, T. Vehicle mode and driving activity detection based on analyzing sensor data of smartphones. Sensors 2018, 18, 1036. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Nguyen, T.; Lu, D.; Nguyen, D.; Nguyen, H. Abnormal Driving Pattern Detection Using Smartphone Sensors. Electronics 2020, 9, 217. [Google Scholar] [CrossRef] [Green Version]
  26. Patil, C.; Baidari, I. Estimating the Optimal Number of Clusters k in a Dataset Using Data Depth. Data Sci. Eng. 2019, 4, 132–140. [Google Scholar] [CrossRef] [Green Version]
  27. Bouckaert, R.R.; Frank, E.; Kirkby, R.; Reutemann, P.; Seewald, A.; Scuse, D. WEKA Manual for Version 3-7-8; 21 January 2013; Available online: https://statweb.stanford.edu/~lpekelis/13_datafest_cart/WekaManual-3-7-8.pdf (accessed on 11 August 2020).
Figure 1. Influence factors.
Figure 1. Influence factors.
Education 10 00270 g001
Figure 2. The Explicit Factor Model by Clustering Algorithm (EFM).
Figure 2. The Explicit Factor Model by Clustering Algorithm (EFM).
Education 10 00270 g002
Figure 3. The Studying Method Selection Model (SMSM).
Figure 3. The Studying Method Selection Model (SMSM).
Education 10 00270 g003
Figure 4. Factors Affecting the System.
Figure 4. Factors Affecting the System.
Education 10 00270 g004
Figure 5. The Influence Level of Infrastructure Factors’ Features.
Figure 5. The Influence Level of Infrastructure Factors’ Features.
Education 10 00270 g005
Figure 6. The Influence of Course Factors’ Features.
Figure 6. The Influence of Course Factors’ Features.
Education 10 00270 g006
Figure 7. The Accuracy of Algorithms.
Figure 7. The Accuracy of Algorithms.
Education 10 00270 g007
Table 1. Evaluation Blended learning base on affecting factors.
Table 1. Evaluation Blended learning base on affecting factors.
ResearchFactors/Attributes/FeaturesMethod/Measurement/Tools
Alberto Rivas et al. [15] (2020)Course viewed, Course module viewed, Discussion viewed, Course module instance list viewed, The status of the submission has been viewed, A submission has been submitted, Summary of the questionnaire attempt, Attempted visualized questionnaire, The attempt has begun, Attempt sent, Grade user report viewed, Course user report seenDecision tree/Random forest

Extreme gradient boosting/Multilayer perceptron
Said A.salloum et al. [14] (2019)Content quality, information quality, and system quality, Self-efficacy, Subjective norm, enjoyment, accessibility, playfulness, ease of use, usefulness, Attitude towards use, behavioral intentionStatistics/Convergent validity, Discriminant validity measurement/SmartPLS 3.2.7 tool
Seyyed Mohsen Azizi et al. [17] (2019)Attitude, Subject Norm, Behavioral Control, attitudinal beliefs, normative beliefs, and control beliefs, Ease of Use, UsefulnessStatistics/Convergent validity, Discriminant validity measurement/SmartPLS 3.2.7 tool
Sinan Keskin et al. [7] (2019)Learning Environment Preferences, Self-directed learning, Learner control, Motivation towards e-learning, Rehearsal, Organization, Elaboration, Test anxiety, Task value, Self-efficacyNon-linear correlation between the variables/OVERALS analysis tool in SPSS 21
Cheol-Rim Choi et al. [18] (2019)System quality, Information quality, Service quality, Attractiveness, Weights of the quality attributes/ANP
F. Martin et al. [8] (2018)Institutional Support, Technology Infrastructure, Course Design, Learner and Instructor Support, Course Assessment and Evaluation, Learning effectiveness, Faculty and Student Satisfactionoverview
Sujit Kumar Basa et al. [13] (2017)institutional, technical, resource, training, competency, infrastructural, attitudinal and social integrationTechnology Acceptance Model/Theory of Planned Behavior
Katerina Kabassi et al. [9] (2016)Student, Teacher, Design, Courses, Environment, TechnologyStatistical
Dilrukshi Gamage et al. [16] (2014)Technology, Pedagogy, Motivation, Usability, Content/Material, Support for Learners, Assessment, Future Directions, Collaboration, Interactivity DoesStatistical/SPSS
Alan Y.K. Chan et al. [8] (2003)online courses, learning effectiveness, evaluation methods and evaluation resultsWeb Mining techniques
Table 2. The Features Affect E-learning System.
Table 2. The Features Affect E-learning System.
CategoryFeatureIllustration
StudentSF1Year of birth
SF2Gender
SF3Place of residence
SF4Year of grade
SF5Field of studying
SF6Average of outcome
TeacherTF7Influence of teacher
InfrastructureIF8Preparing lecture and material
IF9Hardware, internet quality
IF10Influence of learning management system (LMS)
and storage service
IF11Influence of regulation environment
CoursesCF12Suitableness of course in e-learning
CF13Effect of e-learning method
CF14Level of course
CF15Influence of assessment
CF16Learning methods
Table 3. Collected Data by Features of Factors.
Table 3. Collected Data by Features of Factors.
CategoryFeatureClusteringClassifying
StudentSF1IntegerInteger
SF2Male; Female1, 0
SF3City, Province, other3, 2, 1
SF4Grade: 1, 2, 3,41, 2, 3, 4
SF5Journalistic
and publication, political, other
3, 2, 1
SF6A, B, C, D, E4, 3, 2, 1
TeacherTF7Five likert scale1, …, 5
InfrastructureIF8Five likert scale1, …, 5
IF9Five likert scale1, …, 5
IF10Five likert scale1, …, 5
IF11Five likert scale1, …, 5
CoursesCF12Five likert scale1, …, 5
CF13Five likert scale1, …, 5
CF14Knowledge, basic, field1, 2, 3
CF15Five likert scale1, …, 5
CF16-labele-learning, traditional, blendedE, T, B
Table 4. The Experiment on EFM Results.
Table 4. The Experiment on EFM Results.
SFGeneralC1
17%
C2
18%
C3
13%
C4
36%
C5
16%
SF119.73872019.128220.8520.120719.087
SF2FemaleFemaleFemaleFemaleFemaleFemale
SF3CityOtherCityCityCityCity
SF4Grade2Grade2Grade1Grade3Grade2Grade1
SF5OtherPoliticalOtherPoliticalJournalOther
SF6BCCBBB
TF73.98993.80564.10263.54.12074.087
IF84.26633.91674.33334.24.34484.413
IF94.6084.44444.79494.54.60344.6304
IF104.25134.08334.48724.24.20694.2609
IF113.72733.38894.02563.453.75393.8261
CF124.16083.80564.46154.154.05174.3261
CF133.5983.44443.66673.53.81033.4348
CF14BasicBasicKnowledgeFieldBasicKnowledge
CF153.90453.58334.10263.83.94833.9783
MethodBlendedBlendedBlendedBlendedE-LearningBlended
Table 5. Experiment on SMSM Results.
Table 5. Experiment on SMSM Results.
AlgorithmAccuracyPrecisionRecallF1ROC
NB53.25%0.4910.5320.4970.607
KNN77.45%0.7720.7740.7730.796
SVM58.64%0.7590.5860.440.508
MP63.92%0.630.6390.6290.719
RF81.52%0.8130.8150.8120.911
Table 6. The Comparison Results.
Table 6. The Comparison Results.
Model/SystemFeatureAlgorithmAccuracy
System in [15]5 type of events/39 featuresDT70.5%
RF78.1%
EGB76.5%
MP78.2%
SMSM4 type factor/16 featuresRF81.52%

Share and Cite

MDPI and ACS Style

Lu, D.-N.; Le, H.-Q.; Vu, T.-H. The Factors Affecting Acceptance of E-Learning: A Machine Learning Algorithm Approach. Educ. Sci. 2020, 10, 270. https://doi.org/10.3390/educsci10100270

AMA Style

Lu D-N, Le H-Q, Vu T-H. The Factors Affecting Acceptance of E-Learning: A Machine Learning Algorithm Approach. Education Sciences. 2020; 10(10):270. https://doi.org/10.3390/educsci10100270

Chicago/Turabian Style

Lu, Dang-Nhac, Hong-Quang Le, and Tuan-Ha Vu. 2020. "The Factors Affecting Acceptance of E-Learning: A Machine Learning Algorithm Approach" Education Sciences 10, no. 10: 270. https://doi.org/10.3390/educsci10100270

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop