Next Article in Journal
A Bibliometric Analysis of Synchronous Computer-Mediated Communication in Language Learning Using VOSviewer and CitNetExplorer
Next Article in Special Issue
Structuring the Post-COVID-19 Process of Digital Transformation of Engineering Education in the Russian Federation
Previous Article in Journal
A Maturity Matrix Model to Strengthen the Quality Cultures in Higher Education
Previous Article in Special Issue
Translanguaging in English Language Teaching: Perceptions of Teachers and Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Effectiveness of an Online Language Course during the COVID-19 Pandemic: Students’ Perceptions and Hard Evidence

by
Irina O. Shcherbakova
1,*,
Svetlana N. Kucherenko
2 and
Natalia B. Smolskaia
3
1
Department of English for Navigation and Radio Communication, “Maritime Academy” Institute, Admiral Makarov State University of Maritime and Inland Shipping, 198035 Saint Petersburg, Russia
2
Foreign languages Department, HSE University, 190121 Saint Petersburg, Russia
3
Graduate School of Applied Linguistics, Translation and Interpreting Peter the Great St. Petersburg Polytechnic University, 195251 Saint Petersburg, Russia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(2), 124; https://doi.org/10.3390/educsci13020124
Submission received: 8 November 2022 / Revised: 9 January 2023 / Accepted: 18 January 2023 / Published: 25 January 2023

Abstract

:
The aim of this study is to analyse the delivery of an online English language course taught at the Admiral Makarov State University of Maritime and Inland Shipping during the COVID-19 pandemic with the ultimate purpose of identifying those features which could be applied to a prospective blended English as a foreign language (EFL) course in the future. The rationale behind this research is a necessity to redesign the existing language education programme at the University which is outdated and far from being digital. By applying the Mashaw model of evaluating the effectiveness of an online course, the authors analyse the results of the survey carried out among a focus group of 22 students from the radio engineering department of the Admiral Makarov State University of Maritime and Inland Shipping. In the end, the authors identify the points for further consideration in the process of developing a prospective blended EFL course.

1. Introduction

At the beginning of 2020, the COVID-19 pandemic gradually covered human society and all its institutions. For education, the lockdown resulted in the closure of educational systems [1]. At the time, anxiety, super-stress, lack of motivation, lack of psychological and physical resources, and absolute existential crisis were the factors affecting educational stakeholders more than any other professional factor. Over the course of time, the COVID-19-induced existential crisis gave the floor to strategic behaviour. Most universities across the world moved their classes and their pedagogy online virtually overnight [2].
The Admiral Makarov State University of Maritime and Inland Shipping was one of many Russian universities which started delivering most of its courses online. The English language classes were no exception. The EFL disciplines include General English, Specialised Marine English, Business English, and English for Marine Engineers. Now the total number of EFL courses on the University’s distant learning platform, FARVATER, is more than 20, including special courses for the final English language exam training.
All the content of the EFL courses was transferred to a digital format including learning materials and reading, listening, writing, and speaking assignments. The courses were designed and organised in such a way that cadets could perform all the necessary tasks online to develop their communicative skills. Several separate courses were designed for cadets majoring in navigation to enable them to have English classes during their on-water placement aboard sea vessels. The cadets were able not only to perform all the necessary tasks which they usually do in class but also to communicate with their English instructors in the same way as they usually do in class, and if necessary, even more intensively, through audio and video conferencing, forums and other elements of online training. The administration and the Department of English for Navigation and Communication have taken tremendous efforts to ensure that the e-courses have all the necessary materials and features for a full-fledged language learning program.
Now that humanity has started to grope for its way in the COVID-19 world, post-pandemic educational priorities are different. Universities across the world are looking back at their year-long online education programmes with the purpose to assess their impact on the quality of education. The most common finding emerging from such studies is that a move to blended learning based on the combination of traditional face-to-face activities with online elements is viewed as most effective by students, teachers, and university administration together [3,4,5].
Pre-pandemic, there was already acknowledgement at the Admiral Makarov State University of Maritime and Inland Shipping that the traditional English language education model at the university needed a major redesign as it was still based on the principles of the Soviet language education system with its focus on the translation method and no space for digital materials and online activities. The management of the Admiral Makarov State University of Maritime and Inland Shipping was aware of the benefits that blended learning could provide for the English language programme and had already taken some steps towards the blended English learning programme before the pandemic broke off [6]. COVID-19-induced online education turned out to be a critical turning point, after which, the university’s management had no doubts about the redesigning of the English language programme at the university. Therefore, the management set a clear research task for the University’s English unit—to carry out a detailed analysis of the English language teaching and learning practices during COVID-19-induced online education. Several research projects were launched to ensure that the best online and face-to-face educational practices are taken on board to produce an effective blended learning language course afterwards.
The paper below reports on the findings of one of the projects whose purpose is to reflect on the recent experience of online language education at the university. The key objective of the ongoing research project is to identify the English language online education practices that are worth capitalising on after the COVID-19 pandemic. The current study aims to address the following research questions: (a) How do the students assess their English language online training programme? (b) Which aspects of the programme are perceived by the students as the most effective and the least effective? (c) Do the students’ subjective perceptions correspond to the objective evidence from the final English exam? In particular, we aim to collect a database of students’ perceptions of the online language courses, interpret them, single out the most effective practices through our students’ perceptions, and compare their perceptions with some evidence of their performance in the final language exam.
For the purposes of our paper, we discriminate between the terms “online education” and “blended education”. Online education refers to a learning environment that solely relies on the Internet to organise the teaching and learning process [7], whereas blended education combines practices of face-to-face learning with those of Internet-based learning [8].
The biggest challenge confronting our research is how to tell the most effective components of the language programme from the least effective ones; in other words, how to evaluate the efficiency of online classes. It is obvious that models commonly used to evaluate the effectiveness of a face-to-face class will not work here as online classes lack social interfacing and interaction. Moreover, bracketing out social interaction has a particular impact on students’ behaviour in online education according to the constructivist theory of learning which posits that students’ learning practices are impacted by their personality and the environment for learning [9]. More importantly, traditional methods of evaluating face-to-face effectiveness cannot evaluate such factors central to online classes as an instructor’s ability to engage students or feedback provision [10].

2. Materials and Methods

We carry out our research by applying the model of measuring the effectiveness of an online course developed by Bijan Mashaw [10]. A major advantage of this model is that it has been specifically tailored to online courses. The Mashaw model is based on two presumptions: (a) effectiveness is a multi-dimensional concept and (b) students are in the centre of the teaching and learning approach according to the student-centred approach [10]. Effectiveness is understood as “the quantity that measures the achievement of a desired objective to produce a goal” [10:193]. Mashaw derives three of the dimensions of effectiveness from the constructivist theory of learning. They are learning experiences, efforts and resources. Several other components were added after Mashaw analysed a great body of literature on measuring effectiveness in online and face-to-face education. In the end, Mashaw’s model of measuring the effectiveness of an online course includes six factors such as the learning experience, the facilitator, the interaction or participation, the mentor’s inspiration, the technology, and hindrances.
Each of the factors is broken down into smaller measurements which are given a particular value in different ways. They may be exam results, direct observation, or self-reporting. The application of the three methods gives the most detailed picture, whereas the application of self-reporting is the most economical way of measuring effectiveness. As we need quick results for the first stage of the project, we have chosen self-reporting to identify the most and the least effective elements of the online course. The exam results from the final English language examination are used to improve the reliability of the findings.
Recent research into the reliability of student surveys has shown that there are few doubts as to their validity [11], and they are used as one of the tools to assess the effectiveness of the education process [12,13]. Thus, for our purposes, we can also assume that our students’ perceptions are informative enough for evaluation at this stage of our research. We consider students as carriers of values of variables as an estimate of the factors affecting the efficiency of the online language course taught for a year at the university.
Below is a list of smaller factors that we use in our research to measure the six core abovementioned factors of the online course. To measure the effectiveness of learning, Mashaw suggests a list of other factors that are indicators of learning performance. They are: (a) a degree of understanding the subject measured through tests; (b) a degree of changes in attitudes measured through; (c) a degree of appreciation for the value of learning; (d) a level of efficiency in applying the material learned; and (e) a level of confidence in handling real-world cases. To measure the effectiveness of the facilitating environment, Mashaw suggests such factors as clear learning objectives and goals, facilitating learning through modular presentation with clear interconnection, and offering students an environment which encourages them to continue learning and explore the subject on their own. To measure the effectiveness of the technology Mashaw considers such factors as design and usability, content and the organisation of the material, interactivity, and flexibility. To measure the effectiveness of interaction, Mashaw focuses on such factors as the ease of communication with the instructor and/or the group, reasonable response time to questions and discussion, the encouragement of participation, and motivation for being involved. To measure the effectiveness of the instructor’s behaviour, Mashaw evaluates explanations and presentation methods, the design of assignments, their challenges and appropriateness, motivation technique for critical thinking, timely feedback about the students’ progress, and assistance at the individual level. To measure hindrance, Mashaw investigates technical frustration/stress, boring or static content or presentations (lectures), dissatisfaction as a result of unexpected inflexibility, discontent as a result of online learning disappointment, unreasonable assignments, tasks which are perceived as impossible, or a lack of guidance or learning assistance.
To measure each of the smaller factors, a list of questions encouraging a participant to self-evaluate a particular factor is drawn up. Evaluation is performed by means of a Likert scale where participants give a score from 1 to 5 to a particular statement or question.
We also base our research on a student-centred approach whose main principles are the focus on students’ needs and interests, students’ control over the content of learning, as well as the choice and evaluation of methods and forms of learning [14]. In this model, the learner is at the centre of the educational process, and the emphasis is placed not only on the effectiveness of learning but also on the learner’s satisfaction with certain forms and methods of learning. This approach to learning as well as the evaluation of the education process organisation is currently actively applied by many progressive Russian universities [15].
The methodology for identifying the most and the least effective components of the language online programme includes several steps: (a) the distribution of the questionnaire (adopted from the Mashaw model) to a focus group of 22 fifth-year students (between 22 and 24 years old with 3 female and 19 male cadets) from the radio engineering department of our university; (b) the students’ evaluation of each of the questions using a Likert scale; (c) the accumulation of the measurements given by the students and the calculation of the average score; (d) the comparison of the students’ perceptions with evidence from the summative assessment tests administered at the end of the course. The assessment results can be also seen on the course page in “Farwater” https://farvater.gumrf.ru/grade/report/grader/index.php?id=1037, accessed on 4 January 2023.

3. Results

Having adopted the factor system proposed by Mashaw, we grouped the questions from the questionnaire into categories, highlighting the important constituent factors of the developed EL online course. We first calculated the average rating for each of the items (Table 1), and then we obtained an average rating for the categories (Figure 1). Separately, we compared students’ perceptions of learning outcomes in four language skills: listening, reading, writing, and speaking, as well as evidence from the summative assessment tests administered at the end of the course (Figure 2).
Therefore, the factors we assessed are:
  • Course organisation;
  • Content;
  • Motivation;
  • Design;
  • Technical support;
  • Teacher’s availability, interaction and feedback;
  • Students’ expectations of their results.
The results are given in Figure 1.
We deliberately singled out language skills in order to be able to compare students’ assessment of their ability to perform tasks and the real picture obtained from the results of the current, intermediate, and final tests. The data received are shown in Figure 2.

4. Discussion

This section provides our deliberations on the findings as well as further discussion with regard to our ultimate objective of identifying elements of the English language online course worth capitalizing on in the prospective blending learning language programme.
Our findings show that the course is generally well perceived by the students as the value they have assigned to each of the efficiency factors varies between 3.11 to 4.3. The factor with the highest average value is the course design (4.3), whereas the factors with the lowest average values are those of technical support and content (3.11 for each of them). Surprisingly enough, the teacher’s feedback and interaction factor receive a value of 4.06.
The course design factor is about the interface of the platform and its usability. The fact that this factor has received the highest value means that the students of the target group enjoy the user-friendly interface of the course. The credit for this aspect goes to the Software Developing Unit of the university as it has been their job to develop and launch a website for online education. Admittedly, the website FARVATER has been under development for some time and its launch is not a matter of several days, but making it so user-friendly that the students value it most of all the other factors is definitely worth noting.
The other factor with a high value of 4.06 is the factor of teachers’ behaviour and interaction. Before carrying out our survey, we had particular negative expectations about this factor as students are generally dissatisfied with the lack of communication and feedback they receive during online or blended classes. Contrary to our expectations, the value for this factor is really high. Apparently, it is the result of the amount of emotional, psychological, personal, and professional investment of the language teachers at our university. Timely feedback, instant replies to students’ queries, and cooperative interaction were achieved by a great degree of personal involvement demonstrated by the teaching staff.
The other factor which has received a low value is the content factor. If the students assign a low value to this factor, it is reasonable to assume that the language instructors have not chosen the content for the course properly. While it might be so, a more reasonable assumption is different. The content of a traditional face-to-face course has been moved online. The content of the course is documented in the course syllabus, which is, in its turn, a part of the language curriculum. The course syllabus is usually approved by the administration of a university for a long period of time and is not subject to changes unless these changes are initiated by the administration. It means the language instructors have no right to change the content of the course even when it is moved online. The fact that the students have shown their partial satisfaction with the content of the course means that the content of a face-to-face course and the content of an online or blended course should be different, or rather, the content of a face-to-face course should be adopted to the online environment when the course is moved online or becomes blended. The signal to the Language Teaching Unit at the university is quite clear in this case—they should revise the content of the course.
The other factor which has received a similarly low value from the students is technical support. The students had problems accessing the webpage or performing some actions on the webpage. Apparently, the problems experienced by the students depend on the speed of the Internet connection in the place from which they access the system or their computer literacy. Although it may seem surprising in the Internet era, not all students have enough skills to work with web-based educational programmes. It is true that they are digital natives as long as it comes to browsing the Internet, using social networks, or communicating online. There are still a great number of students who use the Internet solely for entertainment and practical purposes (buying tickets or goods, looking for public transport schedules, paying bills, and others). However, the interface of a web-based educational platform might be something that they have never seen. Being forced into circumstances where they have to perform their traditional learning actions—asking questions, providing reactions, accessing learning materials, and doing homework—online might be a little bit of a problem for some of them. Unfortunately, the Technical Support Unit at the University cannot provide prompt support to any student who has technical problems with their educational platforms. Over the course of time, when blended learning covers more university courses and more and more students use online courses as a normal part of their studies, technical problems will fade away.
Now we turn our attention to the factors which have received a relatively low value from the students, i.e., the course organisation, motivation, and learning expectations. It is worth analysing them separately as they belong to different realms. The organisation of the course is the factor largely depending on the syllabus of the course and closely connected with its content. Organisation is about the flexibility of assignments, the opportunity for students to control when to access the course, and the usability of activities. If the students perceive the content as a little dull and heavy, then they will regard the organisation similarly. The message they send by giving a low value to the factor of organisation is quite clear—they want to be autonomous learners. The takeaway point to consider is to organise the course in a different way, probably by introducing more features enabling students to control their studies on their own to a greater extent than now.
Human motivation has a very complex nature [16], thus making it difficult to pinpoint the reasons for assigning a low value to this factor. On the one hand, there are studies with evidence of increased learners’ motivation [17]; on the other hand, there are those which clearly argue that the online environment decreases motivation and learners’ involvement [18]. Apparently, the point to consider is that the motivation factor needs further analysis, and more attention should be given to this aspect when fully implementing the EFL course in the future.
The expectations factor shows what the students expected from the course in terms of their results in speaking, listening, writing, and reading. The exam scores show that the students performed better than expected with the listening tasks but overestimated their abilities in writing and speaking. The students had the lowest results in speaking (with an average score of 3.1), although they considered speaking assignments to be quite simple. It is interesting to note that several studies also identify a learning loss in students of different ages [19]. The discrepancy between expected and real results can be explained by the fact that formative assessment was mostly carried out online, whereas summative assessment was performed in a face-to-face format. Term quizzes, presentations, and group discussions were performed in circumstances in which a language instructor has few instruments to monitor their students’ behaviour. In other words, we should allow for the dishonest behaviour factor in formative assessment which is mostly eliminated in summative assessment when the student and their language instructor find themselves in a face-to-face situation with no possibility to read off a screen or use external sources of ideas and information. Another possible explanation is that the online environment created a false feeling of simplicity in the learning process as the infinite source of information—the Internet—is always at hand for students [20]. There could be several ways to consider this finding: (a) to introduce more formative assessment points to have a more objective picture of students’ academic performance, (b) to install functions allowing for visual observation when students take quizzes or undertake discussions online, (c) make the content of the course more challenging in general by introducing more creative tasks with no possibility for students to Google answers to eliminate a false feeling of the simplicity of the course.
Although the current project does identify the elements of the online course that are worth capitalising on, as is stated in the research question and required by the management of the University, several limitations of our study have to be acknowledged. Firstly, the sample size was too small. As this is the first project within a series of research studies initiated by the management of the university, we have been allowed to run the project at a particular department which included 22 cadets. Although this fact may cause some doubts about the generalisability of our findings, the size of our sample still cannot negate our findings. We aim to scale up our research to a bigger sample; thus, our findings will become more generalisable. Secondly, the analytical tools seem to need more validation. Indeed, further validation will be performed at the subsequent stages of the project. The limitation mentioned above still cannot write off the impact of the study for the purposes of our university. Further research will substantiate the new English language programme at the university to make the graduates more competitive on the world market as competent users of English.

5. Conclusions

Consistent with our research question, we collected some students’ perceptions, assigned a value to each of them, and performed an assessment of the efficiency of the online EFL course taught at the Admiral Makarov State University of Maritime and Inland Shipping in 2020/2021. The assessment was performed in order to single out the most beneficial aspects of the course, though the ultimate purpose of our research is to map out practical strategies to transform the existing EFL course into a blended EFL course in the near future. We identified several takeaway points for consideration in the future. Firstly, the content of the course should be adopted to a new format of teaching with a greater number of challenging and creative assignments. Increasing the difficulty of a prospective blended course will eliminate any possibility of academic misconduct as well. Secondly, the organisation of the course in terms of the types of assignments, the time of completing assignments, the difficulty of assignments, and the variety of assignments needs further consideration and apparently a complete redesign to answer students’ needs for autonomous learning. Thirdly, the speaking component of the course obviously needs a different methodological approach as students perform worse in their formative assessment than they expected. A possible solution is to teach and assess speaking only in a face-to-face context or increase the number of speaking assignments when the course becomes blended.
Our study gave an insight into the world of the students’ perceptions of the course. The application of the Mashaw model enabled us to translate subjective perceptions into objective values and reflect on the past crisis. A roadmap can now be built at the teaching and policy levels to help the students and teaching staff at the Admiral Makarov State University of Maritime and Inland Shipping cope with any further challenges enforced upon education by the pandemic.

Author Contributions

Conceptualisation, S.N.K.; methodology, S.N.K., N.B.S. and I.O.S.; validation, I.O.S.; investigation, I.O.S.; data curation, N.B.S.; writing original draft preparation, S.N.K.; writing—review and editing, S.N.K. and N.B.S.; supervision, I.O.S.; project administration N.B.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Ethics Commission founded in the Institute of Humanities, Peter the Great St. Petersburg Polytechnic University (protocol code: 5 and date of approval: 26 February).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Misirli, O.; Ergulec, F. Emergency remote teaching during the COVID-19 pandemic: Parents experiences and perspectives. Educ. Inf. Technol. 2021, 26, 6699–6718. [Google Scholar] [CrossRef] [PubMed]
  2. Oyedotun, T.D. Sudden change of pedagogy in education driven by COVID-19: Perspectives and evaluation from a developing country. Res. Glob. 2020, 2, 100029. [Google Scholar] [CrossRef]
  3. Rapanta, C.; Botturi, L.; Goodyear, P.; Guàrdia, L.; Koole, M. Balancing Technology, Pedagogy and the New Normal: Post-pandemic Challenges for Higher Education. Postdigital Sci. Educ. 2021, 3, 715–742. [Google Scholar] [CrossRef]
  4. Munir, H. Reshaping Sustainable University Education in Post-Pandemic World: Lessons Learned from an Empirical Study. Educ. Sci. 2022, 12, 524. [Google Scholar] [CrossRef]
  5. Robson, L.; Gardner, B.; Dommett, E.J. The Post-Pandemic Lecture: Views from Academic Staff across the UK. Educ. Sci. 2022, 12, 123. [Google Scholar] [CrossRef]
  6. Kucherenko, S.; Skripnik, I.; Shcherbakova, I. Blended Learning in Russian Higher Education: A Comparative Analysis of Three Cases. In Proceedings of the 2nd International Scientific and Practical Conference on Digital Economy (ISCDE 2020), Yekaterinburg, Russia, 5–6 November 2020; pp. 146–153. [Google Scholar] [CrossRef]
  7. Horvitz, B.S. N. Dabbagh and B. Bannan-Ritland, Online Learning: Concepts, Strategies, and Application. Educ. Technol. Res. Dev. 2007, 55, 667–669. [Google Scholar] [CrossRef]
  8. Garrison, D.; Kanuka, H. Blended learning: Uncovering its transformative potential in higher education. Internet High. Educ. 2004, 7, 95–105. [Google Scholar] [CrossRef]
  9. Kocadere, S.A.; Ozgen, D. Assessment of Basic Design Course in Terms of Constructivist Learning Theory. Procedia—Soc. Behav. Sci. 2012, 51, 115–119. [Google Scholar] [CrossRef] [Green Version]
  10. Mashaw, B. A Model for Measuring Effectiveness of an Online Course. Decis. Sci. J. Innov. Educ. 2012, 10, 189–221. [Google Scholar] [CrossRef]
  11. van der Scheer, E.A.; Bijlsma, H.J.E.; Glas, C.A.W. Validity and reliability of student perceptions of teaching quality in primary education. Sch. Eff. Sch. Improv. 2018, 30, 30–50. [Google Scholar] [CrossRef] [Green Version]
  12. Ferguson, R.F. Can Student Surveys Measure Teaching Quality? Phi Delta Kappan 2012, 94, 24–28. [Google Scholar] [CrossRef]
  13. Maslakov, S.I.; Tutaeva, V.I.; kolesnikova, A.V. Research on the Effectiveness of Distance Learning Based on a Survey of Students. 2020. Available online: https://cyberleninka.ru/article/n/issledovanie-effektivnosti-distantsionnogo-obucheniya-na-osnove-anketirovaniya-studentov (accessed on 2 January 2023).
  14. Bruce, W.T. The Student-Centered Curriculum: A Concept in Curriculum Innovation. The Student—Centered Curriculum: A Concept in Curriculum Innovation. 1969. Available online: https://files.eric.ed.gov/fulltext/ED032616.pdf (accessed on 2 January 2023).
  15. Vladimirovna, K.O.; Ivanovna, D.A.; Vladimirovna, B.A. Advantages of using a student-centered approach in higher education. Balt. Humanit. J. 2020, 9, 97–100. [Google Scholar]
  16. Kirsch, C.; de Abreu, P.M.E.; Neumann, S.; Wealer, C. Practices and experiences of distant education during the COVID-19 pandemic: The perspectives of six- to sixteen-year-olds from three high-income countries. Int. J. Educ. Res. Open 2021, 2, 100049. [Google Scholar] [CrossRef] [PubMed]
  17. Guay, F.; Chanal, J.; Ratelle, C.F.; Marsh, H.; LaRose, S.; Boivin, M. Intrinsic, identified, and controlled types of motivation for school subjects in young elementary school children. Br. J. Educ. Psychol. 2010, 80, 711–735. [Google Scholar] [CrossRef] [PubMed]
  18. Philipp, N. Impact of COVID-19 Emergency Transition to On-Line Learning on International Students’ Perceptions of Educational Process at Russian University. J. Soc. Stud. Educ. Res. 2020, 11, 270–302. Available online: https://jsser.org/index.php/jsser/article/view/2602 (accessed on 2 January 2023).
  19. Sabates, R.; Carter, E.; Stern, J.M. Using educational transitions to estimate learning loss due to COVID-19 school closures: The case of Complementary Basic Education in Ghana. Int. J. Educ. Dev. 2021, 82, 102377. [Google Scholar] [CrossRef] [PubMed]
  20. Markova, T. Educators’ and students’ perceptions of online distance education before and amid COVID-19: Key concerns and challenges. SHS Web Conf. 2021, 99, 01018. [Google Scholar] [CrossRef]
Figure 1. Effectiveness of EL online course.
Figure 1. Effectiveness of EL online course.
Education 13 00124 g001
Figure 2. Language skills.
Figure 2. Language skills.
Education 13 00124 g002
Table 1. Average rating of factors to assess the course effectiveness.
Table 1. Average rating of factors to assess the course effectiveness.
Questions AskedFactors AssessedAverage Rating
The online format of the course clearly helped me to comprehend and understand the content or topics of the course
(1 strongly disagree—5 strongly agree)
Course organisation3.54
Listening activities were easy for me in the online format
(1 strongly disagree—5 strongly agree)
Students’
expectations of their results
3.72
Reading activities were easy for me in the online format
(1 strongly disagree—5 strongly agree)
3.90
Students’
expectations of their results
Writing activities were easy for me in the online format
(1 strongly disagree—5 strongly agree)
Students’
expectations of their results
4.45
Speaking activities were easy for me in the online format
(1 strongly disagree—5 strongly agree)
Students’
expectations of their results
4
The course Objectives were communicated clearly so that I could understand the learning goals
(1 strongly disagree—5 strongly agree)
Course organisation4.40
Each topic was distinct, yet related to the overall objective of the course (1 strongly disagree—5 strongly agree)Course organisation4.36
The course content and topics inspired me to continue.
(1 strongly disagree—5 strongly agree)
Content3.95
The course challenged and inspired me to find my own answer to problems
(1 strongly disagree—5 strongly agree)
Motivation3.68
The webpage/interface was designed to be attractive, pleasing and easy to use
(1 strongly disagree—5 strongly agree)
Design4.3
The variety &novelty of presentation tools positively impacted my learning experience
(1 strongly disagree—5 strongly agree)
Content4.13
I liked the web-management software used, it made the learning experience easy
(1 strongly disagree—5 strongly agree)
Technical support4.36
During the course, I could control the learning progress on my own past and time.
(1 strongly disagree—5 strongly agree)
Course organisation4.09
It was easy to communicate with the instructor and the group
(1 strongly disagree—5 strongly agree)
Teacher’s availability,
interaction and feedback
4.68
Questions/comments were answered promptly
(1 strongly disagree—5 strongly agree)
Teacher’s availability,
interaction and feedback
4.59
I was encouraged to get involved in group discussions and participate more than average
(1 strongly disagree—5 strongly agree)
Motivation3.45
The explanations and presentation by the instructor were impressive, inspiring and effective.
(1 strongly disagree—5 strongly agree)
Teacher’s availability,
interaction and feedback
4.13
The assignments were related to the course, well designed, reasonable, yet challenging.
(1 strongly disagree—5 strongly agree)
Content4.40
Throughout the course, I was motivated to find my own answer to challenges
(1 strongly disagree—5 strongly agree)
Motivation3.86
Throughout the course, feedback was provided so that I was aware of my learning progress and grade
(1 strongly disagree—5 strongly agree)
Teacher’s availability,
interaction and feedback
4.59
The instructor was available and provided appropriate assistance when asked to do so.
(1 strongly disagree—5 strongly agree)
Teacher’s availability,
interaction and feedback
4.68
I often had problems with the technical side of the webpage and the course
(1 strongly disagree—5 strongly agree)
Technical support and
usability
1.86
The presentations/lectures or explanations were static, boring and discouraging
(1 strongly disagree—5 strongly agree)
Content and teaching
methods
1.81
The course was not designed to be flexible as expected
(1 strongly disagree—5 strongly agree)
Course organisation1.77
Assignments were unreasonable, very time-consuming and almost impossible to finish
(1 strongly disagree—5 strongly agree)
Content1.27
The course was designed like a self-learning course, without much guidance or explanation
(1 strongly disagree—5 strongly agree)
Teacher’s availability,
interaction and feedback
1.72
When considering the learning experiences, my overall rating for this course is:
(1 lowest, 5 Highest)
Students’ expectations
and course evaluation
4.22
When considering the skills learned, my rating is:
(1 nothing, 5 learned more than average)
Students’ expectations
of their result
3.81
My rating for the learning effectiveness, compared to the physical classroom
(1 not effective, 5 more than a classroom)
Students’ expectations
of their result
3.27
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shcherbakova, I.O.; Kucherenko, S.N.; Smolskaia, N.B. The Effectiveness of an Online Language Course during the COVID-19 Pandemic: Students’ Perceptions and Hard Evidence. Educ. Sci. 2023, 13, 124. https://doi.org/10.3390/educsci13020124

AMA Style

Shcherbakova IO, Kucherenko SN, Smolskaia NB. The Effectiveness of an Online Language Course during the COVID-19 Pandemic: Students’ Perceptions and Hard Evidence. Education Sciences. 2023; 13(2):124. https://doi.org/10.3390/educsci13020124

Chicago/Turabian Style

Shcherbakova, Irina O., Svetlana N. Kucherenko, and Natalia B. Smolskaia. 2023. "The Effectiveness of an Online Language Course during the COVID-19 Pandemic: Students’ Perceptions and Hard Evidence" Education Sciences 13, no. 2: 124. https://doi.org/10.3390/educsci13020124

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop