Next Article in Journal / Special Issue
#Mathathome during the COVID-19 Pandemic: Exploring and Reimagining Resources and Social Supports for Parents
Previous Article in Journal
Teachers’ Attitudes toward Educational Inclusion in Spain: A Systematic Review
Previous Article in Special Issue
Teaching during COVID-19: The Decisions Made in Teaching
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Challenges and Experiences of Online Evaluation in Courses of Civil Engineering during the Lockdown Learning Due to the COVID-19 Pandemic

by
Marcos García-Alberti
1,*,
Fernando Suárez
2,
Isabel Chiyón
3 and
Juan Carlos Mosquera Feijoo
4,*
1
Departamento de Ingeniería Civil: Construcción, Universidad Politécnica de Madrid, 28040 Madrid, Spain
2
Departamento de Mecánica de Medios Continuos y Teoría de Estructuras, Universidad de Jaén, 23700 Linares (Jaén), Spain
3
Facultad de Ingeniería, Universidad de Piura, Piura 20009, Peru
4
Departamento de Mecánica de Medios Continuos y Teoría de Estructuras, Universidad Politécnica de Madrid, 28040 Madrid, Spain
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2021, 11(2), 59; https://doi.org/10.3390/educsci11020059
Submission received: 31 December 2020 / Revised: 18 January 2021 / Accepted: 29 January 2021 / Published: 3 February 2021
(This article belongs to the Special Issue Online and Distance Learning during Lockdown Times: COVID-19 Stories)

Abstract

:
As a consequence of the global health emergency in early 2020, universities had to tackle a sudden shift in their teaching–learning strategies so that the preset competences could be fulfilled. This study presents the learning outcomes of the implemented tasks, student experiences, and feedback, as well as some reflections from the instructors with a holistic perspective of the courses due to the adopted measures and adaptations. Six courses taught at civil engineering degrees of three universities, two from Spain and one from Peru, were analyzed. The teaching and evaluation strategies are described, and some reflections are made by comparing the student’s performance with the previous course. Though the shift to online learning had to be made from day to day, with no time for preparation, the experience has proved that online learning can be beneficial in some aspects and has probably come to stay, although some other aspects are difficult to replace with respect to face-to-face learning, especially students’ engagement and motivation. The significance of this study relies on a description of the challenges that arose due to the global public health and an assessment of the results of the implemented strategies to account for both teaching and evaluation in modules of civil engineering. After the acquired experience, new questions have arisen, e.g., what type of content is (and what is not) adequate or suitable for online exams? What features have come to stay? Has higher education taken a step forward to tomorrow’s education?

1. Introduction

In the first months of 2020, due to the health emergency triggered by the COVID-19 pandemic at a global level, most educational institutions around the world were forced to modify their teaching methodologies and turn them into new strategies compatible with online learning. Most universities scrambled to adjust and apply digital systems needed for remote learning. However, some recent studies seemed to agree that the teaching institutions were not prepared for such a sudden shift to emergency remote teaching (ERT) [1]. There has been an increasing number of experiences shared around the world studying how this situation has affected teachers and students, especially focused on primary and secondary levels of education [1,2,3,4] but not so much at the higher education level [5,6].
In these circumstances, each university adopted different solutions, usually including specific online tools and platforms for distance learning, such as video call applications, and giving general guidelines and instructions so that the lecturers knew how to adapt their teaching activities. Nevertheless, the lecturers were ultimately those who had to make the decision on how to specifically implement online teaching, changing their traditional teaching strategies by incorporating new tools, such as video calls and screencast videos, and implementing new forms of interaction with the students by using virtual forums or online group tutorials. Thus, the digital tablet has proven to be a valuable tool for teaching and interacting with learners [7,8,9]. These decisions were made in response to many factors, such as their motivation, digital skills, or family or personal circumstances, all of which could make conciliation harder.
Transforming traditional face-to-face teaching into distance teaching is not trivial, either for the lecturers or students. Some elements must be adapted, such as the teaching materials, the tools used for their production, and the interaction mechanisms with the students. All this implies that both students and lecturers must adapt their daily work, since they must learn how to use new tools and the way they interact with each other.
As Singh et al. stated, the use of the World Wide Web facilitates both types of online learning strategies because it makes asynchronous teaching easier, thus allowing for anytime and anywhere learning, and makes synchronous teaching easier by means of video call tools [10]. Many experiences have shown that online learning is possible using both synchronous and asynchronous methodologies [11,12]. In fact, higher education would be hard to imagine without it, since traditional methods, such as master and practical classes, are often combined with online resources. Nevertheless, evaluation is still not complete because some disciplines, as is the case of many courses in engineering degrees, make use of problems solved in a specific time period as the most appropriate evaluation method, and this is not easy to adapt in an online environment.
During the lockdown period, continuous evaluation techniques were highlighted and recommended. This is a clear trend in the last years, since ongoing evaluation methods are considered to be highly important so that the implementation of corrective actions that help students during the learning process is not done too late [13], but has become particularly emphasized in the period of online teaching due to the COVID-19 crisis. In this regard, some learning methods, such as those involving teamwork, should not be forgotten because these competencies are some of the most demanded in recruitment processes [14,15,16]. On the one hand, online learning can make this easier, since there are now many collaborative tools that help students to work together; on the other hand, due to the inherent nature of the traditional face-to-face teaching, students are not used to these tools and, as such, could be stressed by the lack of personal interaction on campus, which may make involvement and commitment more difficult. In this field, it is interesting to distinguish between formative and summative assessment [17,18,19]. Formative assessment is more informal, and its priority is to serve the purpose of promoting students’ learning [20] and complementing and helping the more traditional and formal summative assessment based on essays, tests, and exams. Despite its informal nature, or maybe because of it, formative assessment has proven to increase students’ achievement [21,22], so it should be boosted, not disregarded, in online environments to promote the engagement and involvement of students.
In this study, the evaluation systems adopted during the lockdown period in six courses taught in three schools of civil engineering were analyzed. These courses were taught by different lecturers using their own strategies and under unequal circumstances. In some cases, the online evaluation was adopted earlier, but in other cases, it was adopted almost at the end of the course, which clearly affected the evaluation methods that were eventually adopted. The evaluation strategies used to suddenly switch from face-to-face to online teaching are presented and discussed in each case.
In the first place, the study is presented, with each of the analyzed courses described. Then, the evaluation strategies adopted in each case are described, and, finally, some results are discussed. Finally, in view of the results, some final remarks and recommendations are given.

2. Description of the Study

This study was built on the shift to remote teaching–learning and assessing in a set of modules taught at three universities under an internationalization scope driven by the Spanish Ministry of Education [23]. It was part of a collaborative research work on the application of information and communication technologies (ICT) in innovation on higher education. This study shows the impacts of online learning on civil engineering modules of three universities, two from Spain (Universidad Politécnica de Madrid (UPM) and Universidad de Jaén (UJA)) and one from Peru (Universidad de Piura (UDEP)). This allowed for a comparison of results from universities in two countries that, due to heritage and historical reasons, have similar educational systems. Moreover, the Peruvian higher education system has made big efforts in the last few years to be part of a more modern and international group of universities.
The shift to remote training started with the closure of classrooms, which deprived both students and instructors of diverse rights and benefits. On the one hand, it had a negative impact on the inclusion process in higher education, understood as the ongoing and transformative process of improving education systems to meet all learners’ needs, especially those of low-achieving students or students with low-income families [24,25,26]. On the other hand, the change to remote teaching has implied a step forward toward so-called ubiquitous learning [27,28]. Another side effect is the advantage that the university system has taken from such rapid digital adaptation, including the use of pervasive components, e-resources, and online communication technologies amidst the well-known physical constraints to deliver satisfactory and profitable teaching–learning experiences to educational agents. However, this ubiquitous learning model is open for debate and demands further research in terms of both the evaluation of knowledge and behavior change measurement. In this regard, the authors considered that there is room for improvement since a well-tailored integrated teaching–learning environment must comprise online activities, digital materials, and face-to-face interactions to yield satisfactory outcomes.
Table 1 shows the main features of the modules analyzed in this study. All of them formed part of the curricula in civil engineering schools. The subjects hereby mentioned are:
  • Universidad Politécnica de Madrid (UPM, Spain): strength of materials, construction management, and dynamic and seismic analysis of structures.
  • Universidad de Jaén (UJA, Spain): theory of structures.
  • Universidad de Piura (UDEP, Peru): research operations I and II, which are modules completed in consecutive semesters with continuously assessed assignments and exams.
All the courses were adapted to online teaching by means of either asynchronous or synchronous methods. Most classes were taught by means of online video calls. However, some classes for the first two weeks of the lockdown at the UPM and the whole UJA course consisted of screencast videos that could be watched by the students at their own pace.

3. Methodology

The closure of classrooms entailed sudden adjustments on teaching and examinations so that the ongoing courses could end properly. Lecturers had to adapt their strategies to the new context, counting on the available resources in order to comply with the expected learning competencies. Such a big shift involved decisions at several levels, ranging from the rectorate to the lecturers, most of which lacked digital competences and underwent those changes while dealing with the absence of a large variety of individual readiness and capability factors. There is no doubt that these changes have led to a step forward in both the digital transformation of universities and the teaching of the future.
The considered teaching methodologies adopted by the teaching units are presented in this section. The initial strategies planned before the lockdown and the adjustments performed to adapt them to the online learning environment are described.
Likewise, this study gathered the criteria used to assess the impact of this sudden shift on the learning outcomes, as well as on both the instructors’ and students’ perceptions, which have evolved since then.

3.1. Evaluation Methodology of Courses at UPM

The closure of classrooms occurred some eight weeks after the start of semester. It took between two and three weeks to reconsider and readjust the evaluation strategy since it was mandatory to rewrite such changes in the academic guide.
An important issue was the assessment design, the impact of which on the learning process of students is significant [21,22,29]. Both formative and summative assessments were to be kept: the former as an essential part of the scaffolding structure because students can benefit from the discussion with and feedback from the teacher [30,31], and the latter boosts quality assurance [31].
Two main lines of action were considered upon readjusting the evaluation strategy’s tools and resources: (1) the follow-up of students through the continuous evaluation and (2) the preparation of exams.
Firstly, it was essential to keep the instructor’s role as both a facilitator and activator of meaningful learning and to help students to take ownership of their progress through ongoing assessment and reflection [32]. Thus, the teaching units approved an increase in the relative weight of the ongoing assessment in the final outcomes.
Class sessions were recorded so that students could access them afterwards. Some supplementary material, and e-resources were made available to students for autonomous learning. In order to ensure a suitable use of the former, diverse short questions were inserted in the pre-recorded videos (Edpuzzle) so that students could only continue watching them after answering. Indeed, this feature was highly valued by them.
In the fundamental degree subjects, students were prompted to solve weekly exercises or problems at home. Additionally, they were made take short online quizzes (Kahoot, Socrative, and Mentimeter) at least once a week during class time.
As with most fourth-year engineering subjects, the construction management module is focused on practical application of engineering knowledge through the relevance of assessments and self-directed inquiry-led learning, which includes visits to work sites. However, during the lockdown, students were prompted to watch some specific documentaries and to analyze the involved processes and workflows. The Edpuzzle has proven to be a valuable tool to insert some short questions at certain stages in order to follow up students’ accomplishments.
Regarding the degree and master technologically-oriented subjects, the instructors set several teamwork-based assignments focused on competence-based learning (CBL). In this sense, students should have been capable of demonstrating some specific learning achievements after each stage and before shifting to the next one [33]. Each group worked on an ongoing set of assignments throughout the semester with online presentations on a two-week basis. Such assignments were tailored according to Vygotsky’s principle of Zone of Proximal Development [31,34,35,36], so some questions arose in this regard: what could they do individually on their own? What could they do with help as they continued to learn by interacting with others around them? The design criteria for such an ongoing, teamwork-based plan built on these features were as follows [37]:
  • To tailor the activity with a trade-off between engagement and personal work.
  • To build on problem statements that pose relevant challenges.
  • To realize that the activities are themselves learning strategies.
  • To highlight that the activities are focused on learning rather than on the work product.
  • To promote tasks that require thinking and reasoning.
  • To focus on the process through appropriate guidelines and instructions.
  • To provide students with regular feedback from their progress.
  • To assess their learning achievements rather than their work products.
  • To empathize with learners when they encounter setbacks along their work.
  • To promote a favorable environment that fosters their effort rather than a single task or target.
ICT-based teamwork allows students to develop documentation, reporting, and other transversal skills. Conversely, its implementation requires teachers and students to use a variety of digital tools, highlighting the importance of digital literacy skills [37,38,39].
Tutorials were another relevant task to follow up and accompany students during the remote learning stage. Those were increasingly given through the online platforms (BlackBoard Collaborate, Microsoft Teams, and Zoom, among others), with noticeably good results, although the online tutorials remained. In this regard, the tablet computer emerged as a key tool to deliver the instructor explanations and responses to students’ queries.
Secondly, the civil engineering school ruled that the final degree projects were presented online, and exams were held online as well, which entailed a challenge. This raised a set of obstacles and uncertainties for both instructors and learners: among other issues, some students showed a weak motivation for distance learning and some professors were either reluctant to adapt to distance learning or not convinced of its usefulness, had a lack of preparation for the community to deal with distance learning, or had lack of clarity regarding the methods of remote evaluation.
The instructors involved in this study carried out several online Likert-type surveys among the students to gather their perceptions on the deliveries, the evaluation process, the extent of success achieved from the sudden shift in teaching and evaluation, their learning achievements [40]. The university also conducted end-of-semester surveys to gain insights into students’ perceptions on teaching strategies, performance, and the usage of innovative tools in teaching.
Surveys were also intended for understanding the degree of satisfaction with the teaching strategies implemented during the pandemic. The responses were classified according to a Likert scale, ranging from 5 (completely agree) to 1 (completely disagree).

3.2. Evaluation Methodology of Course at UJA

In order to understand why the teaching and evaluation methodology was converted to online teaching, as described later, it must be clarified that the course of theory of structures comprises two main parts. One is an introduction to elasticity, mainly describing strains, stresses, their tensorial expressions, and their relationship through Lamé’s and Hooke’s equations. The other one is an introduction to strength of materials, presenting axial, shear and bending stresses, torque, and how to solve stress diagrams in isostatic beams. Therefore, this is a course where solving problems has a paramount importance as a teaching tool.
Online teaching due to lockdown started after seven weeks after the start of the semester. From the very beginning, a big effort was made to proceed with the lectures in a way that was as similar as possible to the classroom lectures. Because lectures could not be given in a synchronous way due to family conciliation issues, an asynchronous methodology was followed. Until then, classroom lectures had mainly been given using the blackboard, since, due to the great importance of problems in this course, this is considered as the most adequate method. Because of this, great effort was made to adapt these lectures to online teaching; thus, lectures were given by means of videos recorded using screencast tools, supported by solved handwritten problems and slide presentations. Problems were solved by hand by the lecturer, and then they were scanned and used to prepare a slide presentation. The progressive explanation that would take place using a blackboard in the classroom was simulated by recording the voice of the lecturer during the slide presentation using the open source software Kazam. Before the lecture, the students had access to the video and the scanned solution of the exercise as a PDF. This material was always available before the official time of the lectures, and clear instructions were given to the students so that they could follow the course and use the material properly. Every two weeks, a group video tutorial session was arranged via Google Meet in order to solve questions and clarify doubts about the lectures, which, together with the doubts solved by email, proved to be an efficient way of solving questions about the course.
Regarding the practical sessions, no big changes had to be made, since all of them consisted of solving given problems by using specific software (MatLab for elasticity problems and robot structural analysis for strength of materials problems) available to the students through academic licenses. Therefore, video tutorials were prepared for instructing the students on the usage of the software, and short videos explained the problems to solve.
Regarding the evaluation of the course, following instructions received from the university, an alternative methodology had to be designed. Table 2 shows how the evaluation methodology was modified to an online final exam by comparing the evaluation items and their weights on the final mark.
The original evaluation system mainly consisted of two items: lab practices that were assessed based on the students’ participation and on reports prepared by the students in groups of two members, which had a weight of 10% on the final mark, and a final exam testing theoretical questions and practical problems, which had a weight of 90% on the final mark. Students had to pass both items independently.
The Universidad de Jaén (UJA) decided to switch from a traditional on-site final exam to an online final exam just three weeks before the end of the course and encouraged lecturers to implement ongoing assessment methods to reduce the weight of the final exam. Since this decision was made during the very last part of the course, drastically modifying the evaluation methodology was considered to be unfair for the students, who had been preparing the course based on the original evaluation methodology. In addition to this, including new items at the very last part of the course could have increased their work load in excess. Finally, the assessment methods were adapted following these recommendations while trying to maintain the same general criteria of the original methodology.
It is important to highlight that this course is evaluated mainly using problems that must be solved in a final exam. It is a course in which theory supports the practical part, but the student must fundamentally learn how to solve problems in a given time. Thus, the evaluation methodology was modified in a way where the same premise was preponderant. A new evaluation item was included “study of cases and exercises,” which consisted of problems that were solved in class during the last week of the course. The weight of the lab practices were increased up to 30% to reduce the weight of the final exam, which had a weight of 50% following the recommendations made by the university. The exam maintained the same structure as in a regular year, with a theory part with a weight of 20% and problems with a weight of 80%, but the theory was transformed into an online test and the problems were defined using parameters that had different values for each student (Figure 1 shows an example of one of the problems designed for this exam). Therefore, each student had different results, and copying was not easy. In addition to this, each problem had to be delivered before the next problem was presented, which reduced the chances of sharing results and consultation among peers. To guarantee that the students themselves were the authors of the submitted exercises, prior to the exam, the students were required to send a video showing them writing a specific text by hand in order to serve for comparison.
Problems were published on the website of the course at a given time, and they had to be solved in paper by hand, scanned with a smartphone, and delivered online before the deadline. Since this procedure was new, the new item, “study of cases and exercises,” was designed as preparatory exercises for the final exam so that the students could get accustomed to it. In these exercises, the delivery process was more flexible, since the students were not used to the scanning and uploading procedure, and passing them was not mandatory for passing the course.

3.3. Evaluation Methodology of Courses at UDEP

The two cohorts comprised 136 and 152 registered students, respectively. Lessons were taught remotely and synchronously, and they were simultaneously recorded. Extensive use was made of UDEP Virtual, the digital learning management system (LMS). This platform held a variety of e-resources, often known as e-textbooks, which go beyond electronic versions of printed material since are intended to support both self-paced and tutor-paced student learning [41,42]; these included video conference classes, pre-recorded videos, individual and teamwork assignments, class notes and presentations, podcasts and tutorials.
Such a variety of digital resources was conceived for remote teaching, autonomous learning, and assessment. The coupling of e-textbooks and digital media formed a promising paradigm that could spread higher education to a variety of settings, so that students can be involved in learning contexts with immersive experiences that help them to attain meaningful learning [43]. In this regard, many publishers have made their e-resources free of charge during the confinement period.
Practical lessons consisted of two virtual laboratory sessions and four team workshops, drawing on collaborative work, by using the Excel Solver tool and focusing on competence-based learning [33]. Workgroups were accompanied and supervised by the instructor on a weekly basis. In addition, students took four individual practical exercises, as well as an end-of-semester exam, for summative assessment issues.
In case students failed to take these exams due to technical, personal, or health causes, UDEP set an extraordinary exams schedule.
This university also conducted end-of-semester Likert-type surveys to grasp students’ perceptions on certain features of the course development regarding the impact of innovative tools in teaching and assessment.

4. Results

In order to measure the impact of the experiences described here, a set of indicators for both process and results was applied, focusing on three areas of interest: (1) the impact of e-resources and e-textbooks on learning outcomes, (2) the benefits and drawbacks of online evaluation when compared with on-site sessions, and (3) meaningful learning achievements.
The impact of the whole evaluation process raised several reflections from both the instructors’ and students’ standpoints.
Most students expressed diverse concerns about the new constraints:
  • Weak motivation for distance learning; the home environment was not suitable.
  • A shortfall in their comprehension of some applied subjects in the absence of classroom interaction.
  • Difficulties when performing remotely oriented work.
  • Uncertainty about the lack of clarity of the methods of remote evaluation.
Regarding the professors, the following reflections can be summarized:
  • The need to overcome an initial resistance to adapt to remote education.
  • Online teaching requires a big effort in preparing new material, although it can be used again in future courses.
  • Lack of digital competences in professors.
  • Lack of preparing the university educational agents to deal with distance evaluation.
  • Lack of training in the use of technology and the absence of uniform controls among all exam takers.
  • Some instructors are not yet convinced of the usefulness of distance learning and assessing.
Some difficulties and uncertainties drove the at-hand preparation of exams:
  • To maintain the preset learning competences and outcomes.
  • To ensure honesty, probity, confidentiality, authorship and equal opportunities of the exam takers.
  • The possibility of designing exams while keeping the same structure as in the on-site ones.
  • The online examination tool could not be a source of uncertainty nor conflict to students.
  • It was mandatory to avoid third-party tools or resources by the exam takers.
Hence, the exam setting estimated very tight response times: questions and problems were precise and objective so that the response resulted from reasoning, relating concepts, and demonstrating, arguing, or deriving arguments and expressions. Thus, the design of exams became a trade-off between keeping the as much of the original classical structure as possible and remaining ethical and ensuring authorship issues. However, our ex-post analysis showed that the instructors’ primarily focused on avoiding cheating. As a consequence, low-achieving students were especially affected by such measures. Nevertheless, the figures of both passing students and dropouts were similar to those of the previous year. Therefore, it cannot be concluded that the sudden shift to remote learning had an impact on the outcomes. Indeed, students’ feedback confirmed this conclusion.
Table 3 shows a comparison of both passing and dropout rates between 2019 and 2020 for the selected modules. As the differences were not significant, we cannot conclude any kind of impact in those outcomes from the change to remote teaching.
In general, the response and attitude of students to online exams were notably positive and proactive. Most of them acted responsibly, were eager to participate, and reached their learning outcomes. However, around 10% of exam takers lacked maturity as they tried to cheat and exchange information during the exams. Given that most students were actually proficient in digital technologies, instructors struggled to monitor the exam sessions, even with online video surveillance. Additionally, students at home were prompted to write the responses to the question in their own hand, scan their manuscripts, and upload the resulting PDF files to the examination platform.
The figures of grades, dropouts, and their ratios compared with the previous year’s numbers, among other figures, are valuable objective indicators of learning outcomes. Some of the available data are issued by the universities since they form part of the information submitted to the respective national certification agencies [15,16]. This study only compared learning outcomes from these last two years since these teaching units applied the same methodology with the same instructors, syllabi, and university policies.

4.1. Results of Courses at UPM

The previously described process and behavior patterns applied to this case. Regarding the concern about the security and confidentiality of data and information during exams, one noticeable proof of cheating is shown below. Three exam takers wrote the response to a given exercise in their own hand and uploaded the scanned versions to the platform. All three used the same alternate approach to address the solution. However, such a method neither belonged to the module syllabus nor was taught by the instructors. Additionally, all three students depicted the same charts and schemes with the same mistakes indeed at the same steps. Figure 2 shows the excerpts from the three individual responses.
Student experiences and feedback revealed rather good acceptance and goal achievement, as shown below.

4.1.1. Strength of Materials and Construction Management

The items of interest were the following:
  • Degree of satisfaction with online classes.
  • How do you value your learning of the subject when compared with face-to-face classes?
  • Have you studied autonomously the subject more than during in-person period?
  • Degree of satisfaction with individual time management and learning.
  • How could you study in online groups during the pandemic as compared with the on-site regime?
  • How do you value your learning achieved through studying in groups during the pandemic?
  • Degree of satisfaction with the e-resources delivered by the instructors of the subject during the lockdown period.
  • Certainty on having mastered the two key concepts taught in the subject.
  • Would you recommend applying the teaching method used in this subject to other modules?
  • Have you achieved the learning expectancies during this period?
  • Your degree of readiness to follow online classes at the beginning of the lockdown period.
  • Your current readiness to follow online classes at the end of the lockdown period.
  • Degree of mind shift with respect to online classes after this experience.
  • Open questions, suggestions, complaints, etc.
Students expressed a fair acceptance of the digital resources involved during the distance learning stage, as well as a reasonably good achievement in their goals. Their suggestions helped to design future actions for the next course, regardless of whether it is online or on-site.

4.1.2. Dynamic and Seismic Analysis of Structures

The remotely oriented teamwork was weighted as one third of the final grade. It was conceived for competence-based learning focused on problem solving. Thus, the survey included three main topics: the fulfilment of learning achievements, perception of teamwork effectiveness, and perception of team leadership. Several items were about the individual learning achievements within the group work method. The main questions were:
  • I have mastered the core concepts application to the seismic design of a given simple structure.
  • Satisfaction level with individual learning from teamwork
  • Satisfaction level with autonomous learning and individual contribution to teamwork
  • Would you recommend applying competency-based learning through teamwork to other modules?
  • Have you achieved your learning expectancies during this period?
  • Your readiness to do online teamwork in the beginning of the lockdown period.
  • Your readiness to do online teamwork at the end of the lockdown period.
  • Level of satisfaction with your own contribution to teamwork
  • Level of satisfaction with teammates’ contribution to teamwork
  • Extent of mind shift with respect to teamwork benefits after this experience.
  • Own leadership skills for doing teamwork.
  • Own skills for overcoming setbacks collaboratively.
  • Team leader’s skills for overcoming setbacks collaboratively.

4.2. Results of Course at UJA

Figure 3 shows a correlation between the final mark of a student and the average time he or she took to view the PDF files prepared for every lecture since they were available online. This graph is intended to show that there was a connection between both values because it was expected that a motivated student would visualize the available material earlier than a non-motivated student, since the former would usually prepare for the course at the same rhythm it is taught and the latter would procrastinate and only study during the last few weeks before the final exam. Each mark represents a student, and the dashed line shows the linear trend of this correlation. It shows a decreasing trend, as expected. Data are broadly spread around the first part of the graph, which groups those students that visualized the files earlier, which is logical because not all students had the same capacities and not all of them needed the same time to comprehend the concepts. Nevertheless, it is clear that a significant delay in accessing the material was related to a lower final mark.
In addition to this conclusion, it was interesting to analyze some aspects observed during the online teaching period. Since the students had access to the videos and the scanned solution of the problems at the same time, they could only focus on understanding the video and taking notes of those issues that were of particular interest for them. Since the problems were solved by using a slide presentation, the explanation time was reduced (a traditional lecture of 50 min was reduced, on average, to 35 min), since no time for writing on a blackboard was required. This, in a traditional classroom teaching context, can be seen as a drawback because a faster pace of teaching may become elusive for some students, but in this online context, it proved to be beneficial. Since lectures were recorded in videos that were accessible to the students any time and as many times as they wanted, those students who needed to could easily rewatch the whole video or only certain parts, but those students who did not need to had more time available for other subjects. In this regard, the students expressed their satisfaction with this teaching methodology during the group tutorial sessions, remarking on the convenience of watching the lectures more than once if they needed. This supports, as already stated by Shahabadi and Uplane [44], that anywhere–anytime learning has clear benefits for students because they have control over their learning pace and can manage their time better.
Regarding the practical sessions, they seemed to be efficient, and, compared with previous years, no big problems were encountered. By contrast, solving questions from the students became a much more time-demanding task because, due to the extraordinary situation motivated by the lockdown, students were allowed to ask questions via email or ask for video calls with the lecturer. This led to a situation where the lecturer’s availability was not limited to specific time periods during the week, instead being extended to the whole week. This proved to be effective for solving questions but implied a high additional workload for the lecturer.
Figure 4 shows the students’ performance compared with the previous year. It is interesting to observe that, although an online exam could imply higher rates of cheating leading to better marks, this was not the case. In general, almost no cheating was detected, and the design of the online exam—with different parameters set for each student and a sequential solving of the problems—seemed to be a successful alternative to the original classroom exam, with similar problems and difficulty.
It must be noted that the lower performance of the students with respect to the previous year cannot be attributed to the adopted online methodology, since the same methodology was used in another course taught in the mechanical engineering degree at UJA, and the performance there was higher (from 32.6% in 2018–2019 to 59.4% in 2019–2020). For some reason, the students of theory of structures were less motivated during the semester. Some of them mentioned that the workload of deliverable reports in other courses had remarkably increased because shifting to online evaluation had encouraged other lecturers to reduce the weight of the final exam and increase the number of ongoing evaluation tasks.

4.3. Results of Course at UDEP

In short, features inquired by the questionnaire were:
  • Degree of satisfaction with the implemented remote teaching model.
  • Degree of mind shift with respect to online classes after this experience.
  • Degree of accomplishment of the module syllabus.
  • Assessment of the implemented evaluation methodology.
  • Usefulness of the virtual lab and workshops.
  • Instructor availability.
  • Usage of innovative resources and e-textbooks in remote teaching.
  • Usage of digital resources and e-textbooks in assignments and exams.
  • Teacher–student interaction and availability to deal with unforeseen events.
  • Adequacy of elapsed time for grade publishing.

4.3.1. Research Operations I

Pass rate was 97% of the 136 registered students. Their level of satisfaction with the implemented online teaching and assessing approaches suggested a line of action for next year’s courses.
Figure 5 and Figure 6 show the average levels of learners’ satisfaction about the two modules, whereas the red lines show those of the engineering faculty.

4.3.2. Research Operations II

The enrolled group comprised 152 attendees. The feedback from the survey on their perceptions and degree of fulfilment of expectancies is summarized below.
Learners individually took four practical virtual lab exercises, as well as an end-of-semester exam, through Zoom. Instructors faced analogous difficulties and issues regarding probity, individuality, and authorship. Around 6% of students failed to take the ordinary evaluation items and dates, mainly due to personal, health, or technical reasons. Thus, the university arranged an extra-ordinary call for the “COVID exam”.

5. Discussion

University students in both countries lead technologically-focused lives, as they are, arguably, digital natives. Indeed, they master the use of digital technologies far more the average instructor. Because of cultural and heritage reasons and specific engineering training, there are certain similarities between Spanish and Peruvian university students. The acquisition of digital competences for lecturers is a pending task, with similar features in both countries. This has implied that exam settings have focused in many cases on avoiding cheating. There are currently diverse applications available to share computer desktops or use third-party digital devices, so the authors think that there is broad room for improvements in order to ensure honesty, probity, confidentiality, authorship, and equal opportunities for all exam takers [8,24,26].
This recent global health crisis has shown that the current university learning system is remarkably digital and has just made a step toward the design of the future higher education system. This passes through the use of active learning models and the development of digital competences for educational agents. Other features can be envisaged in this route, including synchronous teaching, ubiquitous learning, and active learning strategies such as synthesis capability, problem-based learning, project-based learning, service-based learning, competence-based learning, and experiential learning [38].
Regarding both the virtual training and ICT-mediated assessment processes, there is much room for improvement, especially when focusing on the formative assessment. These improvements must start at revising their meaning in the future digital context, analyzing their limits and possibilities, determining which types of knowledge are adequate for being evaluated, and identifying the drawbacks and capabilities of virtual tools [45,46]. Indeed, the recent experience revealed that ICT-based evaluation showed a trend toward summative and quantitative assessment, even more when inserted within LMS. Additionally, it has become essential to ensure the effectiveness of the technical and digital layouts of remote evaluation, so these layouts should open for debate. The absence of uniform controls and the pervasive use of digital tools may lead to a loss of quality assurance and, hence, of the evaluation purpose.
Likewise, this recent experience revealed two other challenges: to reduce the digital divide and the lack of inclusion in higher education. This may include facilitating a digital equipment loan service and access to wireless technology to low-income families.
Furthermore, future higher education will be digital, and mobile devices will play a paramount role as they have jumped into the spotlight and become an inseparable tool for university students, who lead technologically-oriented lives. This issue demands a mind shift in educational agents. Likewise, this recent experience also revealed diverse pending tasks about supporting distance-learning students to overcome their lack of motivation and difficulty in understanding some applied courses and remotely oriented work. Only then will they be able to succeed in their study, which could help to decrease the dropout rate. In this regard, digital transformation strategies must also concern research and social service missions.

5.1. Courses at UPM and UJA

Some of the core issues in designing the online evaluation process were as follows. (1) How to promote meaningful learning? (2) Are we actually promoting competition or improvement? (3) Could we cause a negative impact on low-achieving students? (4) How to keep the same face-to-face exam structure when setting online exams?
The incident depicted in Figure 2 is an example of the ease with which students can form groups online to exchange information during exams. All three students developed an alternate approach that did not belong to the syllabus. On the one hand, this may mean that they did not follow the daily impartations. On the other hand, they incorrectly applied such a method and committed the same mistakes. No other exam taker used such an approach not taught in the course.
After the experience, it appeared to be mandatory that universities take actions to train instructors in the usage of educational technology [39,45]. In this regard, designing an effective and safe online examination strategy is a priority task. Likewise, learning outcomes could be improved by setting up a convenient flow of integrated digital content and online sessions [41,46].
The strategy followed at UJA in this extraordinary situation has proven to be remarkably efficient, since the results obtained by the students did not greatly differ from those of previous courses. Regarding evaluation, no cheating issues were detected. This suggests that, since the group of students was not large, this problem could be kept under control according to this experience. Nevertheless, it was true that this is the first time the students faced this type of exam procedure, and different behavior is expected in the future, once the students are accustomed to it. In the case of UPM, some cases of cheating were detected, as mentioned above, since the number of students was about five times larger than in the case of UJA. From these results, it was concluded that group size is critical in this aspect.
On another note, one of the main lessons learnt during this experience was related to students’ motivation. During the online teaching period, the students seemed, in general, to be less engaged with the course, compared with previous years. This could have been due to a higher workload in other courses, which modified their evaluation method and increased their deliverable tasks, reducing the available time for studying and preparing these courses. Moreover, face-to-face teaching implies meeting other peers, having informal talks with lecturers, and (this is probably a key point) increasing the feeling of belonging to a community.
Following this experience, it was concluded that a combination of face-to-face and online learning can lead to a better learning experience, since online resources provide tools and promote dynamics that are not possible in the classroom, but this combination can also lead to a less motivated group of students [26,37].

5.2. Courses at UDEP

The assessment was mainly built on the teamwork method complemented with individual evaluations and a final exam. The LMS played a core role in this process, and students pointed out its reliability.
The numbers of passes showed that the virtual teaching and evaluation system behaved in a similar way to the previous year, which was a face-to-face one. This was due to greater teacher–student interaction and flexibility in dealing with unforeseen technical difficulties. Even so, the small differences could be attributed to the lack of access to technology at home by a few students, i.e., affordability problems. In this sense, this recent experience revealed two other challenges: to reduce the digital divide and the lack of inclusion in higher education. This may include facilitating a digital equipment loan service and access to wireless technology to low-income families [25,26].

5.3. Comparison between Spanish and Peruvian Results

On the one hand, the subjects whose assessment was mainly based on teamwork and CBL yielded similar outcomes. Nevertheless, instructors showed concern about factors that hindered teamwork effectiveness such as compensation for team achievements and boosting personal mind shifts for team members [37].
On the other hand, the rest of the subjects yielded similar outcomes when compared with the face-to-face teaching from the previous year. Both outcome figures (Table 3) and students’ perceptions (Table 4, Table 5, Table 6 and Table 7) indicated that the measures taken to redress the situation were satisfactory.
It also seems necessary to reflect on what is an adequate setup and structure for an online exam. University student associations are currently demanding online exams rather than on-site ones. They argue sanitary risk reasons, though they prefer face-to-face lectures. This gives cause for reflection.
Lastly, there is general concern about the two aforementioned challenges, i.e., the lack of inclusion and the digital divide, as priorities for improvement.

6. Final Remarks

The closure of university classrooms caused by the advent of the recent global health emergency has prompted numerous efforts and adaptations to the remote teaching–learning system. Some measures, practices, and changes might be here to stay, including the use of digital tablets in remote teaching, pre-recorded videos with inserted questions to ensure follow-up, preset questionnaires and quizzes for online use, and the capability to meet with students and colleagues.
The adaptability to the constraints imposed by remote teaching has emerged as a key feature: good-achieving students during the face-to-face stage of the semester performed well during the distance-learning phase, whereas low-achieving students became more affected. The dropout rate in fundamental subjects reached 22%, which was notably higher than in technological modules, which was lower than 10%.
Regarding the digital divide and the lack of inclusion as shortcomings, deep reflection is required about setting policies to support and counsel students in order to facilitate their integration and adaptability so that they can better meet their learning outcomes.
Lastly, the impact of this health crisis on higher education has shown the potentials of distance teaching, either synchronous or asynchronous. Conversely, the remote evaluation process still raises technical, functional, and ontological controversies that need to be addressed and improved.

Author Contributions

Conceptualization, F.S. and J.C.M.F.; methodology, F.S., I.C., M.G.-A. and J.C.M.F.; validation, F.S., I.C., M.G.-A. and J.C.M.F.; formal analysis, F.S., I.C., M.G.-A. and J.C.M.F.; investigation, F.S., I.C., M.G.-A. and J.C.M.F.; resources, M.G.-A. and I.C.; data curation, F.S. and I.C.; writing—original draft preparation, F.S. and J.C.M.F.; writing—review and editing, F.S., I.C., M.G.-A. and J.C.M.F.; visualization, F.S., I.C., M.G.-A. and J.C.M.F.; supervision, M.G.-A.; project administration, M.G.-A. and F.S.; funding acquisition, M.G.-A. and J.C.M.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Universidad Politécnica de Madrid through the Educational Innovative Programme 2019–2020, Project code IE1920-0405.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available from the authors on request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Schuck, R.K.; Lambert, R. “Am I Doing Enough?” Special Educators’ Experiences with Emergency Remote Teaching in Spring 2020. Educ. Sci. 2020, 10, 320. [Google Scholar] [CrossRef]
  2. Kaden, U. COVID-19 school closure-related changes to the professional life of a K–12 teacher. Educ. Sci. 2020, 10, 165. [Google Scholar] [CrossRef]
  3. Lassoued, Z.; Alhendawi, M.; Bashitialshaaer, R. An Exploratory Study of the Obstacles for Achieving Quality in Distance Learning during the COVID-19 Pandemic. Educ. Sci. 2020, 10, 232. [Google Scholar] [CrossRef]
  4. Ferraro, F.V.; Ambra, F.I.; Aruta, L.; Iavarone, M.L. Distance Learning in the COVID-19 Era: Perceptions in Southern Italy. Educ. Sci. 2020, 10, 355. [Google Scholar] [CrossRef]
  5. Akour, A.; Al-Tammemi, A.B.; Barakat, M.; Kanj, R.; Fakhouri, H.N.; Malkawi, A.; Musleh, G. The Impact of the COVID-19 Pandemic and Emergency Distance Teaching on the Psychological Status of University Teachers: A Cross-Sectional Study in Jordan. Am. J. Trop. Med. Hyg. 2020, 103, 2391–2399. [Google Scholar] [CrossRef]
  6. Rapanta, C.; Botturi, L.; Goodyear, P.; Guàrdia, L.; Koole, M. Online university teaching during and after the Covid-19 crisis: Refocusing teacher presence and learning activity. Postdigital Sci. Educ. 2020, 2, 923–945. [Google Scholar] [CrossRef]
  7. Godsk, M. Tablets for Learning in Higher Education: The Top 10 Affordances. In E-Learn 2013, Proceedings of the World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, Las Vegas, NV, USA, 9 February 2013; Bastiaens, T., Marks, G., Eds.; Association for the Advancement of Computing in Education (AACE): Norfolk, VA, USA, 2013; pp. 1892–1897. [Google Scholar]
  8. Sevillano-Garcia, M.L.; Vazquez-Cano, E. The Impact of Digital Mobile Devices in Higher Education. Educ. Technol. Soc. 2015, 18, 106–118. [Google Scholar]
  9. Espinel, B.I.; García, M.L.S.; Castro, I.J.M.; Moscoso, C.P. El auge del aprendizaje universitario ubicuo. Uso de las tabletas en la apropiación del conocimiento. Educ. Siglo XXI 2019, 37, 183–204. [Google Scholar] [CrossRef] [Green Version]
  10. Singh, V.; Khasawneh, M.T.; Bowling, S.R.; Kaewkuekool, S.; Jiang, X.; Gramopadhye, A. The evaluation of alternate learning systems in an industrial engineering course: Asynchronous, synchronous and classroom. Int. J. Ind. Ergon. 2004, 33, 495–505. [Google Scholar] [CrossRef]
  11. Boaz, C.; Nath, R. Asynchronous learning environments: An empirical study. In Proceedings of the 1997 Annual Meeting of the Decision Sciences Institute, San Diego, CA, USA, 22–25 November 1997; Part 1. pp. 726–728. [Google Scholar]
  12. Manseur, R.; Manseur, Z. A synchronous distance learning program implementation in Engineering and Mathematics. In Proceedings of the 39th ASEE/IEEE Frontiers in Education Conference W1C-1, San Antonio, TX, USA, 18–21 October 2009. [Google Scholar]
  13. Brambila, D.A.C.; Obando, A.L. Sistema de evaluaciones en línea como herramienta para los niveles de educación media superior. RIDE Rev. Iberoam. Para Investig. Desarro. Educ. 2015, 6, 67. [Google Scholar]
  14. Bennett, N.; Dunne, E.; Carré, C. Patterns of core and generic skill provision in higher education. High. Educ. 1999, 37, 71–93. [Google Scholar] [CrossRef]
  15. ENAEE: EUR-ACE® Framework Standards and Guidelines. Standards and Guidelines for Accreditation of Engineering Programmes. Available online: https://www.enaee.eu/eur-ace-system/standards-and-guidelines/#standards-and-guidelines-for-accreditation-of-engineering-programmes (accessed on 21 December 2020).
  16. ABET: Criteria for Accrediting Engineering Programs. Available online: https://www.abet.org/wp-content/uploads/2015/05/E001-15-16-EAC-Criteria-03-10-15.pdf (accessed on 21 December 2020).
  17. McGuinness, C. Becoming Confident Teachers: A Guide for Academic Librarians; Elsevier: Amsterdam, The Netherlands, 2011. [Google Scholar]
  18. Shavelson, R.J. On the integration of formative assessment in teaching and learning: Implications for new pathways in teacher education. In Competence Oriented Teacher Training; Brill Sense: Leiden, Netherlands, 2006; pp. 61–78. [Google Scholar]
  19. Mosquera, J.C.; Suárez, F.; Chiyón, I.; García-Alberti, M. Una exploración sobre técnicas de enseñanza mixta para el aprendizaje basado en competencias en materias CTIM. In Proceedings of the V Congreso Internacional Sobre Aprendizaje, Innovación y Competitividad (CINAIC 2019), Madrid, Spain, 9–11 October 2019. [Google Scholar]
  20. Mosquera, J.C.; Baeza, F.; Santillán, D.; Garcia-Alberti, M. Exploring some problem-based learning approaches with the classroom response systems for undergraduate engineering students. In Proceedings of the 12th annual International Conference of Education, Research and Innovation (ICERI 2019), Seville, Spain, 11–13 November 2019; pp. 4231–4238. [Google Scholar]
  21. Black, P.; Wiliam, D. Assessment and classroom learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74. [Google Scholar] [CrossRef]
  22. Black, P.; Harrison, C.; Lee, C.; Marshall, B.; Wiliam, D. Working inside the black box: Assessment for learning in the classroom. Phi Delta Kappan 2004, 86, 8–21. [Google Scholar] [CrossRef]
  23. Spanish Ministry of Education, Culture and Sport. Strategy for the Internationalisation of Spanish Universities 2015–2020; Spanish Ministry of Education, Culture and Sport: Madrid, Spain, 2014.
  24. Bulman, G.; Fairlie, R.W. Technology and education: Computers, software, and the Internet. In Handbook of the Economics of Education; National Bureau of Economic Research: Cambridge, MA, USA, 2016; Volume 5, pp. 239–280. [Google Scholar]
  25. Martín-Gutiérrez, J.; Mora, C.E.; Añorbe-Díaz, B.; González-Marrero, A. Virtual technologies trends in education. EURASIA J. Math. Sci. Technol. Educ. 2017, 13, 469–486. [Google Scholar]
  26. Vidal, I.M.G. Influencia de las TIC en el rendimiento escolar de estudiantes vulnerables. RIED. Rev. Iberoam. Educ. Distancia 2020, 24, 351–365. [Google Scholar]
  27. Cárdenas-Robledo, L.A.; Peña-Ayala, A. Ubiquitous learning: A systematic review. Telemat. Inform. 2018, 35, 1097–1132. [Google Scholar] [CrossRef]
  28. Mota, F.P.d.; Tôledo, F.P.; Kwecko, V.; Devincenzi, S.; Núñez, P.; Botelho, S.S.d.C. Ubiquitous Learning: A Systematic Review. In Proceedings of the IEEE Frontiers in Education Conference (FIE), Covington, KY, USA, 16–19 October 2019; pp. 1–9. [Google Scholar]
  29. Bloxham, S.; Boyd, P. Developing Effective Assessment in Higher Education: A Practical Guide: A Practical Guide; McGraw-Hill Education: London, UK, 2007. [Google Scholar]
  30. Sadler, D.R. Formative Assessment: Revisiting the territory. Assess. Educ. Princ. Policy Pract. 1998, 5, 77–84. [Google Scholar] [CrossRef]
  31. Chen, W.; Umang, V.S.; Brechtelsbauer, C. A framework for hands-on learning in chemical engineering education—Training students with the end goal in mind. Educ. Chem. Eng. 2019, 28, 25–29. [Google Scholar] [CrossRef]
  32. Caena, F.; Redecker, C. Aligning teacher competence frameworks to 21st century challenges: The case for the European Digital Competence Framework for Educators (Digcompedu). Eur. J. Educ. 2019, 54, 356–369. [Google Scholar] [CrossRef] [Green Version]
  33. Torres, A.S.; Brett, J.; Cox, J. Competency-Based Learning: Definitions, Policies, and Implementation; Regional Educational Laboratory Northeast & Islands: Waltham, MA, USA, 2015. [Google Scholar]
  34. Vygotsky, L. Interaction between learning and development. Read. Dev. Child. 1978, 23, 34–41. [Google Scholar]
  35. Doolittle, P.E. Vygotsky’s Zone of Proximal Development as a Theoretical Foundation for Cooperative Learning. J. Excell. Coll. Teach. 1997, 8, 83–103. [Google Scholar]
  36. Shabani, K.; Khatib, M.; Ebadi, S. Vygotsky’s Zone of Proximal Development: Instructional Implications and Teachers’ Professional Development. Engl. Lang. Teach. 2010, 3, 237–248. [Google Scholar] [CrossRef] [Green Version]
  37. Tapia, J.A. Motivar Para el Aprendizaje; Edebé: Barcelona, Spain, 1998. [Google Scholar]
  38. Traxler, J. Distance Learning—Predictions and Possibilities. Educ. Sci. 2018, 8, 35. [Google Scholar] [CrossRef] [Green Version]
  39. Chen, S.Y.; Tseng, Y.F. The impacts of scaffolding e-assessment English learning: A cognitive style perspective. Comput. Assist. Lang. Learn. 2019, 1–23. [Google Scholar] [CrossRef]
  40. Clason, D.L.; Dormody, T.J. Analyzing data measured by individual Likert-type items. J. Agric. Educ. 1994, 35, 4. [Google Scholar] [CrossRef]
  41. Mynbayeva, A.; Sadvakassova, Z.; Akshalova, B. Pedagogy of the Twenty-First Century: Innovative Teaching Methods. In New Pedagogical Challenges in the 21st Century: Contributions of Research in Education; Cavero, O.B., Llevot-Calvet, N., Eds.; IntechOpen: London, UK, 2018; pp. 3–20. [Google Scholar]
  42. Dutkiewicz, A.; Kołodziejczak, B.; Leszczyński, P.; Mokwa-Tarnowska, I.; Topol, P.; Kupczyk, B.; Siatkowski, I. Online Interactivity–A Shift towards E-textbook-based Medical Education. Studies in Logic. Gramm. Rhetor. 2018, 56, 177–192. [Google Scholar] [CrossRef] [Green Version]
  43. Mosquera, J.C.; Baeza, F.J.; Galao, O.; Alberti, M.G. On student perceptions about e-textbooks and digital resources for online teaching: Lessons learned from confinement. In Proceedings of the 13th annual International Conference of Education, Research and Innovation (ICERI 2020), online conference. 9–11 November 2020; pp. 3642–3647. [Google Scholar]
  44. Shahabadi, M.M.; Uplane, M. Synchronous and asynchronous e-learning styles and academic performance of e-learners. Procedia Soc. Behav. Sci. 2015, 176, 129–138. [Google Scholar] [CrossRef] [Green Version]
  45. Torres-Madroñero, E.M.; Torres-Madroñero, M.C.; Ruiz Botero, L.D. Challenges and Possibilities of ICT-Mediated Assessment in Virtual Teaching and Learning Processes. Future Internet 2020, 12, 232. [Google Scholar] [CrossRef]
  46. Moodley, K. Improvement of learning and assessment of the practical component of a Process Dynamics and Control course for fourth year chemical engineering students. Educ. Chem. Eng. 2020, 31, 1–10. [Google Scholar] [CrossRef]
Figure 1. Example of one of the problems designed for the online exam. Each student had a different set of parameters.
Figure 1. Example of one of the problems designed for the online exam. Each student had a different set of parameters.
Education 11 00059 g001
Figure 2. The same excerpts from three students’ own manuscripts during a remote exam. The approach, notation, and procedure was the same in all three cases and not taught in the course. Even the mistakes coincided and were at identical places.
Figure 2. The same excerpts from three students’ own manuscripts during a remote exam. The approach, notation, and procedure was the same in all three cases and not taught in the course. Even the mistakes coincided and were at identical places.
Education 11 00059 g002
Figure 3. Correlation between the final mark of the students and the average time they took to access the PDF files available for each lecture. The dashed line represents the linear trend of this correlation.
Figure 3. Correlation between the final mark of the students and the average time they took to access the PDF files available for each lecture. The dashed line represents the linear trend of this correlation.
Education 11 00059 g003
Figure 4. Comparison between the students’ performance of 2018–2019 and 2019–2020 courses for theory of structures, taught at UJA.
Figure 4. Comparison between the students’ performance of 2018–2019 and 2019–2020 courses for theory of structures, taught at UJA.
Education 11 00059 g004
Figure 5. Average values of UDEP students’ satisfaction degree regarding the teaching–learning and assessment processes in research operations I.
Figure 5. Average values of UDEP students’ satisfaction degree regarding the teaching–learning and assessment processes in research operations I.
Education 11 00059 g005
Figure 6. Average values of UDEP students’ satisfaction degree regarding the teaching–learning and assessment processes in research operations II.
Figure 6. Average values of UDEP students’ satisfaction degree regarding the teaching–learning and assessment processes in research operations II.
Education 11 00059 g006
Table 1. Main characteristics of the analyzed modules. UPM: Universidad Politécnica de Madrid; UJA: Universidad de Jaén; UDEP: Universidad de Piura.
Table 1. Main characteristics of the analyzed modules. UPM: Universidad Politécnica de Madrid; UJA: Universidad de Jaén; UDEP: Universidad de Piura.
CourseUniversityDegree/MasterYearNumber of Registered StudentsTeaching MethodExamination Method
Construction managementUPMDegree4th65SynchronousOnline
Strength of materialsUPMDegree2nd215Asynchronous and synchronousOnline
Dynamic and seismic analysis of structuresUPMMaster2nd15Asynchronous and synchronousOnline
Theory of structuresUJADegree2nd42AsynchronousOnline
Operations research IUDEPDegree3rd136SynchronousOnline
Operations research IIUDEPDegree3rd152SynchronousOnline
Table 2. Modification of the evaluation methodology of the UJA course. The last column corresponds to the modified methodology to adapt the evaluation to an online final exam. Items marked with an asterisk must be passed independently to pass the course.
Table 2. Modification of the evaluation methodology of the UJA course. The last column corresponds to the modified methodology to adapt the evaluation to an online final exam. Items marked with an asterisk must be passed independently to pass the course.
ItemCriteriaToolWeight (Original Methodology)Weight (Modified Methodology)
Lab practices and use of ICT toolsParticipation and attendance, delivery of well solved reports, report structure and quality of the document.Lecturer’s observation and notes. Reports of the practical sessions10%30%
Theory and problemsMastering of the theory and practical aspects of the course.Final exam90%50%
Study of cases and exercisesWorks and cases proposed in the practical sessions.Deliverable problems0%20%
Table 3. Comparison of students’ performance in each course in 2019 (face-to-face teaching) and 2020 (online teaching).
Table 3. Comparison of students’ performance in each course in 2019 (face-to-face teaching) and 2020 (online teaching).
Registered Students 2020Passing Rate 2020
%
Dropout Rate 2020
%
Passing Rate 2019
%
Dropout Rate 2019
%
Strength of materials21542276022
Construction management657511738
Dynamic and seismic analysis of structures151000919
Theory of structures4250194422
Research operations I1369758416
Research operations II1529818412
Table 4. Results from the survey on student perceptions and degree of fulfillment of expectancies in the courses of strength of materials and construction management taught at UPM.
Table 4. Results from the survey on student perceptions and degree of fulfillment of expectancies in the courses of strength of materials and construction management taught at UPM.
Strongly AgreeAgreeNeutralDisagreeCompletely DisagreeMeanStd. Deviation
(1)19.4%19.4%38.7%21.0%1.6%3.341.06
(2)3.2%16.1%45.2%35.5%0.0%2.870.79
(3)3.2%22.6%27.4%27.4%19.4%2.631.12
(4)1.6%21.0%38.7%29.0%9.7%2.760.95
(5)4.8%22.6%33.9%19.4%19.4%2.741.15
(6)3.3%36.1%44.3%14.8%1.6%3.250.80
(7)41.9%40.3%16.1%1.6%0.0%4.230.77
(8)16.1%40.3%41.9%1.6%0.0%3.710.75
(9)13.3%23.3%38.3%18.3%6.7%3.181.09
(10)8.2%41.0%32.8%13.1%4.9%3.340.97
(11)14.8%41.0%19.7%18.0%6.6%3.391.13
(12)3.3%24.6%50.8%21.3%0.0%3.100.76
(13)19.7%37.7%21.3%16.4%4.9%3.511.13
Table 5. Students’ perceptions on their goal achievements through the remotely oriented work in course of dynamic and seismic analysis of structures taught at UPM.
Table 5. Students’ perceptions on their goal achievements through the remotely oriented work in course of dynamic and seismic analysis of structures taught at UPM.
Strongly AgreeAgreeNeutralDisagreeCompletely DisagreeMeanStd. Deviation
(1)40.9%40.9%15.2%3.0%0%4.200.80
(2)63.6%27.3%9.1%0%0%4.550.66
(3)27.3%54.5%18.2%0%0%4.090.67
(4)54.5%9.1%27.3%9.1%0%4.091.08
(5)45.5%45.5%9.1%0%0%4.360.64
(6)9.1%63.6%27.3%0%0%3.820.57
(7)2.3%54.5%9.1%0%9.1%3.911.08
(8)54.5%45.5%0%0%0%4.550.50
(9)54.5%27.3%0%18.2%0%4.181.11
(10)0%36.4%45.5%18.2%0%3.180.72
(11)9.1%72.7%18.2%0%0%3.910.51
(12)9.1%72.7%18.2%0%0%3.910.51
(13)45.5%36.4%18.2%0%0%4.270.75
Table 6. Feedback from the survey on students’ perceptions and degree of fulfilment of expectancies in course of research operations I taught at UDEP.
Table 6. Feedback from the survey on students’ perceptions and degree of fulfilment of expectancies in course of research operations I taught at UDEP.
Strongly AgreeAgreeNeutralDisagreeCompletely DisagreeMeanStd. Deviation
(1)13.7%42.5%39.7%1.4%2.7%3.630.84
(2)18.5%54.0%26.6%0.8%0%3.900.69
(3)57.5%30.1%2.7%0%9.6%4.261.18
(4)19.2%31.5%38.4%6.8%4.1%3.551.01
Table 7. Feedback from the survey on students’ perceptions and degree of fulfilment of expectancies in course of research operations II taught at UDEP.
Table 7. Feedback from the survey on students’ perceptions and degree of fulfilment of expectancies in course of research operations II taught at UDEP.
Strongly AgreeAgreeNeutralDisagreeCompletely DisagreeMeanStd. Deviation
(1)17.1%53.9%25.0%3.9%0%3.840.74
(2)15.7%57.9%24.8%1.7%0%3.880.68
(3)33.8%42.9%20.8%2.6%0%4.080.80
(4)62.3%31.2%3.9%2.6%0%4.530.69
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

García-Alberti, M.; Suárez, F.; Chiyón, I.; Mosquera Feijoo, J.C. Challenges and Experiences of Online Evaluation in Courses of Civil Engineering during the Lockdown Learning Due to the COVID-19 Pandemic. Educ. Sci. 2021, 11, 59. https://doi.org/10.3390/educsci11020059

AMA Style

García-Alberti M, Suárez F, Chiyón I, Mosquera Feijoo JC. Challenges and Experiences of Online Evaluation in Courses of Civil Engineering during the Lockdown Learning Due to the COVID-19 Pandemic. Education Sciences. 2021; 11(2):59. https://doi.org/10.3390/educsci11020059

Chicago/Turabian Style

García-Alberti, Marcos, Fernando Suárez, Isabel Chiyón, and Juan Carlos Mosquera Feijoo. 2021. "Challenges and Experiences of Online Evaluation in Courses of Civil Engineering during the Lockdown Learning Due to the COVID-19 Pandemic" Education Sciences 11, no. 2: 59. https://doi.org/10.3390/educsci11020059

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop