Next Article in Journal
Students’ Engagement in Education as Sustainability: Implementing an Ethical Dilemma-STEAM Teaching Model in Chemistry Learning
Next Article in Special Issue
A Comprehensive Framework for Comparing Textbooks: Insights from the Literature and Experts
Previous Article in Journal
Towards the 2030 Agenda: Measuring the Progress of the European Union Countries through the SDGs Achievement Index
Previous Article in Special Issue
Sustainability Learning in Education for Sustainable Development for 2030: An Observational Study Regarding Environmental Psychology and Responsible Behavior through Rural Community Travel
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Rethinking Assessment: The Future of Examinations in Higher Education

by
Kelum A. A. Gamage
1,*,
Roshan G. G. R. Pradeep
2 and
Erandika K. de Silva
3
1
Centre for Educational Development and Innovation, James Watt School of Engineering, University of Glasgow, Glasgow G12 8QQ, UK
2
Department of Education, University of Peradeniya, Peradeniya 20400, Sri Lanka
3
Department of Linguistics and English, University of Jaffna, Jaffna 40000, Sri Lanka
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(6), 3552; https://doi.org/10.3390/su14063552
Submission received: 11 February 2022 / Revised: 15 March 2022 / Accepted: 16 March 2022 / Published: 17 March 2022

Abstract

:
The global higher education landscape is significantly impacted as a result of the COVID-19 pandemic and the majority of the universities now follow an online or hybrid mode of delivery. This presents substantial challenges for universities, particularly to conduct examinations, as traditionally most exams were conducted physically on campus. During the first wave of the pandemic, many universities had no option and were forced to move online in a very short period of time, causing universities also to conduct exams online without transforming pedagogy and the structure/s of closed-book exams. Inevitably, in non-proctored and unregulated examinations, this allowed room for students to collaborate and share material during online exams without being noticed by an invigilator as in the case of physical exams. Online exams, also leave room for students to find information online which made preventing plagiarism a significant challenge. This paper investigates the practices used in both closed-book and open-book exams and identifies the challenges associated with the transition to online exams. It also identifies potential ways forward for future online exams, while minimizing opportunities for students to collaborate, plagiarise and use online material. The findings of this study reveal that online examinations affect teachers and students differently: while teachers have mixed feelings about online exams, students are anxious about their grades and the technical hassle they experience in online exams. While viva has emerged as a popular form of alternative assessment, students still feel the need of returning to physical exams. None of the teachers who participated in this study discussed a psychosocial approach to education and exams in this pandemic. We conclude this study on the note that there is a need for the collaboration of social scientists, psychologists, psychosocial specialists, educationists, and humanities scholars/humanists for better educational policy and pedagogical practices during the pandemic.

1. Introduction

Many universities use the conventional education system that requires the physical presence of students and teachers for academic activities. In the face of the COVID-19 pandemic, most universities were not fully prepared to face an exigency of such an extent. Such universities were forced to follow travel restrictions and close down universities to prevent the gathering of students, and then to consider alternate options of delivery. This led to the cancellation of most academic activities, including exams. In the first wave of the COVID-19 pandemic, the transition from physical to online was so rapid; while some universities managed to transfer most of their activities online, some could not keep up the pace due to various reasons such as lack of access to IT facilities and resources, less technical knowledge and experience amongst teachers compared to students, issues of trust and accountability, managing daily life and human interactions amidst the pandemic, etc. [1]. It was a rude awakening that made most universities realize that both IT infrastructure and technical know-how were essential to managing teaching and learning in the COVID-19 pandemic. According to a recent survey with respect to crisis management, most universities have highlighted the importance of the following factors [2]: online learning, international coordination, and collaboration, strong university leadership, proactive, preventative measures, flexibility for assessment deadlines, and exams, stricter sanitation initiatives, and clear communication from university leadership and administrators. With the lessons learned, most universities have now developed modules for instructors on how to teach online effectively and innovatively. But only a few are popularised and go beyond the instructor level into the common use since there is no proper strategy for dissemination and to support innovative teaching [3].
From travel restrictions to social distancing, isolation measures, and closing universities and borders, students worldwide have been impacted by the spread of COVID-19. Since the onset of the pandemic, most universities tried to mitigate the loss of the semester by switching from face-to-face to online teaching to avoid students from gathering [4]. In a survey conducted last year, most students preferred to defer their studies until 2021 [2]. During the first wave of the pandemic, conducting exams was a challenge for universities, especially in instances where online mode was not well-practiced. Some universities took the option of “no-detriment” where students were ensured that their final grade will not be lower than their average performance. In some instances, their previous performance was considered in providing the grades, while other measures include altering the assessment format [5,6].
Two major types of physical exams that are conducted in universities are Open Book Exams and Closed Book Exams. Both these tests are timed tests where students answer different types of questions (MCQs, essays, short-answer, etc.). In Open Book Exams, students are allowed to refer to their notes or textbooks (or a self-prepared “cheat sheet” while taking the test. Closed-book tests require writing answers from memory, especially with respect to retrieval from long-term memory [7]. Therefore, high marks may be gained on closed book exams by students with a well-organized memory [8]. Here the test does not allow reading (or bringing for that matter) of any other material (books, notes, other printed or digital sources) into the examination hall. Alternative methods of assessments involve in-class observations, take-home assignments, practical exams, projects, and portfolios [9].
In previous studies [10,11], it is shown that students taking online exams have shown lower test anxiety compared to the group with in-class exams. This is possibly due to the high flexibility in online exams, where students can face the test when they feel more prepared, and in a familiar environment with less anxiety-provoking stimuli. Also, one can customize the exam time to better match with his or her sleep/wake cycles, resulting in improved exam performance [12].
Despite the advantages of online examinations, there are inherent drawbacks to this method too. For example, there can be issues such as lack of sufficient resources, hardware/software problems, having a reliable power supply, and bearing of added costs for these facilities [13]. Also, it has been shown that some of the tests have subject-related problems. For example, in mathematical calculations, students might need to transfer the paperwork to the computer in an online setup. This will require more time than simply deriving the answer on paper [14]. They note that teachers also have to master the necessary IT skills for online delivery in a creative manner, in which their preparation time shall be longer (2013) and this may be true in the case of (online) marking of online examinations too. Another concern for teachers is the increased temptation to cheat in an unsupervised setting [13,15]. A study by King et al. [16] shows that it is easier for students to cheat in online examinations.
This study aims at examining how assessment strategy needs to be rethought in the COVID-19 pandemic and identifying potential ways forward for future online exams while minimizing opportunities for students to commit acts of academic misconduct. In Section 1, we explore the background to learning and assessment, and the three domains of learning that Benjamin Bloom introduced for developing curricula and educational goals successfully. While Section 2.3 traces the practices used in physical exams, Section 3 examines the application of learning taxonomies in exams. Section 4 examines the challenges pertaining to transitioning to online exams with a special focus on how to prevent academic misconduct during online exams. Section 5 explains the methodology used in this study and Section 6 presents the findings of this study organized under two perspectives: the voice of teachers and the voice of students. This paper ends with Section 7 which offers a brief discussion of the research findings and the conclusion.

2. Background to the Study

2.1. Aligning Learning and Assessments

With the introduction of the Bologna Process in 1999, the pedagogical model in higher education has shifted from mere knowledge accumulation to skills acquisition [17]. This change made assessing student learning outcomes the standard in describing qualifications and qualification structures. It is worth noting that the assessments for a particular course can be considered “valid” if they can properly communicate “what?” and “to which extent?” the students have learned from that course.
According to Poutasi, taking examples from New Zealand, students use digital technology in nearly every aspect of their education, so they must be assessed in a way that could reflect their learning which is digitally enabled [18]. This means that the traditional “pencil and paper” examinations may not give a complete representation of the academic capability of present-day students. Therefore, this diversity of skills produced through the change from literacy to multi-literacies has didactic consequences for assessment [19]. Hence, it becomes the responsibility of teachers (as individuals) and universities (as entities) to properly match the type of assessment to measure the student achievement of learning outcomes in a course or program. Similarly, learning activities need to be prepared to align with the assessment types and the competencies described for the course.
On a practical note, in order to align these pedagogical aspects properly, a taxonomy table can be used. The table focuses on cognitive processes and the types of knowledge needed to achieve the standards in a course; so, this will reflect more on student learning than performance. Also, this table can help analyze the results of assessments based on their possible impact on curriculum and instruction, for necessary revisions or improvements [20].
Furthermore, as pointed out by Ajjawi et al. [21] designing of assessments needs to contain aspects of the industry as well, in order to establish the professional identities of students once they enter the job world. A workflow model developed by Guerrero-Roldán, and Noguera [22] involves aligning the assessment according to competencies, learning outcomes, activities, assessment criteria, indicators, and feedback. This model has promising implications in the fields of student- and competence-based learning and e-assessments.

2.2. Three Domains of Learning

In developing curricula and educational goals, a gradual process of information and concepts needs to be considered according to the growing capacities in the development of a student. The first person to develop these classifications and divide them into stages is Benjamin Bloom, an educational psychologist at the University of Chicago in 1956. He developed the “Bloom’s taxonomy,” a classification of levels of study goals that teachers can set for their students. He divided the goals into three areas: cognitive, behavioral, and psychomotor. This classification depends on a structural scheme so that learning a higher skill requires knowing the lower skill in the pyramid.

2.2.1. Cognitive Domain

Knowledge, comprehension, and the development of intellectual attitudes and skills fall into the Cognitive Domain. This domain contains skills mainly related to thinking processes such as information processing, understanding, applying knowledge, solving problems, and research [23]. This has six main levels where the complexity of mental operation increases moving up the levels. Both Bloom’s and Anderson’s taxonomies are developed based on this domain.

2.2.2. Affective Domain

The Affective Domain includes the set of skills in which people deal with things emotionally such as feelings, values, appreciation, enthusiasms, motivations, and attitudes. This domain is categorized into 5 sub-domains, viz. Receiving, Responding, Valuing, Organization, and Characterization [23]. Here, the student moves from being aware of what they are learning to a stage of having internalized the learning so that it plays a role in guiding their actions [24].

2.2.3. Psychomotor Domain

Although taxonomies for Cognitive and Affective domains were developed early, the Psychomotor Domain was not fully described until the 1970s. This domain comprises utilizing and coordinating motor skills. Categories under this domain include Imitation, Manipulation, Precision, Articulation, and Naturalization. The categories in this competence indicate the development in learning from initial exposure to final, unconscious mastery [24].

2.3. Practices Used in Physical Exams

Some practices in two common types of examinations are summarized below:
Closed-book exams: In the closed-book type of tests, test takers are not allowed to view or peruse any source of information, and the test involves retrieving information from memory.
Open-book exams: These types of tests mainly involve using and viewing any source of information (books, notes, reports, etc.) during the exam. These types of tests appear to reduce stress, but research shows that students do not essentially perform significantly better on open-book tests, as these exams seem to reduce students’ motivation to study. Open-book tests tend to be inappropriate for introductory courses where facts must be learned, or skills thoroughly mastered if the student is to progress to learn complicated concepts in advanced courses [25].
The routine practices in physically conducting exams include allocating space for seating arrangements, invigilation, checking the identity of exam candidates, checking for possession of any unauthorized materials, and distribution and collection of question papers/answer sheets on time. Following this, examiners will read and evaluate the answers for feedback or grading purposes.
Especially during a pandemic situation like COVID-19, additional requirements such as spacing, seating arrangements, and sanitation facilities too add to the cost of conducting exams. Another challenge arises during student identity verification processes as students are required to remove face masks for physical identification which can, in turn, pose a health concern at a time when a respiratory disease prevails. This can also be disturbing for students too when they are stressed against time during the exam. Also, when evaluating answers to papers by a panel of examiners, there can be some threat of fomite transmission as the papers will go from one person to another.

3. Learning Taxonomies—Application in Exams

A curriculum is prepared by incorporating learning objectives to let students know what is expected from them by learning the course. Learning/teaching activities and assessments for a course should focus on achieving these learning objectives. Based on the teaching activities, the mode(s) of delivery (lectures, discussions, role play, presentations, activities like practicals, group activities) can be decided together with the lesson materials (notes, slides, hand-outs, videos, audios, books, further reading, etc.). In order to realize a good course design, it is important to align these three aspects, viz. learning outcomes, activities, and assessments properly [19]. Otherwise, even the best teaching will not show a high student performance at the assessment level, which will raise questions on the validity or the “use” of following that course for a student [20].
For this purpose, learning taxonomies have been introduced. Learning taxonomies or classifications are usually used as a way of describing different types of learning behaviors and characteristics expected to be developed by students. Often, they are utilized to identify different steps in the learning development, and hence act as a useful tool in distinguishing the appropriateness of particular learning outcomes for particular courses [24].

3.1. Bloom (1956)

Bloom’s taxonomy of educational objectives is a well-accepted explanation for different types of learning in the cognitive domain and is widely used to develop learning objectives for teaching and assessment. This subdivides the academic skills into six categories viz. (1). Basic knowledge, (2). Secondary comprehension, (3). Application, (4). Analysis, (5). Evaluation and (6). Synthesis [26]. The first two categories do not require critical thinking skills, but the last four involve higher-order thinking that requires critical thought; the “open-ended” questions fall into this category. The definitions show a smooth transition from theory and content to practice whereby teachers can use specific exam designs to test students in each category. However, the correct use of higher-order thinking requires the knowledge and comprehension of the basic content; thus, all levels of thinking needs to be encouraged [27].

3.2. Bloom’s Taxonomy Revised—Taxonomy of Anderson et al. (2001)

Anderson and Krathwohl [28] revised the original Bloom’s Taxonomy to incorporate learning theory and practice to produce a two-dimensional scheme: The cognitive Process Dimension and the Knowledge Dimension. Here 4 types of knowledge (factual, conceptual, procedural, and metacognitive) are associated with the cognitive processes so that more learning objectives can be defined. This also shows that higher cognitive processes employ multiple types of knowledge content. The advantage here is that it leads to a clearer understanding to formulate learning objectives and design assessments for a course [20]. Here, 19 cognitive processes have been identified which can be used as verbs when writing learning objectives for a particular course.

3.3. The SOLO Taxonomy

The alternative to Bloom’s Cognitive Domain that is commonly utilized is the Structure of Observed Learning Outcomes (SOLO) Taxonomy developed by Biggs and Collis [29]. This comprises of five levels of describing student performance in exams viz. (1) Pre-structural, (2) Uni-structural, (3) Multi-structural, (4) Relational, and (5) Extended Abstract.
According to SOLO Taxonomy, the structural complexity increases with increasing student learning. This taxonomy allows to determine student performance both quantitatively, as to how much detailed the responses are to questions and qualitatively viz. as to how well the responses are structured) [29]. At the early stages of learning, the amount of details is high. However, as learning progresses, it becomes more qualitative as details are combined into a structural complex and are of higher-order learning. SOLO taxonomy can be applied to categorize student answers, so as to determine a student’s capacity to coherently connect concepts and relate them to new ideas [29,30]. Therefore, this can be used as a model to encourage and develop a deeper approach to student learning [31].

3.4. Six Facets of Understanding (Wiggins and McTighe)

Grant Wiggins and Jay McTighe discussed the six facets of understanding in 1998. This was developed to help teachers determine if students have a deep understanding of the concept or idea that is being taught. This framework has six levels of understanding, namely (1) Explain, (2) Interpret, (3) Apply, (4) Have perspective (5) Empathize, and (6) Have self-knowledge. While one’s understanding can be expanded by questions that arise from reflection, discussion, and application of ideas, a comprehensive and mature understanding will involve the full development of all six types of understanding [32].

3.5. Taxonomy of Significant Learning (Fink)

Fink’s Taxonomy of Significant Learning (2003) is made up of six key domains in which truly meaningful learning occurs: (1) Learning How to Learn, (2) Foundational Knowledge, (3) Application, (4) Integration, (5) Human Dimension, and (6) Caring. This taxonomy provides educators with a comprehensive structure to guide the designing of educational courses which can produce “significant learning” where it is required that there will be some kind of important, lasting change in the learner’s life [33]. Fink’s taxonomy is not hierarchical, and it includes both cognitive and affective domains [24].

4. Challenges of Transition to Online Exams

Initially, the transition from physical to online teaching has seen much unacceptability, where this comes from both students and academia due to reasons such as technical barriers, resistance to change, less know-how of using new technology, policies of universities and their governing bodies [34,35,36].
Another issue was that staff were reluctant to move away from paper-marking and develop or convert questions to the electronic format as it requires a lot of time and effort. The capacity to align classroom assignments by developing assessment rubrics is a meta-level skill, not common to all faculty [37]. This too deters teachers from converting paper-based tests into an online format.
However, now with some training/experience and realizing and appreciating the possible advantages, online teaching and assessment have rapidly expanded over the past few semesters. It is expected to see a further increase in hybrid/blended learning programs in the near future [38].
Combined with face-to-face teaching, online learning will have definite benefits viz. using a learning management systems (LMS) can provide a lot more activities the students can engage in, this will also help to assess student achievements and how far they have gained the learning outcomes more closely apart from how they perform during an exam. So it can be stated that online tools are richer ways of measuring learning and achievement.
However, it can be seen that students find different ways of cheating to get better grades in exams. This is true even in online exams. Some of these methods include sharing of questions/answers among student groups when online exams that remain open and flexible to be taken at different times, plagiarism, impersonation, referring to sources for information when the exam is termed as “closed-book” etc. Academic dishonesty has been categorized as academic fraud by Becker et al. [39]. According to King et al., cheating is described as a “transgression against academic integrity which entails taking an unfair advantage that results in a misrepresentation of a student’s ability and grasp of knowledge” [16].
In the traditional environment, the usual ways of cheating include peeking, whispering, using technological devices or concealed methods like passing small pieces of paper, contract cheating, texting via phones, using smart glasses and smartwatches, etc. [40,41]. However, the opportunity to cheat becomes less when all notes, electronic devices, and other materials are put away and students are being watched over by invigilators [42].
It has been shown in a study that random seat assignments together with an increased number of invigilators can eliminate the opportunity for cheating during physical examinations [43]. However, these traditional methods are not available in online exams, but technology itself has presented students the chances to cheat e.g., students can text, “chat” in social media or email each other to compare answers, they can copy-paste answers from their notes and the internet [44].
Many previous studies show that students cheat in exams, especially in online exams [16,42,45,46,47]. Thus, it is essential to maintain integrity in online exams as well, in order to both preserve the integrity of education as a whole and on the other hand to preserve the fairness of exams for all students [44]. Therefore, additional efforts must be taken to prevent academic misconduct in online exams.

4.1. How to Prevent Academic Misconduct during Online Exams?

Academic integrity must be maintained throughout irrespective of global pandemics; the way to ensure it depends on how the learning outcomes of a program are assessed by the higher education institution. It is the responsibility of the higher education institutions to prompt students of the regulations already in place regarding academic integrity and to highlight that these continue to apply at all times. Several measures can be introduced in order to mitigate academic misconduct when conducting examinations online. Some key measures include the following:

4.2. Use of Software and Video Proctoring

There is software that can detect plagiarism to catch a cut and paste job, and which can prevent the opening of other computer applications during the exam period. Software like Blackboard’s Respondus Lockdown Browser can be used for students to access online exams [42]. Video proctoring software (ProctorU, Examinity, etc.) can recreate the physical invigilation in order to detect cheating. The students can take their exams at home in front of a webcam which can be recorded for later review. The same can be achieved via ZOOM or other video conferencing platforms. In order to consider time restraints, the exam must be scheduled for all students to take the exam at the same time. Some issues regarding video proctoring involve security and privacy; students can step away for a restroom break, where they can consult other materials or people. If no video proctoring is done, students can still photograph the exam for future students, in which case the exam questions must be changed for the next time.
Another issue that arises in online exams is where students should confirm their identity to prevent others from taking the exam on behalf of the actual candidate (impersonation). This can be overcome by using passwords, thumbprints, or cornea scans to log in to the exam session [48].

4.3. Exam Design

Exam design is a key area that needs to be discussed in addressing integrity concerns. To protect assessment integrity, the key here is to avoid questions that can easily be answered, shared by students, or referred during the exam, particularly in online exams that are not invigilated. Exam questions that need issue-spotting, problem-solving, and extensive reasoning are effective as such questions require higher-order thinking skills. Setting up exam questions in a way to test higher-order learning skills is another method to prevent students from cheating in exams. According to Bissell and Lemons, preparing questions that demand both content knowledge and critical thinking of students can improve student metacognition while being easily modifiable to accommodate unpredicted answers as well [27]. With a refined scoring rubric to these questions, they can be marked more rapidly and reliably. The only drawback is that preparing these questions and answer schemes can take time. For written assignments, combining straight-answer questions with brief explanations or reasoning can help students develop deep learning skills [49]. In order to prevent students from collaborating, the exam can be set to have a limited login time, present questions as they answer without back-tracking, randomize exam questions/answers for different students, and change exam questions every term [42]. A study by Suryani shows that individualized questions can be successfully prepared and scored using the Excel spreadsheet for accounting students based on their student ID numbers [11].
At present, web tools such as TALOE are developed to assist teachers in order to determine the most appropriate type of e-assessment by analyzing the learning outcomes of the module/course that is being taught [17]. With this, teachers can modify the exam design or even the learning outcomes which are already in place to better align with the course/program objectives. In the case of using questions from pre-published test banks, it is recommended that teachers must paraphrase these questions to prevent students from finding answers directly from the internet [47].

4.4. Additional or Other Assessments

Additional assessments conducted throughout the semester can reduce the weight of the final exam, whereby it reduces the need or opportunity of students to cheat during final exams (52). This is because students can focus on understanding fewer learning outcomes at a time and reduce their pressure. Further, it can help teachers continuously assess student performance to see whether they have understood what is taught.

4.5. Using Honour Codes, Warnings, and Penalties

Another critical aspect of maintaining academic integrity is advising and making students knowledgeable to inculcate competencies and values required to conduct ethical learning. It is recommended that faculty and universities should have clear policies on identifying and penalizing academic misconduct. The honor codes, guidelines, penalties, etc. must be available in print form to students and openly discussed with students so that they become aware and be responsible at all times [50]. Thus, a “culture” of being academically genuine can be established within universities, so that it will help students realize its long-term benefits. But honor codes alone might be insufficient. An investigation by Corrigan-Gibbs et al. has proved that a prior warning on consequences of cheating deters the misconducted behavior by about 50% compared to using honor codes, as warnings tend to “scare-off” students [45].

5. Methodology

This study employed a small-scale conventional interview method to investigate the perception of the staff and students on online exams. The research sample includes staff members and students who voluntarily participated in an interview, and this study analyzes the interview responses of all participants. Table 1 shows details of the staff members and students interviewed.
The participants of this interview have experience in either conducting or facing online exams. In terms of the structure of exam questions, all of them have experience in a broad spectrum of exam questions, from MCQs to short answers, essays, and practical tests.
The interview questionnaire included a comparison of online and physical exams in terms of the time required for preparation and conducting (for teachers) and time required for answering (for students), and assessment security and the format of questions (for both groups). Apart from that, the questionnaire also explored the difficulties faced by the participants and gathered suggestions for improvements.

6. Research Findings

6.1. Voice of Teachers

6.1.1. A Common Approach to Conduct Online Exams

All teachers interviewed in this survey utilized a way to allow students to register and connect online via an LMS with a proctoring tool installed before the test. The questions were then loaded on the computer screen and students were allowed to type their answers into the portal. Alternatively, students were asked to write the answers on paper for the questions they see on screen, followed by taking snapshots of the answer scripts and then uploading a PDF version of the snapshots to the portal.

6.1.2. The Preparation of Online Exam Papers Is More Time-Consuming than That of Physical Exams

All lecturers state that it is more time-consuming for them to prepare exam papers. All academic staff who participated in the interviews revealed that they individually customize certain questions in each paper and upload questions one by one onto an online portal. With network traffic, preparing and uploading exam papers has become a time-consuming process. Both Participant 1 and Participant 2 revealed that they use student ID numbers to generate questions with different parameters. Participant 1 elaborates:
I spend a lot more time setting up online exam questions because students copy, right? They have Whatsapp groups etc. They simply chat and circulate answers. So, to avoid that, I try my best to customize the questions so that different students will get different parameters in the questions. So, I use student index numbers and change the values in certain questions so that students will get different diagrams and values as answers. which will prevent the students from copying… and I have found that effective. So, it takes a lot of time to prepare exam questions. In physical exams, you prepare only one paper…
Further, the other teachers interviewed too revealed that they prepare open-ended questions for online exams to avoid students directly copying from the internet.

6.1.3. Conducting Exams and Enabling Security Measures in Online Exams via Various Tools

In terms of exam types, all lecturers mentioned using a Learning Management System (LMS) to post the questions in online exams and invigilators to monitor physical exams. Two participants discussed the use of the WatchGuard software as a proctoring tool in online exams and informed the students, in advance, about the consequences of cheating. One participant shared the proctoring experience thus:
In the proctoring tool, students must sign the agreement form, and then take a picture with their ID read their details with the camera and microphone on. Once it is opened, they will have to keep the application open till the end of the exam. During that period, we randomly capture the screen and record the sounds. In that case, students have to switch on the camera and unmute the microphone. We make students aware of what will happen if they cheat. Before exams, we explain [it to them], so I think because of the proctoring tools and the awareness, they did not cheat…
Another participant stated that the proctoring tool gives the academic staff access to students’ camera, microphone, and the window/s active on their computer. Furthermore, screenshots and audio clips are taken at certain intervals during the exam if any movements are detected. Another strategy that academic staff employs in order to prevent plagiarism in online exams is using open-ended questions whereas in physical exams direct answers are expected. All participants emphasized the importance of questions that require critical and analytical thinking skills so that students cannot find answers to questions on Google.

6.1.4. Student Worries over Technical Issues in Online Exams

None of the lecturers mentioned any tech-related issues, but they pointed out the issue of power failures and low internet bandwidth that had interrupted students during the exam. As a solution, certain teachers had given extra time for students to upload their answers via the portal. Another participant recalled including a late submission link for students with low-bandwidth internet access.

6.1.5. Challenges to Assessment Security and Academic Integrity in Online Exams

All lecturers who were interviewed are dissatisfied with online exams and believe that online exams pose a threat to maintaining fairness and ensuring academic integrity. One participant emphasized that online exams should be protected against plagiarism, collusion, and other forms of academic misconduct before, during, and after exams. Another academic reported about capturing the screen of a student who paused the camera using some kind of technology:
They try to escape always…I had one incident where we captured a screen of a student pausing (the camera). I don’t know what caused it but they used something to cheat, because that student’s pause is always the same for the entire two hours of the examination…

6.1.6. Viva as a Popular Alternative Assessment

One participant proposed having a short viva session post-exam:
I know it’s difficult and time-consuming, but if we can have follow-up exams with a short viva session, we can tally the answers and check whether it is the same student who has answered the exam questions… It is good because we can test whether the student has knowledge or not… I think, in some foreign countries, they stick to only MCQ questions, and they randomize the questions and answers so that the students will end up receiving a different set of questions and answers.
Participant 5 has an advantage over her subject:
In my subject, I can do a lot of practical stuff (presentation, reports, surveys, etc.).. so I can mitigate plagiarism because it involves individual work… so in my case, it’s really difficult to copy or plagiarise…

6.1.7. Back to Physical Exams after the Pandemic

80% of the respondents wanted to go back to the physical and traditional method of conducting exams. On the contrary, one academic spoke in favor of online exams and explained why online exams must be the new norm:
The government has a responsibility to expand internet access to all, treating it as an essential commodity. If properly set, online exams are a good way of continuing exams in the future as well. We cannot expect written exams to stay in the future, because you can see all over the world that there are so many other options coming in for examinations and studies. Installing checks and installing methods to maintain integrity (in online exams) would be the way to solve this problem in my opinion…

6.2. Voice of Students

6.2.1. Similar Experiences, Different Institutes

The students who participated in this interview are undergraduates and 3 postgraduate students from different higher education institutes across Sri Lanka (Table 1). All these participants have experience in typing answers into an LMS, or answering questions appearing on a screen via writing on a paper and then uploading a PDF of the image of the answered scripts to an online portal. Regarding technical issues, many of the respondents cited power failures and poor internet speed. In addition, one participant mentioned the possibility of overlooking questions while scrolling down the question page in online exams. In terms of security measures, most of these respondents were asked to switch on their devices with a camera and microphone, and stay connected till the exam ended. However, one respondent had only faced open-book exams—so, she was not asked to switch on her camera or microphone during the exam. Overall, Participant 12 and Participant 13 were satisfied enough with the online exams while Participant 9 and Participant 8 did not have a clear response due to the nature of the question papers they had answered. One participant, being a Literature student, had to cite her references during her exams. What is noteworthy is that all the respondents stated that physical exams were much more secure than online exams, and one participant mentioned that although there is a camera on, students may have another device or sticky notes in front of them.

6.2.2. Improving Student Experience in Online Exams

When asked about suggestions for improvement, Participant 7 and Participant 10 mentioned time constraints, while one participant expressed the need for improving internet connectivity. Another participant proposed having breakout rooms in online invigilation:
We don’t have a good exam system. Video conference systems are good for lectures, but (during exams) if we can break the students into 10 and put them into breakout rooms and assign invigilators that would be good…

6.2.3. Greatest Challenges Faced in Lockdown

Many students expressed discontent over poor internet connectivity, especially during the first wave of the pandemic. One participant emphasized on losing touch with friends and not having the chance to study in groups:
Before (the pandemic) we had friends studying together—there was a study group. This time we had to do everything alone, and that was challenging.
Two participants expressed how challenging it was to find quiet places to study, while one participant was stressed out about exam preparation:
We were scared whether we could finish it on time, and what the questions will be like because definitely, they will not be like exam questions in physical exams…and we did not do even one practical in the laboratory. So, it was difficult to answer theory questions based on practicals, because we did not have a proper idea about how to do the experiments…

6.2.4. Students Prefer Alternate Exams

All the respondents liked the idea of having alternative exams. In Participant 6’s words:
Quizzes and assignments will help polish our knowledge and they will support our exams. So, final marks can be taken [computed] from all these activities, so if there are any issues with online exams the [negative] effects will be reduced with these options.
When asked their opinion on alternate exams, three participants explained that using alternate exams as means of improving their knowledge is effective:
Participant 8: Reports and presentations will be more effective than exams.
Participant 11: I would say viva and MCQs are very good, but in elaborate assignments where you do a lot of research, you learn a lot… so there’s some sort of hard work going into that assignment…
Participant 10: Group projects are better than exams…because we did a math lab project and I learned many things I couldn’t have learned by studying for an exam…

6.2.5. Let’s Go Back to “Normal”

When consulted as to how they would like to face their exams after the pandemic is over, all respondents preferred going back to the physical examination method. This was similar to the response seen in the majority of the teachers.

7. Discussion and Conclusions

Closure of higher education institutes has resulted in the students’ academic loss, and the mitigation of this loss seems even more difficult. It is imperative that both teaching and assessment strategies must be shaped to reduce academic loss during a time of hardship [51]. Speaking about academic loss, what is more, disheartening is that certain students experience academic loss double- or triple-fold: Not all students are affected by the pandemic in the same way and depending on personal circumstances, certain student groups are double- or triple-marginalized.
By looking at both responses from teachers and students, it can be seen that different tests, open and closed book questions, as well as exam structures (from MCQs to essays), are used. These are to test different skills and levels of the Three Domains and different learning taxonomies [25] while preserving academic integrity and improving student knowledge. This research reveals that the traditional three-hour papers are no longer appealing to students; instead, they prefer to improve their knowledge while getting scores in alternative assessment methods. Hence, it is expected that teachers should accept new teaching practices and evaluation methods [51]. The academics who participated in this study have recognized the importance of spending more time customizing exam questions. Although the use of proctoring tools is equally appreciated by students and teachers, both groups still wish to revert to paper-based exams, as both parties are still not confident with technology. This reiterates the need for good IT accessibility, especially to get the benefits of online education [1]. As reflected by the teachers, students are most affected during the pandemic and even teachers mention student difficulties over theirs in this interview. So, it is clear that the academics who participated in this research are sensitive to student concerns, although challenges such as poor internet connectivity and power issues are beyond their control. Yet, it is also noteworthy that the teachers who prepare innovative exam papers for online exams are probably teachers who may have never faced online exams themselves. In that case, there arises a question as to how these teachers test the achievability or the un-achievability of the benchmarks they set in their exams. All the teachers who participated in this study reveal how they secure their exams from any form of academic misconduct or exam offenses by making the exam questions open-ended. Yet, none of the teachers discussed the different dynamics that involve the marking of the innovative and hybrid papers they design. A question worth asking is whether these teachers employ the same old modes of assessing and grading these papers. Another question is whether marking and grading in the pandemic is strict or if they show any leniency considering the many challenges that students face in these difficult times. Furthermore, although, understandably, all universities are faced with the challenge of maintaining standards, a question that needs to be raised is how violent these “standards” are when you look at the different experiences of different students in the pandemic. Are these online tests and exams fair for all students? Are standards and benchmarks fair when individual differences and circumstances are discounted in these processes? This study reveals the efforts of the teachers in maintaining academic integrity in the face of the COVID-19 pandemic. Yet, none of these participants discussed a psychosocial approach to education and exams in this pandemic. Against this backdrop, the need for social scientists, psychologists, psychosocial specialists, educationists, and humanities scholars/humanists is greatly felt. Education policy and practice in the pandemic would benefit more through the effective collaboration of these groups.
At the onset of the pandemic, one point has been made clear: Every institution needs to develop an overall digital learning strategy based on academic program levels in order to picture how teaching and learning need to be designed under the “new normal”. The strategy must take into account what plans work and what does not, what works for whom, and what does not work for whom. Measures must be taken to improve diversity and inclusivity in exam design too. This is the prime responsibility of the university administration, in order to become resilient and to provide continuous education for the students with diverse experiences and needs.
This research fills a gap in the existing literature on rethinking assessment strategy during the COVID-19 pandemic. However, this study has its own limitations as the research sample consists of only 13 participants with only one representational participant from 13 disciplines across faculties. This study may have overlooked many disciplines that use online examinations during the pandemic. Future research in this area could take this as a point of departure and build upon the experiences of students from a diverse range of disciplines.

Author Contributions

All authors contributed equally and substantially to elaborate on this article. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Anu, V. Online Learning Challenges & Solutions Read All about It. 2021. Available online: https://www.embibe.com/exams/online-learning-challenges-and-solutions/amp/ (accessed on 19 June 2021).
  2. Quacquarelli Symonds. How COVID-19 Is Impacting Prospective International Students. 2020. Available online: https://www.qs.com/portfolio-items/how-covid-19-is-impacting-prospective-international-students-across-the-globe/ (accessed on 15 June 2021).
  3. Southwell, D.; Gannaway, D.; Orrell, J.; Chalmers, D.; Abraham, C. Strategies for effective dissemination of the outcomes of teaching and learning projects. J. High. Educ. Policy Manag. 2010, 32, 55–67. [Google Scholar] [CrossRef]
  4. Gamage, K.A.A.; de Silva, E.K.; Gunawardhana, N. Online delivery and assessment during COVID-19: Safeguarding academic integrity. Educ. Sci. 2020, 10, 301. [Google Scholar] [CrossRef]
  5. QAA. ‘No Detriment’ Policies: An Overview. 2020. Available online: https://www.qaa.ac.uk/docs/qaa/guidance/no-detriment-policies-an-overview.pdf (accessed on 25 June 2021).
  6. QQI. Guiding Principles for Alternative (Devised in Response to the COVID-19 Emergency Restrictions). 2020. Available online: https://www.qqi.ie/Downloads/Guiding%20Principles%20for%20Alternative%20Assessment%20%28COVID-19%29%2018-11-20.pdf (accessed on 24 July 2021).
  7. Rummer, R.; Schweppe, J.; Schwede, A. Open-book versus closed-book tests in university classes: A field experiment. Front Psychol. 2019, 10, 463. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Black, P.J. University examinations. Phys. Educ. 1968, 3, 93–99. [Google Scholar] [CrossRef]
  9. Ilgaz, H.; Gülgün, A.A. Providing Online Exams for Online Learners: Does it really matter for them? Educ. Inf. Technol. 2020, 25, 1255–1269. [Google Scholar] [CrossRef]
  10. Stowell, J.; Bennett, D. Effects of online testing on student exam performance and test anxiety. J. Educ. Comput. Res. 2010, 42, 161–171. [Google Scholar] [CrossRef]
  11. Suryani, A.W. Individualized Excel-Based Exams to Prevent Students from Cheating. J. Account. Bus. Educ. 2016, 5, 14–24. [Google Scholar] [CrossRef]
  12. Hartley, J.; Nicholls, L. Time of day, exam performance and new technology. Br. J. Educ. Technol. 2008, 39, 555–558. [Google Scholar] [CrossRef]
  13. Tippins, N.T.; Beaty, J.; Drasgow, F.; Gibson, W.M.; Pearlman, K.; Segall, D.O. Unproctored Internet testing in employment settings. Pers. Psychol. 2006, 59, 189–225. [Google Scholar] [CrossRef]
  14. Laine, K.; Anderson, M. Electronic Exam in Electronics Studies. In Proceedings of the 44th SEFI Conference, Tampere, Finland, 12–15 September 2016; pp. 12–15. [Google Scholar]
  15. Bloemers, W.; Oud, A.; Dam, K.V. Cheating on Unproctored Internet Intelligence Tests: Strategies and Effects. Pers. Assess. Decis. 2016, 2, 21–29. [Google Scholar] [CrossRef] [Green Version]
  16. King, C.G.; Guyette, R.W.; Piotrowski, C. Online exams and cheating: An empirical analysis of business students’ views. J. Educ. Online 2009, 6, 1–11. [Google Scholar] [CrossRef]
  17. Gil-Jaurena, I.; Softic, S.K. Aligning learning outcomes and assessment methods: A web tool for e-learning courses. Int. J. Educ. Technol. High Educ. 2016, 13, 17. [Google Scholar] [CrossRef] [Green Version]
  18. Poutasi, K. SPANZ 2017. Available online: https://www.nzqa.govt.nz/assets/About-us/Future-State/NZQA-SPANZ-address-2017.pdf (accessed on 25 July 2021).
  19. Cartner, H.; Hallas, J. Aligning assessment, technology, and multi-literacies. E-Learn. Digit. Media 2020, 17, 131–147. [Google Scholar] [CrossRef]
  20. Airasian, P.W.; Miranda, H. The role of assessment in the revised taxonomy. Theory Pract. 2002, 41, 249–254. [Google Scholar] [CrossRef]
  21. Ajjawi, R.; Tai, J.; Nghia, T.L.H.; Boud, D.; Johnson, L.; Patrick, C.J. Aligning assessment with the needs of work-integrated learning: The challenges of authentic assessment in a complex context. Assess. Eval. High. Educ. 2020, 45, 304–316. [Google Scholar] [CrossRef] [Green Version]
  22. Guerrero-Roldán, A.E.; Noguera, I. A model for aligning assessment with competences and learning activities in online courses. Internet High Educ. 2018, 38, 36–46. [Google Scholar] [CrossRef]
  23. Hoque, E.M. Three Domains of Learning: Cognitive, Affective and Psychomotor. J. EFL Educ. Res. 2016, 2, 45–52. [Google Scholar]
  24. O’Neill, G.; Murphy, F. Guide to Taxonomies of Learning. UCD Teach Learn. 2010. Available online: http://www.ucd.ie/t4cms/UCDTLA0034.pdf (accessed on 14 July 2021).
  25. Clay, B. Is This a Trick Question? A Short Guide to Writing Effective Test Questions. Kansas Curriculum Center, USA. 2001. Available online: https://kgi.contentdm.oclc.org/digital/collection/p16884coll42/id/147/ (accessed on 5 July 2021).
  26. Bloom, B.; Engelhart, M.; Furst, E.; Hill, W.; Krathwohl, D. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I; Longmans, Green & Co: New York, NY, USA; Toronto, ON, Canada, 1956. [Google Scholar]
  27. Bissell, A.N.; Lemons, P.P. A new method for assessing critical thinking in the classroom. Bioscience 2006, 56, 66–72. [Google Scholar] [CrossRef] [Green Version]
  28. Anderson, L.W.; Krathwohl, D.R. A Taxonomy for Learning, Teaching and Assessing; Longman: New York, NY, USA, 2001. [Google Scholar]
  29. Leung, C.F. Assessment for learning: Using SOLO taxonomy to measure design performance of Design & Technology students. Int. J. Technol. Des. Educ. 2000, 10, 149–161. [Google Scholar]
  30. Newton, G.; Martin, E. Research and Teaching: Blooming, SOLO Taxonomy, and Phenomenography as Assessment Strategies in Undergraduate Science Education. J. Coll. Sci. Teach. 2013, 43, 78–90. [Google Scholar] [CrossRef]
  31. Lucander, H.; Bondemark, L.; Brown, G.; Knutsson, K. The structure of observed learning outcome (SOLO) taxonomy: A model to promote dental students’ learning. Eur. J. Dent. Educ. 2010, 14, 145–150. [Google Scholar] [CrossRef]
  32. Wiggins, G.; McTighe, J. Understanding by Design; Association for Supervision and Curriculum Development: Alexandria, Egypt, 1998; pp. 85–97. [Google Scholar]
  33. Branzetti, J.; Gisondi, M.A.; Hopson, L.R.; Regan, L. Aiming Beyond Competent: The Application of the Taxonomy of Significant Learning to Medical Education. Teach. Learn. Med. 2019, 31, 466–478. [Google Scholar] [CrossRef]
  34. Wuthisatian, R. Student exam performance in different proctored environments: Evidence from an online economics course. Int. Rev. Econ. Educ. 2020, 35, 100196. [Google Scholar] [CrossRef]
  35. Milone, A.S.; Cortese, A.M.; Balestrieri, R.L.; Pittenger, A.L. The impact of proctored online exams on the educational experience. Curr. Pharm. Teach. Learn. 2017, 9, 108–114. [Google Scholar] [CrossRef]
  36. Williams, J.B.; Wong, A. The efficacy of final examinations: A comparative study of closed-book, invigilated exams and open-book, open-web exams. Br. J. Educ. Technol. 2009, 40, 227–236. [Google Scholar] [CrossRef]
  37. Jordan-Fleming, M.K. Excellence in Assessment: Aligning Assignments and Improving Learning. Assess. Update 2017, 29, 10–12. [Google Scholar] [CrossRef]
  38. Güzer, B.; Caner, H. The Past, Present and Future of Blended Learning: An in Depth Analysis of Literature. Procedia Soc. Behav. Sci. 2014, 116, 4596–4603. [Google Scholar] [CrossRef] [Green Version]
  39. Becker, D.; Connolly, J.; Lentz, P.; Morrison, J. Using the Business Fraud Triangle to Predict Academic Dishonesty among Business Students. Acad. Educ. Lead. J. 2006, 10, 37. [Google Scholar]
  40. Lancaster, T.; Clarke, R. Rethinking Assessment by Examination in the age of contract cheating. In Plagiarism across Europe and Beyond; ENAI: Brno, Czech Republic, 2017; pp. 215–228. [Google Scholar]
  41. Trost, K. Psst, have you ever cheated? A study of academic dishonesty in Sweden. Assess. Eval. High. Educ. 2014, 34, 367–376. [Google Scholar] [CrossRef]
  42. Cluskey, G.R.; Ehlen, C.R.; Raiborn, M.H. Thwarting online exam cheating without proctor supervision. J. Acad. Bus. Ethics 2011, 4, 1–8. [Google Scholar]
  43. Lin, M.J.; Levitt, S.D. Catching Cheating Students. Economica 2020, 87, 885–900. [Google Scholar] [CrossRef] [Green Version]
  44. Ryznar, M. Giving an Online Exam (2 September 2020). Indiana University Robert H. McKinney School of Law Research Paper No. 2020-16. Available online: http://doi.org/10.2139/ssrn.3684958 (accessed on 25 July 2021).
  45. Corrigan-Gibbs, H.; Gupta, N.; Northcutt, C.; Cutrell, E.; Thies, W. Deterring cheating in online environments. ACM Trans. Comput.-Hum. Interact. 2015, 22, 1–23. [Google Scholar] [CrossRef]
  46. Chirumamilla, A.; Sindre, G.; Guyen-Duc, A. Cheating in e-exams and paper exams: The perceptions of engineering students and teachers in Norway. Assess. Eval. High. Educ. 2020, 45, 940–957. [Google Scholar] [CrossRef]
  47. Golden, J.; Kohlbeck, M. Addressing cheating when using test bank questions in online Classes. J. Account. Educ. 2020, 52, 100671. [Google Scholar] [CrossRef]
  48. Karim, N.A.; Shukur, Z. Review of User Authentication Methods in Online Examination. Asian J. Inf. Technol. 2015, 14, 166–175. [Google Scholar]
  49. Bearman, M.; Dawson, P.; O’Donnell, M.; Tai, J.; Jorre, T.J.D. Ensuring Academic Integrity and Assessment Security with Redesigned Online Delivery. 2020. Available online: http://dteach.deakin.edu.au/2020/03/23/academic-integrity-online/ (accessed on 25 July 2021).
  50. QAA. Assessing with Integrity in Digital Delivery Introduction. 2020. Available online: https://www.qaa.ac.uk/docs/qaa/guidance/assessing-with-integrity-in-digital-delivery.pdf (accessed on 27 June 2021).
  51. Ashri, D.; Sahoo, B.P. Open Book Examination and Higher Education during COVID-19: Case of University of Delhi. J. Educ. Technol. Syst. 2021, 50, 73–86. [Google Scholar] [CrossRef]
Table 1. Details of staff and student interviewees.
Table 1. Details of staff and student interviewees.
StaffStudents
NameDisciplineNameDiscipline
Participant 1Civil EngineeringParticipant 6Electronic Engineering
Participant 2Electrical and Electronic EngineeringParticipant 7Civil Engineering
Participant 3Information and Communication EngineeringParticipant 8Agriculture and Plantation Engineering
Participant 4Chemistry and NanotechnologyParticipant 9Literature
Participant 5Communication skills, English, FrenchParticipant 10Automobile Technology
Participant 11 (MSc. student)Food Science and Technology
Participant 12 (MSc. student)Biotechnology
Participant 13 (Ph.D. student)Financial Engineering
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gamage, K.A.A.; Pradeep, R.G.G.R.; de Silva, E.K. Rethinking Assessment: The Future of Examinations in Higher Education. Sustainability 2022, 14, 3552. https://doi.org/10.3390/su14063552

AMA Style

Gamage KAA, Pradeep RGGR, de Silva EK. Rethinking Assessment: The Future of Examinations in Higher Education. Sustainability. 2022; 14(6):3552. https://doi.org/10.3390/su14063552

Chicago/Turabian Style

Gamage, Kelum A. A., Roshan G. G. R. Pradeep, and Erandika K. de Silva. 2022. "Rethinking Assessment: The Future of Examinations in Higher Education" Sustainability 14, no. 6: 3552. https://doi.org/10.3390/su14063552

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop