Special Issue "Application of New Technologies for Assessment in Higher Education"

A special issue of Education Sciences (ISSN 2227-7102). This special issue belongs to the section "Technology Enhanced Education".

Deadline for manuscript submissions: 30 September 2023 | Viewed by 2065

Special Issue Editors

Department of Computer Science, University of Warwick, Coventry CV4 7AL, UK
Interests: educational technology; computer science education; higher education; mobile learning
Special Issues, Collections and Topics in MDPI journals
School of Education, University of Hull, Hull HU6 7RX, UK
Interests: technology-enhanced learning; blended learning; e-assessment; education future scenarios

Special Issue Information

Dear Colleagues,

The landscape in which HE institutions operate has been changing, with increasing numbers of students attending universities with diverse missions. As a result, traditional models of course delivery are being supplemented or replaced by flexible approaches assisted by new technologies, a process which has been accelerated by the recent pandemic. Existing generic tools such as learning management systems and virtual learning environments are complemented by software which fulfils particular educational tasks, including learning analytics software, which supports the assessment process. The development of assessment tools for use within HE institutions gives rise to challenges, technical, pedagogic, and administrative, to the process of assessment.

The goal of this Special Issue is to consider the implications of applying new technological approaches to the assessment process. The issue will relate to much of the recent literature on the educational impacts of the pandemic, to the potential of emerging technologies for assessment, and to the institutional challenges of universities operating in an increasingly global market. We welcome the submission of research papers which focus on technical, pedagogical, or administrative views of that process. Possible topics include but are not limited to the following:

  • Evaluation of the effectiveness of particular approaches;
  • Incorporation of new computer science techniques, such as learning analytics, artificial intelligence, and machine learning;
  • Student perceptions of the assessment process;
  • Implications for pedagogic theory;
  • Incorporation of assessment technologies into existing learning environments.

Prof. Dr. Mike Joy
Dr. Peter Williams
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Education Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • assessment
  • HE
  • higher education
  • university
  • technology
  • learning analytics

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Microlearning for the Development of Teachers’ Digital Competence Related to Feedback and Decision Making
Educ. Sci. 2023, 13(7), 722; https://doi.org/10.3390/educsci13070722 - 15 Jul 2023
Viewed by 668
Abstract
The assessment and feedback area of the European Framework for the Digital Competence of Educators (DigCompEdu) establishes a specific competence related to the ability to use digital technologies to provide feedback and make decisions for learning. According to the literature, this particular competence [...] Read more.
The assessment and feedback area of the European Framework for the Digital Competence of Educators (DigCompEdu) establishes a specific competence related to the ability to use digital technologies to provide feedback and make decisions for learning. According to the literature, this particular competence is one of the least developed in the teaching profession. As there are few specialised training strategies in the field of information and communication technology (ICT)-mediated feedback, this study aims to validate a microlearning proposal for university teachers, organised in levels of progression following the DigCompEdu guidelines. To validate the proposal, a literature analysis was carried out and a training proposal was developed and submitted to a peer review process to assess its relevance. This study identifies the elements that should be included in a training strategy in the area of feedback and decision making for university contexts. Finally, it is concluded that this type of training requires a combination of agile and self-managed strategies (characteristics of microlearning), which can be complemented by the presentation of evidence and collaborative work with colleagues. Full article
(This article belongs to the Special Issue Application of New Technologies for Assessment in Higher Education)
Show Figures

Figure 1

Article
Maintaining Academic Integrity in Programming: Locality-Sensitive Hashing and Recommendations
Educ. Sci. 2023, 13(1), 54; https://doi.org/10.3390/educsci13010054 - 03 Jan 2023
Viewed by 839
Abstract
Not many efficient similarity detectors are employed in practice to maintain academic integrity. Perhaps it is because they lack intuitive reports for investigation, they only have a command line interface, and/or they are not publicly accessible. This paper presents SSTRANGE, an efficient similarity [...] Read more.
Not many efficient similarity detectors are employed in practice to maintain academic integrity. Perhaps it is because they lack intuitive reports for investigation, they only have a command line interface, and/or they are not publicly accessible. This paper presents SSTRANGE, an efficient similarity detector with locality-sensitive hashing (MinHash and Super-Bit). The tool features intuitive reports for investigation and a graphical user interface. Further, it is accessible on GitHub. SSTRANGE was evaluated on the SOCO dataset under two performance metrics: f-score and processing time. The evaluation shows that both MinHash and Super-Bit are more efficient than their predecessors (Cosine and Jaccard with 60% less processing time) and a common similarity measurement (running Karp-Rabin greedy string tiling with 99% less processing time). Further, the effectiveness trade-off is still reasonable (no more than 24%). Higher effectiveness can be obtained by tuning the number of clusters and stages. To encourage the use of automated similarity detectors, we provide ten recommendations for instructors interested in employing such detectors for the first time. These include consideration of assessment design, irregular patterns of similarity, multiple similarity measurements, and effectiveness–efficiency trade-off. The recommendations are based on our 2.5-year experience employing similarity detectors (SSTRANGE’s predecessors) in 13 course offerings with various assessment designs. Full article
(This article belongs to the Special Issue Application of New Technologies for Assessment in Higher Education)
Show Figures

Figure 1

Back to TopTop