Next Article in Journal
Using Digital Learning Platforms to Enhance the Instructional Design Competencies and Learning Engagement of Preservice Teachers
Previous Article in Journal
Perception of the Figure of the Social Educator in Rural Contexts in Spain
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

University-Wide Digital Skills Training: A Case Study Evaluation

by
Nabila A. S. Raji
1,
David A. Busson-Crowe
1 and
Eleanor J. Dommett
1,2,*
1
Centre for Technology Enhanced Learning, King’s College London, London SE1 9NH, UK
2
Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London SE5 8AF, UK
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(4), 333; https://doi.org/10.3390/educsci13040333
Submission received: 10 February 2023 / Revised: 16 March 2023 / Accepted: 21 March 2023 / Published: 23 March 2023

Abstract

:
Digital competencies and confidence are thought to be critical to success in higher education. However, despite learning frequently taking place online through the use of virtual learning environment and tools such as lecture capture, and evidence to counter the idea of digital nativity, these critical skills are often not explicitly taught at university. In the present study, we describe the development and evaluation of our Essential Digital Skills programme, which is a university-wide digital training programme designed and implemented at a large London university, aimed at new students but open to all students at the University. Using Kirkpatrick’s evaluation model, we demonstrate that the programme provided effective training in digital skills for all students but that individual differences exist in the training experience, notably around ethnicity and student status, with Black and Minority Ethnic (BAME) students and international students feeling that the training made a greater contribution to their skill levels and resulted in greater behaviour change and impact, as well as intention to undertake further training.

1. Introduction

Teaching and learning in Higher Education is subject to a range of influences. In recent years, one of the most influential factors has been the availability of digital technologies to support learning, forcing the continuous modernisation of the sector [1,2]. Universities now routinely use a virtual learning environment (VLE) with a range of built-in tools and add-ons such as lecture capture [3,4]. The use of these tools can provide enhanced learning opportunities such as easy access to quality resources and flexible learning [5,6]. Reliance on digital tools for learning increased abruptly in response to the COVID-19 pandemic. However, given the movement in that direction previously, it is likely that post-pandemic, this direction of increasing digital education will continue. Research certainly indicates that a shift to blended learning, which combines online learning with face-to-face experiences, may be beneficial in terms of student retention and engagement [7,8,9] as well as attainment [10,11]. Additionally, employers are increasingly seeking digitally literate graduates [12]. Given this, it is perhaps unsurprising that it has been suggested that the ability of university students to effectively utilise digital education opportunities is likely to be critical to their success at university and beyond [13].
Irrespective of whether we see a return to pre-COVID practices or a more rapid adoption of blended and online learning, the increased reliance on digital technologies in Higher Education brings with it significant challenges. One key challenge is ensuring that students have sufficient digital capabilities or competences to learn in a digital environment [14]. Digital competency in university students can be defined as the knowledge, skills and attitudes that they must possess in order to effectively use digital technologies to evaluate, consume and produce learning information, and to communicate and collaborate with others for learning [1,15]. Digital competencies or capabilities do not just exist at an individual level. The national organisation providing network and IT services and digital resources in support of further and higher education institutions, JISC, states that these are the skills and attitudes that are needed by both individuals and organisations in order to thrive [16].
A common misconception is that these competencies do not need to be explicitly taught because today’s students come to university digitally equipped. This stems from the idea of digital nativity, which is that generations born after 1980 grew up with access to computers and the internet and therefore are inherently technology-savvy [17,18,19,20]. Digital nativity has been assumed for millennial and Gen Z learners; however, studies show that students use a limited range of mainly established technologies [21]. Furthermore, although students in the millennial and Gen Z age brackets may have been surrounded by technology during their lifetimes, there is little evidence to suggest that they approach learning differently because of this [22]. Additionally, it is suggested that proficiencies, where they exist, are hugely diverse, meaning that what one student is capable of, another may not, even if both can be considered millennials or Gen Z [23]. Moreover, older students also make up a significant proportion of those attending university [24]; therefore, even if digital natives were appropriately equipped, they are not the only ones studying. In addition, students from widening participation backgrounds, who may not have had so much exposure to digital tools, now make up around 12% of students studying in the UK [24]. It is therefore apparent that there is a need to teach and develop digital competencies in students, rather than assume that they have these.
When developing skills training for students, there are a range of approaches that can be taken. For example, skills training could form part of induction activities or continuous development throughout degree studies. It could also be embedded into the core curriculum and taught at a departmental or programme level or delivered in a university-wide manner. Evidence suggests that digital competencies take a considerable amount of time to develop [25], meaning that it is unlikely they can be successfully taught and developed in brief induction periods, and should instead be part of a longer programme. It has been argued that an embedded approach is preferable for skills development because the skills are then integrated into the rest of the students’ experiences [26,27]. Conversely, it has also been suggested that where something is fundamental or mandatory, as digital competency training may be considered, it is better delivered at a university-wide level, because this is most cost-effective and the need spans multiple disciplines [28]. A university-wide approach also allows a consistent delivery of content. Irrespective of the exact approach, previous research indicates that students generally show low engagement with skills development [29,30,31,32], meaning that any training will need to be carefully designed to increase engagement, for example, through the use of multimedia [33] and animation [34], low-stakes assessment [35] and formal recognition of engagement [36].
The present study describes the development and evaluation of a digital skills training programme at a large UK university. The programme was primarily designed to be taken by students new to the university; therefore, it did not assume any prior competencies or familiarity with university systems. The programme was evaluated using a survey including measures from an established training evaluation model and bespoke questions to assess student attitudes towards the training and their current skill level. The evaluation aimed to establish: whether students had found the training helpful and engaging; whether any individual factors determined training experience; and attitudes towards digital skills training and intentions to further develop in this area.

2. Materials and Methods

2.1. Design of the Digital Skills Training Programme

The first stage in developing the digital skills training was to devise a Digital Capabilities framework to underpin the training programme. This was created using a previously developed digital information literacy framework [37] and guidance from JISC on digital capabilities [16]. The framework was developed by a team of staff including academic, student services, library staff, careers, and IT staff, with additional input from a student digital committee. The final framework was approved by the Deans for Education on behalf of the University and is shown in Table S1. This framework included 67 individual learning outcomes to be achieved during the first year of study at university and be the focus of the ‘Essential Digital Skills’ training. This training, as the name suggests, was designed to give students, irrespective of their discipline of the study, the essential skills to engage with key technologies during their studies, including the virtual learning environment and Microsoft Office, and to guide students on appropriate digital behaviour.
Following the framework development, the 67 learning outcomes were grouped into five overall topics by the authors: (i) your online learning platform; (ii) digital tools to equip you for success; (iii) being a good digital citizen; (iv) wellbeing in the digital world; and (v) resources for your studies. Within each topic, specific activities were developed to support all the learning outcomes. These were organised into four blocks of study, based on when in the academic year the training is most essential. The topics and activities were reviewed by the interdisciplinary team involved in framework development before being finalised. A summary of the course structure is shown in Table 1.
Training activities were then produced by specialist staff within the university to a specific template so that all activities had at least one interactive element and a quiz at the end. All training was delivered online via the VLE.
Although this programme was originally developed with first year students in mind, it was open to all students from any level of study or discipline because this was the first digital training offered across the university. To access the training, students could self-enrol on the course and work through it at their own pace. The programme was made available from the start of the academic year 2020/21 and was promoted via induction events and through departments. Completion of the programme was associated with Higher Education Achievement Record (HEAR) accreditation. To gain this accreditation, students needed to work through the material and achieve at least 80% on the quizzes associated with each activity (excluding the Practice Submission activity, which did not include a quiz) by June 2021.

2.2. Evaluation of the Programme

All students who had enrolled on the Essential Digital Skills training were invited to complete the survey. Advertisements were placed on the skills training page of the VLE and in the institutional research recruitment circulars. Students willing to complete the survey were given access to a link where they could provide informed consent for the evaluation (Ethics Ref: MRA-20/21-21928). Once they had provided consent, they were given access to the survey which took 20 min to complete and was divided into six sections. Example questions and answer options are summarised in Table 2 (for all sections not reproduced in the Supplementary Material Table S2).
Participant characterisation: This section assessed demographic characteristics including gender, trans status, age, ethnicity, disability status and first language.
Qualification and study: In this section, participants indicated their level and year of study. They also indicated whether they were studying full- or part-time and whether they were registered on a campus-based programme or a distance learning programme. Due to the COVID-19 pandemic, campuses were not fully open for much of the academic year; thus, participants also indicated whether they completed the Essential Digital Skills training on campus, remotely from London, or remotely from elsewhere in the UK or abroad. Participants also detailed their student fee status. Finally, students indicated which of the ten university faculties they were studying within.
Prior skills and training: Here, participants were asked to rate their agreement with four statements about prior skills awareness and ability and to indicate which of the sections they had completed. The final question asked them to rate the specific factors in terms of importance when determining whether to complete specific activities.
Kirkpatrick evaluation model: To evaluate the essential digital skills training, each participant completed measures developed according to the four-area Kirkpatrick evaluation checklist [38]. Although several models of evaluating training have been proposed, this model remains popular and a strength of it is the focus on behavioural outcomes of the learning [39] whilst also taking a holistic approach to learning [40]. It is also deemed suitable for short- to mid-length training rather than longer term training with extensive follow-up periods [41].
To assess their reaction (level 1), participants rated their agreement (1 = strongly disagree, 5 = strongly agree) with eight statements pertaining to their reaction to the training, for example, “The Essential Digital Skills training was an effective use of my time”. A full list of these can be found in the Table S2. The eight measures were combined into a single mean reaction score. Cronbach’s alpha was calculated to measure internal consistency and was deemed sufficient i.e., >0.6 (α = 0.881). Learning (level 2) was assessed through four separate groups of questions. Firstly, participants were asked to rate their agreement with four statements about what they had learnt or gained, for example “I have gained the key digital skills I needed to learn for my studies”. These items were combined (α = 0.838). Secondly, they were asked to rate their expertise (1 = no skill at all, 10 = expert level) across the six areas of digital capabilities, which were averaged to provide a current skill level (α = 0.887). Thirdly, for the same six areas, they were asked to indicate how important the Essential Digital Skills training had been in achieving this current level of skills (1 = not at all important, 5 = extremely important) (α = 0.892). Finally, a free text question asked participants “What is the biggest change you’ve noticed in your studies so far with the new skills you developed or learnt on the Essential Digital Skills training?”. Behaviour (level 3) was assessed through four items, where participants rated agreement, for example “I have been able to help others by teaching them some of what I have learnt on the Essential Digital Skills Training” (α = 0.773). Finally, impact (level 4) was measured using a similar approach with eight statements such as “Since engaging with the Essential Digital Skills training, I feel I am more productive in my studies, making fewer errors and getting work done more quickly and efficiently” (α = 0.881)
Digital capabilities: The fifth section of the evaluation used 17 multiple choice questions to assess digital capabilities based on the contents of the Essential Digital Skills training programme. These questions had a single correct answer and the total score across all questions was calculated as a percentage to indicate current level of digital skills. These were considered evidence of level 2 Learning in the Kirkpatrick model.
Beliefs about digital skills: The final short section of the survey investigated participants’ beliefs around digital skills, adopting the framework of the theory of planned behaviour [42]. The items measured attitude, perceived norms (both social norm and subjective norm), perceived power and perceived behavioural control. Finally, given that these would typically predict intention, we asked students to indicate the strength of their intention to continue to develop their digital skills.

2.3. Data Analysis

Data from the survey were used to characterize the sample in terms of demographic (Section 1 of survey described above) and study-related characteristics (Section 2). Most variables were categorical and data frequency counts and percentages were therefore used. For the continuous variable of age, the mean and standard deviation were calculated. Prior skills and training (Section 3) were also considered in terms of frequency counts and percentages for different levels of agreement with the statements. Quantitative data pertaining to Kirkpatrick’s model were described using means and standard deviations and the free text responses were grouped into categories and indicative quotes provided. To evaluate whether the essential digital skills training was received similarly by all students, we used t-tests (two group comparison, e.g., male vs. female) and one-way ANOVAs (more than two groups comparison, e.g., different ethnicities) to compare quantitative scores for all levels of Kirkpatrick’s model. For the multiple-group comparisons, post hoc Tukey tests were completed where the ANOVAs revealed a significant group difference. For the analysis of measures about beliefs and intentions around digital skills training, the means and standard deviations were calculated. To assess whether the data aligned with the theory of planned behaviour [42], linear regression was used to determine whether intention could be predicted by attitudes, norms and control as would be expected. t-tests and one-way ANOVAs were used to identify any individual factors impacting these beliefs, as outlined above for Kirkpatrick’s model. All analyses were completed in SPSS.

3. Results

3.1. Sample Characteristics

The survey was open from January to May and was accessed by 163 students during this time, 138 (84.7%) of which completed it. All data from the completed surveys were included and sample characteristics for categorical variable are provided in Table 3. The mean age of participants was 23.67 years (standard deviation, SD = 6.96).
Study characteristics of the students are shown in Table 4. Of the 138 students who completed the survey, only 9 (6.5%) were studying on campus. Most were studying off-campus in London (N = 80, 58%). A further 19 (13.8%) were studying remotely from elsewhere in the UK and 30 (21.7) were studying from another country.

3.2. Prior Views of Digital Skills

Prior to completing the digital skills training, most students somewhat agreed (N = 54, 39.1%) or strongly agreed (N = 41, 29.7%) that they recognized the extent of digital skills training that would be required for their studies, in contrast to around one-fifth (N = 27, 19.5%) who did not agree. Although most also somewhat agreed (N = 49, 35.5%) or strongly agreed (N = 23, 16.7%) that they had received suitable training before arrival, almost one third disagreed (N = 39, 28.8%), confirming a mismatch between recognized skill requirements and prior training. Despite this, most students were confident that they had sufficient digital skills to support their learning (N = 87, 63.0%) and any employment applications they needed to make (N = 75, 54.3%).

3.3. Course Completion

Of those completing the survey, the completion rates for all four sections of the training were high (98–99% completion). Students were asked to rate the importance of different factors in whether they completed the training. These are shown in Figure 1.

3.4. Kirkpatrick’s Evaluation Model

Level 1 (Reaction) of the model was assessed through eight items assessing reaction to the training. The mean score across all items was 4.33 (SD = 0.59), indicating a positive reaction to the training. Level 2 (Learning) was assessed through several different measures. Firstly, three items assessed whether students felt that the training had met their learning needs (M = 4.29, SD = 0.64). Secondly, students were asked to rate their current skill levels on a scale of 1–10, where 10 was expert level. Across the six areas of digital capabilities, the students gave an average rating of 8.15 (SD = 1.09), indicating they believed that they had a very good skill level. This is partially supported by their responses to the assessment questions as a measure of performance (M = 70.36, SD = 11.26). The final quantitative measure of level 2 asked students to indicate the extent to which they believed that the digital training had contributed to their current skill level. Student ratings here indicated that it had contributed but not as strongly as it might have (M = 3.89, SD = 0.81). The final measure for this level was a free text question asking students to identify the biggest change they had seen because of the training. Responses to this question were provided by 107 students. Of these, two responses simply said ‘None’ or ‘Not much’ and were not used in further categorisation. The remaining 105 responses were categorised according to the main theme raised in the comment (Table 5). Nine statements could not be categorised, in most cases due to a lack of clarity, and five were placed under multiple themes.
To assess level 3 (Behaviour), students completed four questions assessing daily use of skills, ability to apply them and to help others with these skills, as well as being motivated to develop them further. The mean score across these questions was 3.90 (SD = 0.65), indicating that the training had resulted in moderate behavioural changes. Similar scores were found for the impact of the training (level 4 M = 3.82, SD = 0.63).

3.5. Effects of Individual Factors on Evaluation

Individual factors that were not university-specific and had sufficient numbers were compared for all quantitative elements of Kirkpatrick’s model. The factors considered were gender (male/female), ethnicity (White British, other White, BAME), English first language (yes/no), student status (home, EU, international), level (undergraduate/postgraduate), year (first/second+), mode (campus/distance). There were no significant differences for any evaluation measure according to gender, first language or level of study.
For current year of study, there was a significant difference in level 2 (Learning) in terms of the contribution that students felt the digital skills training had made to their current skill level (t(136) = 3.229, p = 0.002, 95% CI [0.54, 0.17]), with first year students (M = 4.00, SD = 0.78) reporting a greater contribution than latter year students (M = 3.46, SD = 0.77). There were no other differences in evaluation measures. For the mode of study, there was only one significant difference, relating to level 2 (Learning). The quiz assessment score differed (t(100) = −4.303, p < 0.001, 95% CI [2.39, −15.04]) such that those on campus (M = 72.88, SD = 1.08) performed better than those on distance learning courses (M = 65.59, SD = 2.59).
There were more differences according to ethnicity and student status. The data shows the evaluation scores were for measures of Levels 2–4 were more positive in BAME (Table 6) groups and in international students (Table 7).

3.6. Beliefs about Digital Skills

3.6.1. Overall Measures

To provide a more thorough understanding of attitudes towards digital skills after the training, the survey included measures of attitude, perceived norms, perceived behavioural control and intention, in line with the theory of planned behaviour. The attitude towards developing better digital skills was very positive (M = 6.43, SD = 0.84). Subjective norms were also high (M = 6.09, SD = 1.03); however, social norms scored lower (M = 4.78, SD = 1.78). Perceived power was similar (M = 4.89, SD = 1.47), but perceived behavioural control was higher (M = 6.10, SD = 1.38). The intention to continue the development of digital skills was high (M = 6.00, SD = 1.27). Furthermore, as would be expected, intention could be predicted by the other factors (R2 = 0.315, F(5, 130) = 11.975, p < 0.001. Coefficients revealed that the only significant predicting skills training was perceived behavioural control (B = 0.427, p < 0.001, 95% CI [0.259, 0.595].

3.6.2. Effects of Individual Factors on Evaluation

There were no differences in any attitude-related measure between genders, undergraduate or taught postgraduates, or those studying with different modes (campus vs. distance). There was a significant difference in intention between those with (M = 5.74, SD = 1.27) and those without (M = 6.26, SD = 1.23) English as a first language (t(135) = −2.463, p = 0.015, 95% CI [−0.95, −0.10]. In terms of current study year, first years (M = 6.51, SD = 0.72) had a significantly more positive attitude towards developing better digital skills (t(134) 2.262, p = 0.025, 95% CI [0.05, 0.76]) than those in later years of study (M = 6.11, SD = 1.19). In terms of ethnicity, there was a significant difference in intention to continue to develop these skills (F(2, 132) = 6.255, p = 0.003). Post hoc Tukey tests indicated that White British (M = 5.29, SD = 1.49) respondents had significantly lower intentions compared to both non-White British ((M = 6.30, SD = 0.76) and BAME (M = 6.14, SD = 1.22) students. Similarly, there were significant differences for student status (F(2, 134) = 3.161, p = 0.046), with Tukey tests indicating that home (M = 5.72, SD = 1.31), had significantly lower intentions compared to international (M = 6.32, SD = 1.27) students but not EU (M = 6.16, SD = 0.97) students.

4. Discussion

The results of the Essential Digital Skills evaluation both confirm and add to prior research. Firstly, most students recognised the level of digital skills required for their university study, although there was less agreement that these skills had been developed prior to starting their studies. This confirms the need to ensure digital training is provided and that just because students may fall into the age group labelled digital natives, it does not mean that they come with all the necessary skills to support learning [21]. In terms of completion rate, those completing the survey completed most or all activities or sections of training. The feature rated most important in undertaking training was the recognition via the Higher Education Achievement Record (HEAR), in line with previous work showing the value of formal recognition [36]. This was followed by a self-identified need and available time to undertake the training. Interestingly, staff-identified needs and the needs of employers were deemed much less important. The latter is particularly worrying given that students and employers have been found to have very different perceptions of the basic skill level required from graduates [43]. However, it may be related to the study sample, as most students where in the first year of their qualification and could therefore be less focused on career opportunities at this earlier stage.
Findings from the items based on the Kirkpatrick model indicate a congruence across all four levels of the model. Students reported positive reactions to the training (level 1). This is in line with previous studies evaluating digital skills training, although these were conducted at a programme level, and therefore not university-wide [44,45,46]. They also felt that the Essential Digital Skills programme met their learning needs (level 2). Furthermore, self-report and assessment scores indicated a good level of skills and students believed the training had contributed to this skill level. The ability scores were slightly higher in the self-report measures than the actual assessment, indicating students may believe they are more capable than they are. This is in line with previous research showing that people typically overestimate information literacy skills, which were encompassed within this training [47] and that students will overestimate digital skills [48]. When asked to identify the biggest change, most students reported an increase in proficiency using a range of tools with comments pertaining to making better use of functions and saving time because of this. The finding that digital skills training made working more efficient is mirrored in previous more specific ICT training [46]. This previous study, in Malaysia, also found that students reported benefits for employability, which was not the case in the present study. This difference is unlikely to relate to the level of study, as both the current study and the work by Sadikan et al. [46] focused on first year students. However, the difference may arise due to the focus on Excel training specifically for engineers in the previous study, in contrast to our more general training. The previous work was more focused on a specific cohort and their professional needs. In terms of behaviour (level 3), the students’ responses indicated moderate behaviour changes related to the daily use of skills and being able to help others. They also felt the training had had an impact (level 4).
Although the overall evaluation of the training programme was positive, we also examined whether individual factors impacted the training experience. Previous work has identified differences in digital skills according to individual factors including gender [49,50]. As such we might, have expected to find differences in the impact of training according to gender. However, we found no differences according to gender, first language or level of study, but did find differences based on year group, where first year students felt the programme had made a greater contribution to their skills than latter year groups. This is perhaps unsurprising given that the programme was primarily designed for new students and that these students would be most unfamiliar with the systems and tools included in the training. Interestingly, those on campus performed better on the assessment of digital skills than those studying distance learning courses. It might be expected that the reverse would be true because distance learners are only able to study online and may therefore have higher usage and familiarity; however, due to COVID-19 much of the cohort was studying online for part if not all the academic year even if they were due to study in person. Consequently, the difference in familiarity may have disappeared.
The greatest level of individual differences came from ethnicity and student status (home, EU, international), which impacted levels 2–4 of the evaluation. In the case of training contribution to skills, White non-British students felt that the training made a significantly lower contribution (level 2) than BAME students. For behaviour change (level 3), White British students reported significantly less change than BAME students. Finally for impact, BAME students reported a significantly higher impact than both other groups (level 4). Previous literature has indicated that ethnicity may be a factor in digital divides [51,52], meaning that BAME students could have poorer digital skills initially, meaning that the training contributes more to their learning and behaviour, thus having a greater impact. However, as we did not assess skills before and after, we cannot confirm this. Similar changes were found for student status on training contribution, behaviour change and impact, where international students reported more positive evaluations than home students on all measures and EU students on two measures. It is important to note that the measures of ethnicity and student status may be confounded; however, considering the available data, it was difficult to establish this. This identification of individual factors influencing the experience and impact of digital skills training adds to the existing literature. At the time of writing, no other study had evaluated university-wide training. However, two studies, in Norway [45] and in the UAE [44], have evaluated training for pre-service teachers, which was carried out over ten weeks and appeared to have a similar duration to the present training, with some overlapping content. In both studies, the year of study, ethnicity and student status were not described for the sample or analysed separately. This was likely due to the low sample size in one study [44]; however, this does suggest future studies should investigate individual differences where the sample size allows for meaningful analysis.
Our attitudinal measures revealed generally positive attitudes towards developing digital skills and a strong belief that people such as them would approve of the training (subjective norms); however, slightly lower beliefs that most people such as them would undertake similar training (social norms) were also observed. Perceived behavioural control (PCB) for undertaking training was high, which might be expected given the optional nature of this training, giving student autonomy on whether to engage and to what extent, and the ease with which they could access the training. The intention to continue developing these skills was also high. This aligns with previous work with students which has indicated that they recognise a need to continuously update digital skills [53]. In line with the theory of planned behaviour [42], intention could be predicted by the other factors, with PCB being critical. The importance of PBC should inform future training programmes; this finding suggests that students should have control over the training and should be able to engage in the programme easily. The latter could be considered in terms of access (e.g., access via typical VLE) or time (e.g., allowing a sufficient period to engage and giving space in the timetable for this). Individual differences in intentions were found, such as those without English as a first language presenting a greater intention, as did first year students. Additionally, and perhaps unsurprisingly given the evaluation results, non-British White students and BAME students held stronger intentions than White British students. International students also held stronger intentions. The exact reasons for these individual differences are unclear and future work should consider qualitative methods to explore these differences.
Based on the current findings, the Essential Digital Skills training programme can be considered successful in developing student digital skills. The experience of training did vary slightly between individuals and future research should investigate why this was, possibly through pre- and post-test approaches and qualitative measures to understand the meaning and value of training in different students. In terms of future training programmes, students viewed accreditation or recognition as critical, alongside their own identified need for the training and timeliness. This suggests that it is important to allow students opportunities to assess their own skill level and recognise where it is limited, as well as to provide succinct and timely training.
The current study provides evidence for the utility of university-wide digital skills training, adding to a small evidence base for programme-specific training [44,45]. It also provides a detailed framework for digital capabilities, which could be amended by other institutions, a template of topics and activities, which may help in the development of other programmes, and some insights into what students appreciated about the training. However, the study does have limitations. Firstly, this is a case study, and it was therefore conducted within a single institution, which may limit the generalisation of findings. Secondly, although the sample was broadly representative of the wider university population, females were over-represented in this evaluation. Although we did not find any evidence of gender differences, this could again limit generalisation. Finally, we opted to include the Kirkpatrick evaluation model. We chose this model because it offers a holistic evaluation [40] and is particularly well-suited to mid-length training rather than longer term training with extensive follow-up periods [41]; however, it does also have some limitations. For example, the module focuses on the training rather than the individual and their context [54], although we have attempted to pick apart individual factors.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci13040333/s1, Table S1: Digital Capabilities Framework; Table S2: Reactions to training.

Author Contributions

Conceptualization, E.J.D. and N.A.S.R.; methodology, E.J.D., N.A.S.R. and D.A.B.-C.; formal analysis, E.J.D.; writing—original draft preparation, E.J.D.; writing—review and editing, E.J.D., N.A.S.R. and D.A.B.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Ethics Committee of King’s College London (Code: MRA-20/21-21928) approved on 22 January 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the participants to publish this paper.

Data Availability Statement

The data supporting the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, X.; Wang, Z.; Wang, Q.; Chen, W.; Pi, Z. Supporting digitally enhanced learning through measurement in higher education: Development and validation of a university students’ digital competence scale. J. Comput. Assist. Learn. 2021, 37, 1063–1076. [Google Scholar] [CrossRef]
  2. Kim, H.J.; Hong, A.J.; Song, H.-D. The roles of academic engagement and digital readiness in students’ achievements in university e-learning environments. Int. J. Educ. Technol. High. Educ. 2019, 16, 21. [Google Scholar] [CrossRef] [Green Version]
  3. Dommett, E.J.; Gardner, B.; Van Tilburg, W. Staff and student views of lecture capture: A qualitative study. Int. J. Educ. Technol. High. Educ. 2019, 16, 23. [Google Scholar] [CrossRef] [Green Version]
  4. Dommett, E.J.; Gardner, B.; Van Tilburg, W. Staff and students perception of lecture capture. Internet High. Educ. 2020, 46, 100732. [Google Scholar] [CrossRef]
  5. Wang, X.; Tan, S.C.; Li, L. Measuring university students’ technostress in technology-enhanced learning: Scale development and validation. Australas. J. Educ. Technol. 2020, 36, 96–112. [Google Scholar] [CrossRef] [Green Version]
  6. Yen, M.H.; Chen, S.; Wang, C.Y.; Chen, H.L.; Hsu, Y.S.; Liu, T.C. A framework for self-regulated digital learning (SRDL). J. Comput. Assist. Learn. 2018, 34, 580–589. [Google Scholar] [CrossRef]
  7. Stockwell, B.R.; Stockwell, M.S.; Cennamo, M.; Jiang, E. Blended Learning Improves Science Education. Cell 2015, 162, 933–936. [Google Scholar] [CrossRef] [Green Version]
  8. Holley, D.; Dobson, C. Encouraging student engagement in a blended learning environment: The use of contemporary learning spaces. Learn. Media Technol. 2008, 33, 139–150. [Google Scholar] [CrossRef]
  9. Hughes, G. Using blended learning to increase learner support and improve retention. Teach. High. Educ. 2007, 12, 349–363. [Google Scholar] [CrossRef]
  10. Boyle, T.; Bradley, C.; Chalk, P.; Jones, R.; Pickard, P. Using blended learning to improve student success rates in learning to program. J. Educ. Media 2003, 28, 165–178. [Google Scholar] [CrossRef]
  11. López-Pérez, M.V.; Pérez-López, M.C.; Rodríguez-Ariza, L. Blended learning in higher education: Students’ perceptions and their relation to outcomes. Comput. Educ. 2011, 56, 818–826. [Google Scholar] [CrossRef]
  12. Leahy, D.; Wilson, D. Digital skills for employment. In Proceedings of the Key Competencies in ICT and Informatics. Implications and Issues for Educational Professionals and Management. International Conferences, KCICTP and ITEM, Potsdam, Germany, 1–4 July 2014; pp. 178–189. [Google Scholar]
  13. European Commission. The Digital Competence Framework 2.0. Available online: https://ec.europa.eu/jrc/en/digcomp/digital-competenceframework (accessed on 3 June 2021).
  14. Bowyer, J.; Chambers, L. Evaluating blended learning: Bringing the elements together. Res. Matters A Camb. Assess. Publ. 2017, 23, 17–26. [Google Scholar]
  15. Janssen, J.; Stoyanov, S.; Ferrari, A.; Punie, Y.; Pannekeet, K.; Sloep, P. Experts’ views on digital competence: Commonalities and differences. Comput. Educ. 2013, 68, 473–481. [Google Scholar] [CrossRef]
  16. JISC. What Is Digital Capability? Available online: https://digitalcapability.jisc.ac.uk/what-is-digital-capability/ (accessed on 16 March 2023).
  17. Oblinger, D.; Oblinger, J.L.; Lippincott, J.K. Educating the Net Generation; Educause: Boulder, CO, USA, 2005. [Google Scholar]
  18. Palfrey, J.; Gasser, U. Opening Universities in a Digital Era. N. Engl. J. High. Educ. 2008, 23, 22–24. [Google Scholar]
  19. Prensky, M. Digital natives, digital immigrants part 2: Do they really think differently? Horizon 2001, 9, 1–6. [Google Scholar] [CrossRef]
  20. Tapscott, D. Grown Up Digital: How the Net Generation Is Changing Your World; McGraw-Hill Companies: San Francisco, CA, USA, 2008. [Google Scholar]
  21. Margaryan, A.; Littlejohn, A.; Vojt, G. Are digital natives a myth or reality? University students’ use of digital technologies. Comput. Educ. 2011, 56, 429–440. [Google Scholar] [CrossRef]
  22. Judd, T. The rise and fall (?) of the digital natives. Australas. J. Educ. Technol. 2018, 34. [Google Scholar] [CrossRef] [Green Version]
  23. Bennett, S.; Maton, K.; Kervin, L. The ‘digital natives’ debate: A critical review of the evidence. Br. J. Educ. Technol. 2008, 39, 775–786. [Google Scholar] [CrossRef] [Green Version]
  24. HESA. Who’s Studying in HE? Available online: https://www.hesa.ac.uk/data-and-analysis/students/whos-in-he#characteristics (accessed on 20 January 2023).
  25. Thorne, S.L. Digital literacies. In Framing Languages and Literacies: Socially Situated Views and Perspectives; Routledge: London, UK, 2013; pp. 192–218. [Google Scholar]
  26. Orr, D.; Appleton, M.; Wallin, M. Information literacy and flexible delivery: Creating a conceptual framework and model. J. Acad. Librariansh. 2001, 27, 457–463. [Google Scholar] [CrossRef]
  27. Snavely, L.; Cooper, N. The information literacy debate. J. Acad. Librariansh. 1997, 23, 9–14. [Google Scholar] [CrossRef]
  28. Benson, L.; Rodier, K.; Enström, R.; Bocatto, E. Developing a university-wide academic integrity E-learning tutorial: A Canadian case. Int. J. Educ. Integr. 2019, 15, 5. [Google Scholar] [CrossRef] [Green Version]
  29. Burnett, S.; Collins, S. Ask the audience! Using a personal response system to enhance information literacy and induction sessions at Kingston University. J. Inf. Lit. 2007, 1, 1–3. [Google Scholar] [CrossRef]
  30. Thompson, K.; Kardos, R.; Knapp, L. From tourist to treasure hunter: A self-guided orientation programme for first-year students. Health Inf. Libr. J. 2008, 25, 69–73. [Google Scholar] [CrossRef]
  31. Verlander, P.; Scutt, C. Teaching information skills to large groups with limited time and resources. J. Inf. Lit. 2009, 3, 31–42. [Google Scholar] [CrossRef]
  32. Wingate, U. Doing away with ‘study skills’. Teach. High. Educ. 2006, 11, 457–469. [Google Scholar] [CrossRef]
  33. Gañán, D.; Caballé, S.; Conesa, J.; Barolli, L.; Kulla, E.; Spaho, E. A systematic review of multimedia resources to support teaching and learning in virtual environments. In Proceedings of the Eighth International Conference on Complex, Intelligent and Software Intensive Systems, Birmingham, UK, 2–4 July 2014; pp. 249–256. [Google Scholar]
  34. Liu, C.; Elms, P. Animating student engagement: The impacts of cartoon instructional videos on learning experience. Res. Learn. Technol. 2019, 27. [Google Scholar] [CrossRef] [Green Version]
  35. Holmes, N. Student perceptions of their learning and engagement in response to the use of a continuous e-assessment in an undergraduate module. Assess. Eval. High. Educ. 2015, 40, 1–14. [Google Scholar] [CrossRef]
  36. Resch, K.; Knapp, M.; Schrittesser, I. How do universities recognise student volunteering? A symbolic interactionist perspective on the recognition of student engagement in higher education. Eur. J. High. Educ. 2021, 12, 194–210. [Google Scholar] [CrossRef]
  37. The Open University. Digital and Information Literacy Framework. Available online: https://www.open.ac.uk/libraryservices/subsites/dilframework/view_all (accessed on 16 March 2023).
  38. Kirkpatrick, D. Four-level training evaluation model. US Train. Dev. J. 1959, 13, 34–47. [Google Scholar]
  39. Mann, S. What should training evaluations evaluate? J. Eur. Ind. Train. 1996, 20, 14–20. [Google Scholar] [CrossRef]
  40. Sim, J. Using Kirkpatrick Four Level Evaluation model to assess a 12-week accelerated ultrasound intensive course. Sonography 2017, 4, 110–119. [Google Scholar] [CrossRef]
  41. Yardley, S.; Dornan, T. Kirkpatrick’s levels and education ‘evidence’. Med. Educ. 2012, 46, 97–106. [Google Scholar] [CrossRef] [PubMed]
  42. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  43. Kavanagh, M.H.; Drennan, L. What skills and attributes does an accounting graduate need? Evidence from student perceptions and employer expectations. Account. Financ. 2008, 48, 279–300. [Google Scholar] [CrossRef] [Green Version]
  44. ElSayary, A. The impact of a professional upskilling training programme on developing teachers’ digital competence. J. Comput. Assist. Learn. 2023, 1–13. [Google Scholar] [CrossRef]
  45. Engeness, I.; Nohr, M.; Singh, A.B.; Mørch, A. Use of videos in the Information and Communication Technology Massive Open Online Course: Insights for learning and development of transformative digital agency with pre-and in-service teachers in Norway. Policy Futures Educ. 2020, 18, 497–516. [Google Scholar] [CrossRef]
  46. Sadikin, A.N.B.; Mustaffa, A.A.B.; Hasbullah, H.B.; Zakaria, Z.Y.B.; Abd Hamid, M.K.B.; Man, S.H.B.C.; Hassim, M.H.B.; Ab Aziz, M.A.B.; Yusof, K.B.M. Qualitative Development of Students’ Digital Skills by Integrating a Spreadsheet Software in First Year Introduction to Engineering and Seminar Course. Int. J. Emerg. Technol. Learn. 2021, 16, 213. [Google Scholar] [CrossRef]
  47. Mahmood, K. Do people overestimate their information literacy skills? A systematic review of empirical evidence on the Dunning-Kruger effect. Commun. Inf. Lit. 2016, 10, 199–213. [Google Scholar] [CrossRef] [Green Version]
  48. Öncül, G. Defining the need: Digital literacy skills for first-year university students. J. Appl. Res. High. Educ. 2020, 13, 925–943. [Google Scholar] [CrossRef]
  49. Grande-de-Prado, M.; Cañón, R.; García-Martín, S.; Cantón, I. Digital competence and gender: Teachers in training. A case study. Future Internet 2020, 12, 204. [Google Scholar] [CrossRef]
  50. Helsper, E.J.; Eynon, R. Digital natives: Where is the evidence? BERJ 2010, 36, 503–520. [Google Scholar] [CrossRef] [Green Version]
  51. Ritzhaupt, A.D.; Liu, F.; Dawson, K.; Barron, A.E. Differences in student information and communication technology literacy based on socio-economic status, ethnicity, and gender: Evidence of a digital divide in Florida schools. J. Res. Technol. Educ. 2013, 45, 291–307. [Google Scholar] [CrossRef]
  52. Enoch, Y.; Soker, Z. Age, gender, ethnicity and the digital divide: University students’ use of web-based instruction. Open Learn. J. Open Distance E-Learn. 2006, 21, 99–110. [Google Scholar] [CrossRef]
  53. Gómez-Poyato, M.J.; Eito-Mateo, A.; Mira-Tamayo, D.C.; Matías-Solanilla, A. Digital skills, ICTs and students’ needs: A case study in social work degree, University of Zaragoza (Aragón-Spain). Educ. Sci. 2022, 12, 443. [Google Scholar] [CrossRef]
  54. Srivastava, V.; Walia, A.M. An analysis of various training evaluation models. Int. J. Adv. Innov. Res. 2018, 5, 276–282. [Google Scholar]
Figure 1. Importance of different factors in completing the digital skills training.
Figure 1. Importance of different factors in completing the digital skills training.
Education 13 00333 g001
Table 1. An overview of the Essential Digital Skill Training programme.
Table 1. An overview of the Essential Digital Skill Training programme.
BlockTopicActivity (Duration of Study)
1Wellbeing in the digital worldEnsuring wellbeing in a digital environment—Keeping Secure at [Host University] (2 h 30 min)
Being a good digital citizenOnline behaviour: Do’s and Don’ts (1 h)
Digital tools to equip you for successIntroducing your Digital Tools for Learning (Library pages, key apps, Office 365 overview) (1 h 30 min)
Your online learning platformIntroducing [Host VLE] (1 h)
Resources for your studiesWhat’s out there? Types of information (45 min)
2Your online learning platformCore and Recommended Tools- how to use accessibility tool, Blackboard Ally, Microsoft TEAMS, Turnitin, Moodle Assignment, Kaltura and LinkedIn Learning. (1 h)
Digital tools to equip you for successWorking with Microsoft Office 365- One Drive, Word, PowerPoint, OneNote, Outlook (2 h)
Resources for your studiesSearching for academic information (1 h)
Being a good digital citizenDo’s and Don’ts of online behaviour—Microsoft Teams and Email Etiquette (30 min)
3Being a good digital citizenBeing a good collaborator (45 min)
Resources for your studiesReferencing and Citing Your Sources (1 h 30 min)
Your online learning platformTurnitin Practice submission area (N/A)
Digital tools to equip you for successUsing Feedback and Reflecting on Practice (45 min)
Wellbeing in the digital worldEnsuring Wellbeing in a Digital Environment (1 h 30 min)
4Being a good digital citizenDigital identity and employability (2 h)
Digital tools to equip you for successWorking with Microsoft Excel (1 h)
Resources for your studyWorking with data (1 h)
Being a good digital citizenData Protection (GDPR) and Data Storage (30 min)
Being a good digital citizenReflecting on your digital practices (1 h 30 min)
Table 2. A summary of the structure of the final survey with example items.
Table 2. A summary of the structure of the final survey with example items.
TopicExample Item (Rating)
Demographic variablesGender (Male/Female/Other/Prefer not to say)
Trans status (Yes/No/Prefer not to say)
Age (free text entry)
Ethnicity (15 categories/Other/Prefer not to say)
Disability status (five categories/None/Prefer not to say)
First language (English/Not English)
Qualification and studyLevel of study (undergraduate/taught postgraduate/research postgraduate)
Year of study (first year/later year)
Intensity (Full-time/Part-time)
Mode (Campus-based/Distance learning)
Location when training (on campus/remote local/remote UK/remote abroad)
Fee status (home/EU/international)
Faculty (10 categories)
Prior skills and training The following statements were rated with 1 = strongly disagree, 5 = strongly agree:
  • I recognised the extent of digital skills that would be required in studying at [host university].
  • I had received sufficient training in digital skills prior to starting my studies.
  • I was confident that I had the necessary digital skills to support my learning at [host university].
  • I was confident that I had the necessary digital skills to support my applications for employment either alongside my studies or upon their completion.
The following were rated for importance using 1 = not at all important, 5 = extremely important:
  • Higher Education Achievement Record Certification on completion.
  • Self-identification of a need to develop specific digital skills covered.
  • Identification by teaching staff or personal tutor of a need to develop specific digital skills covered.
  • Identification by employer or careers advisor of a need to develop specific digital skills covered.
  • Clear relevance to my studies in the upcoming month.
  • Having the time to complete the relevant activities alongside my studies.
  • Having an easy-to-use interface where I could locate all necessary information.
  • Use of multimedia in the training to help bring the material to life.
  • Use of regular assessment to help me see my progress.
Beliefs about digital skillsAttitude: “For me to develop better digital skills is…” (1 = extremely bad, 7 = extremely good)
Social norm: “Most people like me complete digital skills training alongside their studies” (1 = extremely unlikely, 7 = extremely likely)
Subjective norm: “Most people who are important to me would approve of me completing digital skills training” (1 = strongly disagree, 7 = strongly agree).
Perceived power: “For me to complete digital skills training alongside my studies is…” (1 = extremely difficult, 7 = extremely easy)
Perceived behavioural control: “My completing digital skills training is up to me.” (1 = strongly disagree, 7 = strongly agree)
Strength of the intention to continue with digital skills development (1 = strongly disagree, 7 = strongly agree).
Table 3. Student characteristics based on responses to demographic items.
Table 3. Student characteristics based on responses to demographic items.
CharacteristicStudent (N = 138)
N%
Gender
    Male3424.6
    Female10475.4
Trans
    Yes21.4
    No13698.6
Ethnicity
    White British2417.4
    White Other4129.7
    BAME7151.4
    Prefer not to say21.4
English First Language
    Yes6849.3
    No7050.7
Disability
    Physical disability 118
    Sensory disability17
    Learning difference75.1
    Mental health condition118
    Long-term condition21.4
    None10777.5
    Prefer not to say75.1
Table 4. Study characteristics based on responses to qualification and study parts of the survey.
Table 4. Study characteristics based on responses to qualification and study parts of the survey.
CharacteristicStudent (N = 138)
N%
Student Status
    Home6547.1
    EU3223.2
    International4129.7
Study Level21.4
    Undergraduate7453.6
    Postgraduate—taught6043.5
    Postgraduate—research42.9
Year of Study
    First11180.4
    Second +2719.6
Study intensity
    Full-time12489.9
    Part-time1410.4
Study Mode
    Campus-based10979
    Distance2921
Faculty
    Arts and Humanities2014.5
    Life Sciences and Medicine2014.5
    Psychiatry, Psychology and Neuroscience1611.6
    Law52.6
    Business 2014.5
    Natural and Mathematical Sciences107.2
    Nursing and Midwifery1913.8
    Foundation Studies21.4
    Social Science and Public Policy2618.8
Table 5. An overview of themes raised by students.
Table 5. An overview of themes raised by students.
Theme (Number of Comments)Example
Increased confidence (11)“I feel more confident with using [VLE] as well as referencing my work and also utilising Office365 a lot more than i would have in the past.”
Increased efficiency or proficiency (58)“The biggest change relates to my proficiency on [VLE]; I now feel that I can take more advantage of the different aspects of the web page.”
Increased awareness of digital options (14)I’m more aware that there are a range of options available to me, and that I don’t have to keep using the same platforms/apps that I previously used but can explore further.
Digital Footprint (11)“Understood the importance of a positive digital footprint”
Careers (5)I am more aware of my professional conduct during meetings on Zoom or MS Teams.
Table 6. Evaluation differences by ethnicity as tested using a one-way ANOVA with post hoc Tukey tests. Ethnicity was grouped in White British, White non-British and Black and Minority Ethnic (BAME).
Table 6. Evaluation differences by ethnicity as tested using a one-way ANOVA with post hoc Tukey tests. Ethnicity was grouped in White British, White non-British and Black and Minority Ethnic (BAME).
Measure (Level)White British
M (SD)
White Non-British M (SD)BAME
M (SD)
Test Statistic F (df)p-Value
Reaction (1)4.16 (0.63)4.21 (0.65)4.43 (0.53)2.978 (2, 133)0.054
Learning needs (2)4.43 (0.52)4.11 (0.84)4.35 (0.13)2.649 (2, 133)0.074
Current skills (2)7.90 (1.07)8.28 (1.20)8.15 (1.05)2.264 (2, 97)0.109
Current performance (2)71.28 (7.17)73.33 (11.32)68.04 (12.02)0.912 (2, 133)0.404
Training contribution (2)3.94 (0.86)3.57 (0.89)4.07 (0.68)5.181 (2, 133)0.007 b
Behaviour (3)3.64 (0.14)3.83 (0.10)4.03 (0.07)3.811 (2, 133)0.025 a
Impact (4)3.44 (0.12)3.72 (0.10)4.01 (0.07)8.931 (2, 132)0.000 a,b
a White British < BAME; b White non-British < BAME.
Table 7. Evaluation differences by student status as tested using a one-way ANOVA with post hoc Tukey tests.
Table 7. Evaluation differences by student status as tested using a one-way ANOVA with post hoc Tukey tests.
Measure (Level)Home
M (SD)
EU M (SD)International
M (SD)
Test Statistic F (df)p-Value
Reaction (1)4.24 (0.52)4.31 (0.63)4.47 (0.67)1.890 (2, 135)0.155
Learning needs (2)4.31 (0.59)4.17 (0.74)4.37 (0.64)0.913 (2, 135)0.404
Current skills (2)8.00 (1.02)8.34 (1.22)8.24 (1.11)1.224 (2, 135)0.297
Current performance (2)71.46 (9.73)71.06 (14.70)68.04 (10.21)0.910 (2, 99)0.406
Training contribution (2)3.80 (0.85)3.60 (0.74)4.26 (0.66)7.576 (2, 135)0.001 b,c
Behaviour (3)3.80 (0.60)3.83 (0.63)4.12 (0.71)3.288 (2, 135)0.040 b,c
Impact (4)3.71 (0.56)3.84 (0.56)4.02 (0.63)3.256 (2, 134)0.042 b
b Home < International; c EU < International.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Raji, N.A.S.; Busson-Crowe, D.A.; Dommett, E.J. University-Wide Digital Skills Training: A Case Study Evaluation. Educ. Sci. 2023, 13, 333. https://doi.org/10.3390/educsci13040333

AMA Style

Raji NAS, Busson-Crowe DA, Dommett EJ. University-Wide Digital Skills Training: A Case Study Evaluation. Education Sciences. 2023; 13(4):333. https://doi.org/10.3390/educsci13040333

Chicago/Turabian Style

Raji, Nabila A. S., David A. Busson-Crowe, and Eleanor J. Dommett. 2023. "University-Wide Digital Skills Training: A Case Study Evaluation" Education Sciences 13, no. 4: 333. https://doi.org/10.3390/educsci13040333

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop