Next Article in Journal
The Impact of Purchasing New Agricultural Machinery on Fuel Consumption on Farms
Next Article in Special Issue
The Assessment of Attitudes towards Retirement from a Psychosocial Approach
Previous Article in Journal
Understanding Active Transportation to School Behavior in Socioeconomically Disadvantaged Communities: A Machine Learning and SHAP Analysis Approach
Previous Article in Special Issue
Chinese and British University Teachers’ Emotional Reactions to Students’ Disruptive Classroom Behaviors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Psychological and Educational Factors of Digital Competence Optimization Interventions Pre- and Post-COVID-19 Lockdown: A Systematic Review

by
Alberto Díaz-Burgos
1,
Jesús-Nicasio García-Sánchez
1,*,
M. Lourdes Álvarez-Fernández
1 and
Sonia M. de Brito-Costa
2,3,4,5
1
Departamento de Psicología, Sociología y Filosofía, Universidad de León, 24071 León, Spain
2
Applied Research Institute, Polytechnic Institute of Coimbra, Rua da Misericórdia, Lagar dos Cortiços-São Martinho do Bispo, 3045-093 Coimbra, Portugal
3
Human Potential Development Center (CDPH), Polytechnic Institute of Coimbra, Rua da Misericórdia, Lagar dos Cortiços-São Martinho do Bispo, 3045-093 Coimbra, Portugal
4
Coimbra Education School, Research Group in Social and Human Sciences (NICSH), Polytechnic Institute of Coimbra, Rua Dom Joao III-Solum, 3030-329 Coimbra, Portugal
5
Coimbra Education School, INED—Center for Research and Innovation in Education, Polytechnic Institute of Coimbra, Rua Dom Joao III-Solum, 3030-329 Coimbra, Portugal
*
Author to whom correspondence should be addressed.
Sustainability 2024, 16(1), 51; https://doi.org/10.3390/su16010051
Submission received: 9 November 2023 / Revised: 13 December 2023 / Accepted: 15 December 2023 / Published: 20 December 2023

Abstract

:
The rapid development of the ever-changing information and communication society demands skills from its members that allow access to and adapt to the various situations that they may face. To achieve this, it is essential to acquire a set of key competencies throughout different stages of life, among which we find digital competence. This systematic review aims to analyse, through a series of focal points and indicators, the internationally published interventions in the last ten years aimed at improving digital literacy and the acquisition of this competence by students in early childhood education, primary education, and higher education, as well as professionals from various fields. The procedure followed for the selection of the interventions has been documented and graphically represented according to the PRISMA statement, with searches conducted across various databases and journals. In total, 26 studies were selected, covering the period before, during, and after the COVID-19 health lockdown, and the influence of the lockdown on the development of digital competence was examined. The obtained results show the evolution of the selected interventions in terms of general aspects, instructional and evaluative procedures, fidelity, and encountered limitations. The results demonstrate a growing concern for the development of digital competence, amplified by the needs arising during the COVID-19 lockdown and evidenced by an increase in interventions aimed at this goal. It also showcases the relationship between adequate acquisition and the nurturing of other psychoeducational variables like motivation or satisfaction.

1. Introduction

1.1. Digital Competence

There are multiple reasons to investigate the impact of Information and Communication Technologies (ICTs) on society in general. In the last decades of the 20th century, these technologies have gradually been introduced into various social areas, providing new opportunities for accessing knowledge and communication [1]. Furthermore, numerous benefits resulting from their acquisition and development have been confirmed, such as the promotion of autonomy, efficiency, responsibility, flexibility, critical, and reflective thinking, among other aspects. These elements favour the development, learning, well-being, and quality of life of this group, preparing them for their future professional life [2]. In summary, ICTs contribute to the development of society.
Currently, digital competence is considered fundamental in the school curriculum, as reflected in the Common Framework for Digital Teaching Competence. This framework is based on international proposals from organizations such as the International Society for Technology in Education [3] and the United Nations Educational, Scientific and Cultural Organization [4]. Digital competence is among the seven Key Competences that must be developed in a cross-cutting manner according to current regulations. Digital competence involves skills related to searching, obtaining, processing, and communicating information, as well as transforming it into knowledge. It encompasses various skills, from accessing information to transmitting it in different formats, using information and communication technologies as an essential tool for staying informed, learning, and communicating [5].
The inclusion of sustainability in digital competence involves exploring how the ethical and responsible handling of technology can positively impact sustainable development. This entails understanding how digital tools can be used to reduce environmental impact, promote social and economic inclusion, ensure equality in access to information and digital resources, and adopt approaches that benefit future generations. Integrating sustainability into digital competence requires a comprehensive perspective that considers both the technical and ethical and social aspects of technology utilization [6].
In this article, a systematic review will be conducted on the acquisition and development of digital competences through various interventions, as well as their relationship with psychological and educational variables, in students from different educational stages, as well as in education professionals and various occupational fields, across the entire lifespan.
The studies presented describe interventions conducted in the last ten years with the aim of analysing the development of digital competence in various contexts. This period has been selected due to the significant increase in interventions aimed at acquiring digital competence and the clear trend towards online instruction and assessment, which was accentuated during and after the COVID-19 lockdown [7]. In this systematic review, an analysis of various indicators will be carried out for each of the selected interventions. They all share several common elements: (i) pre- and post-assessment of digital competence or literacy, as well as, in some cases, the psychological and educational variables associated with them; (ii) a description, to varying degrees, of the chosen sample and the instructional procedure followed, whether online (MOOC) or in-person; and (iii), the presentation and analysis of the obtained results, concluding with a discussion, conclusion, and, in most cases, limitations and future research directions.
Finally, during the procedure, each of the articles was reviewed with the purpose of classifying them according to a set of focal points which are described below. For greater ease, they have been organized into various tables according to the following points: (i) a general analysis of the participants, the construct to be addressed, and the instructional procedure carried out; (ii) the degree of focus on the role of the instructor and the student, the materials, and the grouping chosen for the intervention; (iii) the articles have also been classified according to the evaluation procedure, constructs to be assessed, moments, materials and instruments, satisfaction, and their validation; (iv) the fidelity of the treatment was also analysed, considering controls and indicators and how they were used; and (v), the various limitations that each of the interventions may present have been observed.

1.2. Justification

An analysis of the scientific literature on digital competence development models reveals that research has primarily focused on digital literacy in educational settings, either through direct student instruction or through teacher training, with the aim of transferring this competence to students [8]. Numerous studies have addressed digital competence in relation to the social environment. Systematic reviews have been conducted on various variables from the 2018 PISA Report, such as the use of digital devices at home and social relationships [9], but not enough attention has been given to the development of online or in-person interventions specifically aimed at promoting digital competence.
The COVID-19 lockdown has underscored the growing need to develop digital literacy at all levels, both in the educational and professional, as well as in the social sphere. In this regard, there is evidence linking digital competence with psycho-educational variables [10]. Variables such as motivation, satisfaction, stress, or academic performance, coupled with distance education, resulting from the lack of resources or limited acquisition of digital competence, hinder proper functioning in the technological environment [11,12].
The results of these studies, in addition to highlighting the existing digital divides in different contexts, both in terms of internet access and digital devices [13], underscore the immediate and effective need for intervention in the methodologies used to acquire digital competence, both for students and teachers.
In response to the identified needs in the educational, professional, and social realms, interventions for digital literacy among students, teachers, and even beyond the educational field have been proposed in the last decade. Some of these proposals are in the pilot stage [14,15], while other interventions have been implemented and are the subject of our review. The studies analysed include various instructional approaches, from in-person classes to online courses, such as the well-known MOOCs (Massive Open Online Courses) or NOOCs (Nano and Open Online Courses), as well as blended learning, where instruction or assessment is carried out online or in person, respectively. In recent years, the number of MOOCs has increased significantly, leading to systematic reviews that analyse studies where the instructional procedure for digital competence development has been based on an online course [16,17].

1.3. Digital Competence and Sustainability

There is no universally agreed-upon definition of the term “sustainability”, especially given its consideration in various contexts. According to Kuhlman & Farrington [18], it can be defined as the maintenance of well-being over an extended, possibly indefinite, period. Additionally, research suggests that sustainability comprises three components: (i) environmental, (ii) economic, and (iii) social [19].
The examination of digital competence plays a fundamental role in promoting sustainability by enabling a comprehensive assessment of the ability of educational institutions and other organizations to adopt digital technologies effectively and ethically [20].
The development of digital competence in the educational context during the COVID-19 lockdown is closely linked to sustainability in various ways. Firstly, the rapid adoption of online education and the strengthening of digital skills among teachers and students have reduced the need for commuting, thus contributing to a reduction in carbon footprint by decreasing mobility. Additionally, online education can be more resource-efficient by reducing the need for printed materials and providing access to sustainable digital resources [21].
Furthermore, digital competence fosters a mindset more oriented towards sustainability by enabling individuals to access information about environmental and social issues, which can inspire greater commitment and action in preserving the environment and promoting sustainable practices in daily life. Increased motivation for its use has amplified concerns and the search for answers, solutions, and alternatives to current socio-environmental problems.
Collectively, digital competence plays a fundamental role in the transition toward a more sustainable world by driving efficiency, awareness, and action concerning environmental and social challenges [22,23].
This work places the social component at the forefront by addressing the global sustainable development goals, “ensuring inclusive and equitable quality education and promoting lifelong learning opportunities for all”, with a focus on digital transformation [24]. It seems natural that the integration of digital tools into educational methods would lead to an improvement in the quality of education [25,26], making it a suitable step towards achieving the aforementioned goal.

1.4. The Present Study

Therefore, digital competence is considered crucial for autonomously navigating the educational and professional challenges we face in a constantly changing society of communication and information. In the past decade, systematic reviews have been conducted that have analysed areas related to digital competence and literacy, clarifying both concepts and investigating their correct usage [27]. Additionally, there are empirical studies, bibliographical analyses, mappings, and meta-analyses of previous systematic reviews where articles related to digital competence in the educational context over the past decade can be extracted [28,29,30,31]. These analyses are descriptive, comparative, evolutionary, or clinical in nature. This review focuses on articles extracted from these studies, but specifically analyses those that include an intervention.
On one hand, there is evidence of numerous systematic reviews focused on the analysis of studies related to the acquisition of digital teaching competence. These reviews encompass various educational models centred on the development of teacher digital literacy [32] or the theoretical underpinnings of teaching competence [33,34]. They also span different educational stages, such as higher education [35] and primary education [36,37]. Although most of these reviews aim to analyse studies focused on assessing the educational practices or methodologies proposed for the development of digital competence [38,39,40,41].
On the other hand, there are systematic reviews focused on the analysis of studies investigating the evaluation of digital competence acquisition procedures in higher education students [42,43,44], in primary education [45], or in early childhood education [46,47]. Finally, due to the demands arising from the COVID-19 lockdown, the systematic review conducted by Armas-Alba et al. [48] stands out, focusing on the use of ICTS and the development of digital teaching competence in response to the educational needs of students with special educational needs during the health crisis. Similarly, the systematic analysis carried out by Scagliusi [49] focuses on the development of digital literacy for youth entrepreneurship post-lockdown.
In both cases, involving students and teachers, these are articles that assess digital competence, the teaching and learning procedures in educational structures, but without delving into interventions [50]. On the other hand, studies were found that demonstrate the development achieved in digital literacy through various interventions on teachers [51] or students at various educational stages [52,53,54,55]. There are also reviews focused on the student’s engagement, from a behavioural, emotional, and cognitive perspective, experienced when advancing in the acquisition of digital competence [56].
Regarding the selected interventions for analysis (see Figure 1), it is evident that some of them have been included in previous systematic reviews. For example, the study conducted by Basantes-Andrade et al. [57] was selected in the reviews by Basantes-Andrade et al. [58] and by De la Cruz-Campos et al. [59]. Similarly, the intervention conducted by Benavente-Vera et al. [60] was analysed in the study by Chavarry [61] and by Sanchez & Fernández [62]. In all four cited reviews, the analysis revolves around interventions aimed at developing teaching digital competencies. The same is true for the interventions carried out by Gómez-Trigueros & Moreno-Vera [63], cited in the review conducted by Velandia-Rodriguez et al. [64], and that of Guayara-Cuéllar et al. [65], included in the studies by Hernández et al. [66] and by Viñoles-Consentino et al. [55].
On the other hand, the intervention conducted by Fernández-montalvo [68] has been analysed in various systematic reviews [45,89,90] that compile studies focused on the evaluation and development of digital literacy in primary education. Similarly, the intervention published by Maureen et al. [69] is included in numerous reviews that focus on the development of literacy and digital competence acquisition in early childhood education or preschool stages [91,92,93,94,95,96,97,98].
Finally, the intervention conducted by Gabarda-Méndez et al. [85] is included in the systematic review published by Marrero-Sánchez & Vergara-Romero [99], which focuses on the analysis of interventions to improve digital competence in university students. All the reviews mentioned so far focus their analysis on digital competencies in one period or stage, highlighting the need to expand the field of study and compare it with other educational phases, an objective proposed in this review.
The article published by Romero-García et al. [71] was analysed in various systematic reviews, focusing on both teacher digital competence [64,100] and digital literacy at the university level [101,102]. It was also included in the review conducted by Reyes-Argüelles et al. [103], where 15 articles detailing problem-based learning during the lockdown were analysed. In contrast to this last review, the present one extends the scope of analysis from the period before to the one after the health crisis, including the during.
Regarding the intervention by Prince et al. [67], it can be found analysed in systematic reviews focused on the study of the development of 21st-century educational competencies and skills, uncovering articles from the last decade focused on digital competence as well as other competencies such as mathematical or literary [104,105]. The main difference with the present review lies in the specificity and depth given to digital literacy and competence, and the influence COVID-19 has had on it.
Finally, there are numerous researchers that include the article published by Nogueira et al. [81] in their reviews. Some are focused on the relationship between digital literacy, the impact of new technologies, and mathematical learning [106,107,108], while others analyse the educational structure and digital development in classrooms [109,110,111]. The present review provides an innovative perspective with a focus on development influenced by the lockdown era and the analysis of various intervention quality indicators (see Table S4).
Although this review has selected the studies from these other reviews, the indicators and focuses have been different. New reviews are needed that focus, for example, on the pre-post situation and during the COVID-19 lockdown (see Figure 2). This is the first need we will address in this analysis. There is a need for a systematic review that combines the acquisition of digital competence at different stages of the life cycle with differences before, during, and after the health crisis caused by the COVID-19 lockdown, and how all of this affects psychoeducational variables such as personal satisfaction [1,112].
This is the added value of this systematic review. In addition to all the studies related to interventions previously analysed by other authors, new ones have been selected, recently published, in order to update the scientific literature, which has been growing exponentially in recent years. Finally, the main contribution of the study is the analysis, first, by focuses and indicators and, later, of the results obtained depending on whether the intervention was carried out before, during, or after the COVID-19 lockdown.
The systematic review addresses the problem through the following research questions: (i) What are the results of empirically validated interventions related to teachers and students, their causal, mediating, and moderating roles in relation to psychological variables in the acquisition of digital competence? (ii) How does the instructional and evaluative procedure affect the development of digital literacy and psychoeducational variables in teachers and students? (iii) What are the strengths and limitations found in the various articles analysed? What can be contributed to future research lines? (iv) How does the COVID-19 lockdown affect the effectiveness of these procedures and the role of variables?
The answers to the various research questions have been obtained through the achievement of general and specific objectives. The general objective was to carry out a systematic review focused on empirically validated interventions in recent years, with national and international data. Specifically, it focused on variables related to the development of digital competence and psychoeducational variables. The specific objectives included: (i) Discuss the focuses analysed in each of the reviewed articles, detailing their instructional and evaluative procedures, the characteristics of the participants, objectives, results, conclusions, and fidelity. (ii) Identify the limitations and future research lines of each of the articles, offering alternatives and practical applications.
To achieve these objectives, the following working hypotheses were proposed: (i) Relevant differences in the contributions of empirical evidence from the studies by focuses would be observed: instructional procedure, evaluative procedure, participant characteristics, objectives, results, conclusions, and fidelity; (ii) differential psychoeducational patterns contributed by the analysed and classified interventions, the results obtained in them, and their relationship with the development of digital competence would be identified; (iii) it is expected that the evidence obtained will demonstrate the relevant mediating role of psychological variables in the causal relationship between digital literacy competencies through content and academic, learning, and adaptive outcomes; and (iv) the differences and similarities extracted after the analysis by focuses, sequenced in the pre-, during-, and post-COVID-19 lockdown stages would be evidenced.

2. Method

To carry out this systematic review, a four-phase procedure was followed, based on the model by Miller et al. & by Scott et al. [113,114], which are detailed as follows: (i) starting with a diagram of relevant terms and thematic axes (see Figure 3), a literature search was conducted using databases such as Web of Science, Scopus, Research Gate and in journals such as MDPI or Frontiers; (ii) the inclusion/exclusion criteria were applied with the addition of other criteria, such as studies published in peer-reviewed journals, reference databases, and citation indexes, following the contributions of Cooper et al. [115]; and (iii), once the inclusion and exclusion criteria were established, they were applied following an agreement between the observers for their coding, with the aim of conducting qualitative and quantitative analyses.
On one hand, both the subject and the content and terms were carefully selected, taking into account the educational needs related to the acquisition and development of digital competence as previously mentioned in the theoretical framework and evidenced by studies [116,117]. On the other hand, once the search terms were defined, which provide meaning to digital competence, the range of participants has been narrowed down. The selection covers the entire life cycle of an individual, including Early childhood, primary, secondary, and higher education, without forgetting the teaching staff. All of this is done with the purpose of comparing the development observed in various educational and biological stages [57,69]. Regarding the search criteria, focus areas, and selected indicators for analysis, those considered essential for conducting a comprehensive quality analysis of each intervention were chosen [113,114].

Inclusion and Exclusion Criteria

At this point, it should be noted that the procedure will be documented and graphically represented according to the PRISMA statement [118]. A total of 26 intervention studies were identified and collected through a search in databases such as Scopus, Web of Science, or Research Gate, and from journals such as Frontiers or MDPI. These studies were classified and analysed using tables that cover various aspects, including general aspects, evaluation instruments, treatment fidelity, instructional procedure, and limitations. The selected studies were sequenced based on their publication date, before, during, and after the COVID-19 lockdown. A column structure was used to describe the indicators, and rows were used for each of the selected interventions (see Figure 4).
For the selection of articles to be analysed, a series of inclusion and exclusion criteria were taken into account. Regarding the typology of the studies, only those related to instructional interventions carried out, only studies that have been published (not in press or preprint) in the last ten years (January 2013–October 2023); the theme of the articles is related to digital competence and psychological variables. Regarding the language, we include those written in English or Spanish, regardless of where they have been implemented. Therefore, studies published in other languages are excluded.
We select those that refer to an instructional intervention that contrasts some digital competence improvement program with psychological variables, excluding single-case or clinical studies. Therefore, other types such as comparative, descriptive, evolutionary, interpretative, and proposals were excluded. Intervention studies that do not contain analyses of psychological variables as well as observational studies that do not provide pre-post intervention measures were excluded.
Hence, the resulting number of 26 studies. Nevertheless, this number of studies is considered appropriate, considering the systematic reviews conducted and published in Sustainability in recent years, focusing on the development of digital competence (see Table S5).
Regarding the assessment of study quality following the PRISMA approach, the analysis focused on treatment fidelity is considered the most crucial aspect in appraising their quality. Consequently, this analysis is also seen as an assessment of quality. A detailed exploration of the fidelity indicators for each intervention has facilitated the creation of a helpful ranking, reflected in a specific table as described in the corresponding results. This analysis, based on the rigorous control of scientifically validated interventions, provides a quality assessment approach from that perspective.
Furthermore, it delves into the limitations of the studies, allowing for a comparative analysis between them regarding their methodology and the context of their published reports. This encompasses evaluating methodological and contextual strengths such as currency, precise problem formulation, rigorous analysis of the measurement of psychological constructs and considered competencies, instrument validation, coherence between propositions and backgrounds, discussion and conclusions, and the appropriate use of data to support claims.
Overall, this approach offers an evaluation of article quality from a broader perspective, encompassing both formal and content-related aspects, beyond the specificity of the intervention, which is primarily assessed based on treatment fidelity.
PRISMA checklist has been included, as an example, for three interventions, one for each period (pre, during, and post COVID-19) [68,72,77] (see Tables S1–S3).
After defining objectives, hypotheses, and research questions, inclusion and exclusion criteria, as well as keywords, were established. These were structured as search terms to build the search string.
Subsequently, the same procedure was followed in all databases and sources. Keywords were added, combined with “AND” or “OR” between each of them. In the search engine, results were filtered based on exclusion and inclusion criteria such as: publication period (2013–2023), open access, language and document type. An example of a search string would be the one used in Web of Science: TS = “digital competence” OR “digital literacy” AND “education” AND “intervention”. Timespan: 2013–2023. Open access.

3. Results

The aspects that were reviewed, ranging from more general indicators such as participants, duration, or objectives of each study, to more specific ones like quality indicators or fidelity to check the validity of each intervention. The review has concluded with a more specific examination of the instructional procedure proposed in each study and its evaluation, to understand and extract the details of each, compare them, and show their similarities and differences. After analysing these indicators, the limitations found in each of the interventions were classified to highlight the difficulties that readers may encounter when drawing conclusions due to the absence of various data (see Figure 5).

3.1. Comparison among the Different Focuses

Before delving into each of the approaches and comparing them in general, significant differences can be observed in various indicators before, during, and after the COVID-19 lockdown period. The first difference concerns the number of studies focused on acquiring digital competencies, showing a clear increase in the quantity of interventions aimed at achieving this goal since the end of the lockdown (see Figure 6). This has entailed a larger participant sample as the health crisis was coming to an end, and a preference for online or blended modalities was observed, which has been predominant up to the present year, with a trend toward in-person learning again (see Figure 7).
Furthermore, a higher number of participants has enabled an improvement in the instructional and evaluative procedure, which has become more controlled, with a growing preference for the division into Control and Experimental Groups. Additionally, there was an increased focus on assessing participant satisfaction, especially since the middle of the health crisis. To achieve positive results in this indicator, continuous information exchange between instructors and participants has been encouraged (see Figure 7).
The studies selected from the period prior to the lockdown primarily focus on the acquisition of digital competence as defined by the European educational framework. There is much less concern for its professional development, as evidenced by the number of studies conducted in these years, particularly those addressing professional development rather than purely curricular development. These studies form the foundation for the design of interventions aimed at this goal and mark the beginning of the use of Massive Open Online Courses (MOOCs).
During the years of the health crisis, there is a growing concern for the educational and professional development of digital competence due to the needs arising from online education and remote work. This is evident in an increased number of interventions aimed at achieving digital literacy in various domains. These interventions provide guidelines for online instruction and assessment. These procedures are complemented by online monitoring, using digital resources such as virtual classrooms, rubrics, and online forms. In addition, psychoeducational aspects have become a concern for researchers, including academic performance and participant satisfaction. To address these concerns, online meetings and feedback mechanisms between instructors and participants have been introduced.
Finally, after the period of restrictions resulting from the lockdown, there is a surge in interventions aimed at promoting digital competence in all areas. It has become one of the primary goals for the proper development of society. The main contributions during this period include significantly larger sample sizes than in previous years, longer-duration instruction, and a trend toward hybrid and in-person modalities. This trend is evident in the last year, marking a shift away from online-only interventions. The assessment of psychoeducational constructs like motivation and participant well-being becomes more established, driven by academic problems encountered during the quarantine and health crisis.

3.1.1. General Overview

When analysing interventions, it is important to consider various general aspects that can influence their effectiveness. These aspects include the participants involved, the specific construct or topic addressed, the research questions, and the stated objectives. It is also relevant to consider the professional context to which the intervention is directed, as the approaches and strategies used can vary depending on the context. Another aspect to consider is the instructional procedure used, which can include specific techniques and strategies to facilitate learning and skill development.

3.1.2. Assessment Instruments

The assessment of an intervention is fundamental to understand its effectiveness and make necessary adjustments. In this regard, it is important to define the appropriate time for evaluation, either during the intervention or upon its completion. Direct observations can provide valuable information about the participants’ performance in real-life situations. Additionally, performance tasks can be used to assess the practical application of acquired knowledge. Questionnaires and self-reports, as well as rating scales, are common tools for collecting data from participants. Other assessment methods include the use of physical or virtual portfolios to collect work samples and the evaluation of intervention effects, participant satisfaction, and result validation through individual or group feedback.

3.1.3. Fidelity and Quality of the Treatment

The fidelity of an intervention refers to the extent to which it is implemented according to the established plan. To assess fidelity, the timing of the comparison between the intervention group and the control group, if applicable, must be considered. It is also important to have a written protocol detailing the steps to follow in the intervention and ensuring consistency in implementation. Comparable training for instructors is crucial to minimize variations in intervention implementation. Detailed records should be maintained to assess whether the intervention is being applied consistently and uniformly. Additionally, it is essential to evaluate the relevance of the intervention and conduct regular meetings to provide feedback and make necessary adjustments.

3.1.4. Instructional Procedure

The instructional intervention encompasses various aspects related to the design and implementation of learning activities. This includes the selection and development of appropriate teaching materials, the instructor’s role during the intervention, the participants’ role in the learning procedure, and the grouping of participants in collaborative activities. The context in which the intervention takes place must also be considered, whether it is in a formal educational setting or a specific professional environment. The duration of the intervention is also a factor to consider as it can influence the depth and quality of learning. Finally, it is essential to assess the results obtained through the intervention and analyse whether the stated objectives were achieved.

3.1.5. Limitations

Despite good intentions and efforts made in interventions, it is important to recognize and address any limitations that may arise. These limitations can manifest in various aspects, such as the available background on the study topic, the participant sample used, the measurement instruments employed, the intervention program design, the results obtained, the discussion and conclusions made, as well as general limitations that can affect the validity and generalizability of findings. Recognizing these limitations and highlighting them in the research is essential as they provide a critical and transparent view of the study. Furthermore, additional comments and reflections can help contextualize and better understand the identified limitations.
All the selected studies show interventions in which the digital competence is assessed both before and after instruction. The aim is to assess the teaching and learning procedure, whether there has been an improvement in participants’ digital literacy, and to what level this competence has been acquired. Below are five tables, each focusing on specific constructs, and a detailed explanation of each one.

3.2. General Overview

Research on digital competence and literacy has been the subject of study in various works in recent years. The analysed findings indicate the importance of understanding how the lockdown has affected research in this field and how digital competence has become even more relevant in today’s society. Some of them have limitations in terms of presenting data from the participant sample. However, other researchers have managed to provide detailed information. In terms of intervention design, a high number of them opt for a quasi-experimental, descriptive, and cross-sectional approach, contributing not only to digital competence but also to other educational areas such as linguistics or mathematics, and even to other professional fields such as healthcare or IT. The selection of a purely qualitative or quantitative analysis or a mixed one offers a greater variety of choices (see Table 1).
In most cases, data from the participant samples have been presented, but not in studies such as those by Benavente-Vera et al.; by Chatwattana & by Garcés et al. [60,73,78] where only the total number of participants is shown, without specifying gender or average age. Regarding the COVID lockdown, a lower average number of participants was observed during the health crisis due to the difficulties encountered during the period of restrictions on gatherings. It was higher before and with a greater difference after the health crisis (see Figure 8). Within the sample, with respect to groupings, half of the studies propose intervention on a single experimental group, without dividing it with another control group. The use of a division into EG (experimental group) and CG (control group) increases after the lockdown, adding originality and higher quality to post-COVID studies because dividing the sample into two groups allows for a differential analysis of the constructs studied, thus improving the reliability of the research and the effectiveness of the instruction provided [81,88].
As for the construct being worked on, all articles focus on digital competence or literacy in the same field. The studies by Yelubay et al. [82], which focus on motivation, by Maureen et al. [69] on linguistic competence, and by Romero-García et al. [71] on academic performance, stand out as exceptions, adding an extra dimension for analysis. It is important to highlight the significance of psycho-educational variables such as motivation and satisfaction during and after the quarantine period, becoming priorities to evaluate in most interventions. Due to the restrictions during the health crisis, which caused various psycho-educational issues in students and teachers, the protection, care, and development of psychological variables related to education, such as motivation, commitment, digital well-being, and academic performance, gained importance in interventions once the state of emergency was lifted [86,87].
Instruction focused on achieving the highest possible level of digital competence has influenced the selection of techniques and strategies for interventions. There is a clear trend toward designing and implementing online courses (MOOC and NANO MOOC), digital applications, websites, and virtual classrooms aimed at practical use to improve digital literacy. Virtual intervention becomes more pronounced during the years when COVID-related health measures are in place, as online education is seen as necessary. This trend continues in studies published after the lockdown, with the use of gamification [76] and social networks [70,77]. After the health crisis, there is a stagnation in the trend toward online intervention, with the number of studies opting for in-person modalities equaling those using online methods [83,86].
Regarding the instruction provided, the mentioned studies that aim to develop psycho-educational variables propose a different instructional procedure. For instance, Wang et al. [87] use gamification to foster engagement and digital well-being. Pino & in Yelubay et al. [82,86], case studies applied to solving problems relevant to participants’ lives and various tasks with playful elements, such as video editing, quiz solving, or online forum debates, are proposed to enhance student motivation, instead of a more traditional instruction. Similarly, Maureen et al. [69], focused on acquiring linguistic competence, and Romero-García et al. [71], focused on academic performance, propose instructional procedures with some original contributions, such as a classic and virtual storytelling or a customized mathematical game and workshop, respectively.
Finally, the research objectives and questions are referenced. Among the analysed articles, some explicitly include both questions and objectives, while others include one of the two indicators. The study by Benavente-Vera et al. [60] does not specify either. The indicators related to the stated objectives or questions mainly focus on verifying the effectiveness of the intervention and assessing the level of development of the worked construct, which in these studies is the acquisition of digital competence. It is observed that studies in which the development of another psycho-educational construct, such as commitment, motivation, or academic performance [71,86,87], is combined with the development of digital competence, are reflected in the research objectives or questions as a goal to achieve. This becomes more prominent during and after the lockdown when the achievement of these variables’ development is emphasized as an objective to affirm the effectiveness of the intervention.
To facilitate the identification and reading of the displayed results, the table has been condensed. As a result, some indicators are retained in this document, while the remaining ones can be found in the supplementary materials section (see Table S6).
Meaning of abbreviations in the table: NM = Number of men; NW = Number of women; EG = Experimental group; CG = Control group.

3.3. Assessment Instruments

The analysis of the evaluation instruments in a study refers to the procedure of examining and critically evaluating those used to collect data and obtain relevant information within the framework of an intervention. The assessment of digital competence and literacy has also been the subject of study where different instruments have been employed to measure participants’ progress. We will analyse the evaluation approaches used in these studies, focusing on the application of the instruments at different times and under the influence of the COVID-19 lockdown. The participants are mostly from educational environments, ranging from primary education to university students or teachers. The latter group predominates in interventions following the lockdown, with the aim of training future teachers to use digital tools efficiently (see Table 2).
In addition, we will highlight the moments when each of the evaluation instruments is applied. All the studies conduct a diagnostic and final evaluation after the intervention. The article by Munawaroh et al. [80] stands out as it includes an evaluation conducted weeks after the final assessment to check if the results persist over time. Regarding the influence of COVID, the health crisis has compelled not only the instruction but also the evaluation procedures to be conducted online. This necessity has adapted the assessments of successive studies, maintaining digital tools in assessing the progress made in most articles [79,88].
Not only have evaluations been conducted before and after the instruction, but some interventions also include ongoing monitoring using various resources. On one hand, regarding the portfolio used, the use of questionnaires or surveys in diagnostic, formative, and summative evaluation stands out. On the other hand, rubrics are used for completing activities, and in the study by Prince et al. [67], interviews with participants are utilized. Once again, there is a progression towards digitizing various evaluative instruments, evolving from paper questionnaires or surveys to the use of online forms like Google Forms. Reliability testing of these instruments with proprietary data is also observed in most studies, with some exceptions [63,65,78].
The observations are also described, where there is a general focus on performance, difficulties, and limitations. In some studies, such as Wang et al. [87], psychoeducational variables such as engagement, motivation, or interest are observed and evaluated. In the studies by Yelubay et al. & by Pino [82,86] variables such as digital well-being and motivation are assessed through questionnaires and case studies. Concerns have arisen from the quarantine and the sudden introduction of online education in the educational context, affecting both teachers, students, and their families.
The last group includes various studies that introduce satisfaction surveys for the participants’ families, adding extra value to the intervention. There is a growing concern about the satisfaction of participants and their families due to the digital tools introduced during the health crisis [71,73,75]. This evaluation continues after the lockdown, being absent before it. Once the instructional and evaluation procedure is completed, a questionnaire is sent to assess the satisfaction with the course. The results obtained in these types of surveys reinforce the need to use MOOC-type interventions to develop digital competence, as the satisfaction of participants and their families is high [57,83].
Finally, reference is made to both task performance evaluation and what is assessed with the evaluation instrument, with recurring results in the vast majority of interventions. In the first group, the focus is on competence level, performance, mastery, and the results of the pre- and post-tests. In the second group, the instruments are used to assess the design and effectiveness of the course conducted, as well as digital competence, either in a generalized manner or divided into areas or subfactors [57,76,77]. In conclusion, the last column provides critical comments on the evaluation procedure of each article.
To facilitate the identification and reading of the displayed results, the table has been condensed. As a result, some indicators are retained in this document, while the remaining ones can be found in the supplementary materials section (see Table S7).

3.4. Fidelity and Quality of the Treatment

Regarding the quality control of interventions, different treatment fidelity indicators can provide valuable information to understand the rigor, confidence, and potential for generalization of results to other studies or educational practices, enabling them to be considered empirically validated studies or interventions. To achieve this, they are classified according to a series of indicators outlined below (see Table 3).
Initially, it can be observed that all interventions share a common pattern. They begin with a prior empirical review before designing the course, followed by a needs analysis or diagnostic assessment of the sample population for which the intervention is intended. This is done to design an optimal and efficient intervention capable of adapting information found in previous articles to the selected participants. To enhance the course’s instruction, in addition to the prior reviews and needs assessment, meetings with experts [67,74,81] or instructor training [70,76] are conducted. Subsequently, the instruction and ongoing and final evaluation take place.
Next, the instructional procedure is detailed, with a common factor in all interventions being the execution of online activities during the course to improve digital literacy. This includes the use of storytelling [69], gamification [87], and case study resolution [86], which contribute originality to the procedure. The entire procedure is monitored and observed, mostly online, except in the case of Guayara-Cuéllar et al. [65], where it is not specified. This monitoring can begin to lean towards online methods, such as virtual classrooms, particularly during the COVID-induced quarantine, and continues even after its conclusion. Among the remaining indicators, it is worth highlighting the predominant use of continuous online portfolios, which include tasks, activities, problem-solving, case studies, forums, debates, and games.
Continuing the analysis, it is indicated whether the program script is detailed in a broken down or schematic manner [60,78,79,81]. Regarding the relevance of the instruction, the majority of interventions are curriculum-based, with horizontal alignment seen in only a few articles [76,83,87]. It is noteworthy that in some studies, both vertical and horizontal approaches are combined, especially during and after the health crisis [73,77,82,86].
The instruction is applied equally to all participants, except in studies comparing an experimental group and a control group, where the experimental group receives the instruction while the control group does not. The use of two groups was observed occasionally before the COVID-19 lockdown [68,69], and it became more common during [71,73] and after the lockdown [76,80,88]. The division into control and experimental groups adds value to the studies by providing another comparison between participants, enhancing result consistency.
Finally, concerning the feedback provided by instructors, more than half consistently inform participants. A higher concentration of studies considering communication with participants as essential is observed in interventions conducted after the lockdown, a period, during which constant contact between instructors and students through ICTs became normalized [57,77,81,82,83,85,87]. There is a clear trend toward constant online feedback, both with participants and their families [81]. Regarding communication among stakeholders during instruction, the use of meetings is evident. Meetings are used in half of the interventions analysed, both before and during the health crisis, and are present in the majority of interventions proposed after the crisis.
To facilitate the identification and reading of the displayed results, the table has been condensed. As a result, some indicators are retained in this document, while the remaining ones can be found in the supplementary materials section (see Table S8).

3.5. Instructional Procedure

The analysis of the instructional procedure is a fundamental phase in the design and development of education and training, particularly in the context of teaching and learning. It refers to the systematic and detailed study of all the stages involved in creating an intervention. In this phase, a comprehensive evaluation of the educational context and the students to whom the instruction is directed is carried out. The main aspects addressed during the analysis of the instructional procedure are outlined below (see Table 4).
Initially, the roles of both the instructor and the student are highlighted. It is observed that the instructor performs the roles of a researcher and instructor, while the student actively participates in each activity. Furthermore, an equitable distribution is noted in terms of grouping into large or small groups and the context of application, whether online or in-person, except for those published during the lockdown, where online mode predominates. Before and after the lockdown, it is noted that there is a similar number of interventions that choose one or the other criterion, although some articles opt for a combination of both modes, both in-person and online [60,71,76,85].
The results of the interventions expose the contrast between the initial assessment of the participants conducted before the instruction with those derived from the questionnaires and tests completed once the course is finished. In all of them, significant progress is shown in terms of digital competence acquisition, with more evident results in studies that present a comparison between EG and CG, where the experimental group has achieved greater development in digital literacy. Depending on the article, the results are shown more explicitly, through tables and graphs, categorizing them based on the treatments applied [60], the areas worked on [71], or the different psycho-educational variables [87].
Regarding the effectiveness of online or in-person instruction and monitoring, considering the results of each study, which base the intervention’s effectiveness on the comparison of pre and post results and, where applicable, the CG and EG, we can affirm that both modalities produce positive effects on the participants. In both situations, technology tools are used, as practical tasks are performed on digital devices, with the difference being the physical presence in the instructor’s classroom or through a virtual classroom, or the method used for evaluation or the classes delivered. In all cases, the use of ICTs is essential.
The selection of an online modality versus an in-person one is based on the initial digital competence level of the participants or, in studies conducted during the lockdown, the emerging needs of that period. Therefore, it is evident that both online and in-person modalities are equally effective in acquiring and developing digital competence, and their selection has not been made to ensure the intervention’s effectiveness, but based on the needs of the moment, available resources, or the characteristics of the chosen sample.
The analysis of the instructional procedure is completed with the presentation of the duration of the instruction in each study. A wide range is observed, divided into modules or courses, starting from four sessions [60] to the entire school year [85]. Articles published during the lockdown show a shorter duration than those before and after it, with the latter suggesting more long-lasting instructions. Regarding the materials used, as it concerns the training of digital competence, there is a clear trend toward the use of ICT, social media, virtual classrooms, Google, and MOOC-NANO MOOC.

3.6. Limitations

The analysis of limitations in studies refers to the procedure of identifying, evaluating, and discussing the constraints or weaknesses that affect the validity, reliability, or generalizability of the results obtained in each intervention. This phase is essential in presenting the findings as it provides a critical and transparent assessment of the scope and implications of the results, enabling a more precise interpretation and a proper understanding of the quality of the research conducted (see Table 5).
In analysing different the interventions of various studies, several limitations have been identified in each of them. These interventions have been classified according to the specific indicators detailed in the corresponding columns and comments have been added for each selected study. Therefore, in this section, a comprehensive analysis of these limitations will not be provided.
In general terms, a common limitation observed is the failure to explicitly present the objectives, research questions, or hypotheses raised. As a result, the conclusions do not address the questions, nor do they indicate whether the objectives were achieved or if the hypotheses were fulfilled. Another limitation that has been found recurrently is the uniform application of treatment to the entire sample group, without making a distinction between an experimental group and a control group.
Additionally, in selecting the sample, there was limited information about the characteristics of the group, such as gender, age, inclusion and exclusion criteria, as well as the randomness in the sampling procedure. When choosing the sample, once again, there was limited information regarding the group’s characteristics, gender, age, inclusion and exclusion criteria, and the randomness of the sampling procedure in a generalized manner.
To facilitate the identification and reading of the displayed results, the table has been condensed. As a result, some indicators are retained in this document, while the remaining ones can be found in the supplementary materials section (see Table S9).

4. Discussion

In this discussion, we will analyse the results obtained in the study based on the stated objectives, provide an evaluation of the response to the research questions, explore practical applications of the work conducted, identify encountered limitations, and present possible solutions for future research.
In comparison to other systematic reviews published in recent years, focused on the analysis of one of the two groups involved in the teaching–learning procedures, the teaching staff, we can observe, on one hand, systematic reviews focused on the theoretical foundation and models applied by educators [32,33,34]. On the other hand, there are evaluations of practices and educational methodologies proposed for the development of digital competence [38,39,40]. This article provides a novel approach by analysing interventions specifically designed for the acquisition and development of teacher digital competence, using results obtained from the analysis of the models, theoretical foundation, practices, and methodologies selected in the mentioned reviews.
Given the division observed across different educational stages, both for educators in higher education [35] and primary education [36,37], as well as for students in higher education [42,43,44], primary education [45], or early childhood education [46,47], where articles assess digital competence and teaching–learning procedures within educational structures, yet without delving into interventions [48,49,50]. This review has amalgamated the various educational stages into a single analysis, with the aim of conducting a more comprehensive analysis, focused on the individual’s life cycle. Consequently, there has been an in-depth exploration of the digital literacy development field, not only concerning the range of stages and years of research but also in terms of adherence to the analysis of interventions aimed at competency development.
This study adds value by not only analysing the instructional and evaluative procedures, limitations, general aspects, and fidelity in interventions directed towards the acquisition of digital competence [53] but also by linking these to psychoeducational constructs [56]. This includes interventions performed on educators [51] or students from various educational stages [52,54,55]. It highlights the evolution observed in the research field during the pre-lockdown, lockdown, and post-lockdown COVID-19 periods. The results obtained have been related to indicators and focuses based on the needs emerging from the health crisis, giving rise to new evidence and trends in the research field. The significance contributed by research and empirical analysis to the acquisition of digital competence in recent years is evident in this systematic review.
The objectives set at the beginning of the research have been evaluated based on the results obtained. Each of the objectives has been systematically addressed, allowing for progress and the achievement of expected results. This has enabled the completion of a systematic review of interventions carried out in the last decade, discussing the analysed focal points, their indicators, and the changes experienced during the health crisis. All of this has revealed limitations and future lines of research, paving the way for practical applications detailed further on. Furthermore, the proposed hypotheses have been largely confirmed, thereby substantiating the validity of the approach and methodology used.
The research questions of the study have been answered satisfactorily. The collected data and the analysis performed have provided strong evidence to address each of the research questions, leading to a deeper understanding. (i) What are the results of empirically validated interventions related to teachers and students, their causal, mediating, and moderating roles in relation to psychological variables in the acquisition of digital competence? The results analysed in the corresponding section have shed light on the roles of both students and teachers within each of the interventions. They have also highlighted the relationship of their roles with psychological variables such as satisfaction or motivation, which have proven to be crucial in acquiring digital competence. (ii) How does the instructional and evaluative procedure affect the development of digital literacy and psychoeducational variables in teachers and students? Again, in the tables where each intervention has been analysed, the relationship and importance of an instructional and evaluative process that considers the motivation of both students and teachers, as well as their well-being or satisfaction, are shown. (iii) What are the strengths and limitations found in the various articles analysed? What can be contributed to future research lines? When analysing the different interventions, various limitations have been successfully identified in each of them, allowing for an answer to this question, all categorized in the corresponding table. On the other hand, a future line of research is established, which would be conducting a meta-analysis. Alternatively, as a practical implication, an improvement in educational methodologies is suggested, favouring not only learning but also the development of key psychoeducational variables in the acquisition of digital competence. (iv) How does the COVID-19 lockdown affect the effectiveness of these procedures and the role of variables? The influence of the COVID-19 pandemic-induced lockdown has been evidenced not only in the acquisition of digital competence but also in the increasing relevance and importance attributed to various psychoeducational variables related to it.
Furthermore, within the analysis of the results, the importance of the instructional and evaluative procedures in digital literacy development has been emphasized, along with the changes experienced during the lockdown and how these affect the analysed indicators. All of the above has allowed for the identification of both limitations and strengths in each of the selected studies and, consequently, their potential contributions to future lines of research and practical applications.
As an added value of the present work, a triple analysis of the quality of the reviewed studies is presented. Initially, it is assessed within the inclusion and exclusion criteria, excluding articles lacking comparative data in proposals, reflections, or interventions. Subsequently, within the “Results” section, a specific focus on evaluating quality and fidelity of treatments is included, featuring a corresponding table and its description. Finally, it culminates with an analysis centred on the limitations, providing this systematic review with both a comprehensive and specific assessment of the Quality of the selected interventions.
On the other hand, when examining the results obtained, the development of digital competence in education and various institutions during the lockdown contributes to sustainability by reducing the ecological footprint and promoting greater awareness and action on sustainable issues. The analysis of this skill set enables the identification of areas in need of improvement, fosters innovation, and eases the transition towards more sustainable psychoeducational models. This is achieved by taking into account environmental, social, and economic factors in the implementation of technological solutions, ultimately contributing to the preservation of resources and long-term well-being.
Despite the significant results, some limitations in the research were identified. One of the main challenges was the selection of the languages in which the articles have been published, as those translated into English and Spanish were chosen, which could significantly reduce the number of selected articles. To address this limitation in future studies, we suggest broadening the range of included languages, thereby increasing the sample of interventions.
Due to the rapid developments in technology and digitalization in recent years, coupled with the impact of COVID-19 on digital competence, and the difficulty of finding interventions in this field dating back before 2016, the selected range has been a decade. It would be advisable to analyse the numerous interventions that are expected to be published in this field in the near future in future research.
One of the main limitations encountered was attempting to classify the analysed interventions into the three periods based on their publication date rather than their implementation date. This was due to a lack of detailed information in some studies regarding when exactly the interventions took place. In these cases, the publication date becomes the only available and reliable information to organize the articles, providing methodological consistency to the systematic review.
On the other hand, organizing the studies based on the publication date could ensure a more consistent comparison over time. This would allow evaluating how interventions have evolved as the literature progresses, facilitating the identification of trends and changes in strategies over the years.
The publication date reflects the context in which the study was conducted and its relevance at that time. This is crucial for understanding the specific conditions, approaches, and concerns during the pre, during, and post-COVID periods.
The need for future reviews that classify interventions based on their implementation year is acknowledged, as there might be a difference of several years between that date and the article’s publication.
The findings of this study have various practical applications in the relevant field. On one hand, the obtained results enable educators to identify successful approaches, facilitating the adjustment of educational strategies to suit diverse contexts. Furthermore, it can influence educational policy formulation by providing key insights into enhancing the acquisition of digital competence in various circumstances. Simultaneously, by acknowledging the relationship between digital competence and psychoeducational variables, specific tactics can be developed for a healthy and constructive use of technology.
Also, in this systematic review, only articles that presented interventions were selected, leaving aside pilot proposals, which are mentioned in the introduction. It is recommended to conduct a systematic review that analyses different pilot proposals with the aim of implementing them in the future. Finally, as an idea for a future line of research could consider a systematic review in the form of a meta-analysis. This would combine results obtained from qualitative and quantitative analyses, significantly enhancing the study’s quality.

5. Conclusions

This systematic review has successfully analysed interventions aimed at acquiring digital competence before, during, and after the COVID-19 lockdown. Through five tables, a range of focal points has been classified and discussed, encompassing general aspects, instructional procedures, evaluation instruments, fidelity, quality, and limitations. Each of these focal points was associated with various indicators, which facilitated the analysis of the results. These findings have highlighted the increasing significance of digital literacy in recent years, as well as various psychoeducational variables associated with this construct, such as motivation and satisfaction. All of this takes on particular relevance in light of the needs arising during the health crisis.
The recommendations stemming from this study are aimed at enhancing the sustainability of the development of digital competence in educational and familial contexts, as improper use and implementation of new technologies can pose a threat to sustainability.
The social significance of this work lies in the potential for the proactive development of pedagogical recommendations that are perceived as valuable in improving the sustainability of education in relation to the use of digital tools, taking into account both the educational and family contexts.
In conclusion, while this review has effectively met its objectives, provided answers to research questions, and contributed valuable and original insights to the field of educational research, it has also revealed certain limitations. Instead of viewing these limitations as failures, they should be considered as the foundation for future lines of research and practical applications, which will further contribute to the growth and sharing of knowledge.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/su16010051/s1, Table S1: PRISMA checklist Camino et al. (2021); Table S2: PRISMA checklist Fernández Montalvo et al. (2017); Table S3: PRISMA checklist Fuentes-Cancell et al. (2022); Table S4: Distribution of the analysed interventions included in other previous systematic reviews; Table S5: Systematic reviews related to digital competence found on the Sustainability website, including the years covered, number of studies analysed, and the added value our study brings to them; Table S6: A comprehensive analysis of the reviewed empirical teaching studies, addressing elements such as the participant subjects, the groups involved, the methodological design, the examined concepts, and the skills developed. Each of these aspects is detailed in the report’s content; Table S7: Evaluation tools used in the educational implementation of the analysed studies are described in detail in the report, presenting a comprehensive comparison among the various instruments and their application; Table S8: Treatment fidelity refers to the consistency of implementing the educational approach. The report provides a comprehensive description of each control element and indicator, detailing their comparative application across the various studies. Additionally, it includes the measurement factors and adjustment variables used in the pedagogical intervention in the analysed studies; Table S9: Limitations of the educational strategies addressed in the examined empirical studies. Each of these areas is detailed extensively in the main report. Refs. [119,120,121,122,123,124] are cited in the supplementary materials.

Author Contributions

Conceptualization, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; methodology, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; software, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; validation, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; formal analysis, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; investigation, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; resources, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; data curation, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; writing—original draft preparation, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; writing—review and editing, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; visualization, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; supervision, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; project administration, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C.; funding acquisition, A.D.-B., J.-N.G.-S., M.L.Á.-F. and S.M.d.B.-C. All authors have read and agreed to the published version of the manuscript.

Funding

The general operating funds of the universities have been used Universidad de León (Spain), Instituto Politécnico de Coimbra and NICSH (Portugal).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Núñez, M.; de Obesso, M.M.; Pérez, C.A. New challenges in higher education: A study of the digital competence of educator in COVID times. Technol. Forecast. Soc. Chang. 2022, 174, 121270. [Google Scholar] [CrossRef]
  2. INTEF Marco Común de Competencia Digital Docente. Madrid: Ministerio de Educación, Cultura y Deporte. 2017. Available online: https://bit.ly/2jqkssz (accessed on 23 May 2023).
  3. ISTE (Ed.) NETS for Teachers: National Educational Technology Standards for Teachers. 2008. Available online: https://bit.ly/2UaLExK (accessed on 23 May 2023).
  4. UNESCO Normas UNESCO Sobre Competencias en TIC Para Docentes. 2008. Available online: http://goo.gl/pGPDGv (accessed on 23 May 2023).
  5. Hinojo, F.J.; Leiva, J.J. Competencia Digital e Interculturalidad: Hacia una Escuela Inclusiva y en Red. REICE Rev. Iberoam. Sobre Calid. Efic. Cambio Educ. 2022, 20, 5–9. [Google Scholar] [CrossRef]
  6. Mattar, J.; Santos, C.C.; Cuque, L.M. Analysis and Comparison of International Digital Competence Frameworks for Education. Educ. Sci. 2022, 12, 932. [Google Scholar] [CrossRef]
  7. Picón, G.A.; de Caballero, G.K.G.; Sánchez, J.N.P. Desempeño y formación docente en competencias digitales en clases no presenciales durante la pandemia COVID-19. Arandu UTIC 2021, 8, 139–153. [Google Scholar]
  8. Gómez-Trigueros, I.M.; Binimelis, J. Aprender y enseñar con la escala del mapa para el profesorado de la “generación”: La competencia digital docente. Rev. Electrónica Recur. Internet Sobre Geogr. Cienc. Soc. 2021, 238, 1–18. Available online: http://hdl.handle.net/10045/129434 (accessed on 23 May 2023). [CrossRef]
  9. Gutiérrez, N.; Mercader Rubio, I.; Trigueros Ramos, R.; Oropesa Ruiz, N.F.; García-Sánchez, J.N.; García Martín, J. Digital Competence, Use, Actions and Time Dedicated to Digital Devices: Repercussions on the Interpersonal Relationships of Spanish Adolescents. Int. J. Environ. Res. Public Health 2022, 19, 10358. [Google Scholar] [CrossRef]
  10. Barlow, D.H.; Levitt, J.T.; Bufka, L.F. The dissemination of empirically supported treatments: A view to the future. Behav. Res. Ther. 1999, 37, S147–S162. [Google Scholar] [CrossRef]
  11. Camacho, A.; Correia, N.; Zaccoletti, S.; Daniel, J.R. Anxiety and social support as predictors of student academic motivation during the COVID-19. Front. Psychol. 2021, 12, 644338. [Google Scholar] [CrossRef]
  12. Jiang, Z.; Jia, X.; Tao, R.; Dördüncü, H. COVID-19: A Source of Stress and Depression Among University Students and Poor Academic Performance. Front. Public Health 2022, 10, 898556. [Google Scholar] [CrossRef]
  13. Bonacini, L.; Murat, M. Beyond the COVID-19 lockdown: Remote learning and education inequalities. Empirica 2023, 50, 207–236. [Google Scholar] [CrossRef]
  14. Cabero-Almenara, J.; Barragán-Sánchez, R.; Palacios-Rodríguez, A.; Martín-Párraga, L. Design and Validation of t-MOOC for the Development of the Digital Competence of Non-University Teachers. Technologies 2021, 9, 84. [Google Scholar] [CrossRef]
  15. Martínez-Pérez, S.; Cabero-Almenara, J.; Barroso-Osuna, J.; Palacios-Rodríguez, A. T-MOOC for initial teacher training in digital competences: Technology and educational innovation. Front. Educ. 2022, 7, 846998. [Google Scholar] [CrossRef]
  16. Guajardo Leal, B.E.; Navarro-Corona, C.; Valenzuela González, J.R. Systematic mapping study of academic engagement in MOOC. Int. Rev. Res. Open Distrib. Learn. 2019, 20, 1–27. [Google Scholar] [CrossRef]
  17. Romero Córdova, J.F.; Arriazu, R. El aprendizaje de competencias en los MOOC. Una revisión sistemática de la literatura. Rev. Latinoam. De Tecnol. Educ. 2023, 22, 107–122. [Google Scholar] [CrossRef]
  18. Kuhlman, T.; Farrington, J. What is Sustainability? Sustainability 2010, 2, 3436–3448. [Google Scholar] [CrossRef]
  19. Purvis, B.; Mao, Y.; Robinson, D. Three pillars of sustainability: In search of conceptual origins. Sustain. Sci. 2019, 14, 681–695. [Google Scholar] [CrossRef]
  20. Mattos, L.K.D.; Flach, L.; Costa, A.M.; Moré, R.P.O. Effectiveness and Sustainability Indicators in Higher Education Management. Sustainability 2023, 15, 298. [Google Scholar] [CrossRef]
  21. Raisiene, A.G.; Rapuano, V.; Raisys, S.J.; Lucinskaite-Sadovskiene, R. Teleworking Experience of Education Professionals vs. 623 Management Staff: Challenges Following Job Innovation. Mark. Manag. Innov. 2022, 2, 171–183. [Google Scholar] [CrossRef]
  22. Ogunleye, J.K.; Afolabi, C.S.; Ajayi, S.O.; Omotayo, V.A. Virtual Learning as an Impetus for Business Education Programme 617 in the Midst of COVID-19 in Nigeria. Health Econ. Manag. Rev. 2023, 4, 83–89. [Google Scholar] [CrossRef]
  23. Okulich-Kazarin, V.; Bokhonkova, Y.; Ruda, V. Do Humanity Student New Needs Meet the State Decisions of Distance Learning during the COVID-19 Epidemic in Ukraine? FWU J. Soc. Sci. 2022, 16, 107–121. [Google Scholar] [CrossRef]
  24. United Nations. The Sustainable Development Goals Report 2023: Special Edition. Available online: https://unstats.un.org/sdgs/report/2023/?_gl=1*69eq5v*_ga*MjEzMTI2NjUyNC4xNjk3MDM2NTcw*_ga_TK9BQL5X7Z*MTY5NzAzNjU2OS4xLjEuMTY5NzAzNzQwNi4wLjAuMA (accessed on 23 May 2023).
  25. Akpoviroro, K.S.; Adeleke, O.A.O. Moderating Influence of E-Learning on Employee Training and Development. Socioecon. Econ. Chall. 2022, 6, 83–93. [Google Scholar] [CrossRef]
  26. Volk, I.; Artyukhov, A.; Lyeonov, S. Modeling of information system for blended education quality assurance and socio-economic impact. In Proceedings of the 16th International Conference on Advanced Trends in Radioelectronics, Telecommunications and Computer Engineering, Lviv-Slavske, Ukraine, 22–26 February 2022; pp. 590–593. [Google Scholar] [CrossRef]
  27. Spante, M.; Hashemi, S.S.; Lundin, M.; Algers, A. Digital competence and digital literacy in higher education research: Systematic review of concept use. Cogent Educ. 2018, 5, 1519143. [Google Scholar] [CrossRef]
  28. Farias-Gaytan, S.; Aguaded, I.; Ramirez-Montoya, M.S. Transformation and digital literacy: Systematic literature mapping. Educ. Inf. Technol. 2022, 27, 1417–1437. [Google Scholar] [CrossRef]
  29. Li, J.; Ye, H.; Tang, Y.; Zhou, Z.; Hu, X. What Are the Effects of Self-Regulation Phases and Strategies for Chinese Students? A Meta-Analysis of Two Decades Research of the Association Between Self-Regulation and Academic Performance. Front. Psychol. 2018, 9, 2434. [Google Scholar] [CrossRef] [PubMed]
  30. Peters, M.; Ejjaberi, A.E.; Martínez, M.J.; Fàbregues, S. Teacher digital competence development in higher education: Overview of systematic reviews. Australas. J. Educ. Technol. 2022, 38, 122–139. [Google Scholar] [CrossRef]
  31. Reyes, C.E.G.; Avello-Martínez, R. Alfabetización digital en la educación. Revisión sistemática de la producción científica en Scopus. Rev. Educ. A Distancia (Red) 2021, 21, 66. [Google Scholar] [CrossRef]
  32. Hernández, D.J.; Sánchez, P.M.; Giménez, F.S.S. La Competencia Digital Docente, una revisión sistemática de los modelos más utilizados. RIITE Rev. Interuniv. Investig. Tecnol. Educ. 2021, 10, 105–120. [Google Scholar] [CrossRef]
  33. Pinto-Santos, A.R.; Pérez Garcias, A.; Darder Mesquida, A. Development of Teaching Digital Competence in Initial Teacher Training: A Systematic Review. World J. Educ. Technol. Curr. Issues 2022, 14, 1–15. [Google Scholar] [CrossRef]
  34. Romero-Hermoza, R. Competencia digital docente: Una revisión sistemática. Eduser (Lima) 2021, 8, 131–137. [Google Scholar] [CrossRef]
  35. Tobar, M.G.R.; De la Cruz Lozado, J. Desarrollo de las competencias digitales en los docentes universitarios en tiempo pandemia: Una revisión sistemática. Crescendo 2021, 11, 511–527. [Google Scholar] [CrossRef]
  36. Ibda, H.; Syamsi, I.; Rukiyati, R. Professional elementary teachers in the digital era: A systematic. Int. J. Eval. Res. Educ. 2023, 12, 459–467. [Google Scholar] [CrossRef]
  37. Ramírez-Galindo, F.; Bernal-Ballén, A. El Desarrollo Profesional Docente para el fortalecimiento de la competencia digital en prácticas pedagógicas en educación básica: Una revisión sistemática. Rev. Boletín Redipe 2023, 12, 99–113. [Google Scholar] [CrossRef]
  38. Cancell, D.R.F.; Aguaded, I.; Estrada-Molina, O. La información y alfabetización informacional del marco común de competencia digital docente: Una revisión sistemática. Aloma Rev. De Psicol. Ciències De L’educació I De L’esport 2023, 41, 35–49. [Google Scholar] [CrossRef]
  39. García Ruiz, M.R.; Buenestado Fernández, M.; Ramírez Montoya, M.S. Evaluación de la Competencia Digital Docente: Instrumentos, resultados y propuestas. Revisión sistemática de la literatura. Educ. XX1 Rev. De La Fac. De Educ. 2023, 26, 273–301. [Google Scholar] [CrossRef]
  40. Nguyen, L.A.T.; Habók, A. Tools for assessing teacher digital literacy: A review. J. Comput. Educ. 2023, 1–42. [Google Scholar] [CrossRef]
  41. Revuelta-Domínguez, F.I.; Guerra-Antequera, J.; González-Pérez, A.; Pedrera-Rodríguez, M.I.; González-Fernández, A. Digital teaching competence: A systematic review. Sustainability 2022, 14, 6428. [Google Scholar] [CrossRef]
  42. Reis, C.; Pessoa, T.; Gallego-Arrufat, M.J. Alfabetización y competencia digital en Educación Superior: Una revisión sistemática. REDU Rev. De Docencia Univ. 2019, 17, 45–58. [Google Scholar] [CrossRef]
  43. Sillat, L.H.; Tammets, K.; Laanpere, M. Digital competence assessment methods in higher education: A systematic literature review. Educ. Sci. 2021, 11, 402. [Google Scholar] [CrossRef]
  44. Zhao, Y.; Llorente, A.M.P.; Gómez, M.C.S. Digital competence in higher education research: A systematic literature review. Comput. Educ. 2021, 168, 104212. [Google Scholar] [CrossRef]
  45. Tamborg, A.L.; Dreyøe, J.M.; Fougt, S.S. Digital literacy-a qualitative systematic review. Tidsskr. Læring Og Medier (LOM) 2018, 11, 29. [Google Scholar] [CrossRef]
  46. Dardanou, M.; Hatzigianni, M.; Kewalramani, S.; Palaiologou, I. Professional development for digital competencies in early childhood education and care: A systematic review. OECD Educ. Work. Pap. 2023, 295, 16–19. [Google Scholar] [CrossRef]
  47. Su, J.; Yang, W. Digital competence in early childhood education: A systematic review. Educ. Inf. Technol. 2023, 1–49. [Google Scholar] [CrossRef]
  48. Armas-Alba, L.; Alonso-Rodríguez, I. Las TIC y competencia digital en la respuesta a las necesidades educativas especiales durante la pandemia: Una revisión sistemática. Rev. Int. Pedagog. Innov. Educ. 2022, 2, 11–48. [Google Scholar] [CrossRef]
  49. Scagliusi MV, F. Competencias digitales clave en el emprendimiento juvenil: Una revisión sistemática de los últimos 6 años. RIITE Rev. Interuniv. Investig. Tecnol. Educ. 2023, 14, 28–44. [Google Scholar] [CrossRef]
  50. Istiani, I.; Fatimah, E.; Husain, A.; Sulistyawati, A.E.; Sunmud, S. Analysing the development of digital literacy framework in education: A systematic literature review. Kwangsan J. Teknol. Pendidik. 2023, 11, 242–254. [Google Scholar] [CrossRef]
  51. Perdomo, B.; Martinez, O.G.; Barreto, I.B. Competencias digitales en docentes universitarios: Una revisión sistemática de la literatura. Edmetic 2020, 9, 92–115. [Google Scholar] [CrossRef]
  52. Canedo-García, A.; García-Sánchez, J.-N.; Pacheco-Sanz, D.-I. A Systematic Review of the Effectiveness of Intergenerational Programs. Front. Psychol. 2017, 8, 1882. [Google Scholar] [CrossRef]
  53. Gutiérrez-Ángel, N.; Sánchez-García, J.-N.; Mercader-Rubio, I.; García-Martín, J.; Brito-Costa, S. Digital literacy in the university setting: A literature review of empirical studies between 2010 and 2021. Front. Psychol. 2022, 13, 896800. [Google Scholar] [CrossRef] [PubMed]
  54. Rojas-García, P.; Sáez-Delgado, F.; Badilla-Quintana, M.G.; Jiménez-Pérez, L. Análisis de intervenciones educativas con videojuegos en educación secundaria: Una revisión sistemática. Texto Livre 2022, 15, e37810. [Google Scholar] [CrossRef]
  55. Viñoles-Cosentino, V.; Sánchez-Caballé, A.; Esteve-Mon, F.M. Desarrollo de la Competencia Digital Docente en Contextos Universitarios. Una Revisión Sistemática. REICE 2022, 20, 11–27. [Google Scholar] [CrossRef]
  56. Schindler, L.A.; Burkholder, G.J.; Morad, O.A.; Marsh, C. Computer-based technology and student engagement: A critical review of the literature. Int. J. Educ. Technol. High. Educ. 2017, 14, 25. [Google Scholar] [CrossRef]
  57. Basantes-Andrade, A.; Cabezas-González, M.; Casillas-Martín, S.; Naranjo-Toro, M.; Benavides-Piedra, A. NANO-MOOCs to train university professors in digital competences. Heliyon 2022, 8, E09456. [Google Scholar] [CrossRef]
  58. Basantes-Andrade, A.; Casillas-Martín, S.; Cabezas-González, M.; Naranjo-Toro, M.; Guerra-Reyes, F. Standards of teacher digital competence in higher education: A systematic literature review. Sustainability 2022, 14, 13983. [Google Scholar] [CrossRef]
  59. De la Cruz Campos, J.C.; Santos Villalba, M.J.; Alcalá del Olmo Fernández, M.J.; Victoria Maldonado, J.J. Competencias Digitales Docentes en la Educación Superior. Un Análisis Bibliométrico. 2023. Available online: http://hdl.handle.net/10498/28623 (accessed on 23 May 2023).
  60. Benavente-Vera, S.Ú.; Flores Coronado, M.L.; Guizado Oscco, F.; Núñez Lira, L.A. Desarrollo de las competencias digitales de docentes a través de programas de intervención 2020. Propósitos Represent. 2021, 9, e1034. [Google Scholar] [CrossRef]
  61. Chávarry, E.I.H. Competencias digitales de los docentes de Educación Básica Regular. Polo Conoc. 2022, 7, 901–913. [Google Scholar]
  62. Sánchez, J.R.C.; Fernández, C.F.C. Propuestas para el desarrollo de competencias digitales docentes en la Educación Básica. Rev. Cient. Emprend. Cient. Tecnol. 2022, 3, 22. [Google Scholar] [CrossRef]
  63. Gómez-Trigueros, I.M.; Moreno-Vera, J.R. Nuevas didácticas geográficas: El modelo TPACK, los MOOCs y Google EarthTM en el aula. EDMETIC Rev. De Educ. Mediática Y TIC 2018, 7, 146–165. [Google Scholar] [CrossRef]
  64. Velandia Rodriguez, C.A.; Mena-Guacas, A.F.; Tobón, S.; López-Meneses, E. Digital Teacher Competence Frameworks Evolution and Their Use in Ibero-America up to the Year the COVID-19 Lockdown Began: A Systematic Review. Int. J. Environ. Res. Public Health 2022, 19, 16828. [Google Scholar] [CrossRef] [PubMed]
  65. Guayara-Cuéllar, C.T.; Millán-Rojas, E.E.; y Gómez-Cano, C.A. Diseño de un curso virtual de alfabetización digital para docentes de la Universidad de la Amazonia. Rev. Cient. 2019, 34, 34–48. [Google Scholar] [CrossRef]
  66. Hernández, E.S.; Juarros, V.I.M.; Vásquez, A.R.R. Competencia digital docente de profesores universitarios en el contexto iberoamericano. Una revisión. Tesis Psicológica Rev. La Fac. Psicol. 2022, 17, 11. Available online: https://dialnet.unirioja.es/servlet/articulo?codigo=8480503 (accessed on 23 May 2023). [CrossRef]
  67. Prince, M.S.; Figueroa, M.; Martínez, J.A.; Izquierdo, J.M. Curso MOOC para fomentar el desarrollo de competencias digitales en estudiantes universitarios y autodidactas. Rev. Iberoam. Tecnol. Educ. Educ. Tecnol. 2016, 17, 16–29. [Google Scholar]
  68. Fernandez-Montalvo, J.; Penalva, A.; Irazabal, I.; Lopez-Goni, J.J. Effectiveness of a digital literacy programme for primary education students/Efectividad de un programa de alfabetización digital para estudiantes de educación primaria. Cult. Educ. 2017, 29, 1–30. [Google Scholar] [CrossRef]
  69. Maureen, I.Y.; van der Meij, H.; de Jong, T. Supporting literacy and digital literacy development in early childhood education using storytelling activities. Int. J. Early Child. 2018, 50, 371–389. [Google Scholar] [CrossRef]
  70. Aydın, M.; Çelik, T. Impact of the Digital Literacy Courses Taken by the Prospective Social Studies Teachers by Distance Learning on Digital Citizenship Skills. Res. Educ. Media 2020, 12, 42–57. [Google Scholar] [CrossRef]
  71. Romero-García, C.; Buzón-García, O.; Sacristán-San-Cristóbal, M.; Navarro-Asencio, E. Evaluación de un programa para la mejora del aprendizaje y la competencia digital en futuros docentes empleando metodologías activas. Estud. Sobre Educ. 2020, 39, 179–205. [Google Scholar] [CrossRef]
  72. Camino, L.G.; Cruz, S.G.M.; Ana, G.V.M.R. Development of Digital Competence in primary and secondary students in three dimensions: Fluency, learning-knowledge and digital citizenship. RISTI (Revista Iberica de Sistemas e Tecnologias de Informacao) 2021, 44, 5–21. Available online: https://link.gale.com/apps/doc/A695578832/AONE?u=anon~cb10afad&sid=googleScholar&xid=b8e348a7 (accessed on 23 May 2023).
  73. Chatwattana, P. A MOOC system with self-directed learning in a digital university. Glob. J. Eng. Educ. 2021, 23, 134–142. Available online: http://www.wiete.com.au/journals/GJEE/Publish/vol23no2/09-Chatwattana-P.pdf (accessed on 23 May 2023).
  74. Ryhtä, I.; Elonen, I.; Hiekko, M.; Katajisto, J.; Saaranen, T.; Sormunen, M.; Mikkonen, K.; Kääriäinen, M.; Sjögren, T.; Korpi, H.; et al. Enhancing social and health care educators’ competence in digital pedagogy: A pilot study of educational intervention. Finn. J. EHealth EWelfare 2021, 13, 302–314. [Google Scholar] [CrossRef]
  75. Ugur, B.; Arkun Kocadere, S.; Nuhoglu Kibar, P.; Bayrak, F. An open online course to enhance digital competences of teachers. Turk. Online J. Distance Educ. 2021, 22, 24–42. [Google Scholar] [CrossRef]
  76. Choi, E.; Park, N. IT Humanities Education Program to Improve Digital Literacy of the Elderly. J. Curric. Teach. 2022, 11, 138–145. [Google Scholar] [CrossRef]
  77. Fuentes-Cancell, D.R.; Estrada-Molina, O.; Delgado-Yanes, N.; Zambrano-Acosta, J.M. Experiences in the Training of Teaching Digital Competence for Using Digital Social Networks. Int. J. 2022, 11, 7–26. [Google Scholar] [CrossRef]
  78. Garcés, L.A.M.; Morán, F.G.F.; Delgado, R.J.E. Diseño e implementación de un MOOC para el desarrollo de las competencias digitales. Rev. Mapa 2022, 6, 74–90. Available online: https://www.revistamapa.org/index.php/es/article/view/339 (accessed on 23 May 2023).
  79. Javorcik, T. Self-assesment of digital literacy skills in microlearning course. In EDULEARN22 Proceedings; IATED: Palma, Spain, 2022; pp. 5642–5647. [Google Scholar] [CrossRef]
  80. Munawaroh, I.; Ali, M.; Hernawan, A.H. The effectiveness of the digital competency training program in improving the digital competence of elementary school teachers. Cypriot J. Educ. Sci. 2022, 17, 4583–4597. [Google Scholar] [CrossRef]
  81. Nogueira, V.B.; Teixeira, D.G.; de Lima, I.A.C.N.; Moreira, M.V.C.; de Oliveira, B.S.C.; Pedrosa, I.M.B.; Jeronimo, S.M.B. Towards an inclusive digital literacy: An experimental intervention study in a rural area of Brazil. Educ. Inf. Technol. 2022, 27, 2807–2834. [Google Scholar] [CrossRef] [PubMed]
  82. Yelubay, Y.; Dzhussubaliyeva, D.; Moldagali, B.; Suleimenova, A.; Akimbekova, S. Developing future teachers’ digital competence via massive open online courses (MOOCs). J. Soc. Stud. Educ. Res. 2022, 13, 170–195. Available online: https://bulenttarman.com/index.php/jsser/article/view/4197 (accessed on 23 May 2023).
  83. Calvopiña Herrera, L.J. Desarrollo de Competencias Digitales Para El Fortalecimiento del Desempeño Profesional de los Docentes. Master’s Thesis, Pontificia Universidad Católica del Ecuador, Quito, Ecuador, 2023. Available online: https://repositorio.pucesa.edu.ec/handle/123456789/4180 (accessed on 23 May 2023).
  84. Dimitri, P.; Fernandez-Luque, L.; Koledova, E.; Malwade, S.; Syed-Abdul, S. Accelerating digital health literacy for the treatment of growth disorders: The impact of a massive open online course. Front. Public Health 2023, 11, 1043584. [Google Scholar] [CrossRef] [PubMed]
  85. Gabarda Méndez, V.; Marín-Suelves, D.; Vidal-Esteve, M.I.; Ramón-Llin, J. Digital Competence of Training Teachers: Results of a Teaching Innovation Project. Educ. Sci. 2023, 13, 162. [Google Scholar] [CrossRef]
  86. Pino, A.A.D. Impacto de un programa de intervención para promover la competencia digital del futuro docente: Un estudio de caso. Hachetetepé. Rev. Cient. Educ. Comun. 2023, 26, 1205. [Google Scholar] [CrossRef]
  87. Wang, K.; Liu, P.; Zhang, J.; Zhong, J.; Luo, X.; Huang, J.; Zheng, Y. Effects of Digital Game-Based Learning on Students’ Cyber Wellness Literacy, Learning Motivations, and Engagement. Sustainability 2023, 15, 5716. [Google Scholar] [CrossRef]
  88. Zhang, H.; Zhu, C.; Sang, G.; Questier, F. Effects of digital media literacy course on primary school students’ digital media literacy: An experimental study. Int. J. Technol. Des. Educ. 2023, 6, 100320. [Google Scholar] [CrossRef]
  89. Godaert, E.; Aesaert, K.; Voogt, J.; van Braak, J. Assessment of students’ digital competences in primary school: A systematic review. Educ. Inf. Technol. 2022, 27, 9953–10011. [Google Scholar] [CrossRef]
  90. Gunnars, F. A large-scale systematic review relating behaviorism to research of digital technology in primary education. Comput. Educ. Open 2021, 2, 100058. [Google Scholar] [CrossRef]
  91. Du, X. A Systematic Literature Review: The Modalities, Pedagogies, Benefits, and Implications of Storytelling Approaches in Early Childhood Education Classroom. 2021. Available online: https://ir.lib.uwo.ca/etd/8054 (accessed on 23 May 2023).
  92. Parlindungan, F.; Rifai, I.; Nuthihar, R.; Dewayani, S. Fostering Peace and Harmony Through Indonesian Heroes’ Stories: A Systematic Review of Literature. In Proceedings of the 4th International Conference on Progressive Education 2022 (ICOPE 2022), Bandar Lampung, Indonesia, 15–16 October 2022; Atlantis Press: Amsterdam, The Netherlands, 2023; pp. 349–364. [Google Scholar] [CrossRef]
  93. Paul, C.D.; Hansen, S.G.; Marelle, C.; Wright, M. Incorporating Technology into Instruction in Early Childhood Classrooms: A Systematic Review. Adv. Neurodev. Disord. 2023, 7, 380–391. [Google Scholar] [CrossRef]
  94. Purnama, S.; Ulfah, M.; Ramadani, L.; Rahmatullah, B.; Ahmad, I.F. Digital Storytelling Trends in Early Childhood Education in indonesia: A Systematic Literature Review. J. Pendidik. Usia Dini 2022, 16, 17–31. [Google Scholar] [CrossRef]
  95. Ramalingam, K.; Jiar, Y.K. The Recent Trends on The Speaking Skills with Storytelling Approach. Int. J. Spec. Educ. 2022, 37 (Suppl. S3), 1–23. Available online: https://khasturiramalingam.com/?p=97 (accessed on 16 December 2023).
  96. Su, J.; Zhong, Y.; Chen, X. Technology education in early childhood education: A systematic review. Interact. Learn. Environ. 2023, 1–14. [Google Scholar] [CrossRef]
  97. Wolff, L.A.; Skarstein, T.H.; Skarstein, F. The Mission of early childhood education in the Anthropocene. Educ. Sci. 2020, 10, 27. [Google Scholar] [CrossRef]
  98. Xu, M.; Stefaniak, J. Embracing children’s voice to support teachers’ pedagogical reasoning and decision-making for technology enhanced practices in early childhood classrooms. TechTrends 2021, 65, 256–268. [Google Scholar] [CrossRef]
  99. Marrero-Sánchez, O.; Vergara-Romero, A. Digital competence of the university student. A systematic and bibliographic update. Amazon. Investig. 2023, 12, 9–18. [Google Scholar] [CrossRef]
  100. Buils, S.; Mon FM, E.; Tarazaga, L.S.; Ainsa, P.A. Análisis de la perspectiva digital en los marcos de competencias docentes en Educación Superior en España. RIED-Rev. Iberoam. Educ. A Distancia 2022, 25, 133–147. [Google Scholar] [CrossRef]
  101. García Prieto, F.J.; López Aguilar, D.; Delgado García, M. Competencia digital del alumnado universitario y rendimiento académico en tiempos de COVID-19. Pixel-Bit 2022, 64, 165–199. [Google Scholar] [CrossRef]
  102. Tito-Huamani, P.; Aponte, S.; Custodio, F.; Castañeda, T.; Garamendi, K.; Soto, E. Universidad virtual y la transformación educativa en el contexto de la pandemia. Rev. Innova Educ. 2022, 4, 113–131. [Google Scholar] [CrossRef]
  103. Reyes-Argüelles, H.; Alanya-Beltran, J.; Caballero, J.E.A.P. Aprendizaje Basado en Problemas en Tiempos de Pandemia COVID-19: Revisión Sistemática. J. Bus. Entrep. Stud. 2022. Available online: https://journalbusinesses.com/index.php/revista/article/view/277 (accessed on 23 May 2023).
  104. Mendoza, V.R. Habilidades Educativas del Siglo XXI: Un Análisis Sistemático del Periodo 2014–2023. Rev. Boaciencia. Educ. E Ciências Sociais 2023, 3, 77–100. [Google Scholar]
  105. Monroy, N.E.C.; Villamil, Y.P.R. Competencias del siglo XXI en educación: Una revisión sistemática durante el periodo 2014–2023. Cienc. Lat. Rev. Cient. Multidiscip. 2023, 7, 219–249. [Google Scholar] [CrossRef]
  106. Şener, Z.T. The impact of technology-mediated applications on mathematics achievement: A meta-analytical review. Int. J. Educ. Technol. Sci. Res. 2023, 8, 1977–2010. [Google Scholar] [CrossRef]
  107. Ye, J.; Lai, X.; Wong GK, W. The transfer effects of computational thinking: A systematic review with meta-analysis and qualitative synthesis. J. Comput. Assist. Learn. 2022, 38, 1620–1638. [Google Scholar] [CrossRef]
  108. Ye, H.; Liang, B.; Ng, O.L.; Chai, C.S. Integration of computational thinking in K-12 mathematics education: A systematic review on CT-based mathematics instruction and student learning. Int. J. STEM Educ. 2023, 10, 3. [Google Scholar] [CrossRef]
  109. Mat Ishah, N.S.; Lee, K.L.; Nawanir, G. Educational supply chain sustainability. Asian Educ. Dev. Stud. 2023, 12, 137–149. [Google Scholar] [CrossRef]
  110. Triyana, I.G.N.; Candiasa, I.M.; Sudatha, I.G.W.; Divayana, D.G.H. Revitalizing digital technology literacy in education: A systematic literature review and framework development. Synesis 2023, 15, 254–272. Available online: https://seer.ucp.br/seer/index.php/synesis/article/view/2798 (accessed on 23 May 2023).
  111. Zahrah, F.; Dwiputra, R. Digital Citizens: Efforts to Accelerate Digital Transformation. J. Studi Kebijak. Publik 2023, 2, 1–11. [Google Scholar] [CrossRef]
  112. Vishnu, S.; Sathyan, A.R.; Sam, A.S.; Radhakrishnan, A.; Ragavan, S.O.; Kandathil, J.V.; Funk, C. Digital competence og higher education learner in the context of COVID-19 triggered online learning. Soc. Sci. Humanit. Open 2022, 6, 100320. [Google Scholar] [CrossRef]
  113. Miller, D.M.; Scott, C.E.; McTigue, E.M. Writing in the secondary-level disciplines: A systematic review of context, cognition and content. Educ. Psychol. Rev. 2016, 30, 83–120. [Google Scholar] [CrossRef]
  114. Scott, C.E.; McTigue, E.M.; Miller, D.M.; Washburn, E.K. The what, when, and how of preservice teachers and literacy across the disciplines: A systematic literature review of nearly 50 years of research. Teach. Teach. Educ. 2018, 73, 1–13. [Google Scholar] [CrossRef]
  115. Cooper, R.; Junginger, S.; Lockwood, T. Design thinking and design management: A research and practice perspective. Des. Manag. Rev. 2009, 20, 46–55. [Google Scholar] [CrossRef]
  116. Alonso-García, S.; Victoria-Maldonado, J.J.; García-Sempere, P.J.; Lara-Lara, F. Student evaluation of teacher digitals skills at Granada University. Front. Educ. 2023, 7, 1069245. [Google Scholar] [CrossRef]
  117. León-Nabal, B.; Zhang-Yu, C.; Lalueza, J.L. Uses of digital mediation in the school-families relationship during the COVID-19 lockdown. Front. Psychol. 2021, 12, 687400. [Google Scholar] [CrossRef] [PubMed]
  118. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. Declaración PRISMA 2020: Una guía actualizada para la publicación de revisiones sistemáticas. Rev. Española De Cardiol. 2021, 74, 790–799. [Google Scholar] [CrossRef]
  119. Colás-Bravo, P.; Conde-Jiménez, J.; Reyes-de-Cózar, S. Sustainability and Digital Teaching Competence in Higher Education. Sustainability 2021, 13, 12354. [Google Scholar] [CrossRef]
  120. De la Calle, A.M.; Pacheco-Costa, A.; Gómez-Ruiz, M.Á.; Guzmán-Simón, F. Understanding Teacher Digital Competence in the Framework of Social Sustainability: A Systematic Review. Sustainability 2021, 13, 13283. [Google Scholar] [CrossRef]
  121. Satalkina, L.; Steiner, G. Digital Entrepreneurship and its Role in Innovation Systems: A Systematic Literature Review as a Basis for Future Research Avenues for Sustainable Transitions. Sustainability 2020, 12, 2764. [Google Scholar] [CrossRef]
  122. Raković, L.; Marić, S.; Đorđević Milutinović, L.; Sakal, M.; Antić, S. What about the Chief Digital Officer? A Literature Review. Sustainability 2022, 14, 4696. [Google Scholar] [CrossRef]
  123. Treviño-Elizondo, B.L.; García-Reyes, H. An Employee Competency Development Maturity Model for Industry 4.0 Adoption. Sustainability 2023, 15, 11371. [Google Scholar] [CrossRef]
  124. Wang, Y.; Jiang, S.; Wu, C.; Cai, X.; Wang, F. Impact of the Global Megatrends, COVID-19, and Digital Economy on Professional Career Management Transformation in Asian Countries. Sustainability 2022, 14, 10981. [Google Scholar] [CrossRef]
Figure 1. Relationship between citing articles and systematic reviews that include the analysed interventions). Prince et al. (2016) [67]; Fernández-montalvo et al. (2017) [68]; Gómez-trigueros et al. (2018) [63]; Maureen et al. (2018) [69]; Guayara cuéllar et al. (2019) [65]; Aydina et al. (2020) [70]; Benavente-vera et al. (2020) [60]; Romero garcía et al. (2020) [71]; Camino et al. (2021) [72]; Chatwattana (2021) [73]; Ryhtä et al. (2021) [74]; Ugur et al. (2021) [75]; Basantes-andrade et al. (2022) [57]; Choi et al. (2022) [76]; Fuentes-cancell et al. (2022) [77]; Garcés et al. (2022) [78]; Javorcik (2022) [79]; Munawaroh et al. (2022) [80]; Nogueira et al. (2022) [81]; Yelubay et al. (2022) [82]; Calvopiña herrera (2023) [83]; Dimitri et al. (2023) [84]; Gabarda méndez et al. (2023) [85]; Pino (2023) [86]; Wang et al. (2023) [87]; Zhang et al. (2023) [88].
Figure 1. Relationship between citing articles and systematic reviews that include the analysed interventions). Prince et al. (2016) [67]; Fernández-montalvo et al. (2017) [68]; Gómez-trigueros et al. (2018) [63]; Maureen et al. (2018) [69]; Guayara cuéllar et al. (2019) [65]; Aydina et al. (2020) [70]; Benavente-vera et al. (2020) [60]; Romero garcía et al. (2020) [71]; Camino et al. (2021) [72]; Chatwattana (2021) [73]; Ryhtä et al. (2021) [74]; Ugur et al. (2021) [75]; Basantes-andrade et al. (2022) [57]; Choi et al. (2022) [76]; Fuentes-cancell et al. (2022) [77]; Garcés et al. (2022) [78]; Javorcik (2022) [79]; Munawaroh et al. (2022) [80]; Nogueira et al. (2022) [81]; Yelubay et al. (2022) [82]; Calvopiña herrera (2023) [83]; Dimitri et al. (2023) [84]; Gabarda méndez et al. (2023) [85]; Pino (2023) [86]; Wang et al. (2023) [87]; Zhang et al. (2023) [88].
Sustainability 16 00051 g001
Figure 2. Systematic reviews related to the acquisition of digital competence published in recent years).
Figure 2. Systematic reviews related to the acquisition of digital competence published in recent years).
Sustainability 16 00051 g002
Figure 3. Search terms used for the selection and screening of articles to analyse.
Figure 3. Search terms used for the selection and screening of articles to analyse.
Sustainability 16 00051 g003
Figure 4. Number of interventions published by country among those analysed.
Figure 4. Number of interventions published by country among those analysed.
Sustainability 16 00051 g004
Figure 5. PRISMA flow diagram. Details of the number of studies selected and excluded at each stage along with their reasons.
Figure 5. PRISMA flow diagram. Details of the number of studies selected and excluded at each stage along with their reasons.
Sustainability 16 00051 g005
Figure 6. Number of selected interventions by years. Before, during, and after COVID-19.
Figure 6. Number of selected interventions by years. Before, during, and after COVID-19.
Sustainability 16 00051 g006
Figure 7. Development of the number of interventions providing information on the following’s indicators pre, during and post COVID-19.
Figure 7. Development of the number of interventions providing information on the following’s indicators pre, during and post COVID-19.
Sustainability 16 00051 g007
Figure 8. Accumulation of selected participants in the samples each year. Pre, during, and post COVID-19.
Figure 8. Accumulation of selected participants in the samples each year. Pre, during, and post COVID-19.
Sustainability 16 00051 g008
Table 1. A comprehensive analysis of the reviewed empirical teaching studies, addressing elements such as the participant subjects, the groups involved, the methodological design, the examined concepts, and the skills developed. Each of these aspects is detailed in the report’s content.
Table 1. A comprehensive analysis of the reviewed empirical teaching studies, addressing elements such as the participant subjects, the groups involved, the methodological design, the examined concepts, and the skills developed. Each of these aspects is detailed in the report’s content.
StudyParticipantsConstruct and CompetenceProfessional Domain
SampleGroupsDesignSampling
Before COVID-19 lockdown
Prince et al. (2016)N = 20
Nstudents = 10
Nteachers = 8
G = 20Qualitative and Descriptive Approach with Case Study.Students from in-person courses at universitiesDigital Competence and ICT Appropriation.Educational.
Fernández-montalvo et al. (2017)N = 364
NW = 158
NM = 206
MIDDLE AGE = 12
EG = 190
CG = 174
Quasi-experimental with repeated assessment measures, participants from each group were chosen randomly.6º grade.Digital literacy.
Digital identity.
Educational.
Gómez-trigueros et al. (2018)N = 189
NW = 166
NM = 23
AGE = 19–21
G = 189A combined approach that incorporates a quasi-experimental approach and a mixed methodology, using descriptive statistics analysis.Students enrolled in the Master’s Degree program in the context of the Faculty of Education at the University of Alicante.Digital literacy.Educational.
Maureen et al. (2018)N = 45
NW = 25
NM = 20
MIDDLE AGE = 5
CG
EGstoryteller
EGDigitalstoryteller
Quasi-experimental design with three groups.5 and 6-year-old students from three Kindergarten classes at a school in Indonesia.Linguistic competence.
Digital competence.
Reading and writing skills.
Educational.
Guayara cuéllar et al. (2019)N = 100
NW = 23
NM = 77
AGE = 25–55
G = 100Exploratory and projective design, with quantitative and qualitative description.
Three phases: Diagnosis, theoretical content, and design and implementation of the online course.
Professors from the University of the Amazonia.Digital literacy.Professional.
During COVID-19 lockdown
Aydin et al. (2020)N = 30
NW = 20
NM = 10
G = 30Quantitative research methods.
Non-controlled pre- and post-test model.
Be part of the distance education digital literacy course.
Be in the second year of the future social science teacher program.
Digital literacy.Educational.
Benavente-vera et al. (2020)N = 24G = 24Experimental design consisting of a EG (Experimental Group) composed of basic education teachers to which four pretest-posttest treatments are applied.Teachers from an educational institution.Teacher digital competence.Educational.
Romero garcía et al. (2020)N = 139
NW = 31 NM = 108
MIDDLE AGE = 33
EG = 65
CG = 74
Quantitative research approach with a quasi-experimental design that involves a non-equivalent experimental group, aimed at evaluating the results of an intervention program.Conduct a detailed analysis of the subject “Teaching Mathematics” in the curriculum of the Primary Education program offered by the International University of La Rioja (UNIR).Academic performance
Digital competence.
Educational.
Camino et al. (2021)N = 205
MIDDLE AGE = 11, 8
G = 205A quasi-experimental design was conducted, which included a longitudinal assessment using a pretest and a posttest after instruction.Elementary and secondary education students.Digital competence.
Technological fluency.
Digital knowledge.
Digital citizenship.
Educational.
Chatwattana (2021)N = 64EG = 14
EG2 = 50
ADDIE
Phases: analysis, design, development, implementation, and evaluation.
14 experts with advanced knowledge of digital tools.
5 instructors, 3 staff members, 32 s-year university students from Bangkok, and 10 subjects from the general public.
Digital literacy.Both.
Ryhtä et al. (2021)N = 11
NW = 11
NM = 0
MIDDLE AGE = 42
G = 11Quasi-experimental design, pre and posttests, without a control group.Educators from universities participating in the TerOpe project.Digital competence.Professional.
Ugur et al. (2021)N = 36
NW = 34
NM = 2
AGE = 25–64
G = 36Design-based research method.
Tech-PACK model.
3 phases: (i) Analysis of learning and teaching needs; (ii) Planning for integration; (iii) Analysis and post-instructional revisions.
Teachers from Spain, Turkey, Romania, and Italy.
Primary, Secondary, and Higher Education.
Digital competence.Professional.
Post COVID-19 lockdown
Basantes-andrade et al. (2022)N = 297
NW = 102
NM = 195
G = 297Quantitative descriptive-inferential research, comparative quasi-experimental design with pretest and posttest.Teachers who belong to the faculties of the Technical University of the North located in Ibarra, Ecuador.Digital competence.Professional.
Choi et al. (2022)N = 42
NW = 10
NM = 13
AGE = 60–79
EG = 23
CG = 19
Decision tree design criteria based on a general understanding of the supervised learning algorithm.Adult individuals aged between 60 and 79 years, residing in the city of J in South Korea.Digital literacy.Educational.
Fuentes-cancell et al. (2022)N = 30
NW = 16
NM = 14
MIDDLE AGE = 39
G1 = 15
G2 = 15
Experimental research with pretest, posttest, and intact groups.Teachers specialized in Technological Sciences and Bioinformatics from the National University of Cuba.Digital competence.Professional.
Garcés et al. (2022)N = 30G = 30Descriptive research design with a qualitative approach.Teachers from the Manuel Wolf Herrera Basic Education School.Teacher’s digital competence.Professional.
Javorcik (2022)N = 203
NW = 122
NM = 81
G = 203Pre-post test with online evaluation and self-learning.The Digital Technology in Education course is a mandatory requirement for students at the Pedagogical Faculty of the University of Ostrava.
This course includes the microlearning component as an integral part of it.
Digital literacy.Educational.
Munawaroh et al. (2022)N = 800
NW = 500
NM = 300
MIDDLE AGE = 35
CG = 400
EG = 400
Quasi-experimental method with a pretest and posttest.Teachers.
(1) have low digital competence.
(2) teach at the primary school level from grades 4 to 6.
(3) are between 25 and 60 years old.
(4) have at least an undergraduate education.
(5) have experience in the teacher professional training program.
Digital competence.
Digital literacy.
Professional.
Nogueira et al. (2022)N = 43
NW = 28
NM = 15
MIDDLE AGE = 10
EG = 20
CG = 23
Non-randomized experimental study with longitudinal intervention.5th-grade students from two public schools in Pureza, Brazil.Digital literacy.Educational.
Yelubay et al. (2022)N = 147
NW = 100
NM = 47
CG = 87
EG = 60
Pretest-posttest with a control group.Third-year students in Psychology, Pedagogy, and Primary Education Methodology at the National University of Kazakhstan.Digital competence.
Motivation.
Educational.
Calvopiña herrera (2023)N = 46
NStudents = 35
NTeachers = 11
G = 46Quantitative research method, specifically a cross-sectional, correctional, and bibliographical approach.Students and teachers at the “Unidad Educativa Tarcila Albornoz de Gross”.Digital competence.Educational.
Dimitri et al. (2023)N = 31G = 31Experimental research method.Healthcare professionals working with individuals with Growth Hormone Deficiency (GHD).Digital literacy.Professional.
Gabarda méndez et al. (2023)N = 102
NW = 80
NM = 22
G = 102Quasi-experimental design with six phases, including assessment through pre and post-tests.Students enrolled in the first, second, and third years of the Education Degree program at the University of Valencia.Digital Literacy.Educational.
Pino (2023)N = 50
NW = 36
NM = 14
G = 50Exploratory mixed-methods study with repeated measures.2nd-year students in the Primary Education degree program at a university affiliated with the Complutense University of Madrid.Digital literacy.
Motivation.
Educational.
Wang et al. (2023)N = 154
NW = 78
NM = 76
EG = 77
CG = 77
Quasi-experimental research method.7th-grade students at Guangzhou Luoxi Xincheng School.Motivation.
Engagement.
Digital literacy.
Digital well-being.
Educational.
Zhang et al. (2023)N = 58
NW = 32
NM = 26
MIDDLE AGE = 11.5
EG = 30
CG = 28
Experimental research method.5th-grade primary education students.Digital literacy.Educational.
Note: The data extracted from the analysed study is included within the appropriate section or, if necessary, outside of it.
Table 2. Evaluation tools used in the educational implementation of the analysed studies are described in detail in the report, presenting a comprehensive comparison among the various instruments and their application.
Table 2. Evaluation tools used in the educational implementation of the analysed studies are described in detail in the report, presenting a comprehensive comparison among the various instruments and their application.
StudyDirect ObservationsTask PerformanceQuestionnaires Self-Reports Rating Scales Semantic DifferentialPortfolio Physical/VirtualParticipantsSatisfactionValidation
Before COVID-19 lockdown
Prince et al. (2016)Performance.
Limitations.
Inadequacy.
Task performance.
Competence level.
Results.
Domain.
Adequacy.
Evaluation of MOOC design.
Evaluation of digital competencias.
Records: observations, tasks, questionnaires, interviews.Teachers and students.Not specified.Indicates reliability, validity, and standards with self-generated data.
Fernández-montalvo et al. (2017)Performance
Limitations
Difficulties
Task performance
Competence level
Results
Difficulties
Domain
Assessment of digital literacy from its conceptual, procedural, and attitudinal aspects.Records: tasks, questionnaires, surveys.6th-grade students.
Psychologist and pedagogue.
Not specified.Indicates reliability, validity, and standards with self-generated data.
Gómez-trigueros et al. (2018)Performance
Adaptation
Limitations
Difficulties
Task performance
Competence level
Results
Difficulties
Domain
Suitability
Evaluation of geographical knowledge through tasks and practical work.
Assessment of the quality and efficiency of the MOOC.
Evaluation of digital competence.
Records: observations, tasks, questionnaires.University students.Not specified.Does NOT indicate reliability, validity, and standards with self-generated data.
Maureen et al. (2018)Performance
Difficulties
Task performance
Competence level
Results
Difficulties
Domain
Errors
Assessment of digital competence using assessment rubrics with 5 items.
Assessment of linguistic competence using assessment rubrics with 5 items.
Both assessments before and after the intervention.
Records: observations;
Rubrics.
Preschool students.Not specified.Indicates reliability, validity, and standards with self-generated data.
Guayara cuéllar et al. (2019)Performance
Limitations
Difficulties
Interest
Task performance
Competence level
Results
Difficulties
Domain
Self-assessment of digital competencies.
Resolution of two issues related to risks and cybercrimes.
Records: tasks; surveys.- University professors.Not specified.Does NOT indicate reliability, validity, or standards with self-generated data.
During COVID-19 lockdown
Aydin et al. (2020)Performance
Difficulties
Task performance
Competence level
Results
Adequacy
Assessment of the quality and efficiency of the MOOC.
Evaluation of initial digital competence and competence after completing the course, which is measured by the sum of 8 factors:
Communication; Rights and duties; Critical thinking; Participation; Security; Digital skills; Ethics; Commerce
Data records: questionnaires.University students.Not specified.Indicates reliability, validity, and standards with their own data.
Benavente-vera et al. (2020)PerformanceTask performance
Competence level
Results
Assessment of digital competence before and after completing the course.Records of the pre-intervention and post-intervention survey.Teachers.Not specified.Indicates reliability, validity, and norms with proprietary data.
Romero garcía et al. (2020)Performance
Behaviors
Challenges
Motivation
Interest
Task performance
Competence level
Results
Adequacy
Assessment of digital competencies in 5 dimensions:
Information and Information Literacy
Communication and Collaboration
Digital Content Creation
Security
Problem Solving
Comparison of academic performance between both groups.
Records: tasks, questionnaires, surveys.University students.Survey conducted at the end by the university students who participated in the intervention.Indicates reliability, validity, and norms with proprietary data.
Camino et al. (2021)Performance
Adaptation
Task performance
Competence level
Results
Challenges
Mastery
Development of digital competencies.
Digital identity and autonomy.
Security and privacy.
Records: observations, tasks, questionnaires.University students.Not specified.Indicates reliability, validity, and norms with proprietary data.
Chatwattana (2021)Performance
Limitations
Competence level
Results
Mastery
Suitability
Quality and efficiency of the MOOC.
Perception and satisfaction regarding the suitability of the system.
Assessment of digital competence.
Records of test results and completed forms.Education experts.
Instructors.
University students.
Survey conducted at the end by the students who participated in the intervention.Indicates reliability, validity, and norms with proprietary data.
Ryhtä et al. (2021)Performance
Challenges
Task performance
Competence level
Results
Challenges
Self-assessment of digital competence in 6 areas:
Professional commitment
Digital resources
Teaching and learning
Assessment
Student training
Facilitation of student digital competence.
Records: questionnaires.University educators.Not specified.Indicates reliability, validity, and norms with proprietary data.
Ugur et al. (2021)Performance
Self-perception
Task performance
Competence level
Results
Mastery
Needs analysis.
Self-regulation.
Digital literacy assessment.
Records of tasks, questionnaires, self-assessment, and rubrics of activities.Primary, secondary, and university teachers.Survey conducted at the end by the teachers who participated in the intervention.Indicates reliability, validity, and norms with proprietary data.
Post COVID-19 lockdown
Basantes-andrade et al. (2022)Performance
Difficulties
Task performance
Competence level
Results
Proficiency
Assessment of digital competence in 6 areas:
Problem-solving
Information retrieval
Communication
Security
Content creation
Satisfaction evaluation.
Assessment of the quality and efficiency of the NANO MOOC.
Records: Questionnaires, surveys.University educators.Survey conducted at the end of the course by the participating educators.Indicates reliability, validity, and standards with their own data.
Choi et al. (2022)Performance
Behaviors
Limitations
Difficultie
Motivation
Task performance
Competence level
Results
Difficulties
Proficiency
To assess and improve the digital skills of the elderly, their digital competence is divided into two areas and further subdivided into subfactors:
The subfactors in the recognition area are value, self-efficacy, and emotion.
The subfactors in the behavior area are self-regulation, participation, ethics, security, and critical reading.
Belief in capability (AE)
Satisfaction assessment.
Records: Observations, tasks, questionnaires, surveys.Adults aged 60 to 79.A survey conducted by the participants to assess satisfaction.Indicates reliability, validity, and standards with their own data.
Fuentes-cancell et al. (2022)Performance
Adaptation
Limitations
Inadequacy
Task performance
Competence level
Results
Adequacy
Assessment of the quality and efficiency of the MOOC.
Evaluation of the progress made in acquiring digital competencies divided into 6 factors:
Language.
Technology.
Interaction procedures.
Production and dissemination
Ideology and values.
Aesthetics.
Records: Questionnaires.University educators.Not specified.Indicates reliability, validity, and standards with their own data.
Garcés et al. (2022)Performance
Satisfaction
Behavior
Adaptation
Difficulties
Task performance
Competence level
Results
Difficulties
Domain
Self-assessment of digital competences.
Assessment of independent tasks and collaborative work.
Summative assessment to understand the progress made.
Participant satisfaction survey.
Records: Observations; tasks; questionnaires; surveys.School teachers.- Survey administered to teachers who participated in the intervention at the end.Does NOT indicate reliability, validity, and standards with their own data.
Javorcik (2022)Performance
Adaptation
Limitations
Challenges
Task performance
Competence level
Results
Adequacy
Self-assessment of digital skills is carried out through a questionnaire consisting of 18 items before and after the microlearning course.
Evaluation of the effectiveness of the microlearning course.
Records: Observations, assignments, and questionnaires.Participants: University students.Not specified.Indicates reliability, validity, and standards with their own data.
Munawaroh et al. (2022)Performance
Behaviors
Difficulties
Task performance
Competency Level
Outcomes
Challenges
Mastery
Assessment of digital competence in three areas: conceptual, procedural, and attitudinal.Records: Questionnaires.Elementary school teachers.
Educational psychologists.
Digital Competence Experts.
Not specified.Indicates reliability, validity, and standards with one’s own data.
Nogueira et al. (2022)Performance
Evidence
Behaviors
Difficulties
Interest
Task performance
Competency Level
Results
Challenges
Mastery
Time
Assessment of logical-mathematical knowledge before and after the course, an 8-question questionnaire.
Observation scales for formative task assessment.
Perception survey on daily use of digital tools.
Records: Observations; tasks; questionnaires; surveys.Elementary school students and teachers.Not specified.Indicates reliability, validity, and standards with one’s own data.
Yelubay et al. (2022)Performance
Adaptation
Difficulties
Motivation
Cognition
Task performance
Competency Level
Results
Challenges
Mastery
Assessment of the development of digital competencies.
Motivational, technological, cognitive, and ethical components of competence.
Records: Questionnaires.University students.Not specified.Indicates reliability, validity, and standards with one’s own data.
Calvopiña herrera (2023)Performance
Behaviors
Inadequacy
Motivation
Task performance
Competency Level
Results
Challenges
Mastery
Assessment of the development of digital competencies.
Motivational, technological, cognitive, and ethical components of competence.
Records: Tasks; questionnaires.High school teachers and students.Survey conducted by the participants to assess satisfaction.Indicates reliability, validity, and standards with one’s own data.
Dimitri et al. (2023)Performance
Limitations
Challenges
Task performance
Competency Level
Results
Challenges
Initial and final assessment of digital literacy.
Components related to the application of digital competence in the healthcare field.
Records: Tasks; questionnaires; forums.Healthcare professionals.Not specified.Indicates reliability, validity, and standards with one’s own data.
Gabarda méndez et al. (2023)Performance
Difficulties
Belief in capability (AE)
Task performance
Competency Level
Results
Mastery
Self-assessment of digital competencies.
Self-assessment of the areas addressed in the intervention related to digital competence: Security, Collaboration, Communication, Content Creation, and Problem Solving.
Records: Tasks; questionnaires.University students.Not specified.Indicates reliability, validity, and standards with one’s own data.
Pino (2023)Performance
Adaptation
Difficulties
Motivation
Task performance
Competency Level
Challenges
Mastery
Engagement
Joint qualitative and quantitative assessment of responses in the case study.
Self-assessment and peer assessment of resources and ideas formulated during metacognitive discussions.
Records: Observations; tasks.University students.Not specified.Indicates reliability, validity, and standards with one’s own data.
Wang et al. (2023)Performance
Behaviors
Inadequacy
Motivation
Interest
Engagement
Commitment
Task performance
Competency Level
Challenges
Mastery
Engagement
Diagnostic evaluation of digital well-being before and after the intervention.
Post-intervention assessment with questionnaires for motivation and commitment.
Records: Observations; tasks; questionnaires.7th-grade students.Not specified.Indicates reliability, validity, and standards with one’s own data.
Zhang et al. (2023)Performance
Behaviors
Inadequacy
Interest
Task performance
Competency Level
Challenges
Mastery
Diagnostic and post-intervention assessment of digital literacy.Records: Tasks; questionnaires; discussions.5th-grade students.Not specified.Indicates reliability, validity, and standards with one’s own data.
Note: The data extracted from the analysed study is included within the appropriate section or, if necessary, outside of it.
Table 3. Treatment fidelity refers to the consistency of implementing the educational approach. The report provides a comprehensive description of each control element and indicator, detailing their comparative application across the various studies. Additionally, it includes the measurement factors and adjustment variables used in the pedagogical intervention in the analysed studies.
Table 3. Treatment fidelity refers to the consistency of implementing the educational approach. The report provides a comprehensive description of each control element and indicator, detailing their comparative application across the various studies. Additionally, it includes the measurement factors and adjustment variables used in the pedagogical intervention in the analysed studies.
StudyPre-Written ProtocolComparable Instructor TrainingRecordsRelevanceMeetingsFeedback
Before COVID-19 lockdown
Prince et al. (2016)Detailed program script broken down into sessions and blocks, applied to the digital competency construct.Pre-MOOC design: Needs diagnosis, surveys, meetings with industry experts, and literature review.Continuous online portfolio: Tasks, activities, case studies, interviews.Curricular relevance.Pre-MOOC development meetings and periodic interviews among stakeholders.Continuous feedback: Provided to students through interviews.
Fernández-montalvo et al. (2017)Detailed program script broken down into sessions and blocks, applied to the digital competency construct.Prior study on the characteristics of internet usage among young people, their patterns, and risk behaviors.Continuous online portfolio: Tasks and activities.Curricular relevance.Not specified.Continuous feedback provided to students.
Gómez-trigueros et al. (2018)The program’s script is not broken down, and instead, it is presented with an outline.A literature review is conducted beforehand.Continuous online portfolio: tasks and activities.Curricular relevance.Not specified.Not specified.
Maureen et al. (2018)The program’s script is detailed, broken down into sessions and blocks, applied to the literacy and digital literacy construct.A prior systematic review is conducted.Continuous online portfolio: tasks and activities.Curricular relevance.Periodic meetings among actors.Continuous feedback to the student.
Guayara cuéllar et al. (2019)The instructional program followed is not specified or detailed.Prior diagnostic evaluation and research focused on problem identification are conducted. Workshops, activities, and assessments are carried out.Continuous online portfolio includes tasks and surveys.Curricular relevance.Not specified.Not specified.
During COVID-19 lockdown
Aydin et al. (2020)A detailed script for the program is broken down into sessions and blocks, applied to the digital competence construct.Prior training for teachers, tutors, parents, etc., who implement the program.Continuous online portfolio includes tasks and activities.Curricular and horizontal relevance.Not specified.Not specified.
Benavente-vera et al. (2020)A schematic program script is used.A previous literature review is conducted to design the instruction.A continuous online portfolio includes experiments and treatments.Curricular relevance.Not specified.Not specified.
Romero garcía et al. (2020)A detailed program script is applied in sessions related to the digital competency construct.A prior literature review is conducted for instructional design.A continuous online portfolio includes tasks and activities.Curricular relevance.Not specified.Feedback is provided continuously to the students.
Camino et al. (2021)The instructional program is not detailed, focusing on variables, evaluation instruments, and results.A prior literature review is conducted for instructional design.A continuous online portfolio includes tasks and activities.Curricular relevance.Not specified.It is not specified; it only mentions the instructor’s role as a guide to the student.
Chatwattana (2021)A detailed script for the program.
Phases for creating and developing it.
A prior systematic review is conducted before creating the MOOC to ensure its proper development.Continuous online portfolio includes tasks, activities, and final tests.Curricular and horizontal relevance.Not specified, as it is a self-paced online course.Continuous feedback through the completion of activities, tests, or tasks.
Ryhtä et al. (2021)Detailed script for the program, broken down into sessions and blocks, applied to the digital competence construct.A systematic review and expert meetings for course development.Continuous online portfolio includes tasks and activities.Curricular and horizontal relevance.Periodic meetings among stakeholders.Continuous feedback to the students.
Ugur et al. (2021)Detailed script for the program, broken down into sessions and blocks, applied to the digital competence construct.A prior literature review and expert meeting for course design.Continuous online portfolio includes tasks, activities, and module assessments.Curricular relevance.Pre-course meetings among experts for course design.Continuous online feedback to the students.
Post COVID-19 lockdown
Basantes-andrade et al. (2022)Detailed script for the program, broken down into sessions and blocks, applied to the digital competence construct.A literature review and evaluation for program design.Continuous online portfolio includes tasks and activities.Curricular relevance.Regular meetings among participants.Continuous feedback to the students.
Choi et al. (2022)Detailed script for the program, broken down into sessions and blocks, applied to the digital competence construct.Prior training of instructors, supported by a literature review.Continuous online portfolio includes tasks, activities, and gamification.Horizontal relevance.Regular meetings among participants.Not specified.
Fuentes-cancell et al. (2022) Detailed program script broken down into sessions, blocks, applied to the digital competence construct.Previous literature review for instructional design.Continuous and online portfolio: tasks, activities.Curricular and horizontal relevance.Periodic meetings among stakeholders.Continuous feedback to the student.
Garcés et al. (2022)The program script is not broken down; it is presented in a schematic form.Previous literature review for instructional design.Continuous and online portfolio: tasks, activities.Curricular relevance.Not specified.Not specified.
Javorcik (2022)The program script is not broken down but is presented in schematic formPrevious systematic review.Continuous and online portfolio: tasks, activities.Curricular relevance.Not specified.Not specified.
Munawaroh et al. (2022)The program script is detailed, broken down into sessions and blocks, and applied to the digital literacy construct.Prior systematic review.Continuous and online portfolio: tasks, activities.Curricular relevance.Not specified.Not specified.
Nogueira et al. (2022)Does not break down the program script. It is shown in a diagram.Literature review and meetings of the researchers.Continuous and online portfolio: tasks, activities, questionnaires from participants and families.Curricular and horizontal relevance.Periodic meetings between stakeholders.Continuous feedback: to students and families.
Yelubay et al. (2022)A detailed script of the program, broken down into sessions and blocks, is applied to the construct of digital competence.A prior literature review is conducted for instructional design.A continuous online portfolio is used for tasks, activities, participation in discussions, and social media use.Curricular and horizontal relevance.Regular online meetings are held among participants.Continuous feedback is provided to both students and online.
Calvopiña herrera (2023)Detailed program script broken down into sessions, blocks, applied to the digital competence construct.Prior literature review for instructional design.Continuous portfolio: tasks, activities, final tests.Curricular and horizontal relevance.Periodic meetings among stakeholders.Not specified.
Dimitri et al. (2023)Detailed program script broken down into sessions and blocks, applied to the digital literacy construct.Prior literature review for instructional design.Continuous portfolio: tasks, forums, final tests.Horizontal relevance.Not specified.Continuous feedback to the student.
Gabarda méndez et al. (2023)A detailed program script broken down into sessions and blocks, applied to the digital competence construct.Pre-intervention meetings among teachers and a literature review for instructional program design.Continuous and online portfolio: tasks and activities.Curricular relevance.Regular meetings among the participants.Continuous feedback to the student.
Pino (2023)A detailed program script broken down into sessions and blocks, applied to the digital competence construct.A prior literature review for instructional design.Continuous portfolio: tasks, activities, participation in discussions.Curricular and horizontal relevance.Regular meetings among the participants.Not specified.
Wang et al. (2023)Detailed program script broken down into sessions and blocks, applied to the digital competence construct.Systematic review and game design.Continuous and online portfolio: gamification.Horizontal relevance.Regular meetings among the participants.Continuous feedback to the students.
Zhang et al. (2023)Detailed program script broken down into sessions and blocks, applied to the digital literacy construct.Pre-course literature review for course design.Continuous portfolio: assignments, discussions, and final tests.Curricular relevance.Regular meetings among the participants.Not specified.
Note: The data extracted from the analysed study is included within the appropriate section or, if necessary, outside of it.
Table 4. Instructional procedure. Parameters and monitoring measures implemented throughout the various interventions. The report provides a detailed analysis of each of the control elements and indicators and describes how they were comparatively applied in the different studies.
Table 4. Instructional procedure. Parameters and monitoring measures implemented throughout the various interventions. The report provides a detailed analysis of each of the control elements and indicators and describes how they were comparatively applied in the different studies.
StudyMaterialsInstructor RoleStudent RoleGroupingContextDurationResults
Before COVID-19 lockdown
Prince et al. (2016)Implementation of MOOC and OER
Coursesites by BlackBoard, lectures, video presentations, links to information and data repositories, activities, and reflective questions regarding the use of ICT.
Researcher-Instructor.Executor of each activity.Small groups.Online.3 weeks divided into 3 work units.Technology adoption is influenced by the use of technology in both social and academic contexts, impacting digital competence development. Furthermore, active participation in a MOOC is linked to students’ interest and their level of technology adoption.
Fernández-montalvo et al. (2017)MOOC.
Available ICT resources at the center.
Researcher-Instructor.Executor of each activity.Large group.Face-to-face.6 months.
3 sessions, each lasting two hours.
Improvements in the Experimental Group (EG) were greater than in the Control Group (CG), both in ongoing assessments and the final evaluation.
Gómez-trigueros et al. (2018)Video tutorial on the use of the MOOC.
Google Earth.
Virtual Campus.
Researcher-Instructor.Executor of each activity.Large group.Both online and face-to-face.3 phases with 6 modules.Participants demonstrated a proper acquisition of geographic content, as well as a significant improvement in their digital competence and the appropriate use of Google Earth as an educational tool.
Maureen et al. (2018)Stories and tales: “My name”, “My Birthday”, and “My Hobby”.
Songs.
Researcher-Instructor.Executor of each activity.Small group.Face-to-face.5 weeks with 3 sessions.The implementation of digital storytelling resulted in a noticeable increase in the literacy skills of children, compared to children in the control group. Specifically, it was found that digital storytelling activities had a more positive impact on digital literacy skills compared to traditional literacy activities.
Guayara cuéllar et al. (2019)Moodle, Educaplay.
Adobe Captivate.
Games, problem-solving, videos.
Researcher-Instructor.Executor of each activity.Large group.Online.3 modules: (i) Cybercrimes; (ii) Internet Risks; (iii) Web 2.0 Tools.The research achieved an improvement and strengthening of digital competencies and the use of Information and Communication Technologies by the participating teachers.
During COVID-19 lockdown
Aydin et al. (2020)Social Networks.
Cloud.
Open Online Courses.
Web 2.0.
Researcher-Instructor.Executor of each activity.Small group.Online.8 weeks, 2 h per week.It is concluded that there is a significant difference in digital skills measured before and after the online digital literacy course in the students.
Benavente-vera et al. (2020)ICT available at the educational center.Researcher-Instructor.Executor of each activity.Small group.Both online and face-to-face.Four treatments.The treatment that experienced the most significant changes and improvements was treatment 3. In contrast, treatment 1 showed fewer positive results in comparison.
Romero garcía et al. (2020)Kahoot, Socrative, Perusall App, Mindmeister.
Virtual Classroom.
PowerPoint.
Researcher-Instructor.Executor of each activity.Small group.Both online and face-to-face.12 sessions.Significant improvements were found in the Experimental Group (EG) compared to the Control Group (CG) in all evaluated dimensions, except for dimension D4: Security
Camino et al. (2021)Center’s training technology.Researcher-Instructor.Executor of each activity.Large group.Face-to-face.4 months.
4 one-hour sessions for the student group.
The educational project addressed in this study offers substantial advantages in terms of acquiring digital competence and the three previously mentioned constructs.
Chatwattana (2021)MOOC system with SDL (Self-Directed Learning).
Lessons, exercises, information search, and online communication.
Researcher-Instructor.Executor of each activity.Small group.Online.3 courses: (i) Digital Circuit and Logic Design; (ii) Television and Video Control; (iii) Multimedia Technology and Animation.The results from this intervention indicate that the use of MOOCs (Massive Open Online Courses) is beneficial for fostering digital skills in students.
Ryhtä et al. (2021)“Basics of Digital Pedagogics for Health Sciences, Social Services, and Rehabilitation Education” (BDE) through Moodle.Researcher-Instructor.Executor of each activity.Small group.Online.Six weeks from February to April 2019.It is suggested to implement this course at all educational levels.
Ugur et al. (2021)Open online course “Integration of ICT in Education”.
Viewing explanatory videos in each unit.
Researcher-Instructor.Executor of each activity.Small group.Online.4 modules in 4 weeks: (i) Integration of ICT; (ii) Planning; (iii) Development for Integration; (iv) Instruction and Reflection.It has been verified that the study’s objectives were achieved, taking into consideration the inherent limitations. The results obtained through rubric assessment and a satisfaction survey, which analysed six specific aspects, led to highly positive conclusions. These findings indicate an improvement in teachers’ competencies regarding the application of information and communication technologies (ICT).
Post COVID-19 lockdown
Basantes-andrade et al. (2022)NANO MOOC.
Masterclass, forums, online surveys.
Moodle
Videos.
Researcher-Instructor.Executor of each activity.Large group.Online.180 min per course.
3 phases.
The results from the post-assessment demonstrate that the teachers who participated in the training experienced a substantial improvement in their level of digital competence compared to the results obtained in the pre-assessment. This highlights the effectiveness of implementing NANO-MOOC as a training tool.
Choi et al. (2022)LiveworkSheet.
Online exercise sheets and readings.
Video game.
Researcher-Instructor.Executor of each activity.Small group.Both online and face-to-face.10 sessions.This educational program has helped improve the digital skills of adults, leading to the prevention of mental and social issues.
Fuentes-cancell et al. (2022)MOOC.
Facebook, Telegram, LinkedIn, and ResearchGate.
Researcher-Instructor.Executor of each activity.Small group.Online.6–8 months.
9 workshops.
The results indicate that the use of MOOC and learning through social networks is effective for the development of digital competence in teachers.
Garcés et al. (2022)MOOC.
Moodle, Padlet.
Canva, Genially.
Youtube.
Blog.
Researcher-Instructor.Executor of each activity.Small group.Online.17 sessions in a total of 35 pedagogical hours.
6 topics.
A gradual improvement in grades is observed throughout the course, increasing from 5.93 to 8.47.
Javorcik (2022)Moodle.
Use of presentations, articles, texts, or books.
Mobile applications.
Researcher-Instructor.Executor of each activity.Large group.Online.3 months.
5 chapters.
Microlearning courses prove to be effective for acquiring knowledge in an engaging way and for increasing participants’ confidence in using ICT.
Munawaroh et al. (2022)Center’s ICT facilities.Researcher-Instructor.Executor of each activity.Large group.Face-to-face.6 months with 4 monthly sessions, each lasting two hours.The intervention program significantly enhances the digital skills of the teachers.
Nogueira et al. (2022)School’s ICT resources.Researcher-Instructor.Executor of each activity.Small group.Face-to-face.1 semester
16 h
8 classes of 2 h each
The intervention program significantly enhances the digital skills of the teachers.
Digital literacy improved over the semester regardless of the use of digital devices at home. The experimental group progressively improved their digital interaction and confidence in the digital environment.
The assessment of logic/mathematics showed significant improvement.
Yelubay et al. (2022)Moodle, Google, Twitter.
MOOC.
Creating puzzles, quizzes, surveys, blogs.
Researcher-Instructor.Executor of each activity.Small group.Online.6 weeksThe hypotheses put forward are confirmed, as the MOOC has clearly demonstrated improvement results in the experimental group compared to the control group in the four types of items proposed.
Calvopiña herrera (2023)School’s ICT resources.
Social Media.
Canva, videos.
Researcher-Instructor.Executor of each activity.Small group.Face-to-face.8 sessions
180 min per session
Teacher training strengthens performance in digital competencies; therefore, the results demonstrate the effectiveness of the applied instructional program by obtaining more positive results in the follow-up questionnaires.
Dimitri et al. (2023)FutureLearn platform.
Videos and puzzles.
Forums.
Researcher-Instructor.Executor of each activity.Small group.Online.4 weeks
2 h per week
2 times in the course
The MOOC enables the enhancement of digital health literacy in the management of growth disorders. It serves as a means to boost digital proficiency and self-assurance among healthcare users, equipping them for upcoming technological advancements in the realm of growth disorders and growth hormone therapy.
Gabarda méndez et al. (2023)Virtual Classroom, Blog.
Video game.
Video viewing.
Researcher-Instructor.Executor of each activity.Large group.Both online and face-to-face.1 course 2021–22.
6 phases.
The results of the innovation project reveal a substantial improvement in students’ digital competence acquisition.
Pino (2023)Center’s ICT resources.Researcher-Instructor.Executor of each activity.Small group.Face-to-face.12 weeks.
2 sessions per week of 1 h and 50 min.
The findings indicate the efficacy of the designed course, as evidenced by the heightened utilization of digital tools for problem-solving and explanation.
Wang et al. (2023)Video game.Researcher-Instructor.Executor of each activity.Small group.Face-to-face.4 days.
80 min per session and 20 min for pre-tests and post-tests (10 min each).
The experimental group exhibits notably elevated levels of digital well-being literacy, as well as higher levels of intrinsic and extrinsic motivation in comparison to the control group. Nonetheless, there are no noteworthy distinctions in engagement between the two groups.
Zhang et al. (2023)Center’s ICT resources.
Videos and readings.
Moodle
Researcher-Instructor.Executor of each activity.Small group.Face-to-face.10 sessions.
1 session per week.
The findings indicated that the DML course had a beneficial effect on students’ civic participation, although it did not significantly affect their technical skills, critical understanding, or creative communication abilities. Additionally, the results highlighted a positive correlation between teacher guidance and students’ digital media literacy.
Note: The data extracted from the analysed study is included within the appropriate section or, if necessary, outside of it.
Table 5. Limitations of the educational strategies addressed in the examined empirical studies. Each of these areas is detailed extensively in the main report.
Table 5. Limitations of the educational strategies addressed in the examined empirical studies. Each of these areas is detailed extensively in the main report.
StudyBackgroundParticipantsProgramResultsGeneral
Before COVID-19 lockdown
Prince et al. (2016)Outdated sources.
Lack of hypotheses or predictions.
Missing inclusion and exclusion criteria.
Non-random purposive sampling.
Small sample size.
Non-representative sample.
No grouping.
Lack of curricular relevance.
Incomprehensible articulation.
Only post-comparison.
Not an experimental intervention study, only a pre-post group.
Missing key information for replication.
Fernández-montalvo et al. (2017)Outdated sources.
Lack of research question.
Missing hypotheses or predictions.
Failure to analyse generalization effects.
Gómez-trigueros et al. (2018)Outdated sources.
Lack of research question.
Missing hypotheses or predictions.
Lack of inclusion and exclusion criteria.No information about the duration.
No information about the number of sessions.
Key information is missing to replicate the intervention.
Maureen et al. (2018)Outdated sources.
Lack of hypotheses or predictions.
Missing inclusion and exclusion criteria.
Small sample.
Failure to analyse generalization effects.
Guayara cuéllar et al. (2019)Outdated sources.
Missing theoretical framework.
Lack of research question.
No hypotheses or predictions..
Lack of inclusion and exclusion criteria.
Small sample.
Unrepresentative sample.
Lack of information regarding the duration and number of sessions.
No grouping.
Lack of curricular relevance.
Failure to analyse each variable.This is not an experimental intervention study but rather a pre-post group analysis.
There is a lack of essential information necessary for the replication of the study.
During COVID-19 lockdown
Aydina et al. (2020)Lack of hypotheses or predictions. No grouping. Not an experimental intervention study, just a pre-post group.
Benavente-vera et al. (2020)Lack of research question.
Lack of objectives.
Missing inclusion and exclusion criteria.
Non-random intentional sampling.
Small sample.
Lack of strategies.
No information on duration.
No information on the number of sessions.
No information on who implemented the intervention.
Lack of curricular relevance
Failure to analyse generalization effects.This study is not designed as an experimental intervention but rather as a pre-post group analysis. Additionally, crucial information required to replicate the intervention is not provided.
Romero garcía et al. (2020)Outdated sources.
Lack of a research question.
Missing hypotheses or predictions.
Lack of inclusion and exclusion criteria.
Non-random intentional sampling.
Small sample.
Not a representative sample.
Does not indicate instructional procedure
No information about duration.
Only post-comparison provided.Key information needed to replicate the intervention is missing.
Camino et al. (2021)Outdated sources.
Lack of hypotheses or predictions.
Missing inclusion and exclusion criteria.No grouping. Not an experimental intervention study, only a pre-post group.
Chatwattana (2021)Lack of research question.
No hypotheses or predictions.
Missing inclusion and exclusion criteria.
Non-random purposive sampling.
Small and unrepresentative sample.
No session count.Comparison only includes post-test data.Not an experimental intervention study, just a pre-post group comparison.
Ryhtä et al. (2021)Lack of a theoretical framework.
No hypotheses or predictions.
Missing inclusion and exclusion criteria.
Small and unrepresentative sample.
No session count
Lack of grouping.
Not an experimental intervention study, just a pre-post group comparison.
Ugur et al. (2021)Outdated sources.
Lack of a theoretical framework.
No hypotheses or predictions.
Missing inclusion and exclusion criteria.
Small and unrepresentative sample.
No grouping.Does not analyse generalization effects.Not an experimental intervention study, just a pre-post group comparison.
Post COVID-19 lockdown
Basantes-andrade et al. (2022) Missing inclusion and exclusion criteria.
Non-random purposive sampling.
No information about the number of sessions.
No grouping.
Does not analyse generalization effects.Lacks ethical controls (informed consent to participate, confidentiality).
Choi et al. (2022)Missing research question.
Lack of hypotheses or predictions.
Missing inclusion and exclusion criteria.No duration information.No analysis of generalization effects.Missing key information for replication.
Fuentes-cancell et al. (2022)
Garcés et al. (2022)The research question is not provided, and there are no objectives or hypotheses.Missing inclusion and exclusion criteria.
Non-random purposive sampling.
Small sample size.
Unrepresentative sample.
Failure to indicate instructional procedure.
Lack of strategies.
No grouping.
Lack of curricular relevance.
Variables are not analysed individually.
Generalization effects are not analysed.
Key information is missing to replicate the intervention.
No ethical controls (informed consent to participate, confidentiality, etc.).
Javorcik (2022)Missing hypotheses or predictions.Missing inclusion and exclusion criteria.Lack of information on the number of sessions. Key information is missing to replicate the intervention.
Munawaroh et al. (2022)Missing hypotheses or predictions.
Nogueira et al. (2022) Lack of inclusion and exclusion criteria.
Non-random intentional sampling..
Small sample size.
No strategies. Missing key information for intervention replication.
Yelubay et al. (2022)Outdated sources.Non-random intentional samplingNo session count.
Calvopiña herrera (2023) Lack of inclusion and exclusion criteria.
Non-random intentional sampling.
Small sample size.
It is not an experimental intervention study, only a pre-post group.
Dimitri et al. (2023)Lack of a research question.
Lack of hypotheses or predictions.
Lack of inclusion and exclusion criteria.
Non-random intentional sampling.
Small and unrepresentative sample.
Lack of grouping.
Lack of curricular relevance.
Missing key information to replicate the intervention.
It is not an experimental intervention study, only a pre-post group.
Gabarda méndez et al. (2023)Lack of a research question.
Lack of hypotheses or predictions
Lack of inclusion and exclusion criteria.
Non-random intentional sampling.
No indication of the number of sessions.
Lack of grouping.
No analysis of generalization effects.Not an experimental intervention study, only a pre-post group.
Pino (2023)Lack of a research question.
Lack of hypotheses or predictions.
Lack of inclusion and exclusion criteria.
Non-random intentional sampling.
Small sample.
No indication of instructional procedure. Lack of essential information for intervention replication.
Not an experimental intervention study, only a pre-post group.
Wang et al. (2023)Lack of objectives.
Lack of hypotheses or predictions.
Zhang et al. (2023)Lack of hypotheses or predictions.Lack of inclusion and exclusion criteria.
Non-random purposive sampling.
No instructional procedure indicated. Lack of key information for replication.
Note: The data extracted from the analysed study is included within the appropriate section or, if necessary, outside of it.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Díaz-Burgos, A.; García-Sánchez, J.-N.; Álvarez-Fernández, M.L.; de Brito-Costa, S.M. Psychological and Educational Factors of Digital Competence Optimization Interventions Pre- and Post-COVID-19 Lockdown: A Systematic Review. Sustainability 2024, 16, 51. https://doi.org/10.3390/su16010051

AMA Style

Díaz-Burgos A, García-Sánchez J-N, Álvarez-Fernández ML, de Brito-Costa SM. Psychological and Educational Factors of Digital Competence Optimization Interventions Pre- and Post-COVID-19 Lockdown: A Systematic Review. Sustainability. 2024; 16(1):51. https://doi.org/10.3390/su16010051

Chicago/Turabian Style

Díaz-Burgos, Alberto, Jesús-Nicasio García-Sánchez, M. Lourdes Álvarez-Fernández, and Sonia M. de Brito-Costa. 2024. "Psychological and Educational Factors of Digital Competence Optimization Interventions Pre- and Post-COVID-19 Lockdown: A Systematic Review" Sustainability 16, no. 1: 51. https://doi.org/10.3390/su16010051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop