Next Article in Journal
Public Expenses and Investment in Environmental Protection and Its Impact on Waste Management
Next Article in Special Issue
Learning Performance Prediction-Based Personalized Feedback in Online Learning via Machine Learning
Previous Article in Journal
The Relationships among Mindfulness, Self-Compassion, and Subjective Well-Being: The Case of Employees in an International Business
Previous Article in Special Issue
Grade Prediction Modeling in Hybrid Learning Environments for Sustainable Engineering Education
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Constructing a Digital Competence Evaluation Framework for In-Service Teachers’ Online Teaching

School of Education Science, Nanjing Normal University, Nanjing 210097, China
Author to whom correspondence should be addressed.
Sustainability 2022, 14(9), 5268;
Submission received: 11 March 2022 / Revised: 23 April 2022 / Accepted: 24 April 2022 / Published: 27 April 2022


The focus on online teaching and teachers’ digital competence (DC) has reached a new level following the emergence of COVID-19 and its dramatic influence on the educational industry, requiring teachers to be equipped with DC. However, there is no consensus on the measuring framework of teachers’ DC. Therefore, this study aimed to construct a reliable self-evaluation framework for in-service teachers’ DC during online teaching. The data of 1342 teachers with online teaching experience were obtained. The methods of data analysis included exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and item analysis. Results demonstrated that the constructed evaluation framework performed consistently with the collected data. CFA also confirmed a good model fitting of the relevant 10 factors of the teachers’ DC framework. Therefore, in the teachers’ DC evaluation framework, the constructs interacting with each other consist of technical knowledge (TK, four items), learner knowledge (LK, three items), pedagogical knowledge (PK, three items), ethical knowledge (EK, three items), learner technical knowledge (LTK, four items), learner pedagogical knowledge (LPK, four items), learner ethical knowledge (LEK, four items), technical pedagogical knowledge (TPK, three items), technical ethical knowledge (TEK, four items), and pedagogical ethical knowledge (PEK, three items), but in total the scale comprises 35 items. It can be an effective instrument to support in-service teachers’ DC measurement for their online teaching.

1. Introduction

Education in many countries is currently developing in a digital direction. It is now common for teachers and students to teach and learn in a variety of online environments [1]. Online teaching has been used at almost all levels of education [2]. In addition, the focus on online teaching and teachers’ digital competence (DC) has reached a new level following the emergence of the COVID-19 pandemic and its dramatic influence on the educational industry. Teaching activities around the world have been completely disrupted by the epidemic, and the teaching environment has been changed. Digital teaching management systems, digital collaboration platforms, social networks, and even telephones have become important tools for teachers to reorganize their teaching [3]. During online teaching, teachers need to use these information and communication tools effectively to ensure the smooth running of the teaching process, as well as to ensure that the relevance of the students’ knowledge and skills are assessed [4,5]. Therefore, teachers’ digital skills play an important role in responding to the COVID-19 crisis, but it is also necessarily an important aspect of online teaching quality and teachers’ skill evaluation [6]. Teachers’ DC is a prerequisite for good online teaching [7]. Teachers’ DC is improved by the effective integration of technology into teaching in different learning environments, such as blended or fully online teaching environments [8,9]. Hjelsvold et al. [10] reported that some teachers regretted their lack of teaching competence when faced with online teaching, again highlighting the importance of teachers’ DC. The application of digital technologies, the digitization of teaching surroundings, and the growing popularity of online teaching all call for teachers to be competent in teaching online [11]. Therefore, it is necessary to construct an appropriate evaluation framework for teachers to effectively validate their DC and allow them to explore various elements of instruction and more easily adjust their online teaching.
However, there is currently no unified conceptual framework or measuring tool for teachers’ DC. There is also little consensus among researchers on what should be measured and how such a measurement should be conducted [12]. Several frameworks have gained attention in recent years. One example is the European Framework of Educators’ Digital Competence (DigCompEdu) presented by the European Commission [13], but this tool primarily considers school leadership and schools as organizations, and does not take into account individual teachers [14]. In contrast, the TPACK framework [15] is more focused on the individual teacher and acknowledges the complexity of optimizing the contribution of digital technology in the classroom, but its original form has limited use in building a more broad-based concept of teachers’ DC [16]. Since the publication of TPACK, the educational landscape has changed significantly, mainly due to new digital technological innovations. Taking these changes into account, Fallon [16] argued that teachers’ DC should go beyond simple digital–technical contact to include considerations such as online presence, ethics, and collaborating in and building knowledge in online environments. Further, some researchers also believe that teachers’ DC cannot be narrowly focused on context-free, isolated technologies [17,18,19]. Scholars are calling for the expansion of the research focus on teachers’ DC to include individual and sociocultural factors [16,18,20,21,22]. Therefore, this study aimed to construct a more comprehensive teachers’ DC framework that considers these factors, to ensure that teachers understand and are equipped with this knowledge and these competencies, and that they are constantly updated and enhanced through professional learning throughout their careers.
Therefore, in order to make up for the shortcomings of the current research, this study aimed to construct a self-evaluation framework to assess in-service teachers’ DC when integrating digital technologies into practical online teaching. It is hoped that the development of the conceptual framework will contribute to the conceptualization and structuring of theories and will transform teachers’ teaching practices.
The structure of this article is as follows. Section 2 presents the literature review about teachers’ DC. This section not only aims to position the work of this study based on the published literature, but also to identify the distance between the existing framework and teachers’ DC evaluation. Section 3 introduces the methodology adopted in this study. Firstly, the research design is presented, including the preliminary design of the framework and the pretest. Then, the method and the process of data collection are described. Finally, the effectiveness measurement method of the proposed framework is explained. Section 4 presents the main measurement results. Section 5 and Section 6 discuss the results of this study in detail and summarize the contributions and limitations of this study.

2. Literature Review

2.1. Teachers’ Digital Competence

Digital competence (DC) is considered to be the integration of knowledge, skills, abilities, attitudes, strategies, and awareness, which helps people perform tasks using digital media and Information and Communication Technology (ICT) [17]. The education system is seen as an important player in improving DC [23]. In particular, teachers’ DC has received attention. However, there is currently no unified or specific definition of teachers’ DC. Teachers’ DC is a complex concept that encompasses all aspects of pedagogy, society, culture, and ethics [24,25]. Teachers’ DC is different from that of other individuals, and the teachers’ focus is on how to effectively apply digital technology to various educational environments [21,26], such as the currently especially popular online teaching. Compared with traditional teaching, online teaching obviously has a greater requirement for teachers’ DC [27]. Li et al. [28] proposed that teachers’ DC is positively correlated with their online teaching behavior. Referring to From’s study [11], teachers’ DC in this study was defined as the way in which teachers are able to appropriately apply digital technologies and digital environments (online teaching). In particular, digital technologies refer to the variety of digital devices and software that teachers use in online teaching, such as computers, mobile phones, and social media applications.
Additionally, teachers with DC are usually considered to merely have the technical skills for selecting appropriate digital tools for the teaching environment, and for applying them in particular teaching units [29]. However, this view has been criticized for its narrow skill focus and its failure to consider the different socio-cultural contexts in which digital technologies are used [17,30]. For example, in the use of digital technology, there tends to be a lack of consideration of ethical factors, of students’ characteristics, and of other aspects, with digital technology only being used in teaching as a simple tool. Teachers’ DC does not mean the same as actual or frequent use of digital technologies while teaching [19]. Other researchers have pointed out the limitations of an overly technical perspective, which ignores broader considerations including the elements of ethics, digital citizenship, security, and so on [18]. Some studies have suggested abandoning the current emphasis on skills-focused DC, and instead call for broader DC frameworks that recognize the greater diversity of knowledge and competencies required by teachers [16,19,20]. Calvani et al. [31] highlighted that DC should be the interplay between three dimensions: the technical, ethical, and cognitive dimensions. Yong et al. [32] also insisted that in order to succeed with implementing technology into the classroom, three areas of knowledge should be included in teachers’ DC: technical proficiency, pedagogical competence, and social awareness. Thus, DC not only emphasizes technical skills, but it is also sensitive to socio-cultural issues such as ethics to effectively adapt to the digital environment [21]. Therefore, it is quite a challenge for teachers, who not only need to apply digital technology more effectively to their teaching, but also have to consider more broadly the use of digital technology and its impact. Furthermore, Janssen et al. [19] argued that the concept of competence needs to be constantly revised to reflect changes in technology systems and usage, and to consider the evolving nature of technology. Therefore, this study aimed to reconstruct a framework for evaluating teachers’ digital competence in response to the rapidly changing educational environment.

2.2. Evaluating Teachers’ Digital Competence

As there is no unified definition of teachers’ DC, some frameworks or models developed by describing competence dimensions have been used to measure teachers’ DC [15,33,34,35]. For example, the framework of DigCompEdu proposed and further implemented six competence areas required by educators [36], namely (1) professional involvement, (2) digital resources, (3) teaching and learning, (4) evaluation, (5) empowering students, and (6) promoting students’ DC. However, this tool does not focus on individual teachers, but primarily considers school leadership and schools as organizations [1]. The most representative framework of teachers’ DC is the technological pedagogical content knowledge (TPACK) proposed by Koehler et al. [37]. The TPACK framework stems from Shulman’s PCK [38]. Specifically, TPCK as an extension of the PCK, and TP(A)CK is derived from the interaction and intersection between these dimensions [39]. The TPACK framework has been used in some studies as the basis for measuring teachers’ DC [40,41,42,43,44,45]. However, the TPACK framework has been criticized for not taking teachers’ cognitive beliefs and values about teaching and learning into account [46]. Moreover, TPACK ignores other teaching factors, such as the learners’ knowledge of the teaching content [46,47,48]. A survey instrument for measuring TPACK tailored for teachers’ online teaching was developed [49]. However, the findings showed a strong correlation between pedagogical knowledge and content knowledge, which calls into question the distinctiveness between these domains as they are difficult to distinguish. Archambault and Barnett suggested that this results in less predictive or inefficient development of new knowledge [50]. Therefore, based on previous experience, content knowledge was not included in the development of the framework for evaluating teachers’ DC for online teaching in this study. In addition, some improved versions of TPACK still have deficiencies. For example, the ICT-TPACK proposed by Angeli and Valanides [46] does not account for the interconnections between constructs [51], and it is difficult to measure clearly and easily [52]. In addition, several self-assessment instruments on DC for teachers have been developed [53,54,55], but they mainly focus on pre-service teachers.
Therefore, in order to make up for the shortcomings of the above research, this study further investigated the structure of teachers’ DC in online teaching with the aim of reconstructing the evaluation framework of in-service teachers’ DC in online teaching.

2.3. Research Purpose

Teachers’ DC is an important factor in the success of online teaching. According to the research on teachers’ DC measurement, it can be improved in two main ways.
Firstly, the attention to the knowledge of learners and their characteristics is insufficient. The knowledge of learner characteristics is considered to be the basis of teaching [15]. Shulman [56] proposed seven basic aspects of teachers’ competence, one of which was the knowledge of learners and their characteristics, but it was weakened in the process of developing TPCK. Teachers should consider the characteristics and needs of the student population when designing activities using digital technologies [57,58,59], for example, students’ readiness for online learning and perceived challenges [60], and learners’ cognitive load in online learning [61]. Online teaching requires teachers to have the ability and knowledge to support students [45]. Li et al. [28] believed that the perception of the learners’ characteristics can be regarded as an important contextual variable affecting teachers’ DC in online teaching. Importantly, being a teacher with DC means being able to help students develop their DC according to their characteristics [26,36]. However, the learners’ knowledge is not taken into account sufficiently in the teachers’ competence framework.
In addition, the measurement of ethical knowledge has been neglected. It is essential that teachers’ DC includes familiarity with ethical issues. Ethics are mentioned in many definitions of DC [19,21,31]; however, it is only a concept and lacks measurement practice. Asamoah [62] believed that ethical knowledge should be an important supplement to teachers’ TPACK, so that teachers can carry out hybrid or all-online teaching more effectively. The digital technology usage, pedagogical design, and guidance of students should follow ethical practices in online teaching. Furthermore, teachers should know what is ethically correct, and should try to avoid wrong perceptions or behaviors.
Therefore, this study constructed a model for evaluating in-service teachers’ DC for online teaching (see Figure 1). The proposed evaluation framework incorporates the interplay and intersection of four types of basic knowledge: learner knowledge (LK), technical knowledge (TK), pedagogical knowledge (PK), and ethical knowledge (EK). However, it goes beyond these four knowledge bases. The teachers’ DC framework further emphasizes the types of knowledge that are located at six key intersections: learner technical knowledge (LTK), learner pedagogical knowledge (LPK), learner ethical knowledge (LEK), technical pedagogical knowledge (TPK), technical ethical knowledge (TEK), and pedagogical ethical knowledge (PEK). Therefore, this study aimed to construct and test whether the 10-dimensional evaluation framework is an effective instrument for measuring teachers’ DC.

3. Methods

3.1. Research Design

3.1.1. Preliminary Development of the Framework for Evaluating Teachers’ Digital Competence

To address the gap in the existing research, a teachers’ digital competence evaluation framework for online teaching was developed in this study, including the following 10 dimensions: LK (5 items), TK (4 items), PK (7 items), EK (4 items), LTK (4 items), LPK (4 items), LEK (4 items), TPK (4 items), TEK (4 items), and PEK (4 items). As shown in Table 1, a total of 44 items were measured using a 5-point Likert-style questionnaire (5 = strongly agree, 4 = agree, 3 = neutral, 2 = disagree, 1 = strongly disagree).

3.1.2. Pretest and Formal Test

Three educational technology experts and two university professors were invited to examine the effectiveness of the 44 items. To enhance the objectivity and accuracy of the questionnaire, 20 volunteers filled in the questionnaire one by one. Then, the expression and presentation of the questionnaire were improved based on their feedback. Finally, six questions about personal background information (including gender, teaching duration, education background, teaching subjects, class, and school) were added to the scale (see Table 1) to construct a 50-item questionnaire.

3.2. Data Collection

In October 2021, questionnaires were distributed to teachers in 25 schools in the east of China via an online platform called Questionnaire Star. The teachers, who come from local primary and secondary schools, often have to resort to online teaching because of the COVID-19 pandemic. Thus, these in-service teachers with online teaching experience were randomly selected to fill out the questionnaire. This survey aimed to explore teachers’ perceptions of their own DC. Respondents were required to answer all 50 items on the questionnaire before they could submit it. A statement about the absence of commercial or other uses of the information was provided in the first part of the questionnaire. Respondents were assured that their information was confidential and that they were allowed to withdraw from the process at any time if they wanted to. Moreover, the purpose, duration, and anonymity of the survey were explained specifically to ensure the validity and authenticity of the respondents’ answers. All participants agreed to participate in the study. A total of 1450 questionnaires were collected. Then, questionnaires submitted within 120 s, or all with the same answers, were deemed invalid. Finally, 108 invalid questionnaires were eliminated, and the remaining 1342 valid questionnaires were used in this study.

3.3. Measurement Tools

To construct an evaluation framework of teachers’ DC and to confirm its validity, exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and item analysis were carried out. Firstly, the 1342 samples were randomly assigned to two groups, with 671 samples in each. EFA was used to analyze the first group of samples. After EFA was performed, as Vogel et al. [64] suggested, the framework was refined by utilizing the method of principal component analysis (PCA). CFA was then used to confirm the results using the second group of 671 samples. Lastly, all samples were analyzed to test the differentiation and suitability of the items. SPSS 22.0 and AMOS 24.0 were applied to analyze the collected data.

4. Results

4.1. Exploratory Factor Analysis

SPSS 22.0 was used for conducting EFA, and the maximum variance method was adopted for factor rotation. The KMO value of the questionnaire was 0.97, and a strong relevance between variables was demonstrated by the Bartlett sphericity test (x2 = 94,819.322; p = 0.000 < 0.001), showing the appropriateness of EFA [65].
The PCA method was adopted to examine the validity of the measured framework dimensions by extracting factors, and finally 10 factors were obtained. The preliminary analysis found a correlation between the factors, but the oblique rotation analysis method was more convincing. However, Kieffer [66] suggested that two strategies should be used for exploratory factor analysis to achieve a repeatable analysis. In general, if the analysis results of orthogonal rotation and oblique rotation show no significant difference, the results obtained by the orthogonal rotation method can be applied directly. Thus, two methods were adopted for EFA, including optimal skew and maximum variance orthogonal rotation. Then, the analysis results of the two methods were generally parallel. The interpretability of the factors was determined by the maximum variance rotation method. Table 2 presents the result of the component transformation matrix. According to Fabrigar et al. [67], the normalized factor loading of all the factors (more than 0.5) in Table 3 demonstrated that these items showed good explainability.

4.2. Confirmatory Factor Analysis

The validity, convergence, and distinctiveness of the evaluation model were determined by first-order CFA. CFA was applied to investigate the relationships between each factor, and then we constructed the evaluation framework of teachers’ DC.

4.2.1. Fitting Validity Analysis for the Framework

As shown in Figure 2, first-order CFA was conducted. According to Hair et al. [68], items that do not meet the standard load (less than 0.5) must be eliminated. The absolute and relative fitting indexes were applied to verify the framework fit. The chi-square/df in this research was 3.651, and the value of RMSEA showed 0.044 (<0.08) [69]. In addition, the goodness-of-fit index (GFI) and adjusted fitness index (AGFI) showed 0.923 and 0.906, respectively, which met the reference standard proposed by Foster et al. [70]. Moreover, according to Hair et al. [66], the normed fitness index (NFI), comparative fitness index (CFI), incremental fitness index (IFI), and relative fitness index (RFI) showed 0.975, 0.982, and 0.972 (>0.9). In addition, the values of simplifying the parsimonious normed fit index (PNFI), and parsimonious goodness-of-fit index (PGFI) were more than 0.5. Therefore, these results indicated the good fitting validity of the framework (see Table 4).
Lastly, LK4, LK5, PK4, PK5, PK6, PK7, EK3, TPK4, and PEK3 were eliminated as they did not meet the criteria. The remaining 35 items were accepted and underwent further analysis, specifically: LK (three items), TK (four items), PK (three items), EK (three items), LTK (four items), LPK (four items), LEK (four items), TPK (three items), TEK (four items), and PEK (three items).

4.2.2. Convergence Validity Analysis for the Framework

The results of the CFA are shown in Table 5. The comprehensive reliability (CR) and average variance extracted (AVE) were used to test the construct validity of the framework. According to Hair et al. [66], the CR value of all items should be more than 0.7. Thus, the CR of the 35 remaining items is considered to be good. What is more, Fornell and Larcker [71] pointed out that if the AVE is higher than 0.5, the framework shows good convergence validity. Therefore, the results in Table 5 show that this evaluation framework has high validity and is reasonable.

4.2.3. Reliability Analysis of the Scale

The Cronbach’s alpha value [72] and composite reliability were applied to evaluate the reliability of the scale items, which has been used in similar studies [73,74]. Due to the exclusion of LK4, LK5, PK4, PK5, PK6, PK7, EK3, TPK4, and PEK3 by exploratory factor analysis and confirmatory factor analysis, the scale was adjusted. Therefore, the results showed that the adjusted scale had good reliability (α = 0.974). Specifically, the Cronbach’s alpha values of the 10 factors were 0.973 (LK), 0.956 (TK), 0.961 (PK), 0.953 (EK), 0.976 (LTK), 0.979 (LPK), 0.946 (LEK), 0.961 (TPK), 0.976 (TEK), and 0.971 (PEK). As shown in Table 5, the composite reliabilities of the 10 factors were 0.9735 (LK), 0.9571 (TK), 0.9612 (PK), 0.9580 (EK), 0.9761 (LTK), 0.9794 (LPK), 0.9567 (LEK), 0.8955 (TPK), 0.9763 (TEK), and 0.9195 (PEK), indicating their good reliability [75].

4.2.4. Discriminant Validity Analysis for the Framework

Discriminant validity of the framework could be ensured by testing the correlation matrix among dimensions [76]. Schumacker and Lomax [77] proposed that in the structural discriminant validity analysis of tools, the AVE square root of all the factors must be more than the absolute value of the Pearson correlation coefficient between two factors in order to be recognized as having discriminant validity. Therefore, as shown in Table 6, the results of the structural discriminant validity analysis indicated that this framework had good discriminant validity.
In conclusion, the first-order CFA demonstrated that this evaluation framework shows good convergence validity and discriminant validity; it is thus sound to claim that it has good construct validity. Therefore, this evaluation framework is suitable for data analysis.

4.3. Item Analysis

For verifying the suitability and distinctiveness of the constructed items, item analysis was used in this study. Two important parts were examined in the item analysis, namely decisive values and the interrelated coefficient between items and the gross score for each dimension. Therefore, the method of the independent samples t test was applied to further analyze low-group versus high-group items. According to Aridag and Yüksel’s recommendation [78], the bottom 27% and top 27% of the 1342 samples were identified as the low and high groups, respectively. Further, items with dimensional Pearson correlation coefficients and standardized factor loadings that did not reach the standard value (less than 0.4 and 0.45, respectively) were eliminated [79]. Finally, for the remaining 35 items, the decisive values were higher than 0.3, and the gross interrelated coefficient between questions and items was higher than 0.4. Overall, the item analysis results showed that the remaining 35 items reached the standard.

5. Discussion

Teachers’ DC is a prerequisite for good online teaching [7]. This study addresses two important questions regarding what kind of DC in-service teachers need to be equipped with and how to measure it in the age of online teaching. Therefore, the construction of the teachers’ DC evaluation framework offers teachers help to support their online teaching quality evaluation, but also to improve teachers’ teaching practices through evaluation. Practical and useful advice was provided to teachers in terms of TK, PK, LK, EK, LTK, LPK, LEK, TPK, TEK, and PEK. For example, learner technological knowledge indicates that teachers should use technology to communicate well with students about their learning and psychology, and to promote students’ learning progress and physical and mental development. Furthermore, according to technological ethical knowledge, teachers should consider students’ emotional value and their subjective feelings about the technology while using it, such as acceptance. Teachers are also reminded to treat technology rationally and to not over-rely on it or abuse it.
The teachers’ DC evaluation framework constructed in this study differs from previous evaluation frameworks such as TPACK. In addition to teachers’ technical and pedagogical knowledge and competences, which are often considered, the framework in this study focuses more on socio-cultural factors to prevent the narrowness of digital technology in teachers’ DC. Specifically, the knowledge of ethics and student development was introduced into the teachers’ DC. Among them, teachers’ ethical knowledge is very important in online teaching practice. Teachers should be equipped to operate ethically in an increasingly digital teaching environment. The importance of teachers’ ethical knowledge has also been highlighted in several previous studies about teachers’ DC [16,20,62]. In addition, this study also emphasizes that learners’ knowledge should be included in teachers’ DC. Learner knowledge refers to the competence of teachers to provide adaptive learning experience for learners by adjusting teaching methods according to their different characteristics and needs. Hsu [80] and Nielsen and Kreiner [81] have discussed in prior studies the necessity for teachers to be equipped with the knowledge of learners and their learning effectiveness. In fact, the use of digital technology makes learning more accessible and promotes personalized educational content tailored to students [36]. Especially in online teaching, which is often a long-distance interaction between instructors and learners, there is more focus on how teachers cross the boundaries of time and space to provide quality teaching for students. Therefore, teachers should be equipped with knowledge about learner development. However, content knowledge, as one of the elements of TPACK, was not included in the framework of this study. Rubio et al. [44] pointed out that greater self-knowledge of pedagogical or technological content is more conducive to developing teachers’ DC than knowledge of subject content. According to the research of Archambault and Crippen [49], content knowledge and pedagogical knowledge in online teaching have a strong correlation, resulting in a lack of domain distinctiveness and difficulty distinguishing them, especially when it comes to content knowledge and pedagogical content knowledge. Although Schulman [38] makes a theoretical distinction between these two concepts, the results of experimental studies are often quite contradictory [82,83]. Therefore, in order to ensure that each domain of the framework has good distinctiveness and can better measure knowledge and abilities, content knowledge was not included in the framework of this study. In conclusion, the framework of teachers’ DC evaluation constructed in this study is a clear addition to and modification of the previous knowledge field.

6. Conclusions

This study further investigated the potential factors of teachers’ DC to construct a framework for evaluating in-service teachers’ DC for online teaching, and to test its effectiveness. The EFA, CFA, and item analysis methods were used in this study to construct a 10-factor teachers’ DC evaluation framework. The results indicate that the 10-factor framework constructed in this study has good reliability and validity.

6.1. Implications

Major implications of this study are twofold. On the one hand, a 10-factor evaluation framework was constructed in this study for in-service teachers’ DC in online teaching. Socio-cultural factors are emphasized in this framework; therefore, it provides a broader and richer technological framework. Importantly, this framework can be applied as a useful instrument to support teachers’ DC evaluation, and to offer teachers a reference to further adjust and enhance their online teaching practice. Specifically, these framework indicators clearly show the various types of knowledge and abilities that teachers should master in the era of online teaching. In the teaching process, teachers have the responsibility to review their own teaching practices at any time, and to use appropriate measures to reinforce the knowledge they lack and to enhance their professional abilities. In this way, teachers improve their teaching efficiency and provide a better teaching experience to students. On the other hand, the development of this conceptual framework also contributes to the conceptualization and structuring of theories of teachers’ DC. It not only develops the concept of teachers’ DC, but also adds a new perspective to explore the structural elements of teachers’ DC and provides support for the sustainable development of teacher digital competence.

6.2. Limitations and Future Work

There are two main limitations of this study. First, this research aimed to provide teachers with pertinent information about what kind of DC they would like to have to improve their online teaching effectiveness. However, as the rapid development of digital technology creates more innovative forms of online learning and teaching, teachers should continue to broaden their knowledge repertoire. Future research should explore more knowledge to enable teachers to effectively and efficiently cope with the digital teaching environment. Second, this evaluation framework of teachers’ DC is still in the theoretical research stage and has not yet been put into practice. Therefore, further research should apply the framework to practical teaching and improve its applicability and usability according to practical feedback.

Author Contributions

Conceptualization, J.X.; methodology, J.G.; writing—original draft preparation, L.T. and J.X.; writing—review and editing, J.X. All authors have read and agreed to the published version of the manuscript.


This research was supported by the Philosophy and Social Science Foundation of Jiangsu Province (CN), (No.21ZXB007).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.


We would like to acknowledge all the people who have helped us with this study. We are grateful for their contribution.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Viberg, O.; Khalil, M.; Baars, M. Self-Regulated Learning and Learning Analytics in Online Learning Environments. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, Frankfurt, Germany, 23–27 March 2020; pp. 524–533. [Google Scholar] [CrossRef] [Green Version]
  2. Martine, F.; Sun, T.; Westine, C.D. A systematic review of research on online teaching and learning from 2009 to 2018. Comput. Educ. 2020, 159, 0360–1315. [Google Scholar] [CrossRef] [PubMed]
  3. Jacques, S.; Ouahabi, A. Chapter 4: Distance Learning in Higher Education in France during the COVID-19 Pandemic; European Liberal Forum: Brussels, Belgium, 2021; pp. 45–58. Available online: (accessed on 19 April 2022).
  4. Tomaževič, N.; Ravšelj, D.; Aristovnik, A. Higher Education Policies for Developing Digital Skills to Respond to the COVID-19 Crisis: European and Global Perspectives; European Liberal Forum: Brussels, Belgium, 2021; Available online: (accessed on 19 April 2022).
  5. Jacques, S.; Ouahabi, A.; Lequeu, T. Synchronous E-Learning in Higher Education during the COVID-19 Pandemic. In Proceedings of the 2021 IEEE Global Engineering Education Conference (EDUCON), Vienna, Austria, 21–23 April 2021; pp. 1102–1109. [Google Scholar] [CrossRef]
  6. Jacques, S.; Ouahabi, A.; Lequeu, T. Remote Knowledge Acquisition and Assessment during the COVID-19 Pandemic. Int. J. Eng. Pedagog. 2020, 10, 120–138. [Google Scholar] [CrossRef]
  7. Dama, C.; Langford, M.; Dan, U. Teachers’ agency and online education in times of crisis. Comput. Hum. Behav. 2021, 121, 106793. [Google Scholar] [CrossRef]
  8. Ge, W.S.; Han, S.B. A Standard Framework for Teachers’ Teaching Competence in the Digital Age. Mod. Distance Educ. Res. 2017, 145, 59–67. [Google Scholar] [CrossRef]
  9. Konig, J.; Jger-Biela, D.J.; Glutsch, N. Adapting to online teaching during COVID-19 school closure: Teacher education and teacher competence effects among early career teachers in germany. Eur. J. Teach. Educ. 2020, 43, 608–622. [Google Scholar] [CrossRef]
  10. Hjelsvold, R.; Nykvist, S.S.; Lorås, M.; Bahmani, A.; Krokan, A. Educators’ Experiences Online: How COVID-19 Encouraged Pedagogical Change in CS Education. Norsk IKT-Konferanse for Forskning Og Utdanning. 2020. Available online: (accessed on 25 November 2021).
  11. From, J. Pedagogical digital competence-between values, knowledge and skills. High. Educ. Stud. 2017, 7, 43–50. [Google Scholar] [CrossRef] [Green Version]
  12. Covello, S. A Review of Digital Literacy Assessment Instruments. 2010. Available online: (accessed on 25 November 2021).
  13. European Commission. Digital Competence Framework for Educators (DigCompEdu). 2017. Available online: (accessed on 25 November 2021).
  14. Viberg, O.; Mavroudi, A.; Khalil, M.; Balter, O. Validating an Instrument to Measure Teachers’ Preparedness to Use Digital Technology in their Teaching. Nord. J. Digit. Lit. 2020, 15, 38–54. [Google Scholar] [CrossRef]
  15. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A new framework for teacher knowledge. Teach. Coll. Rec. 2020, 108, 1017–1054. [Google Scholar] [CrossRef]
  16. Falloon, G. From digital literacy to digital competence: The teacher digital competency (TDC) framework. Educ. Technol. Res. Dev. 2020, 68, 2449–2472. [Google Scholar] [CrossRef] [Green Version]
  17. Ferrari, A. Digital Competence in Practice: An Analysis of Frameworks; Joint Research Centre of the European Commission: Sint Maartensvlotbrug, The Netherlands, 2012; Volume 91. [Google Scholar] [CrossRef]
  18. Foulger, T.; Graziano, K.; Schmidt-Crawford, D.; Slykhuis, D. Teacher educator digital competencies. J. Technol. Teach. Educ. 2017, 25, 413–448. [Google Scholar]
  19. Janssen, J.; Stoyanov, S.; Ferrari, A.; Punie, Y.; Pannekeet, K.; Sloep, P. Experts’ views on digital competence: Commonalities and differences. Comput. Educ. 2013, 68, 473–481. [Google Scholar] [CrossRef]
  20. Ryhtä, I.; Elonen, T.; Saaranen, M.; Sormunen, K.; Mikkonen, M.; Kääriäinen, L.; Salminen. Social and health care educators’ perceptions of competence in digital pedagogy: A qualitative descriptive study. Nurse Educ. Today 2020, 92, 104521. [Google Scholar] [CrossRef] [PubMed]
  21. Røkenes, F.M.; Krumsvik, R.J. Development of student teachers’ digital competence in teacher education. Nord. J. Digit. Lit. 2014, 9, 250–280. [Google Scholar] [CrossRef]
  22. Salas-Pilco, S. Evolution of the framework for 21st century competencies. Knowl. Manag. E-Learn. 2013, 5, 10–24. [Google Scholar] [CrossRef]
  23. Ilomäki, L.; Paavola, S.; Lakkala, M.; Kantosalo, A. Digital competence-an emergent boundary concept for policy and educational research. Educ. Inf. Technol. 2016, 21, 655–679. [Google Scholar] [CrossRef]
  24. Engen, B.K. Understanding social and cultural aspects of teachers’ digital competencies. Comunicar 2019, 27, 9–18. [Google Scholar] [CrossRef]
  25. Lund, L.; Furberg, A.; Bakken, J.; Engelien, K.L. What does professional digital competence mean in teacher education? Nord. J. Digit. Lit. 2014, 4, 280–298. [Google Scholar] [CrossRef]
  26. Krumsvik, R.J. Digital Competence in the Norwegian Teacher Education and Schools. Högre Utbild. 2011, 1, 39–51. Available online: (accessed on 29 November 2021).
  27. Ramirez-Montoya, M.S.; Mena, J.; Rodriguez-Arroyo, J.A. In-service teachers’ self-perceptions of digital competence and oer use as determined by a xmooc training course. Comput. Hum. Behav. 2017, 77, 356–364. [Google Scholar] [CrossRef]
  28. Li, W.; Gao, W.Y.; Fu, W.D.; Chen, Y.Y. A Moderated Mediation Model of the Relationship Between Primary and Secondary School Teacher’ Digital Competence and Online Teaching Behavior. Front. Educ. 2021, 6, 744950. [Google Scholar] [CrossRef]
  29. Admiraal, W.; van Vuget, F.; Kranenburg, F.; Koster, B.; Smit, B.; Weijers, S.; Lockhorst, D. Preparing pre-service teachers to integrate technology into K-12 instruction: Evaluation of a technology-infused approach. Technol. Pedagog. Educ. 2016, 26, 105–120. [Google Scholar] [CrossRef]
  30. Ottestad, G.; Kelentrić, M.; Guðmundsdóttir, G. Professional digital competence in teacher education. Nord. J. Digit. Lit. 2014, 9, 243–249. [Google Scholar] [CrossRef]
  31. Calvani, A.; Fini, A.; Ranieri, M.; Picci, P. Are young generations in secondary school digitally competent? A study on Italian teenagers. Comput. Educ. 2012, 58, 797–807. [Google Scholar] [CrossRef] [Green Version]
  32. Yong, Z.; Byers, J.; Sheldon, S.; Pugh, K. Conditions for classroom technology innovations. Teach. Coll. Rec. 2002, 104, 482–515. [Google Scholar] [CrossRef]
  33. Kelentrić, M.; Helland, K.; Arstorp, A. Professional Digital Competence Framework for Teachers; The Norwegian Centre for ICT in Education: Oslo, Norway, 2018. [Google Scholar]
  34. Krumsvik, R.J. Teacher educators’ digital competence. Scand. J. Educ. Res. 2014, 58, 269–280. [Google Scholar] [CrossRef]
  35. Tondeur, J.; Aesaert, K.; Pynoo, B.; Van Braak, J.; Fraeyman, N.; Erstad, O. Developing a validated instrument to measure preservice teachers’ ict competencies: Meeting the demands of the 21st century. Br. J. Educ. Technol. 2017, 48, 462–472. [Google Scholar] [CrossRef] [Green Version]
  36. Redecker, C.; Punie, Y. European Framework for the Digital Competence of Educators: DigCompEdu; Publications Office of the European Union: Luxembourg, 2017. [Google Scholar] [CrossRef]
  37. Koehler, M.J.; Mishra, P.; Kereluik, K.; Shin, T.S.; Graham, C.R. The Technological Pedagogical Content Knowledge Framework; Springer: New York, NY, USA, 2014. [Google Scholar] [CrossRef]
  38. Shulman, L.S. Those who understand: Knowledge growth in teaching. Educ. Res. 1986, 15, 4–14. [Google Scholar] [CrossRef]
  39. Voogt, J.; Fisser, P.; Roblin, P.N.; Tondeur, J.; Braak, J.V. Technological pedagogical content knowledge: A review of the literature. J. Comput. Assist. Learn. 2013, 29, 109–121. [Google Scholar] [CrossRef] [Green Version]
  40. Gómez-Trigueros, I.M. Digital teaching competence and space competence with tpack in social sciences. Int. J. Emerg. Technol. Learn. 2020, 15, 37. [Google Scholar] [CrossRef]
  41. Meroo, L.; Calderón, A.; Arias-Estero, J.L. Digital pedagogy and cooperative learning: Effect on the technological pedagogical content knowledge and academic achievement of pre-service teachers. Rev. Psicodidáctica 2020, 26, 53–61. [Google Scholar] [CrossRef]
  42. Miguel-Revilla, D.; Martínez-Ferreira, J.M.; Sánchez-Agustí, M. Assessing the digital competence of educators in social studies: An analysis in initial teacher training using the tpack-21 model. Australas. J. Educ. Technol. 2020, 36, 1–12. [Google Scholar] [CrossRef] [Green Version]
  43. Maderick, J.A.; Zhang, S.; Hartley, K.; Marchand, G. Preservice Teachers and Self-Assessing Digital Competence. J. Educ. Comput. Res. 2016, 54, 326–351. [Google Scholar] [CrossRef]
  44. Rubio, J.C.C.; Serrano, J.S.; Martinez, J.C.B. Digital competence in future teachers of Social Sciences in Primary Education: Analysis of the TPACK framework. Educ. Siglo XXI 2018, 36, 107–128. [Google Scholar] [CrossRef]
  45. Tomte, C.; Enochsson, A.B.; Buskqvist, U.; Karstein, A. Educating online student teachers to master professional digital competence: The tpack-framework goes online. Comput. Educ. 2015, 84, 26–35. [Google Scholar] [CrossRef]
  46. Angeli, C.; Valanides, N. Epistemological and Methodological Issues for the Conceptualization, Development, and Assessment of ICT-TPCK: Advances in Technological Pedagogical Content Knowledge (TPCK). Comput. Educ. 2009, 52, 154–168. [Google Scholar] [CrossRef]
  47. Aminath Adam. A framework for seeking the connections between technology, pedagogy and culture: A study in the Maldives. J. Open. Flex. Distance Learn. 2017, 21, 35–51. Available online: (accessed on 3 December 2021).
  48. Peng, C.A.; Daud, S.M. Relationship between Special Education (Hearing Impairment) Teachers’ Technological Pedagogical Content Knowledge (TPACK) and Their Attitudes toward ICT Integration. In Proceedings of the International Conference on Special Education in Southeast Asia Region, Bangi, Malaysia, 23 January 2016; p. 6. Available online: (accessed on 3 December 2021).
  49. Archambault, L.; Crippen, K. Examining TPACK among K-12 online distance educators in the United States. Contemp. Issues Technol. Teach. Educ. 2009, 9, 71–88. Available online: (accessed on 3 December 2021).
  50. Archambault, L.M.; Barnett, J.H. Revisiting technological pedagogical content knowledge: Exploring the TPACK framework. Comput. Educ. 2010, 55, 1656–1662. [Google Scholar] [CrossRef]
  51. Saad, M.M.; Barbar, A.M.; Abourjeili, S.A.R. Introduction of TPACK-XL: A Transformative View of ICT-TPCK for Building Pre-service Teacher Knowledge Base. Turk. J. Teach. Educ. 2012, 1, 41–60. Available online: (accessed on 3 December 2021).
  52. Albion, P.; Jamieson-Proctor, R.; Finger, G. Auditing the TPACK Competence and Confidence of Australian Teachers: The Teaching with ICT Audit Survey (TWictAS). In Proceedings of the Society for Information Technology & Teacher Education Conference (SITE), San Diego, CA, USA, 29 March 2010; pp. 3772–3779. Available online: (accessed on 3 December 2021).
  53. Caena, F.; Redecker, C. Aligning teacher competence frameworks to 21st century challenges: The case for the european digital competence framework for educators (digcompedu). Eur. J. Educ. 2019, 54, 356–369. [Google Scholar] [CrossRef] [Green Version]
  54. Ghomi, M.; Redecker, C. Digital Competence of Educators (DigCompEdu): Development and Evaluation of a Self-assessment Instrument for Teachers’ Digital Competence. In Proceedings of the 11th International Conference on Computer Supported Education, Heraklion, Greece, 2–4 May 2019; pp. 541–548. [Google Scholar] [CrossRef]
  55. McGarr, O.; McDonagh, A. Digital competence in teacher education. Output 1 of the Erasmus+ funded developing student teachers’ digital competence (DICTE) project. Retrieved May 2019, 25, 2019. [Google Scholar]
  56. Shulman, L. Knowledge and teaching: Foundations of the new reform. Harv. Educ. Rev. 1987, 57, 1–23. [Google Scholar] [CrossRef]
  57. Carrera, F.X.; Coiduras, J.L. Identificación de la competencia digital del profesor universitario: Un estudio exploratorio en el ámbito de las Ciencias Sociales. Rev. Docencia Univ. 2012, 10, 273–298. [Google Scholar] [CrossRef] [Green Version]
  58. Duta, N. Training Teachers University-Some Reflections on the Development of Digital Competence in the Knowledge Society. In Proceedings of the 6th International Conference on Virtual Learning, Cluj-Napoca, Romania, 29 October 2011; pp. 352–357. Available online: (accessed on 15 December 2021).
  59. Pérez, K.V.P.; Fernández, J.T. Competencias digitales en docentes de Educación superior: Niveles de dominio y necesidades formativas. Rev. Digit. Investig. Docencia Univ. 2018, 12, 59–87. [Google Scholar] [CrossRef]
  60. Khan, M.A.; Kamal, T.; Illiyan, A.; Asif, M. School Students’ Perception and Challenges towards Online Classes during COVID-19 Pandemic in India: An Econometric Analysis. Sustainability 2021, 13, 4786. [Google Scholar] [CrossRef]
  61. Choi, Y.; Kim, J. Learning Analytics for Diagnosing Cognitive Load in E-Learning Using Bayesian Network Analysis. Sustainability 2021, 13, 10149. [Google Scholar] [CrossRef]
  62. Asamoah, M.K. TPACKEA Model for Teaching and Students’ Learning. J. Acad. Ethics. 2019, 17, 401–421. [Google Scholar] [CrossRef]
  63. Ekrem, S.; Recep, C. Examining preservice efl teachers’ tpack competencies in turkey. J. Edu. Online. 2014, 11, 22. [Google Scholar] [CrossRef]
  64. Vogel, D.L.; Wade, N.G.; Ascheman, P.L. Measuring perceptions of stigmatization by others for seeking psychological help: Reliability and validity of a new stigma scale with college students. J. Couns. Psychol. 2009, 56, 301–308. [Google Scholar] [CrossRef] [Green Version]
  65. Kaiser, H.F. An index of factorial simplicity. Psychometrika 1974, 39, 31–36. [Google Scholar] [CrossRef]
  66. Kieffer, K.M. Orthogonal versus oblique factor rotation: A review of the literature regarding the pros and cons. In Proceedings of the Annual Meeting of the 27th Mid-South Educational Research Association, New Orleans, LA, USA, 4 November 1988; pp. 4–6. Available online: (accessed on 29 December 2021).
  67. Fabrigar, L.R.; Wegener, D.T.; MacCallum, R.C.; Strahan, E.J. Evaluating the use of exploratory factor analysis in psychological research. Psychol. Methods 1999, 4, 272–299. [Google Scholar] [CrossRef]
  68. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis, 7th ed.; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2014. [Google Scholar] [CrossRef]
  69. Liu, H.; Shao, M.; Liu, X.; Zhao, L. Exploring the influential factors on readers’ continuance intentions of e-Book APPs: Personalization, usefulness, playfulness, and satisfaction. Front. Psychol. 2021, 12, 640110. [Google Scholar] [CrossRef] [PubMed]
  70. Foster, J.; Barkus, E.; Yavorsky, C. Understanding and Using Advanced Statistics; SAGE Publications: Thousand Oaks, CA, USA, 1993. [Google Scholar] [CrossRef]
  71. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  72. Pallant, J.F. SPSS Survival Manual: A Step by Step Guide to Data Analysis Using SPSS, 3rd ed.; Region 6th Series; Routledge: Bangi, Malaysia, 2007. [Google Scholar]
  73. Hsu, L.; Chen, Y.-J. Examining teachers’ technological pedagogical and content knowledge in the era of cloud pedagogy. S. Afr. J. Educ. 2019, 39, 1–13. [Google Scholar] [CrossRef]
  74. Wang, Y.; Zhao, L.; Shen, S.; Chen, W. Constructing a Teaching Presence Measurement Framework Based on the Community of Inquiry Theory. Front. Psychol. 2021, 12, 694386. [Google Scholar] [CrossRef] [PubMed]
  75. Bagozzi, R.P.; Yi, Y. On the evaluation of structural equation models. J. Acad. Market. Sci. 1988, 14, 33–46. [Google Scholar] [CrossRef]
  76. Hair, J.F.; Anderson, R.E.; Tatham, R.L.; Black, W.C. Multivariate Data Analysis, 5th ed.; Prentice Hall: Upper Saddle River, NJ, USA, 1988. [Google Scholar]
  77. Schumacker, R.E.; Lomax, R.G. A Beginner’s Guide to Structural Equation Modeling, 4th ed.; Routledge: New York, NY, USA, 2016. [Google Scholar] [CrossRef] [Green Version]
  78. Aridag, N.I.; Yüksel, A. Analysis of the relationship between moral judgment competences and empathic skills of university students. Kuram Uygul. Egit. Bilimleri 2010, 10, 707–724. Available online: (accessed on 6 January 2022).
  79. Kim, K.U. Measurement of quality of life in patients with end-stage cancer. Cancer Nurs. 2014, 37, 44–49. [Google Scholar] [CrossRef]
  80. Hsu, L. The perceptual learning styles of hospitality students in a virtual learning environment: The case of Taiwan. J. Hosp. Leis. Sports Tour. Educ. 2011, 10, 114–127. [Google Scholar] [CrossRef]
  81. Nielsen, T.; Kreiner, S. Course evaluation for the purpose of development: What can learning styles contribute? Stud. Educ. Eval. 2017, 54, 58–70. [Google Scholar] [CrossRef]
  82. Baumert, J.; Kunter, M.; Blum, W.; Brunner, M.; Voss, T.; Jordan, A.; Klusmann, U.; Krauss, S.; Neubrand, M. Teachers’ mathematical knowledge, cognitive activation in the classroom, and student progress. Am. Educ. Res. J. 2010, 47, 133–180. [Google Scholar] [CrossRef] [Green Version]
  83. Kleickmann, T.; Richter, D.; Kunter, M.; Elsner, J.; Besser, M.; Krauss, S.; Baumert, J. Teachers’ content knowledge and pedagogical content knowledge: The role of structural differences in teacher education. J. Teach. Educ. 2013, 64, 90–106. [Google Scholar] [CrossRef]
Figure 1. A model for evaluating teachers’ DC for online teaching.
Figure 1. A model for evaluating teachers’ DC for online teaching.
Sustainability 14 05268 g001
Figure 2. The first-order CFA model. ** p < 0.01.
Figure 2. The first-order CFA model. ** p < 0.01.
Sustainability 14 05268 g002
Table 1. Dimensions and items of the teachers’ DC evaluation framework.
Table 1. Dimensions and items of the teachers’ DC evaluation framework.
Knowledge (LK)
1I know the general trajectory of students’ physical and mental developmentLK1Self-compiled based on the general law of student development
2I know the age characteristics of student developmentLK2
3I know the common problems of students associated with their physical and mental developmentLK3
4I can effectively solve the common problems of students related to their physical and mental developmentLK4
5I know the knowledge level of students at different stagesLK5
Technological Knowledge (TK)6I have the ability to solve hardware-related technical problemsTK1Ekrem and Recep [63]
7I can deal with various problems related to softwareTK2
8I have the ability to help students solve computer technology problemsTK3
9I have the ability to solve some unexpected situations in multimedia teaching, such as network connection interruptionTK4
Pedagogical Knowledge (PK)10I have the ability to plan the order of topics to be taught in the coursePK1Ekrem and Recep [63]
11I have the ability to determine the scope of knowledge to be taught in the coursePK2
12I have the ability to prepare appropriate teaching materials according to the curriculum standards of the subjectPK3
13I have the ability to relate knowledge points in this course to other subjectsPK4
14I can use various teaching strategies to connect different knowledge points with the actual life of studentsPK5
15I can adjust teaching methods based on student performance or learning feedbackPK6
16I can quickly adjust teaching methods or strategies to maintain class order in case of teaching emergenciesPK7
Ethical Knowledge (EK)17I know and abide by the professional code of ethics for teachersEK1Self-compiled based on teacher professional ethics requirements
18I know and abide by education laws and regulationsEK2
19I love educationEK3
20I bravely fight resolutely against all acts endangering the cause of educationEK4
Learner Technological Knowledge (LTK)21I can use appropriate teaching tools according to the age characteristics of studentsLTK1Self-compiled based on learner knowledge and teacher’s technological knowledge
22I can use modern intelligent auxiliary tools to understand students’ development status, such as changes in gradesLTK2
23I can use technology to communicate well with students about their learning and psychologyLTK3
24I can use technology to promote students’ learning progress and physical and mental developmentLTK4
Learner Pedagogical Knowledge (LPK)25I can teach based on students’ existing knowledge levelLPK1Self-compiled based on learner knowledge and teacher’s pedagogical knowledge
26I can teach based on students’ individual differencesLPK2
27I can teach in accordance with the general traits of students’ physical and mental development at the present stage LPK3
28I can use appropriate teaching methods according to the characteristics of studentsLPK4
Learner Ethical Knowledge (LEK)29I fulfill teachers’ professional responsibilities conscientiously and be responsible to students, parents, and societyLEK1Self-compiled based on learner knowledge and teacher’s
ethical knowledge
30I care for studentsLEK2
31I care for students’ physical and mental health changes and development during daily campus lifeLEK3
32I sometimes use corporal punishment to punish disobedient studentsLEK4
Technological Pedagogical Knowledge (TPK)33I have the ability to encourage students to communicate with each otherTPK1Ekrem and Recep [63]
34I can use different teaching methods in the classroomTPK2
35I have the ability to create a learning environment that enables students to quickly master new knowledge and skillsTPK3
36I prefer to use intelligent teaching auxiliary toolsTPK4
Technological Ethical Knowledge (TEK)37I protect students’ privacy while using technology to understand studentsTEK1Self-compiled based on the notion of technology ethical
38I can treat technology rationally and do not over-rely on it or abuse itTEK2
39I consider the emotional value of students while using technology. For example, the technical results show that students’ performance is average, but their attitude is getting better and betterTEK3
40I consider students’ subjective feelings about the technology while using technology, such as acceptanceTEK4
Pedagogical Ethical Knowledge (PEK)41I consider the physical and mental health of students when using teaching methodsPEK1Self-compiled based on the relevant content of educational purpose
42I pay attention to students’ personal development in teaching (take education as the purpose)PEK2
43Improving grades is the most important purpose of my teachingPEK3
44I pay attention to screening the teaching content that is harmful to students’ physical and mental healthPEK4
Table 2. The factor analysis of Teachers’ DC framework (N = 671).
Table 2. The factor analysis of Teachers’ DC framework (N = 671).
LK1 0.806
LK2 0.810
LK3 0.820
LK4 0.798
LK5 0.791
TK1 0.842
TK2 0.887
TK3 0.886
TK4 0.807
EK1 0.803
EK2 0.803
EK3 0.782
EK4 0.742
LTK1 0.777
LTK2 0.799
LTK3 0.815
LTK4 0.817
LPK1 0.777
LPK2 0.778
LPK3 0.780
LPK4 0.779
LEK1 0.819
LEK2 0.826
LEK3 0.819
LEK4 0.764
TPK1 0.796
TPK2 0.814
TPK3 0.805
TPK4 0.715
TEK1 0.789
TEK2 0.813
TEK3 0.810
TEK4 0.798
PEK1 0.746
PEK2 0.753
PEK3 0.691
PEK4 0.749
Table 3. The eigenvalues and contribution rates of the 10-dimension framework.
Table 3. The eigenvalues and contribution rates of the 10-dimension framework.
DimensionEigenvaluePercentage of VarianceCumulative Variance Contribution Rate
Table 4. The fitting index of the evaluation framework.
Table 4. The fitting index of the evaluation framework.
TypeFitting IndexThresholdValuesResults
Absolute fit indexChi-square/df<53.651Acceptable
Relative fit index
Incremental fit index
Streamlining fit index
Parsimonious fit index
Table 5. Results of the confirmatory factor analysis.
Table 5. Results of the confirmatory factor analysis.
Potential VariableItemNormalized Factor LoadingCRAVE
Learner Knowledge
Technology Knowledge
Pedagogy Knowledge
Ethical Knowledge
Learner Technological Knowledge (LTK)LTK10.9390.97610.9108
LearnerPedagogical Knowledge (LPK)LPK10.9510.97940.9223
Learner Ethical Knowledge (LEK)LEK10.9860.95670.8489
Technological Pedagogical Knowledge (TPK)TPK10.9560.89550.9625
Technological Ethical Knowledge (TEK)TEK10.9470.97630.9116
Pedagogical Ethical Knowledge (PEK)PEK10.9780.91950.9716
Table 6. The results of interrelated coefficient matrix and square roots of AVE.
Table 6. The results of interrelated coefficient matrix and square roots of AVE.
TK0.479 **0.921
PK0.668 **0.524 **0.945
EK0.643 **0.379 **0.689 **0.940
LTK0.542 **0.543 **0.620 **0.468 **0.954
LPK0.572 **0.480 **0.673 **0.512 **0.686 **0.960
LEK0.478 **0.350 **0.566 **0.595 **0.422 **0.478 **0.921
TPK0.466 **0.336 **0.558 **0.503 **0.510 **0.546 **0.573 **0.981
TEK0.533 **0.423 **0.654 **0.581 **0.547 **0.578 **0.565 **0.559 **0.955
PEK0.523 **0.370 **0.625 **0.595 **0.525 **0.583 **0.598 **0.568 **0.646 **0.986
Note: The square root of AVE is located on the diagonal, and the remaining values are the Pearson correlation coefficient. ** p < 0.01.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tang, L.; Gu, J.; Xu, J. Constructing a Digital Competence Evaluation Framework for In-Service Teachers’ Online Teaching. Sustainability 2022, 14, 5268.

AMA Style

Tang L, Gu J, Xu J. Constructing a Digital Competence Evaluation Framework for In-Service Teachers’ Online Teaching. Sustainability. 2022; 14(9):5268.

Chicago/Turabian Style

Tang, Lin, Jianjun Gu, and Jinlei Xu. 2022. "Constructing a Digital Competence Evaluation Framework for In-Service Teachers’ Online Teaching" Sustainability 14, no. 9: 5268.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop