Next Article in Journal
Suitability of Ornamental Pepper Cultivars as Banker Plants for the Establishment of Predatory Mite Amblyseius swirskii in Controlled Production
Next Article in Special Issue
A Significant Moment in History: A Virtual Living Lab. LifeStyle Narratives That Are Shaping Our World; the Cases of Japan and UK 2019–2020
Previous Article in Journal
Implementation for Comparison Analysis System of Used Transaction Using Big Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of Key Performance Indicators (KPIs) for Sustainable Postgraduate Medical Training: An Opportunity for Implementing an Innovative Approach to Advance the Quality of Training Programs at the Saudi Commission for Health Specialties (SCFHS)

by
Abdulrahman Housawi
1,*,
Amal Al Amoudi
1,
Basim Alsaywid
1,2,3,*,
Miltiadis Lytras
1,4,*,
Yara H. bin Μoreba
1,
Wesam Abuznadah
1 and
Sami A. Alhaidar
1
1
Saudi Commission for Health Specialties, Riyadh 11614, Saudi Arabia
2
College of Medicine, King Saud Bin-Abdul-Aziz University for Health Sciences, Jeddah 14611, Saudi Arabia
3
Urology Section, Department of Surgery, King Abdulaziz Medical City, Ministry of National Guard, Jeddah 14815, Saudi Arabia
4
King Abdulaziz University, Jeddah 21589, Saudi Arabia
*
Authors to whom correspondence should be addressed.
Sustainability 2020, 12(19), 8030; https://doi.org/10.3390/su12198030
Submission received: 4 September 2020 / Revised: 21 September 2020 / Accepted: 23 September 2020 / Published: 29 September 2020

Abstract

:
The Kingdom of Saudi Arabia is undergoing a major transformation in response to a revolutionary vision of 2030, given that healthcare reform is one of the top priorities. With the objective of improving healthcare and allied professional performance in the Kingdom to meet the international standards, the Saudi Commission for Health Specialties (SCFHS) has recently developed a strategic plan that focuses on expanding training programs’ capacity to align with the increasing demand for the country’s healthcare workforce, providing comprehensive quality assurance and control to ensure training programs uphold high quality standards, and providing advanced training programs benchmarked against international standards. In this research paper, we describe our attempt for developing a general framework for key performance indicators (KPIs) and the related metrics, with the aim of contributing to developing new strategies for better medical training compatible with the future. We present the results of a survey conducted in the Kingdom of Saudi Arabia (KSA), for the enhancement of quality of postgraduate medical training. The recent developments in the field of learning analytics present an opportunity for utilizing big data and artificial intelligence in the design and implementation of socio-technical systems with significant potential social impact. We summarize the key aspects of the Training Quality Assurance Initiative and suggest a new approach for designing a new data and services ecosystem for personalized health professionals training in the KSA. The study also contributes to the theoretical knowledge on the integration of sustainability and medical training and education by proposing a framework that can enhance future initiatives from various health organizations.

1. Introduction

Sustainability in healthcare is a key initiative towards digital transformation. Numerous emerging technologies provide a novel ecosystem for data and services exploitation. Applied data science and sophisticated computational information processing in the healthcare domain define new highways for research and innovation. This research work aims to develop an overall unique quality framework for healthcare sustainability and technological innovation in the Kingdom of Saudi Arabia (KSA). It summarizes leading-edge research conducted by the Saudi Commission for Health Specialties (SCFHS).

1.1. The Saudi Commission for Health Specialties’s Quality Assurance Initiative for Sustainable Personalized Medical Training

The SCFHS has recently developed a strategic plan that focuses on expanding training programs’ capacity to comprehend the increasing demand for the country’s healthcare workforce, to provide comprehensive quality assurance and control, to ensure training programs upholding high quality, and to benchmark advanced training programs against international standards. This research paper summarizes a bold initiative towards the implementation of both this strategic plan for more efficient patient-centric healthcare, as well as an interactive data ecosystem for enhanced decision making in the context of the activities and the priorities of the SCFHS.
The SCFHS was established by the approval of Royal Decree in 1993, in order to supervise and evaluate training programs, as well as set regulations and standards for the practice of health and allied professionals. Since its establishment, the number of residency and fellowship training programs has increased.
Across the country and the Gulf region, 1200 programs, that cover the 79 health specialties, are conducted [1]. These newly initiated training programs respond to the anticipated increase in demand for health professionals and training needs over the years. Moreover, there will likely be an expansion in training programs over the coming years that will require sound development planning to ensure high-quality training for health professionals.
Some countries have taken the initiative to pursue quality measurement in competency-based postgraduate training [2,3,4]. Saudi Arabia, through the SCFHS, is among the list of those countries that developed a system that measures quality outcomes of postgraduate training.
The SCFHS has launched the Training Quality Assurance Initiative (TQAI) which aims to develop a system for setting quality standards, benchmarks, processes, and improvement guidelines for the postgraduate training programs.

1.2. Governance of Training Programs in SCFHS

The latest developments in the big data and analytics domain bring forward the significance of defining efficiency and performance metrics as a basis for the design of a data ecosystem for personalized value-adding services. The healthcare domain is one of the most critical with great social impact.
Key performance indicators (KPIs) serve as a way of gently guiding clinicians in the right direction to improve clinical practice and is being used in different fields of medicine [5,6,7]. Santana et al. [8] identified nine measurements that KPIs should meet. Specifically, KPIs should be:
  • Targeting important improvements (there is a direct connection with sustainability),
  • Precisely defined (measurable metrics, impose efficiency and enhanced decision making—they also provide a systematic way for monitoring performance over time with significant managerial implications in the medical sector),
  • Reliable,
  • Valid,
  • Implemented with risk adjustment,
  • Implemented with reasonable cost,
  • Implemented with low data collection effort,
  • Achieving results which can be easily interpreted, and
  • Global (overall evaluation).
In June 2017, the SCFHS’s Secretary General requested the formation of a Quality Indicators Committee (QIC) that served to assess the postgraduate training programs’ outcomes based on the set quality indicators. These indicators are measurable features as a means of providing evidence that reflects quality and, hence, serves as a basis to help raise the quality of training provided to health professionals.
The collaborative efforts of the two main divisions of the SCFHS (i.e., Academic Affairs and Planning and Corporate Excellence Administration) have formed the committee. It consisted of the Chief of Academic Affairs, the Chief of Planning and Corporate Excellence, the Executive Director of Training (and Assistant), the Executive Director of Assessment (and Assistant), the Director of Quality Department, and the Director of Knowledge Management. The objectives of the QIC are to:
  • Improve the quality of postgraduate medical training (PGMT),
  • Provide means for objective assessment of residency programs and training centers,
  • Provide guidance to residency programs and training centers periodically and objectively, and
  • Assist program directors in reviewing the conduction and educational quality of their programs.
Thus, this research paper presents the determined outcome measures based on the quality indicators to identify what elements needed to be modified, which areas needed to be improved, and/or what areas might be developed to fulfill the objectives of the SCFHS. Hence, the main aim of this research article is to evaluate the residency training programs being conducted throughout Saudi Arabia through the core set of preformed KPIs. The main objectives are to evaluate the quality of training programs across the nation, assess the degree of achievement of pre-determined KPIs, and provide baseline data that support decisions for the improvement plan.
As an ultimate objective, this research contributes a novel framework for the integration of KPIs in a sophisticated data and services ecosystem in the SCFHS that can be used in similar organizations worldwide.
This research paper is organized as follows. Section 2 integrates multidisciplinary research on analytics KPIs and personalized medicine and medical education. In Section 3, we present our integrated research methodology with an emphasis on demographics, methods deployed, and research objectives. Analysis and key findings of the analysis of training programs in the SCFHS are in Section 4, including statistical analysis. The main findings are presented in Section 5. The key contribution of our research, the SCFHS Decision Making Model for Training Programs, is then summarized in Section 6. Finally, in Section 7, we present the conclusions and future research aims for this research work.

2. Critical Literature Review on KPIs in Healthcare Training Programs for Sustainable Healthcare

The analysis of performance and efficiency in training programs; technology-enhanced learning; and the systematic analysis of teaching, learning, and training behavior is a key initiative in our days. In fact, in recent years, a new multidisciplinary domain has emerged: the domain of learning and training analytics. It provides critical insights for the methodological methods, measurement tools, and trusted key performance indicators that enhance decision making in the training and learning process [9,10,11]. In Table 1, we summarize an overview of some key approaches in the literature that provide key insights to our methodological approach.
What is more, a systematic evolution in the areas of big data analytics and data science [12,13] offers new unforeseen opportunities for the design, implementation, and functioning of data and services ecosystems, aiming to promote enhanced decision making. Within this diverse area, issues related to data governance, standardization of data, availability of data, and advanced data mining permit new insights and analytical capabilities. Furthermore, the social impact of big data research and the direct connection of sophisticated analytics to strategic management and decision making are also evident in various studies.
In a parallel discussion on sustainable higher education [14,15], researchers emphasize the need to measure the impact of investments in training programs and to maximize the return on investment in social capital. A bold literature section also communicates that socially inclusive economic development must be an intervention of innovative education and enhanced skills and capabilities of citizens.
Within the training domain, there is fast-developing literature related to learning analytics and key performance indicators that provide significant insights into our research. In the next paragraphs, we provide a compact synthesis of complementary approaches [16,17,18,19,20].
Selected research works in the domain of learning analytics and KPIs provide further insights, especially when dealing with the learning, teaching, and training process. Different methodological approaches emphasize diverse pedagogical approaches and interpret them in meaningful sets of KPIs and learning analytics [25,26,27,28]. Other researchers propose KPI interventions that integrate the execution of the learning process with management and decision making at the highest level within organizations [29,30,31,32,33,34]. In an effort to interpret the literature review of learning analytics and to provide the pillars of our research methodology that will be presented in the next section, some critical insights are useful.
In the context of the Saudi Commission for Healthcare Specialties, thousands of medical training programs are offered. Our interest in maintaining a total quality initiative incorporates the development and monitoring of a reliable, trusted, and efficient set of KPIs. In this research study, we emphasize exactly this aspect. We present the first run and execution of a full set of KPIs related to the total quality initiative. This is a constructive knowledge management approach associated with big data analytics research and significant implications for data science and sustainability.
The standardization of KPIs and learning analytics for measuring the attitudes of residents on postgraduate medical training programs in the Kingdom of Saudi Arabia is a significant effort towards enhanced decision making. In this research study, we analyze the impact of using KPIs in order to develop a framework for sustainable medical education. We also introduce, in Section 6, the SCFHS Training Quality Approach—Medical Training Quality Framework. In fact, we contribute to the body of knowledge of the learning analytics and KPI domain by attaching a managerial approach for total quality management in medical education.
Our approach is also related to the latest developments in data science. It also contributes to the sustainability domain and brings forward new challenges and requirements for the digital transformation of healthcare systems. The necessity of having flexible, sustainable postgraduate medical education requires the design and implementation of socio-technical services that interconnect knowledge, knowledge processes, knowledge providers, and stakeholders [35,36,37].
In the next section, we provide the first phase of the quality initiative in the SCFHS. We focus on the design, implementation, and interpretation for sustainable medical education purposes of a newly, innovative KPI and learning analytics framework. As explained in this critical literature review, the main purpose of this framework is to support the following six complementary objectives:
  • Measurement tool for enhanced decision making
  • Learning behavior analysis and adjustment
  • Predicting performance and personalizing learning experience
  • Real-time monitoring tool of learning process
  • Customizing learning feedback
  • Standardization of instruction flow
  • Interoperable technology-enhanced learning systems

3. Research Methodology

The main aim of this research paper is to analyze the interconnections of quality assurance in medical training programs in the KSA as a key enabler for a big data ecosystem in the KSA for the provision of personalized medical training. Our research is directly related to the Vision 2030 for digital transformation and is also a bold initiative for analyzing how sustainable data science can promote integrated sophisticated medical services with social impact.
The three integrated learning objectives in our research are summarized as follows.
Research Objective 1: KPI definitions for efficient medical training. The main aim is to define and to operationalize a set of KPIs for efficient medical training.
Research Objective 2: KPIs’ social impact and sustainability in medical education. The main aim is to understand and to interpret how a set of measurable, compact, and reliable KPIs can be used as the basis of a data ecosystem for efficient medical training.
Research Objective 3: Sustainable data science for medical education and digital transformation of healthcare. We analyze the main impact of our key contribution towards a long-term sustainable plan for the introduction of personalized medical training services for individuals and the community in the KSA.

3.1. Study Design

This research is a prospective, analytical, cross-sectional study design that represents the quality outcome measures of training programs and services provided by the SCFHS in 2018. The key performance indicator list was created by experts in quality and health profession education at the SCFHS. The initial KPI list consisted of 28 KPIs but, after in-depth review and discussion, the committee omitted five KPIs from the final list and only 23 KPIs were selected. The theoretical framework of those KPIs was based on the Kirkpatrick model. Basically, the model is objectively used to analyze the impact of training, to work out how well your team members learned, and to improve their learning in the future [11]. The four-level training model that was created in 1959 represents a sequence of ways to evaluate training programs. As you move from one level to the next, the process becomes more difficult and time-consuming, but it also provides more valuable information. The KPIs are combined into domains that are described in Table 2.
There were 23 general KPIs created, defined, and agreed upon by the Quality Indicators Committee which were intended to provide objective evidence in measuring the level of performance and its progress. Establishing a positive and trusting relationship with the postgraduate training stakeholders is the core of the SCFHS reform strategy and success. To achieve this, all stakeholders were included in the process of reviewing the 23 agreed upon KPIs in the GMH quality matrix. They were asked specifically about whether these KPIs are measurable, usable, reliable, simple to understand, available (data), robust, and clearly defined. Selected KPIs were distributed to all stakeholders for review, feedback, and approval (20 December 2017).

3.2. Sources of Data

The sources of data collected for the 23 KPIs were numerous, including primary and secondary data collection methods.
Primary sources were self-administered, semi-structured, consisting of a series of closed and open-ended validated questionnaires distributed to program directors and trainees.
The secondary sources of data were collected from the academic affairs department databases at the Saudi Commission for Health Specialties (SCFHS), which includes databases from four different sections (admission, accreditation, training, and assessment sections databases). Table 3 represents the matrix of the data sources for each KPI.

3.3. Validity and Reliability of the Primary Data Collection Tool (i.e., Survey)

The validation process of the questionnaire took place by content validity, which was conducted by the content experts; after that, face validity was performed with the help of the medical educationist who found the survey fulfilling the objectives of the survey and that the flow of the questions was in a logical sequence. For the reliability of the questionnaire, a pilot study was conducted on 40 participants; the questionnaire was modified according to the feedback of the respondents. Cronbach’s alpha was calculated as above 0.7 which suffices for reliability.

3.4. Program Directors’ Survey

A well-designed interview questionnaire for program directors was developed through key informant interviews and extensive focused group discussion of the Quality Indicators Committee (QIC) panels. The questionnaire, consisting of 53 questions, was structured representing four main sections. The first section was demographic details of the program directors; this section had 14 questions exploring the personal information and the base of the participants’ practice. The second section was the supervision and training, which contained 15 questions. The third section was educational content and it had give questions. The fourth section was the support and resources section which had 19 questions.
Initially, the SCFHS had a list of 1145 program directors in the directory; after an extensive checking and verification of the list, it was found that there was some duplication and wrong information, so the final list came out to be 1095 program directors in the country. During the survey period (December 2018 to February 2019) and following the consecutive sampling technique, all the 1097 program directors were invited. An online link was created for the survey and was sent to all program directors through email but only 464 replied to the email. Another 252 program directors were contacted, and an interview was conducted through a computer-assisted telephonic interview (CATI) by SCFHS personnel. The total number of program directors who responded was 716 (65.2%).

3.5. Trainees Survey

The survey questionnaires for residents were developed by the PGMT Quality Indicators Committee (QIC) to produce an error-free measure of care quality. Measures should be based on characteristics of best practice such as validity, reliability, and transparency. The questionnaire was created on 25 September 2018, published on 1 October in the same year, and closed on 4 August 2019. The questionnaire consisted of six sections with questions distributed among them. Eight questions were covering the demographic section; in the second section, there were 14 questions asking about the trainees’ educational activities. Three questions in the third section were regarding the program satisfaction/training program. Section four had eight questions regarding perception and personal experience. The fifth and sixth sections had three and five questions for research participation and satisfaction with the SCHFS, respectively.
There was a total of 13,688 residents working in different specialties throughout Saudi Arabia; only 3696 (27%) of the residents agreed to participate in the online survey. The trainers were left out of the survey due to time constraints.
A total of 41 questions, representing indicators of the training programs’ quality, were validated by expert and QIC panels for clarity and content relevance. The KPI working group is responsible for collecting and analyzing data each year and making sure that the indicators remain precise and appropriate—unlike Toussaint et al.’s suggestion that the report should be on a quarterly basis [12].

3.6. Data Analysis

The computer program used for data analysis was the statistical program SPSS version 24. A test will be considered significant if p-value < 0.05. All data were analyzed based on each KPI’s parameters and targets.

3.6.1. Data Management

After the data were collected and entered into Microsoft Excel, respondents’ data were rechecked for blank/empty or any typo errors, and then the file was exported to SPSS format for further management and analysis.

3.6.2. Descriptive Statistics

For the presentation of descriptive statistics, categorical data (gender, region distribution, qualification, etc.) were calculated as frequency (n) and percentages (%), while numerical data such as age and years of experience were presented by mean + standard deviation.

3.6.3. Inferential Statistics

A Chi-square test was used to assess the association between two categorical variables like gender and country distribution.

4. Analysis and Main Findings

In this section, we provide the initial analysis of our data. We provide the statistical analysis for the key aspects of the research tools (surveys) discussed in the previous section. In Section 5, we will summarize the key interpretations and the impact of our research towards personalized medical training and the digital transformation of healthcare in the KSA.

4.1. Demographic Features of Our Respondents

4.1.1. Section One: Program Directors’ Basic Characteristics

There were 716 program directors who completed the questionnaire and were included in the final analysis. The overall response rate of the program director was 65.3% (716/1097). The respondents were mainly Saudi nationals (647, 92%) and predominantly males 560 (71.7%) (Figure 1) who were more prevalent in the northern and southern regions (Figure 2).
Almost half (320, 44.7%) of the program directors were in the age group between 41 and 50 years (Figure 3 and Figure 4).
Saudi board graduate was the most common highest qualification for our program directors, 58.4%; followed by graduates from different regional programs, 15%, like Arab, Egyptian, or Syrian boards. Figure 5 describes the distribution of the primary qualification among program directors. The option “other” included boards in Australia, Pakistan, South Africa, etc. In Figure 6, we provide also the overview of the Qualification Distribution among Different Regions.
Most of the program directors, 44%, were recently appointed with experience in the job for less than two years and 15% have been in this post for more than six years (see Figure 7). In Figure 8, we provide also an indication of the Time Spent as Program Director by Gender.
Program directors who responded to our survey resided in over 27 cities. However, Riyadh was the most common city of residence for the program directors, 38.4%, followed by Jeddah city in 17.9%, (see Figure 9).
The respondents covered around 60 different appointed programs (Figure 10). Program directors from the discipline of medicine and its subspecialties, pediatrics, surgical specialties, dentistry, and OBGY were over two-thirds of the respondents (Figure 11).
In Figure 12, Figure 13, Figure 14, Figure 15, Figure 16, Figure 17, Figure 18, Figure 19, Figure 20 and Figure 21 we present some additional characteristics and qualitative features of the participants in our quality initiative including: age of program directors in each discipline (Figure 12), Health Sectors for the Included Program Directors (Figure 13).
Nearly half of the program directors work in the Ministry of Health (49.4%), followed by the Ministry of Defense (16.2%); and only three directors work in the Royal Commission for Jubail and Yanbu.
Nearly half of the program directors are holding other administrative jobs at their centers, including head of department or section for 29.9% (see Figure 14 below).
In Figure 15 and Figure 16 some more interesting data about our participants are presented. Over 27% of the PDs have other special qualifications including medical education, hospital administration, public health, clinical epidemiology, and others but only 8.9% of the program directors have some degree or certificates in health profession education.

4.1.2. Trainees’ Demographic Data

There were 3696 trainees who participated in this self-administered, structured, closed, and open-ended online survey. The response rate of the trainees through the online link survey was 27%. There were 1932 (52.3%) male respondents and most of the respondents were Saudi nationals, 91.9% (see Figure 17).
The respondents were representing all major disciplines, however, most of the trainees were trainees in medical disciplines, 90.9% (see Figure 18, below)
Residents in training represented the most common respondents, 94.5%, while fellows were 5.5%. One-third of the residents were working in the R2 level (1270, 34.4%) followed by 23.3% at the R3 level (860 =), (See Figure 19, below).
In Figure 20, we communicate that more than one-third of the trainees are in the Central Region (40.4%) of Saudi Arabia and after that approximately one-third in the Western Region (34.5%). A very small number are in the Northern Region (0.4%).
The highest number of trainees are working in the city of Riyadh (39.6%) and then 20.1% of the trainees are working in Jeddah; only 0.4% are working in the city of Hail (see Figure 21, below).

4.2. Results of Key Performance Indicator (KPI) Domains

Reaction domain of Kirkpatrick
The initial set of KPIs was based on the reaction domain of the Kirkpatrick training evaluation model, which measures the value of the training program among stakeholders, i.e., trainees, trainers, and program directors. There were four KPIs developed based on this domain.

4.2.1. Percentage of Trainees’ Satisfaction (GMH QI 1.1)

This is a score-based KPI, with a target set at 80% satisfaction. This is an accumulative score of eight different domains retrieved from the trainees’ questionnaire. The domains are residents’ satisfaction of academic activities in their training centers, residents’ satisfaction with other residents in their programs, residents’ satisfaction with the administrative component of their training, residents’ satisfaction with the program and recommending it to others, residents’ satisfaction with their training center, residents’ satisfaction with their competency, residents’ satisfaction with their specialty, and residents’ satisfaction with SCFHS functions. The calculated score for overall trainees’ satisfaction was 69%; more details are described in Figure 22. So the target is not yet achieved.

4.2.2. Percentage of Trainers’ Satisfaction (GMH QI 1.2)

This KPI is defined as the total number of satisfied trainers divided by the total number of surveyed trainers in that period of time. The target score was set at an 80% satisfaction rate. Unfortunately, these data were unavailable; trainers were not surveyed last year.

4.2.3. Percentage of Program Directors’ (PDs’) Satisfaction (GMH QI 1.3)

This is a score-based KPI, with a target set at 80% satisfaction. This accumulative score contains four main satisfaction domains: satisfaction with training faculty; satisfaction with evaluation process; satisfaction with educational components; and satisfaction with the training center’s governance, education environment, and resources. The overall score for PDs’ satisfaction was calculated at 76%, slightly below our target value of 80%; this is explained in detail in Figure 23.

4.2.4. Percentage of Trainees’ Burnout (GMH QI 1.4)

This KPI was measured through the trainees’ questionnaire and defined as the number of trainees who feel burnout [38,39,40,41,42,43,44,45,46,47,48,49] divided by the total number of surveyed trainees in that period of time. The target value was set at a level equal to/below 10%. The calculated score was 66.7% and the breakdown score is demonstrated in Figure 24, Figure 25 and Figure 26.
Learning domain of Kirkpatrick
The second set of KPIs was, it focused on the learning domain, which is the capability in developing new knowledge, skills, and attitude, and it contains 13 different KPIs.

4.2.5. Percentage of PDs Who Attended PD Training Course Offered by the SCFHS (GMH QI 2.1)

This KPI is defined as the number of program directors who attended the course divided by the total number of program directors in the same period. There was no target value set for this KPI. There were 38.4% of program directors who received such courses, which is better than last year, which was calculated at 15% (See Figure 27 and Figure 28, below). The details are described in the following figures.

4.2.6. Percentage of Trainers in PGMT Programs Who Successfully Completed SCFHS Training (GMH QI 2.2)

This KPI is defined as the total number of certified trainers by SCFHS in 2018. The data source for this KPI was the raining department database. The target score was set at the value 200. In 2018, there were 276 trainers who completed the SCFHS training course. This KPI achieved its target and it is significantly higher than last year, 2017, when the score was 66% of the Target.

4.2.7. Percentage of Surveyors Who Have Successfully Completed SCFHS’s Accreditation Training (GMH QI 2.3)

This KPI is defined as the total number of certified surveyors by SCFHS divided by the total number of surveyors. The target was set at a level of 70%. The calculated score for this year was 67%, as per the feedback from the Accreditation Department Database.

4.2.8. Percentage of Trainees’ Compliance with Minimal Procedure, Case Exposure Policies Required Competency Index (GMH QI 2.4)

This KPI is target value was set at 90%, however, we did not identify the source of data yet to be used, and it was not yet measured. Our expectation is that this data will be available through the One45 program by 2020, and we will work on achieving that.

4.2.9. Percentage of Trainees Who Have Received Trainees’ Evaluation by Program in a Specific Period (GMH QI 2.5)

This KPI is defined as the total number of trainees who received their evaluation with two weeks of the end of every rotation (see Figure 29, below). The target was set at the level of 100% compliance. Ideally, this score should be collected from the trainees, but, unfortunately, it was not included in their survey, therefore, the only available data were in the program directors’ survey. The mean score for compliance was estimated to be 74.2%. This compliance rate was nearly identical in each region, with p = 0.9 (ANOVA test).

4.2.10. Percentage of Research Included in Curriculum (GMH QI 2.6)

This is defined as the number of programs that included research-related issues. The target value was set at 70%. This KPI source of information is the Training Department Database but, unfortunately, these data are not yet available. Research participation was documented in 33.1% of the trainees’ surveys.

4.2.11. Percentage of Programs with Burnout Policy (GMH QI 2.7)

This KPI target value was set at a 100% compliance rate. The source of information was the program directors’ survey which counted the compliance with such policy as 69%. This KPI in the future should rely on the accreditation data, as well.

4.2.12. Percentage of Compliance with Implementing Incorporated e-log System in Each Program (GMH QI 2.8)

This KPI is defined as the number of programs that are implementing an incorporated e-log system divided by the number of all programs. The target value was set at 30%. The source of such information is the Training Department Database. Unfortunately, the score is not available, as the data were not yet received.

4.2.13. Percentage of Trainees Who Fulfilled Their Promotion Criteria (GMH QI 2.9)

This KPI target was set at 70%, and the Training Department Database is the main source of this information. Unfortunately, the score is not available, as the data were not yet received.

4.2.14. Percentage of Trainees Who Passed the Board Exam (GMH QI 2.10)

This KPI target was set at a 70% pass rate in the board exam, and the Training Department Database (and Assessment Department Database) is the main source of this information. The calculated score for this KPI in 2018 was 79%.

4.2.15. Percentage of Programs That Incorporated Simulation in Their Curricula (GMH QI 2.11)

The target value of this KPI was set at 50% or more. At present, the source of this information was the program directors’ survey, Q 3.1, and the trainees’ survey, Q10. The score in the program directors’ survey was 3% only, while the score in the trainees’ survey was 34.2%. The accumulative mean score was only 18.6%.

4.2.16. Percentage of Programs with Trainees Receiving Annual Master Rotation Plan (GMH QI 2.12)

This KPI is defined as the number of trainees receiving their annual master training program before 1 October. The target value of this KPI was set at above 90%. The source of this information was the program directors’ survey. Around 73% of the program directors apply for the annual master rotation plan for residents in their center early in the academic year, as per program directors’ feedback (see Figure 30 and Figure 31, below).

4.2.17. Percentage of Programs in Compliance with the Annual Master Plan (GMH QI 2.13)

This KPI is defined as the number of trainees who completed their block rotation. The target value of this KPI was set at a level above 90%. The source of this information was the program directors’ survey. The mean score of adherences to the master rotation planned early in the academic year was 76.9%; this rate was similar across genders, type of training center, and regions (see Figure 32, below).
Training domain of Kirkpatrick
The final set of KPIs is focused on the training governance domain and it contains six different KPIs.

4.2.18. Percentage of Programs with Complete Goals and Objectives for Residency Programs (GMH QI3.1)

This is defined as the total number of programs with a clear statement outlining goals and educational objectives for their residency program. The target value for this KPI was set at 80%. Clear goals and objectives of every rotation were provided to 87.7% of residents (see Figure 33, below). This was similar in both genders and all regions. However, program directors who received orientation about the role and responsibilities of the job were significantly better in providing the residents with those objectives in every rotation.

4.2.19. Percentage of Completed Trainer Evaluations by Trainee per Program (GMH QI 3.2)

The target score for this KPI was set at 70%. The trainees had a chance to evaluate the training faculty in a structured process in 48.7% of cases (see Figure 34, Figure 35 and Figure 36 below). This was not significantly different between genders, regions, or types of program—independent or joint program. However, there is a significant difference found when the program director spends 2–3 h managing the training activity in comparison to other time frames.

4.2.20. Percentage of Adherence to Accreditation Requirements (GMH QI 3.3)

This is defined as the rate of programs maintaining accreditation standards during the validity period. The target value of this KPI was set at over 90% compliance. The source of information is the Accreditation Department Database.

4.2.21. Percentage of PD Turnover Rate (GMH QI 3.4)

This is the number of program directors who did not complete their full term. This target was set below 10%. Unfortunately, the data of this information are not available. The source of such information should be the Training Department Database.

4.2.22. Percentage of Accreditation Compliance Score (GMH QI 3.5)

This is the number of centers that met accreditation standards. The target value of this KPI is to be determined.

4.2.23. Percentage of Violations with the Matching Regulations (GMH QI 3.6)

This is the total number of matching violations in a given year. The target value was set below 1%. The source of this information is the Admission and Registration Database. The rate of matching violation for 2018 was 4%.
In Table 4, we provide a summary of the Key Performance Indicators measurement.

5. Key Findings Related to the Research Objectives

Our research work contributes to the intersections of knowledge management, quality assurance, and medical education domains, while it provides direct input to applied data science and sustainability. The overall idea that several well-defined, accurate, measurable, and reliable sets of key performance indicators can govern a holistic quality assurance initiative for the provision of high-quality medical training is a bold knowledge management case. It provides also an operational, functional sustainability framework for medical education, supporting a short- and long-term analysis of performance. The connection of our work is also significant for the computer science and data analytics domains. The technological infrastructure for collecting all required data for the measurement and the standardization of these KPIs serve as a transparent managerial method for measuring efficiency over time. The SCFHS moves forwards and plans a holistic data science strategy for connecting several knowledge-intensive personalized medical services to its users. In this section, we provide the key interpretations of the survey results presented in previous sections. We will elaborate further with these findings in Section 6, where we provide our contribution to the theory of sustainability for the healthcare domain by interconnecting the main findings with our strategic proposition for enhanced decision making. In Table 5, below, we present a high-level abstraction of the key findings related to the three significant research objectives of our study.

5.1. Recommendations of the Program Directors and Trainees in Improving the Quality of Training

Burnout
  • The burnout rate in trainees is multifaceted but most of the residents agree that lack of staff at the training center is the main reason. Training centers need to ensure the proper distribution of workload among residents with appropriate staff coverage.
  • The resident dropout rate should be monitored closely for each program and should be investigated in each training center; the reasons should be clarified and addressed with appropriate actions for each reason.
  • The residents’ annual vacation is their right and should be protected by the training center to choose the time where it suits the residents the best—not the program.
  • Trainees suggest that one day per month should be offered as an admin leave to every resident, mandatorily (note that month and date should be determined by the resident only).
  • Under no circumstances should the residents cover the service without appropriate supervision, including ER rotation, and they should not be prevented from attending the academic activities.
Key Requirements for a Sustainable Medical Education Framework:
  • A unified ecosystem of smart medical educational content and distance learning facilities can significantly reduce the burnout rate. More flexible procedures and cloud training services can be a bold action for the improvement of the burnout rate. It was one of the most critical findings and the SCFHS takes this seriously into consideration and various initiatives are planned, closely related also to the digital transformation in the Vision 2030.
  • The analysis of the dropout rate and critical reasons must be monitored through a transparent web service capable of monitoring responsive actions. In an integrated service, advanced notification and counseling procedures can support residents’ training and commitment to participate and learn.
  • Time restrictions related to annual vacations or obligations must be supported and training must always be a priority as a long-term capability that enhances the full competence of the Saudi healthcare system. Towards this direction, a new hybrid training platform and strategy must be implemented. Human factors are a critical aspect of a Sustainable Medical Education Framework.
  • Rewards for training efficiency and performance must be introduced for residents providing a transparent mechanism for facilitating training without dropouts. This reward system can be part of a holistic data and services ecosystem in the SCFHS.
Working hours
6.
Given that residents are struggling in finding time to read during the working hours, centers have to make sure that the educational environment is proper for residents’ educational needs by enforcing academic half-days on a weekly basis and availability of rooms, reading areas, and resources like libraries with updated textbooks.
7.
Residents are struggling with long working hours even after their on-call times, therefore, centers have to ensure that post call off is strictly implemented, as per SHF policy.
8.
Professional behavior of seniors including consultants is of paramount importance in educational gain.
9.
Centers need to conduct continuous medical educational activities on professionalism, especially dealing with junior staff, to enhance proper communication and professional behavior.
Key Requirements for a Sustainable Medical Education Framework:
  • A blended virtual and physical space for reading, study, and research is also required. Within such a virtual medical educational space, seniors support juniors and also advance knowledge management capabilities, codify know-how, experiences, lessons learned, and similar activities.
  • Time and space reservation for sophisticated research activities is also a critical pillar in a sustainable medical education framework.
  • Advanced professional social networking of medical experts is also required, and thus, SCFHS is planning a sophisticated social human network of medical experts and professionals in the KSA aiming to enhance teamwork and collective intelligence management.
Simulation
10.
Many centers lack a simulation and skill lab which is nowadays an imperative tool in training; therefore, centers should invest in preparing such facilities with proper support to enhance the educational opportunities or should collaborate with an accredited center formally.
11.
Simulation activity should be incorporated in the curriculum and if the training center does not provide such a facility, collaborative effort with regional accredited simulation centers and a proof of evidence for such an activity should be recorded.
12.
Simulation training should be offered to all trainees including all essential procedures which need to be kept in a resident portfolio.
Key Requirements for a Sustainable Medical Education Framework:
  • The new SCFHS data and services ecosystem requires an integrated open simulation lab, partially implemented in the cloud as well as an enhanced virtual and augmented reality medical lab. Towards digital transformation, a Saudi Agora of Virtual and Augmented Reality Medical Lab and Content Laboratory can be implemented. This requires, of course, significant resources but will release unexploited capabilities of residents.
  • In a parallel effort and initiative, the SCFHS should lead the adoption of virtual and augmented reality content for medical training and labs. This should be a continuous, ongoing process towards excellence.
Training
13.
All candidates for a program director post should not be appointed unless they attend a comprehensive orientation of how to perform this job efficiently, before starting their duties.
14.
Most trainees are not aware of the availability of courses on training; thus, the SCFHS needs better advertisements.
15.
External rotations should be offered to all residents twice a year and it is up to the resident to choose what rotation he/she will select and should not be pressured to give up his right to choose the elective rotation.
16.
The training center has to ensure the availability of diverse specialties to improve the clinical encounter and make sure that residents are trained by highly qualified physicians.
17.
The institutional leadership should be supportive of the educational process by all means necessary, promptly listen to the trainees’ complaints about their needs, and address their concerns.
Key Requirements for a Sustainable Medical Education Framework:
  • An integrated Training Excellence approach is required with various constitutional parts including active learning medical practice, mentoring, etc. It is also necessary to support continuous awareness campaigns for the excellent training and research programs of the SCFHS.
  • Institutional leadership is also an integral part of the Sustainable Medical Education Framework, with dynamic channels for direct communication and accurate information for trainees. It also provides all the means for the realization of a fully efficient educational medical environment.
On-call duties
18.
The on-call duty needs modification from the current practice which has been the same for the case; the on-call duty should be in shifts (the 24 h on-call duty should be terminated).
19.
On-call shifts should not exceed 8 each month and post call off duty should be implemented immediately right after the handover and it should be monitored by the training center.
Key Requirements for a Sustainable Medical Education Framework:
  • Shifting on-call duties is a key action for the enhancement of the efficiency in medical education and research. Within an integrated data and services ecosystem, on-call duties can be also supported by electronic services.
Research
20.
Research support should be made available by the training center, the Saudi commission should incorporate a research curriculum into every training program, and a research portal should be made available to all training centers with weak research capability.
21.
Research facilities vary from one center to another, so the Saudi commission should make sure that all the trainees have access to minimum research supporting tools to standardize the level of facilities and enhance the quality outcome. An e-portfolio of the resident regarding research should be maintained by the Saudi commission and should be updated periodically by the residents.
Key Requirements for a Sustainable Medical Education Framework:
  • Research integration into training must be an integral part of any medical training program.
  • A unified research skills lab and research cloud service must be implemented for providing ubiquitous and effective research training to residents. Sophisticated services and access to medical and other scientific libraries, as well as the cultivation of best practices and standardization of the research process, should be promoted. Finally, collective research towards higher accomplishment must promote team research, increasing the visibility and impact of Saudi medical research.
  • An advanced publication program should allow residents to execute their research skills towards publications with impact. In the long term, we will have high skilled residents with top publication and research records promoting the objectives of Vision 2030.
Counseling
22.
Residents should receive regular counseling sessions (minimum quarterly to address any concerns).
23.
Centers should put the needs of trainees as a high priority without hindering patient care quality.
Key Requirements for a Sustainable Medical Education Framework:
  • Counseling should be offered to residents in a constant and systematic way exploiting also the data and services ecosystem of the SCFHS.

5.2. Recommendation of the Training Quality Department

Our research has several implications for the quality assurance of medical training programs and for the relevant training departments. A novel integrated approach is required in order to enhance the capabilities and the efficiency of the training quality department. This will be taken seriously into consideration for future initiatives and strategies in the SCFHS.
The main implication is related to the need for the design of a fully functional data and services ecosystem for enhanced efficiency in medical training programs and decision making. In the previous section, we highlighted some key aspects of improvements and efficiency parameters for the residents in medical training programs. In this section, we try to communicate some key requirements for this new ecosystem in the SCFHS for the digital transformation Vision 2030.
The following items provide the key findings and the main implications of our research study related to the training quality department of the SCFHS.

Novel Data and Services Ecosystem in SCFHS

  • Data quality was a major challenge of this project: availability, completeness, and accuracy were the most worrisome. In fact, some of the KPI information has yet to be retrieved. The collection of data related to the KPIs and quality assurance indicates the need for an integrated data strategy in the SCFHS. The establishment of big data flows for continuous management monitoring and support of the medical education process is a key requirement. In addition, the suggested services proposed in the previous section for residents require a new, sophisticated analytics upper level.
  • Transparency, collaboration, and effective communication are a cornerstone in achieving the objective of the assigned committee and its charges including conducting this project and preparing the report. The provision of a systematic, manageable workflow within the data and services ecosystem of the SCFHS is also another critical development.
  • Some KPIs need further revisiting to re-define them, or even modify or define certain measures. This research study focused on 23 KPIs based on Kirkpatrick’s model, but further enhancement can be developed. Emphasis can be based on active learning, team development, research skills and capability, collective intelligence, social impact, etc. A key interpretation for this requirement is related to a continuous improvement process on KPIs for teaching, training, and research excellence. This is a core component of any Sustainability Framework for Medical Education.
  • The developed questionnaire was an eye opener toward a better-constructed questionnaire with pre-determined comprehensive domains in order to accurately measure what it was made for, covering all aspects of training quality. In a greater context, the establishment of two-way data and communication channels in various training and research initiatives in SCFHS must be a key target. The continuous provision of real-time data can serve as a basis for numerous personalized medical training and research services.
  • A quality rating system utilizes a unified scoring system for the satisfaction level across all surveys and performs reliability analysis to check for internal consistency. A transparent, multi-component quality rating system, like the one we tested in this research through 23 KPIs, is a bold initiative towards the objective measurement of quality. It is necessary for the next years to design and implement a cloud-based rating system, capable of delivering on real-time variations and updates on quality and satisfaction perceptions of participants in training and research conducted by the SCFHS.
  • Setting the target value, i.e., benchmark, was somewhat arbitrary. Using an objective measure as much as possible would be ideal, however, any subjective measure selected should be agreed upon by the TQC, with acceptable reasoning for this value. In this direction, relevant research in analytics and KPIs and international benchmarks should be also exploited. For a Sustainability Framework for Medical Education, this quality benchmark can be developed through community collaboration and international associations’ intervention. The SCFHS can be a leader in this area by also exploiting international collaborations.
  • We should utilize a multi-source feedback system to retrieve the required information; this might enhance the validity of the information and reduce bias. Securing accurate, transparent, trusted feedback allows for enhanced decision making and also integrates knowledge creation processes. The integrated data and services ecosystem has to secure this kind of feedback as a long-term commitment and trust agreement between the stakeholders in the SCFHS ecosystem.
  • Data integration between all departments or even within the same department is mandatory. Thus, the new data and services ecosystem of the SCFHS will offer this integration as a value-adding service for enhanced decision making.

6. The SCFHS Sustainable Medical Education Framework

Our research study based on the implementation of key performance indicators to benchmark, maintain, and improve the quality of postgraduate medical training provides interesting insights for the integration of knowledge management, medical education, and big data analytics research for sustainability. As an effort to codify the key findings and to generalize the implications, we present, in Figure 37, a proposed Sustainable Medical Education Framework. This framework can be used as a road map as well as a policy making document for the digital transformation of the SCFHS.
The basic assumption for our proposed Sustainability Medical Education Framework is the integration of four integral pillars of efficiency.
Sophisticated big data ecosystem: At the first layer, a sophisticated big data ecosystem provides a sustainable infrastructure for the aggregation and analysis of diverse data required for medical training and education as well as for advanced decision making in the context of the SCFHS and its stakeholders.
Data quality and governance allow transparency, collaboration, and effective communication for the optimization of medical training. Special services are also facilitating data integration and flow for the provision of value-adding services. Multi-source feedback for all the training programs in all the phases of the relevant workflow also permits flexible communication. A unified big data framework exploits a sophisticated KPI-driven process for measuring and improving performance in parallel with the execution of medical training.
A big data ecosystem requires:
  • Data quality;
  • Transparency, collaboration, and effective communication;
  • Data integration and flow;
  • Multi-source feedback; and
  • A big data unified framework
Quality and performance management: The second pillar in the Sustainable Medical Education Framework is related to an integral quality and performance management component. The definition, implementation, execution, and monitoring of diverse, multi-facet KPIs provide a sophisticated quality assurance and monitoring component. The availability of a quality rating system together with the exploitation of quality benchmarks provides a transparent methodological way for enhanced quality and performance management. The execution of strategies for quality assurance, as presented in this research, operates as an analytical meta-level for performance management. The set of KPIs that were designed and implemented offers an initial testbed for further execution. In a future development, more additions can be done beyond Kirkpatrick’s framework. Emphasis on Bloom’s taxonomy of educational goals or sophisticated learning analytics for active learning are just some possible directions.
The SCFHS quality and performance management should include
  • KPI management,
  • A quality rating system, and
  • quality benchmarks.
Enhanced decision-making capability: The orchestration of the previous two layers, namely the big data ecosystem and quality and performance management, interconnect with the next capability in the Sustainable Medical Education Framework. Performance evaluation, monitoring, and control appear as transparent ubiquitous processes that offer real-time knowledge and critical data for significant decisions in the SCFHS organization and, overall, in relevant organizations worldwide. This pillar also cultivates a continuous improvement culture towards innovation and medical education impact. In the context of innovation, significant effort is invested in the integration of emerging technologies and modern healthcare management to the medical education process. The provision of new services, e.g., a sophisticated Social Network of Medical Experts for personalized medical education must be a priority. Furthermore, decision making must be a continuous driver of enhancements and improvement toward measurable education impact.
Enhanced decision making improves
  • Performance,
  • Monitoring and control,
  • Innovation,
  • Resource utilization, and
  • Education impact.
Sustainability in medical education: The final component in the Sustainable Medical Education Framework is the sustainability layer. All the investments and the initiatives presented in this research paper must be considered as developmental efforts that interconnect various performance capabilities in a long-term, viable, sustainable educational strategy for the healthcare domain.
Within the sustainability strategy, various actions could be critical milestones. The continuous sustainable investment in research competence of the members of SCFHS, as well as a parallel integration of the healthcare industry with academia and medical education institutions, can lead to a unique, innovative, sustainable Medical Education and Innovation Startup Ecosystem. Through this Saudi Medical Innovation Startup Ecosystem, the vision of digital transformation of the healthcare sector can be achieved in an easier way. In future research, we plan to operationalize the requirements for the Medical Innovation Startup Ecosystem and also specify some key projects in the context of the SCFHS. One of the key priorities within this sustainable vision is community development and enhancement through the so-called Saudi Social Network of Medical Experts in the context of SCFHS.
Sustainability includes
  • Saudi Social Network of the SCFHS,
  • Medical Education and Innovation Startup Ecosystem,
  • Research competence,
  • Social impact,
  • SCFHS Digital Transformation, and
  • Vision 2030
The implications of our study for training quality department in the context of the previously presented Sustainable Medical Education Framework are significant. Special and general actions towards efficiency improvements were evident in the survey data and are presented in Figure 38, as our proposed Medical Training Quality Framework. The constructive interpretation of the key findings in the previous section can be summarized in Figure 38.
Namely, four collaborative, integrative, value-adding processes enhance the medical training impact and also enhance the innovation capability and the social impact: training excellence, research excellence, simulation and virtual reality cloud medical laboratory, and work–life balance summarize socio-technical factors and contribute to greater medical education and training impacts. In future research, we plan to investigate further the determinants of this model.
The exploitation of our recommended Medical Training Quality Framework can be performed within the context of the previously presented Sustainable Medical Education Framework. The provision of the data and services ecosystem in the SCFHS orchestrates a holistic management of training, research, and technological resources within a fertile life–work balance education context.
The first pillar is the training excellence strategy: a transparent, organization-wide strategy for the management of medical training and education as a key enabler of Vision 2030 goals in the KSA. The key dimensions of this approach are summarized as follows:
  • Integrated training excellence organization-wide;
  • A unified ecosystem of smart medical educational content and distance learning facilities;
  • A blended virtual and physical space for reading, study, and research;
  • Flexible procedures and cloud training services;
  • Advanced notification and counseling procedures;
  • Awareness campaigns; and
  • Institutional leadership.
The overall idea is that training must be designed, executed, and supported with an integrated strategy for effective educational active learning strategies, learning analytics for quality assurance, and sophisticated socio-technical systems.
The second pillar is related to research excellence. It is a bold requirement for the evolution of the SCFHS towards the achievement of Vision 2030 goals to promote, support, and implement a total quality management of research excellence. The integration of research into medical training programs will require resources, but the impact will be high. Furthermore, the cultivation of a research culture, the systematic development of research skills, and the implementation of a virtual space for research skills development and research execution are additional aspects of the research excellence approach in the SCFHS. The ultimate objective is the enhancement of the visibility and impact of Saudi medical research together with a well-designed publication and awareness program. The research excellence dimension incorporates among others the following services and key capabilities:
  • Research integration into training,
  • A unified research skills lab and research cloud service,
  • Best practices and standardization of the research process,
  • Collective research towards higher accomplishment,
  • Visibility and impact of Saudi medical research, and
  • An advanced publication program.
The third dimension is related to an open cloud-based simulation lab with advanced Virtual Reality / Augmented Reality (VR/AR) capabilities. The implementation of this open educational and training space will significantly increase the exposure of the members and beneficiaries of the SCFHS to research with a significant impact on research capability. Special effort must be also paid to the development and integration of VR/AR medical content for training and integration in the open cloud-based simulation lab. The key dimensions of the simulation lab are summarized as follows at a general abstract level:
  • Integrated open simulation lab;
  • Saudi Agora of Virtual and Augmented Reality Medical Lab and Content Laboratory; and
  • Virtual and augmented reality content for medical training and labs
The last dimension of the Medical Training Quality framework is related to the human factor (work–life balance). This is a very sensitive area in our training quality strategy. The design of rewards for training efficiency and performance, together with time and space reserved for sophisticated research activities are necessary for higher satisfaction rates and training impact. Counseling and a professional social network for Saudi medical experts promote the feeling of belonging to a greater community of achievers and contributors to the social good of healthcare in the KSA. The following is a partial list of considerations for work–life balance (human factors):
  • Rewards for training efficiency and performance,
  • Time and space reservation for sophisticated research activities,
  • Advanced professional social networking of medical experts, and
  • Counseling.
Our research contributes to the theoretical knowledge of the integration of sustainability and medical training and education in multiple ways. The proposed SCFHS Sustainable Medical Education Framework is a novel theoretical abstraction that facilitates management and innovation in medical training and education. From a theoretical point of view, it organizes and integrates multidisciplinary contributions from the epistemology of medical education and information systems research and constructs a managerial framework that focuses on the sustainability aspect of the phenomenon.
It is also one of the first studies worldwide that connects medical education and training with sustainability. We propose a concrete sustainability layer in our framework with emphasis on the following pillars:
  • Integration of social network of medical experts, towards continuous input and feedback from the community of the SCFHS;
  • Establishment of a Medical Education and Innovation Startup Ecosystem capable of bringing to life and launching smart services and applications for domains of special interest for the evolution of medical research and education;
  • The development of a continuous research competence developmental process integrated into the sustainability framework; and
  • The direct integration of medical education and training excellence to the digital transformation vision. The quality initiative described in this research study is a first proof of concept approach for the capacity of multidisciplinary approaches to promote this vision to functional managerial and training practices.
The next connection of our research study to sustainability is also related to the impact of the key findings. It is our strategic decision to enhance all the medical training programs based on the findings and the related recommendation of this study. This is a bold move towards sustainable medical education. The design of transparent and participatory managerial procedures will enhance the quality of the value proposition of the SCFHS to its members and the Saudi economy and society
Our focus on KPIs in this research study is intended. It establishes a managerial framework for sustainable/medical training through a reliable, trusted, and transparent process in which the community of the SCFHS is active. The interpretation of this approach, from sustainability’s point of view, is also significant. We promote sustainability in medical training by connecting the beneficiaries and the stakeholders in the SCFHS though the training and quality assurance process. We emphasize human capital and we set up an integrated approach for measuring diverse aspects of quality and performance. The proposed framework and the quality initiative are bold responses to the need of Saudi healthcare for a continuous improvement approach towards innovation and training excellence. The preparation of highly qualified health specialists is a key aspect of sustainability in healthcare.

7. Conclusions and Future Research

This research study promotes inter-disciplinary research with a significant social impact. We discussed the implementation of key performance indicators to benchmark, maintain, and improve the quality of postgraduate medical training. We analyzed the details, the scientific background, and the implementation of the Training Quality Assurance Initiative by the Saudi Commission for Health Specialties as a knowledge management, big data analytics, and sustainability joint endeavor. The first significant contribution of this research is related to the introduction of a full set of measurable, trusted, and reliable KPIs defined for efficient medical training. At a glance, 23 KPIs were used to measure the efficiency of medical training around three dimensions, namely, training governance, learning, and reaction. We do plan in the near future to update this functional set of KPIs with additional ones related to active learning strategies as well as community engagement in the context of a newly developed Social Network of Medical Experts. Another direction for future enhancement of this analytics approach is to emphasize also on research and social impact.
Another significant dimension of our research study was the synthesis of the key finding of this Training Quality Assurance Initiative towards the generalization of findings for future exploitation and best practices communication. As a result, we introduced the Sustainable Medical Education Framework, with four integrative components, namely big data and analytics ecosystem, SCFHS quality and performance management, enhanced decision making capability, and sustainability. This framework can serve as a sophisticated decision-making roadmap for the promotion of social impact and sustainability in the context of medical training and education. The entire research paper also initiates a multidisciplinary debate on the role of human factors [50,51,52,53], knowledge management, KPI research, and sustainable data science for medical education and digital transformation of healthcare. We also introduced the SCFHS Training Quality Approach and the relevant Medical Training Quality Framework as a key enabler of personalized medical training services for individuals and the community in the KSA.
In the next developmental phases, we are looking forward to the integration of social networks for medical expertise sharing and community building [54], as a strategic enabler of leadership and innovation [55]. Our special interest in the resilience and wellbeing of healthcare professions trainees and healthcare professionals during public health crises [56] is challenging the knowledge creation [57] and the design of new timely medical training programs capable of promoting the vision of patient-centric healthcare [58].
Our research contributes to the body of knowledge management, KPI research, sustainable data science, and sustainability, with empirical evidence from a large-scale survey on postgraduate medical training in the KSA. The overall contribution to quality assurance in medical training will be analyzed further in the near future with the integration of more KPIs for active learning, research enhancement, and social impact.

Author Contributions

The authors contributed equally to this research work and they involved in an integrated way in the stages of this research including Conceptualization, methodology, software, validation, formal analysis, investigation, resources, data curation, writing—original draft preparation, writing—review and editing, visualization, supervision, project administration, funding acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Khoja, T.; Rawaf, S.; Qidwai, W.; Rawaf, D.; Nanji, K.; Hamad, A. Health Care in Gulf Cooperation Council Countries: A review of challenges and opportunities. Cureus 2017, 9, e1586. [Google Scholar] [CrossRef] [Green Version]
  2. ten Cate, O.; Scheele, F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad. Med. 2007, 82, 542–547. [Google Scholar] [CrossRef]
  3. Kjaer, N.K.; Kodal, T.; Shaughnessy, A.F.; Qvesel, D. Introducing competency-based postgraduate medical training: Gains and losses. Int. J. Med. Educ. 2011, 2, 110–115. [Google Scholar] [CrossRef] [Green Version]
  4. Scheele, F.; Teunissen, P.; Van Luijk, S.; Heineman, E.; Fluit, L.; Mulder, H.; Meininger, A.; Wijnen-Meijer, M.; Glas, G.; Sluiter, H.; et al. Introducing competency-based postgraduate medical education in the Netherlands. Med. Teach. 2008, 30, 248–253. [Google Scholar] [CrossRef]
  5. 5. Rees, C.J.; Bevan, R.; Zimmermann-Fraedrich, K.; Rutter, M.D.; Rex, D.; Dekker, E.; Ponchon, T.; Bretthauer, M.; Regula, J.; Saunders, B.; et al. Expert opinions and scientific evidence for colonoscopy key performance indicators. Gut 2016, 65, 2045–2060. [Google Scholar] [CrossRef]
  6. Sandhu, S.S.C. Benchmarking, key performance indicators and maintaining professional standards for cataract surgery in Australia. Exp. Ophthalmol. 2015, 43, 505–507. [Google Scholar] [CrossRef]
  7. Raitt, J.; Hudgell, J.; Knott, H.; Masud, S. Key performance indicators for pre hospital emergency Anaesthesia—A suggested approach for implementation. Scand. J. Trauma. Resusc. Emerg. Med. 2019, 27, 42. [Google Scholar] [CrossRef]
  8. Santana, M.J.; Stelfox, H.T. Trauma quality Indicator consensus panel. Development and evaluation of evidence-informed quality indicators for adult injury care. Ann. Surg. 2014, 259, 186–192. [Google Scholar] [CrossRef]
  9. Lytras, M.D.; Aljohani, N.R.; Hussain, A.; Luo, J.; Zhang, X.Z. Cognitive Computing Track Chairs’ Welcome & Organization. In Proceedings of the Companion of the Web Conference, Lyon, France, 23–27 April 2018. [Google Scholar]
  10. Lytras, M.D.; Raghavan, V.; Damiani, E. Big data and data analytics research: From metaphors to value space for collective wisdom in human decision making and smart machines. Int. J. Semant. Web Inf. Syst. 2017, 13, 1–10. [Google Scholar] [CrossRef]
  11. Visvizi, A.; Daniela, L.; Chen, C.W. Beyond the ICT- and sustainability hypes: A case for quality education. Comput. Hum. Behav. 2020. [Google Scholar] [CrossRef]
  12. Alkmanash, E.H.; Jussila, J.J.; Lytras, M.D.; Visvizi, A. Annotation of Smart Cities Twitter Microcontents for Enhanced Citizen’s Engagement. IEEE Access 2019, 7, 116267–116276. [Google Scholar] [CrossRef]
  13. Lytras, M.D.; Mathkour, H.I.; Abdalla, H.; Al-Halabi, W.; Yanez-Marquez, C.; Siqueira, S.W.M. An emerging social- and emerging computing-enabled philosophical paradigm for collaborative learning systems: Toward high effective next generation learning systems for the knowledge society. Comput. Hum. Behav. 2015, 5, 557–561. [Google Scholar] [CrossRef]
  14. Visvizi, A.; Lytras, M.D. Editorial: Policy Making for Smart Cities: Innovation and Social Inclusive Economic Growth for Sustainability. J. Sci. Technol. Policy Mak. 2018, 9, 1–10. [Google Scholar]
  15. Visvizi, A.; Lytras, M.D. Transitioning to Smart Cities: Mapping Political, Economic, and Social Risks and Threats; Elsevier-US: New York, NY, USA, 2019. [Google Scholar]
  16. Arnaboldi, M. The Missing Variable in Big Data for Social Sciences: The Decision-Maker. Sustainability 2018, 10, 3415. [Google Scholar] [CrossRef] [Green Version]
  17. Olszak, C.M.; Mach-Król, M. A Conceptual Framework for Assessing an Organization’s Readiness to Adopt Big Data. Sustainability 2018, 10, 3734. [Google Scholar] [CrossRef] [Green Version]
  18. Kent, P.; Kulkarni, R.; Sglavo, U. Finding Big Value in Big Data: Unlocking the Power of High Performance Analytics. In Big Data and Business Analytics; Liebowitz, J., Ed.; CRC Press Taylor & Francis Group, LLC: Boca Raton, FL, USA, 2013; pp. 87–102. ISBN 9781466565784. [Google Scholar]
  19. Kharrazi, A.; Qin, H.; Zhang, Y. Urban big data and sustainable development goals: Challenges and opportunities. Sustainability 2016, 8, 1293. [Google Scholar] [CrossRef] [Green Version]
  20. Wielki, J. The Opportunities and Challenges Connected with Implementation of the Big Data Concept. In Advances in ICT for Business, Industry and Public Sector; Mach-Król, M., Olszak, C.M., Pełech-Pilichowski, T., Eds.; Springer: Cham, Switzerland, 2015; pp. 171–189. ISBN 978-3-319-11327-2. [Google Scholar]
  21. Lytras, M.D.; Aljohani, N.R.; Visvizi, A.; De Pablos, P.O.; Gasevic, D. Advanced Decision-Making in Higher Education: Learning Analytics Research and Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 937–940. [Google Scholar] [CrossRef]
  22. Zhang, J.; Zhang, X.; Jiang, S.; de Pablos, P.O.; Sun, Y. Mapping the Study of Learning Analytics in Higher Education. Behav. Inf. Technol. 2018, 37, 1142–1155. [Google Scholar] [CrossRef]
  23. Rajkaran, S.; Mammen, K.J. Identifying Key Performance Indicators for Academic Departments in a Comprehensive University through a Consensus-Based Approach: A South African Case Study. J. Sociol. Soc. Anthropol. 2014, 5, 283–294. [Google Scholar] [CrossRef]
  24. Asif, M.; Awan, M.U.; Khan, M.K.; Ahmad, N. A Model for Total Quality Management in Higher Education. Qual. Quant. 2013, 47, 1883–1904. [Google Scholar] [CrossRef]
  25. Varouchas, E.; Sicilia, M.-A.; Sánchez-Alonso, S. Towards an Integrated Learning Analytics Framework for Quality Perceptions in Higher Education: A 3-Tier Content, Process, Engagement Model for Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 1129–1141. [Google Scholar] [CrossRef]
  26. Varouchas, E.; Sicilia, M.-Á.; Sánchez-Alonso, S. Academics’ Perceptions on Quality in Higher Education Shaping Key Performance Indicators. Sustainability 2018, 10, 4752. [Google Scholar] [CrossRef] [Green Version]
  27. Sanyal, B.C.; Martin, M. Quality assurance and the role of accreditation: An overview. In Higher Education in the World 2007: Accreditation for Quality Assurance: What Is at Stake? Palgrave Macmillan: New York, NY, USA, 2007; pp. 3–23. [Google Scholar]
  28. McDonald, R.; Van Der Horst, H. Curriculum alignment, globalization, and quality assurance in South African higher education. J. Curric. Stud. 2007, 39, 6. [Google Scholar] [CrossRef]
  29. Deming, W. Improvement of quality and productivity through action by management. Natl. Product. Rev. 2000, 1, 12–22. [Google Scholar] [CrossRef]
  30. Dlačić, J.; Arslanagić, M.; Kadić-Maglajlić, S.; Marković, S.; Raspor, S. Exploring perceived service quality, perceived value, and repurchase intention in higher education using structural equation modelling. Total Qual. Manag. Bus. Excell. 2014, 25, 141–157. [Google Scholar] [CrossRef]
  31. Suryadi, K. Key Performance Indicators Measurement Model Based on Analytic Hierarchy Process and Trend-Comparative Dimension in Higher Education Institution. In Proceedings of the 9th International Symposium on the Analytic Hierarchy Process for Multi-criteria Decision Making (ISAHP), Viña del Mar, Chile, 2–6 August 2007. [Google Scholar]
  32. Chalmers, D. Teaching and Learning Quality Indicators in Australian Universities. In Proceedings of the Institutional Management in Higher Education (IMHE) Conference, Paris, France, 8–10 September 2008. [Google Scholar]
  33. Varouchas, E.; Lytras, M.; Sicilia, M.A. Understanding Quality Perceptions in Higher Education: A Systematic Review of Quality Variables and Factors for Learner Centric Curricula Design. In EDULEARN16—8th Annual International Conference on Education and New Learning Technologies; IATED: Barcelona, Spain, 2016; pp. 1029–1035. [Google Scholar]
  34. Yarime, M.; Tanaka, Y. The Issues and Methodologies in Sustainability Assessment Tools for Higher Education Institutions: A Review of Recent Trends and Future Challenges. J. Educ. Sustain. Dev. 2012, 6, 63–77. [Google Scholar] [CrossRef]
  35. Sheng, Y.; Yu, Q.; Chen, L. A Study on the Process Oriented Evaluation System of Undergraduate Training Programs for Innovation and Entrepreneurship. Creative Educ. 2016, 07, 2330–2337. [Google Scholar] [CrossRef] [Green Version]
  36. Toussaint, N.D.; McMahon, L.P.; Dowling, G.; Söding, J.; Safe, M.; Knight, R.; Fair, K.; Linehan, L.; Walker, R.G.; A Power, D. Implementation of renal key performance indicators: Promoting improved clinical practice. Nephrology (Carlton) 2015, 20, 184–193. [Google Scholar] [CrossRef]
  37. Pencheon, D. The Good Indicators Guide: Understanding How to Use and Choose Indicators. London: Association of Public Health Observatories & NHS Institute for Innovation and Improvement. 2008. Available online: http://fingertips.phe.org.uk/documents/The%20Good%20Indicators%20Guide.pdf (accessed on 10 September 2020).
  38. Lunsford, L.D.; Kassam, A.; Chang, Y.F. Survey of United States neurosurgical residency program directors. Neurosurgery 2004, 54, 239–245, discussion 245–247. [Google Scholar] [CrossRef]
  39. Ahmadi, M.; Khorrami, F.; Dehnad, A.; Golchin, M.H.; Azad, M.; Rahimi, S. A Survey of Managers’ Access to Key Performance Indicators via HIS: The Case of Iranian Teaching Hospitals. Stud. Health. Technol. Inform. 2018, 248, 233–238. [Google Scholar]
  40. Aggarwal, S.; Kusano, A.S.; Carter, J.N.; Gable, L.; Thomas, C.R., Jr.; Chang, D.T. Stress and Burnout among Residency Program Directors in United States Radiation OncologyPrograms. Int. J. Radiat. Oncol. Biol. Phys. 2015, 93, 746–753. [Google Scholar] [CrossRef]
  41. Ishak, W.W.; Lederer, S.; Mandili, C.; Nikravesh, R.; Seligman, L.; Vasa, M.; Ogunyemi, D.; Bernstein, C.A. Burnout during residency training: A literature review. J. Grad. Med. Educ. 2009, 1, 236–242. [Google Scholar] [CrossRef] [Green Version]
  42. Krebs, R.; Ewalds, A.L.; van der Heijden, P.T.; Penterman, E.J.M.; Grootens, K.P. Burn-out, commitment, personality and experiences during work and training; survey among psychiatry residents. Tijdschr. Psychiatr. 2017, 59, 87–93. [Google Scholar]
  43. Gouveia, P.A.D.C.; Ribeiro, M.H.C.; Aschoff, C.A.M.; Gomes, D.P.; Silva, N.A.F.D.; Cavalcanti, H.A.F. Factors associated with burnout syndrome in medical residents of a university hospital. Rev. Assoc. Med. Bras. 2017, 63, 504–511. [Google Scholar] [CrossRef] [Green Version]
  44. Dyrbye, L.N.; West, C.P.; Satele, D.; Boone, S.; Tan, L.J.; Sloan, J.; Shanafelt, T.D. Burnout among U.S. medical students, residents, and earlycareer physicians relative to the general U.S. population. Acad. Med. 2014, 89, 443–451. [Google Scholar] [CrossRef] [Green Version]
  45. Shoimer, I.; Patten, S.; Mydlarski, P.R. Burnout in dermatology residents: A Canadian perspective. Br. J. Dermatol. 2018, 178, 270–271. [Google Scholar] [CrossRef]
  46. Porter, M.; Hagan, H.; Klassen, R.; Yang, Y.; Seehusen, D.A.; Carek, P.J. Burnout and Resiliency Among Family Medicine Program Directors. Fam. Med. 2018, 50, 106–112. [Google Scholar] [CrossRef] [Green Version]
  47. Chaukos, D.; Chad-Friedman, E.; Mehta, D.H.; Byerly, L.; Celik, A.; McCoy, T.H., Jr.; Denninger, J.W. Risk and Resilience Factors Associated with Resident Burnout. Acad. Psychiatry. 2017, 41, 189–194. [Google Scholar] [CrossRef]
  48. Holmes, E.G.; Connolly, A.; Putnam, K.T.; Penaskovic, K.M.; Denniston, C.R.; Clark, L.H.; Rubinow, D.R.; Meltzer-Brody, S. Taking Care of Our Own: Multispecialty Study of Resident and Program Director Perspectives on Contributors to Burnout and Potential Interventions. Acad. Psychiatry. 2017, 41, 159–166. [Google Scholar] [CrossRef]
  49. Dyrbye, L.; Shanafelt, T. A narrative review on burnout experienced by medical students and residents. Med. Educ. 2016, 50, 132–149. [Google Scholar] [CrossRef]
  50. Jagsi, R.; Griffith, K.A.; Jones, R.; Perumalswami, C.R.; Ubel, P.; Stewart, A. Sexual harassment and discrimination experiences of academic medical faculty. JAMA 2016, 315, 2120–2121. [Google Scholar] [CrossRef]
  51. Karim, S.; Duchcherer, M. Intimidation and harassment in residency: A review of the literature and results of the 2012 Canadian Association of Internsand Residents National Survey. Can. Med. Educ. J. 2014, 5, e50–e57. [Google Scholar] [CrossRef] [Green Version]
  52. Fnais, N.; Soobiah, C.; Chen, M.H.; Lillie, E.; Perrier, L.; Tashkhandi, M.; Straus, S.E.; Mamdani, M.; Al-Omran, M.; Tricco, A.C. Harassment and discrimination in medical training: A systematic review and meta-analysis. Acad. Med. 2014, 89, 817–827. [Google Scholar] [CrossRef]
  53. Bates, C.K.; Jagsi, R.; Gordon, L.K.; Travis, E.; Chatterjee, A.; Gillis, M.; Means, O.; Chaudron, L.; Ganetzky, R.; Gulati, M.; et al. It Is Time for Zero Tolerance for Sexual Harassment in Academic Medicine. Acad. Med. 2018, 93, 163–165. [Google Scholar] [CrossRef] [Green Version]
  54. Giroux, C.; Moreau, K. Leveraging social media for medical education: Learning from patients in online spaces. Med. Teach. 2020, 42, 970–972. [Google Scholar] [CrossRef]
  55. McKimm, J.; McLean, M. Rethinking health professions’ education leadership: Developing ‘eco-ethical’ leaders for a more sustainable world and future. Med. Teach. 2020, 42, 855–860. [Google Scholar] [CrossRef]
  56. Wald, H. Optimizing resilience and wellbeing for healthcare professions trainees and healthcare professionals during public health crises—Practical tips for an ‘integrative resilience’ approach. Med. Teach. 2020, 42, 744–755. [Google Scholar] [CrossRef]
  57. Naeve, A.; Yli-Luoma, P.; Kravcik, M.; Lytras, M.D. A modelling approach to study learning processes with a focus on knowledge creation. Int. J. Technol. Enhanc. Learn. 2018, 1, 1–34. [Google Scholar] [CrossRef] [Green Version]
  58. Spruit, M.; Lytras, M. Applied Data Science in Patient-centric Healthcare. Telemat. Inform. 2018, 35, 2018. [Google Scholar] [CrossRef]
Figure 1. Gender Distribution among Program Directors.
Figure 1. Gender Distribution among Program Directors.
Sustainability 12 08030 g001
Figure 2. Gender Distribution of Program Directors in Each Country Region.
Figure 2. Gender Distribution of Program Directors in Each Country Region.
Sustainability 12 08030 g002
Figure 3. Age Distribution among Program Directors.
Figure 3. Age Distribution among Program Directors.
Sustainability 12 08030 g003
Figure 4. Gender Distribution in Each Age Group among Program Directors.
Figure 4. Gender Distribution in Each Age Group among Program Directors.
Sustainability 12 08030 g004
Figure 5. The Primary Qualification of the Program Directors.
Figure 5. The Primary Qualification of the Program Directors.
Sustainability 12 08030 g005
Figure 6. Qualification Distribution among Different Regions.
Figure 6. Qualification Distribution among Different Regions.
Sustainability 12 08030 g006
Figure 7. Time Spent in Years as Program Director.
Figure 7. Time Spent in Years as Program Director.
Sustainability 12 08030 g007
Figure 8. Time Spent as Program Director by Gender.
Figure 8. Time Spent as Program Director by Gender.
Sustainability 12 08030 g008
Figure 9. Program Directors’ City of Residence.
Figure 9. Program Directors’ City of Residence.
Sustainability 12 08030 g009
Figure 10. Programs of the Appointed Program Directors Surveyed.
Figure 10. Programs of the Appointed Program Directors Surveyed.
Sustainability 12 08030 g010
Figure 11. Gender Distribution of Program Directors among Disciplines.
Figure 11. Gender Distribution of Program Directors among Disciplines.
Sustainability 12 08030 g011
Figure 12. Age of Program Directors in Each Discipline.
Figure 12. Age of Program Directors in Each Discipline.
Sustainability 12 08030 g012
Figure 13. Health Sectors for the Included Program Directors.
Figure 13. Health Sectors for the Included Program Directors.
Sustainability 12 08030 g013
Figure 14. Other Appointed Administrative Positions for the Program Directors.
Figure 14. Other Appointed Administrative Positions for the Program Directors.
Sustainability 12 08030 g014
Figure 15. Percentage of Program Directors with Degree in Health Profession Education among Disciplines.
Figure 15. Percentage of Program Directors with Degree in Health Profession Education among Disciplines.
Sustainability 12 08030 g015
Figure 16. Percentage of Program Directors with Degree in Health Profession Education in Different Regions.
Figure 16. Percentage of Program Directors with Degree in Health Profession Education in Different Regions.
Sustainability 12 08030 g016
Figure 17. Gender Distribution among Trainees.
Figure 17. Gender Distribution among Trainees.
Sustainability 12 08030 g017
Figure 18. Distribution of Trainees among Disciplines.
Figure 18. Distribution of Trainees among Disciplines.
Sustainability 12 08030 g018
Figure 19. Level of Trainees in Our Study Sample.
Figure 19. Level of Trainees in Our Study Sample.
Sustainability 12 08030 g019
Figure 20. Distribution of Trainees among Different Regions in Saudi Arabia.
Figure 20. Distribution of Trainees among Different Regions in Saudi Arabia.
Sustainability 12 08030 g020
Figure 21. The Distribution of Trainees among Different Cities.
Figure 21. The Distribution of Trainees among Different Cities.
Sustainability 12 08030 g021
Figure 22. Trainees’ Satisfaction Scores.
Figure 22. Trainees’ Satisfaction Scores.
Sustainability 12 08030 g022
Figure 23. Program Directors’ Satisfaction Scores.
Figure 23. Program Directors’ Satisfaction Scores.
Sustainability 12 08030 g023
Figure 24. The Prevalence of Burnout among Residents.
Figure 24. The Prevalence of Burnout among Residents.
Sustainability 12 08030 g024
Figure 25. The Mean Burnout Scores According to the Trainees’ Level of Training.
Figure 25. The Mean Burnout Scores According to the Trainees’ Level of Training.
Sustainability 12 08030 g025
Figure 26. The Mean Burnout Scores According to the Trainees’ City of Residency.
Figure 26. The Mean Burnout Scores According to the Trainees’ City of Residency.
Sustainability 12 08030 g026
Figure 27. Percentage of Program Directors Who Received Orientation by Gender.
Figure 27. Percentage of Program Directors Who Received Orientation by Gender.
Sustainability 12 08030 g027
Figure 28. The Percentage of Orientation Received by Program Directors by Region.
Figure 28. The Percentage of Orientation Received by Program Directors by Region.
Sustainability 12 08030 g028
Figure 29. The Satisfaction Rate of the Program Directors with the Compliance of the Training Faculty in Submitting Trainees’ Evaluations in Time.
Figure 29. The Satisfaction Rate of the Program Directors with the Compliance of the Training Faculty in Submitting Trainees’ Evaluations in Time.
Sustainability 12 08030 g029
Figure 30. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year.
Figure 30. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year.
Sustainability 12 08030 g030
Figure 31. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year by Region.
Figure 31. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year by Region.
Sustainability 12 08030 g031
Figure 32. The Compliance Rate (Adherence) to the Master Rotation Plan Set Early in the Academic Year.
Figure 32. The Compliance Rate (Adherence) to the Master Rotation Plan Set Early in the Academic Year.
Sustainability 12 08030 g032
Figure 33. The Rate of Providing the Residents with Goals and Objectives of Every Rotation among Program Directors Who Did/or Did Not Receive Orientation on Their Roles and Responsibilities.
Figure 33. The Rate of Providing the Residents with Goals and Objectives of Every Rotation among Program Directors Who Did/or Did Not Receive Orientation on Their Roles and Responsibilities.
Sustainability 12 08030 g033
Figure 34. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According to the Gender of Program Director.
Figure 34. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According to the Gender of Program Director.
Sustainability 12 08030 g034
Figure 35. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process by Region.
Figure 35. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process by Region.
Sustainability 12 08030 g035
Figure 36. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According to the Time Spent by Program Directors to Manage the Training Activities.
Figure 36. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According to the Time Spent by Program Directors to Manage the Training Activities.
Sustainability 12 08030 g036
Figure 37. The SCFHS Sustainable Medical Education Framework.
Figure 37. The SCFHS Sustainable Medical Education Framework.
Sustainability 12 08030 g037
Figure 38. The SCFHS Training Quality Approach—Medical Training Quality Framework.
Figure 38. The SCFHS Training Quality Approach—Medical Training Quality Framework.
Sustainability 12 08030 g038
Table 1. Overview of literature review for the perceptions and utilization of learning analytics and key performance indicators (KPIs) for learning and training.
Table 1. Overview of literature review for the perceptions and utilization of learning analytics and key performance indicators (KPIs) for learning and training.
Literature Review
Author(s)Learning Analytics MetaphorsKey InterpretationImpact on Our Research Model
[21,22,23,24]Learning analytics and KPIs serve as a measurement tool for enhanced decision makingThere is a critical need for the definition of trusted, measurable, and efficient learning analytics and KPIs to support decision making.We have a tremendous interest in analyzing how KPI research can enhance the quality of training programs in the Saudi Commission for Health Specialties (SCFHS). We are looking for a set of KPIs that will allow for enhanced decision making and adjustment of training programs.
[21,22,23,24,25]Learning analytics as a key approach for learning behavior analysis and adjustmentIn the recent literature, there is an increasing interest in the use of KPIs and learning analytics for understanding, interpreting, and enhancing the learning behavior and adjustment.In our research study, we want to investigate how well-defined KPIs can be used for a short-term and long-term analysis of residents in postgraduate medical training programs‘ learning behavior and attitudes.
[22,23,24,26]Learning analytics as a key approach for predicting performance and personalizing learning experienceThe use of learning analytics to predict performance in training programs can be used as the basis for the adjustment of the learning experience for the benefit of trainees.In the wide range of the medical training programs of the SCFHS, this is not currently the priority. We are interested, though, in the future, in utilizing a newly established big data ecosystem towards this direction.
[21,22,26,27]Learning analytics as a key approach for real-time monitoring tool of learning processKPIs can provide a systematic way of monitoring the efficiency in various aspects of the learning and training process.In the variety of medical training programs of the SCFHS, there is a necessity to define and maintain a set of KPIs for monitoring the learning and training processes.
[21,22,23,24,25,26,27]Learning analytics as a key approach for customizing learning experienceKPIs and learning analytics can be the basis for flexible training programs based on variations of training approaches.A fully functional set of KPIs for the purposes of the SCFHS’s training programs can be used for a deep understanding of obstacles in training and alteration of training approach.
[21,22,23,24,25,26,27]Learning analytics as a key approach for standardizing the flow of instructionKPIs can be used as a methodological framework for the standardization of instruction. Learning objectives as well as learning outcomes in training programs can be codified as measurable KPIs.In the SCFHS, there is a special interest in a quality initiative that will be distributed across all training programs in a transparent way.
[28,29,30,31,32,33]Learning analytics as a means for interoperable technology-enhanced learning systemsA thorough, sophisticated learning analytics and KPI framework can be also empower interoperable or distributed technology-enhanced learning systems.The development of a reliable set of KPIs for the purposes of the quality initiative in the SCFHS can support interoperable learning services with various other stakeholders and their training services in the near future.
Table 2. The List of Generated KPIs and Their Descriptors.
Table 2. The List of Generated KPIs and Their Descriptors.
DomainsCodeDescriptors
REACTIONGMH QI 1.1Percentage of trainees’ satisfaction
GMH QI 1.2Percentage of trainers’ satisfaction
GMH QI 1.3Percentage of program directors’ (PDs’) satisfaction
GMH QI 1.4Percentage of trainees’ burnout
LEARNINGGMH QI 2.1Percentage of program directors (PDs) who attended a PD training course offered by the SCFHS
GMH QI 2.2Number of trainers in postgraduate management training (PGMT) programs who successfully completed SCFHS training
GMH QI 2.3Percentage of surveyors who have successfully completed SCFHS’s accreditation certification
GMH QI 2.4Percentage of trainees’ compliance with minimal procedure, case exposure policies required competency index
GMH QI 2.5Percentage of trainees who have received trainees’ evaluation by program in specific period
GMH QI 2.6Percentage of research inclusion in curricula
GMH QI 2.7Percentage of programs with burnout policy
GMH QI 2.8Percentage of compliance with implementing incorporated e-log system in each program
GMH QI 2.9Percentage of trainees who fulfilled their promotion criteria
GMH QI 2.10Percentage of trainees who passed the board exam
GMH QI 2.11Percentage of programs that incorporated simulation in their curricula
GMH QI 2.12Percentage of programs with trainees receiving annual master rotation plan
GMH QI 2.13Percentage of programs in compliance with the annual master plan
GOVERNANCEGMH QI 3.1Percentage of programs with complete goals and objectives for residency programs
GMH QI 3.2Percentage of completed trainer evaluations by trainee per program
GMH QI 3.3Percentage of adherence to accreditation requirements
GMH QI 3.4Percentage of PD turnover rate
GMH QI 3.5Percentage of accreditation compliance score
GMH QI 3.6Percentage of violations of the matching regulations
Table 3. The Sources of Data for Each KPI.
Table 3. The Sources of Data for Each KPI.
CodeProgram Directors’
Survey
Trainees’ SurveyAdmission
Department
Accreditation
Department
Training DepartmentAssessment Department
GMH QI 1.1
GMH QI 1.2
GMH QI 1.3
GMH QI 1.4
GMH QI 2.1
GMH QI 2.2
GMH QI 2.3
GMH QI 2.4
GMH QI 2.5
GMH QI 2.6
GMH QI 2.7
GMH QI 2.8
GMH QI 2.9
GMH QI 2.10
GMH QI 2.11
GMH QI 2.12
GMH QI 2.13
GMH QI 3.1
GMH QI 3.2
GMH QI 3.3
GMH QI 3.4
GMH QI 3.5
GMH QI 3.6
Primary Data (Surveys) Secondary Data (SCFHS Databases)
Table 4. Summary of Key Performance Indicator Results.
Table 4. Summary of Key Performance Indicator Results.
DomainKPI IDDefinitionQuestions and CalculationResult
ReactionGMH QI 1.1Trainees’ SatisfactionTS Q10, TS Q11, TS Q12, TS Q13, TS Q16, TS Q17, TS Q2569%
GMH QI 1.2Trainers’ SatisfactionUnavailable
GMH QI 1.3Program Directors’ SatisfactionPDS Q 2.4
PDS Q 2.6
PDS Q 3.3
PDS Q 4.2
76%
GMH QI 1.4Trainees’ BurnoutTS Q 1466.7%
LearningGMH QI 2.1Program directors who attended training course offered by SCFHSPDS Q 2.138.4%
GMH QI 2.2Trainers in PGMT programs who successfully completed SCFHS training certificationTraining Department Database276
GMH QI 2.3Surveyors who have successfully completed SCFHS’s accreditation trainingAccreditation Department Database67%
GMH QI 2.4Trainees’ compliance with minimal procedure, case exposure policies required competency indexUnavailable
GMH QI 2.5Trainees who have received trainees’ evaluation by program in a specific periodPDS Q 2.6 74%
GMH QI 2.6Research included in curriculumTraining Department Database =?
(TS Q 22 = 33.1%)
Unavailable
GMH QI 2.7Programs with burnout policyPDS Q 4.2 = 69%69%
GMH QI 2.8Compliance with implementing incorporated e-log system in each programTraining Department DatabaseUnavailable
GMH QI 2.9Trainees who fulfilled their promotion criteriaTraining Department Database
Assessment Department Database
Unavailable
GMH QI 2.10Trainees who passed the board examAssessment Department Database79%
GMH QI 2.11Programs that incorporated simulation in their curriculaPDS Q 3.1 = 3%
TS Q 10 = 34.2%
18.6%
GMH QI 2.12Programs with trainees receiving annual master rotation plan PDS Q 2.773%
GMH QI 2.13Programs’ compliance with the annual master plan PDS Q 2.876.9%
Training GovernanceGMH QI 3.1Programs with complete goals and objectives for residency programs PDS Q 3.287.7%
GMH QI 3.2Completed trainer evaluation by trainee per program PDS Q 2.948.7%
GMH QI 3.3Adherence to accreditation requirements Accreditation Department DatabaseNot Available
GMH QI 3.4PD turnover rate TBDNot Available
GMH QI 3.5Accreditation compliance score TBDNot Available
GMH QI 3.6Violations with the matching regulations Admission and Registration Database 4%
Table 5. Key Findings Related to the Research Objectives.
Table 5. Key Findings Related to the Research Objectives.
Overview of Key Findings
No.Research ObjectiveKey Findings
1KPI definition for efficient medical training
  • A set of 23 KPIs based on work of Kirkpatrick provides trusted, measurable, and efficient metrics to support decision making
  • Different aspects of medical training can be enhanced and supported by the continuous monitoring of these KPI sets
  • There are critical requirements for the availability of relevant data related to medical training
2KPIs‘ social impact and sustainability in medical education
  • The continuous measurement of the proposed KPIs is directly linked to various components of sustainability in medical education. In the discussion section, we provide our methodological contribution
  • The synthesis of the key interpretations of KPIs lead to a theoretical contribution, the SCFHS Framework for Sustainable Medical Education, that is presented in a relevant section in this research study
  • The social impact of the medical education is also directly linked to a continuous quality assurance and performance process in medical training programs. This effort is progressive and KPIs enhance the strategic alignment of training programs to sustainable goals and digital transformation
3Sustainable data science for medical education and digital transformation of healthcare
  • The variety, sophistication, and wide range of medical training programs in the SCFHS require an integrated big data and analytics ecosystem for the management of data and services for personalized training
  • The community of the SCFHS as well as its stakeholders’ network can be supported and enhanced through digital infrastructures and smart services
  • A social network that integrates the expertise and the human capital of health specialists can be a significant enabler for the digital transformation of health in Saudi Arabia
  • Future research on the requirements of an integrated medical data ecosystem for training and education is also promoted

Share and Cite

MDPI and ACS Style

Housawi, A.; Al Amoudi, A.; Alsaywid, B.; Lytras, M.; bin Μoreba, Y.H.; Abuznadah, W.; Alhaidar, S.A. Evaluation of Key Performance Indicators (KPIs) for Sustainable Postgraduate Medical Training: An Opportunity for Implementing an Innovative Approach to Advance the Quality of Training Programs at the Saudi Commission for Health Specialties (SCFHS). Sustainability 2020, 12, 8030. https://doi.org/10.3390/su12198030

AMA Style

Housawi A, Al Amoudi A, Alsaywid B, Lytras M, bin Μoreba YH, Abuznadah W, Alhaidar SA. Evaluation of Key Performance Indicators (KPIs) for Sustainable Postgraduate Medical Training: An Opportunity for Implementing an Innovative Approach to Advance the Quality of Training Programs at the Saudi Commission for Health Specialties (SCFHS). Sustainability. 2020; 12(19):8030. https://doi.org/10.3390/su12198030

Chicago/Turabian Style

Housawi, Abdulrahman, Amal Al Amoudi, Basim Alsaywid, Miltiadis Lytras, Yara H. bin Μoreba, Wesam Abuznadah, and Sami A. Alhaidar. 2020. "Evaluation of Key Performance Indicators (KPIs) for Sustainable Postgraduate Medical Training: An Opportunity for Implementing an Innovative Approach to Advance the Quality of Training Programs at the Saudi Commission for Health Specialties (SCFHS)" Sustainability 12, no. 19: 8030. https://doi.org/10.3390/su12198030

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop