Next Article in Journal
Students’ Perceived M-Learning Quality: An Evaluation and Directions to Improve the Quality for H-Learning
Next Article in Special Issue
Administrative Processes Efficiency Measurement in Higher Education Institutions: A Scoping Review
Previous Article in Journal
Investigation of Progressive Learning within a Statics Course: An Analysis of Performance Retention, Critical Topics, and Active Participation
Previous Article in Special Issue
A Maturity Matrix Model to Strengthen the Quality Cultures in Higher Education
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

A Novel Strategic Approach to Evaluating Higher Education Quality Standards in University Colleges Using Multi-Criteria Decision-Making

Department of Industrial Engineering, Faculty of Engineering—Rabigh, King Abdulaziz University, Jeddah 21589, Saudi Arabia
Department of Industrial Engineering, Faculty of Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia
Department of Electrical and Computer Engineering, Faculty of Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(6), 577;
Original submission received: 8 May 2023 / Revised: 30 May 2023 / Accepted: 1 June 2023 / Published: 4 June 2023
(This article belongs to the Special Issue Higher Education Quality Assurance)


Universities worldwide strive to achieve excellence in research, learning, teaching, and community services, which are the pillars of their strategic plans. However, satisfying international ranking criteria might not directly result in achieving their strategic objectives. This paper proposes a new approach to rank university colleges by evaluating their educational quality. Standard sets of criteria from multiple international university ranking systems and a university’s strategic plan’s Balanced Scorecard perspectives were cross-mapped for the evaluation. A new multi-criteria decision-making-based framework was applied to six colleges of a non-profit university in the Middle East. It revealed their performance rankings and contributions to the university’s educational quality objectives. This paper offers a novel approach for universities to develop strategies that satisfy multiple international ranking systems while achieving their strategic goals concurrently and as per their priorities. Implications include informing university leaders on the most contributing colleges and assisting in pinpointing quality shortcomings and their causes. This helps universities design better performance indicators and allocate resources to achieve educational excellence. This paper puts forward a new approach for universities to unify their efforts in satisfying the requirements of multiple international ranking systems while achieving their strategic goals.

1. Introduction

A decent education performance through its management is critical to achieving effective educational outcomes. Colleges and universities worldwide strive to achieve excellence in research, learning, teaching, and community services [1,2,3]. Thus, the quality of service and education provided is very important as education institutions wish to bridge any intellectual gap in all sectors of the economy. Therefore, the educational institution’s administration must be effective and efficient in all managerial aspects. In order to ensure that an educational institution remains competitive, attention and control must be given to its internal affairs. One of the most critical factors to ensure the continuity of performance excellence in educational institutions is the development of measurable standards, and the indicators should be measured periodically [4].
Performance indicators have become of great importance in educational institutions. However, the difference between the scope of education, research, and composition of study programs in educational institutions makes developing quality indicators challenging [5]. Quality measures should ideally have a clear causal explanation. For instance, if the educational quality of student outcomes is measured within an educational institution, the performance indicators should reflect the performance of students with different characteristics across that educational institution and its programs of study. In the case of measuring the quality of departments within educational institutions, the performance indicators that reflect the quality performance of those departments must be considered. Generally, one of the most challenging aspects of quality measurement is obtaining quantitative data. Therefore, survey data and questionnaires are generally required to collect opinions from relevant experts [6]. Likewise, one of the essential tools that many institutions might apply to evaluate their performance in various aspects of their institutions is the Balanced Scorecard (BSC). The concept of a BSC has evolved beyond the simple use of perspectives, and it is now a holistic system for managing strategy. A key benefit of a disciplined framework is that it allows organizations to connect the dots between the various components of strategic planning and management [7].
Another important tool many institutions use to evaluate performance is Multi-criteria Decision Making (MCDM). The MCDM approach analyzes various alternatives and selects the optimal one [8]. Because of management’s role in ensuring quality performance, the MCDM approach remains important across the education sector and other fields [9,10,11,12,13,14]. The MCDM is built based on the insight of solving planning and structural problems and challenges using multiple criteria [15]. The main objective of this paper is to rank university colleges according to their educational quality. Evaluation criteria will be determined based on common international university ranking systems as well as other criteria derived from the university strategic plan’s BSC perspectives. Combining these two separate criteria is suggested as a novel strategy for evaluating university colleges’ educational quality.
Education institutions face a wide range of challenges, such as providing high-quality education, achieving top world rankings, reducing costs, or increasing self-funding. Due to the rapid growth of education models, evaluation models are gradually becoming the focus of scholars’ attention [16,17]. Several factors contribute to the challenges facing educational institutions. Therefore, the educational institution must be evaluated at all levels to be successful. For this reason, it is important to develop quality performance indicators for assessing educational departments. This will allow decision-makers to make informed decisions that contribute to the organization’s success. The main objective of this paper is to generate a new approach to ranking the university faculties by evaluating their educational quality. A standard set of criteria from multiple common international university ranking systems, as well as other criteria derived from the BSC perspectives of the strategic plan by a university, were the basis for the evaluation. Moreover, this paper employs the MCDM approach to rank university colleges based on the criteria developed and derived from BSC and other international ranking education systems. Consequently, the following question will be answered in this paper:
How can the higher education quality of university colleges be evaluated using criteria derived from the BSC’s perspectives of its strategic plan and simultaneously using standard criteria of international ranking systems?
The rest of the paper is structured as follows: Section 2 presents current research studies using BSC and MCDM techniques to evaluate the educational sector. Section 3 describes the novel proposed strategy for universities to achieve their educational quality objectives in a step-by-step fashion. Section 4 presents its application and results. Section 5 presents a discussion of the findings from the novel strategy, while the conclusions are presented in Section 6.

2. Literature Review

Educational Institutions may establish programs or assessment processes to discover and encourage practical management approaches. Previous studies have explored various quality evaluation and performance appraisal aspects across multiple service sectors, including education, healthcare, hospitality, tourism, and the public or private sectors [6]. In the education sector, university colleges are evaluated based on their quality performance, which requires the management to implement appropriate measures and balances to ensure quality [18,19,20,21].
The BSC is one of the most extensively utilized tools in gauging and improving quality in higher education institutions. A study performed such research to ascertain the efficacy of such a tool in German and Austrian educational institutions [22]. It analyzes the substantive similarities and differences between the BSC of four universities in Germany and Austria: Johannes Gutenberg University Mainz, Munster University of Applied Sciences, Cologne University of Applied Sciences, and Montan University Leoben. It was suggested that the BSC gives a holistic perspective of the method used by a higher education institution by comparing the BSC utilized by four distinct educational establishments [22]. It guarantees a comprehensive and sophisticated framework for executing and regulating the strategy and establishes a foundation for future learning in the strategy formulation of the higher educational institution following the scheme “plan-do-check-act”. The BSC has also been proven effective in the United States. Similar research was performed in another study to establish the tool’s efficacy in gauging quality within higher education institutions to demonstrate that the BSC may be an effective instrument for assessing the accomplishments of educational institutions, namely, universities [23]. Furthermore, a possible implication of utilizing the BSC was established to improve the quality of instruction in higher education colleges by determining how effective the BSC model is in improving the overall performance of prospective private institutions where performance was the dependent variable [24], In the BSC, the customer dimension, financial dimension, internal business process dimension, and the outlook of the growth procedure and acquiring knowledge are considered independent variables [25,26,27,28,29]. In that study [24], a sample size of one hundred individuals was used, representing more than half of the undergraduates currently enrolled as active students at the University of WR Supratman Surabaya. The findings indicate that while the approach effectively gauges performance, combining it with other approaches is necessary to evaluate the more nuanced performance and quality improvement aspects [24].
Performance and quality improvement approaches have also been adopted following a bottom-up approach. This was applied at the University of Minho to build the university’s vision and achieve its sustainability with a comprehensive and all-encompassing view, illustrating the participation of the academic community and the top management in the process [30].
The primary focus of most of the research on assessing quality in higher education has been private institutions. However, only a few studies have adopted a different approach by choosing public institutions that are not focused primarily on the profit dimension [31]. The researchers examined current research published in reputable publications that applied the BSC Framework to higher education institutions using contextual analysis and highlighted the pertinent viewpoints for higher education institutions. When implemented, it will be possible to monitor their performance and provide them with the ability to adapt to new difficulties that arise as a direct consequence of putting essential strategies into action. The conclusion drawn from this finding is that private institutions utilized contemporary BSC viewpoints. However, public establishments used conventional perspectives with minor adjustments to the titles and the orders of the perspectives. For instance, the stakeholder viewpoint was employed in certain studies instead of the customer perspective. The available data demonstrate that the BSC has been used in a wide range of settings within the framework of higher education institutions, producing observable effects.
MCDM is another crucial approach used in making decisions in higher education institutions. For instance, it was used to evaluate the tool’s implications at the Teaching Hospitals of Yazd University of Medical Sciences [32]. Literature research and qualitative techniques were utilized to gather expert opinions on the quality characteristics of hospital and education services. Following that, the views of three hundred patients on the quality of the provided services were acquired via a questionnaire that was created. The Fuzzy Analytic Hierarchy Process (FAHP) approach was used to assign weights to each quality parameter, and the Technique for Order of Preference by Similarity to the Ideal Solution (TOPSIS) method was utilized to rank hospital wards. According to the results, MCDM approaches effectively prioritize the aspects that influence the quality of education and health services [33,34,35]. As a result, decision-makers in government may use them to prepare for and enhance the provision of services in academic progress and health.
Departments offering engineering programs in a public university in the Middle East were evaluated using a combination of MCDM methodologies by representing the administration of the public institution assessed using fifteen criteria. The study suggested that low-performing departments should be encouraged to produce more research articles by offering various forms of incentives to their academic members [36].
An integrated MCDM approach to evaluate several lectures to determine whether e-learning technology can be evaluated in an industrial engineering department of a Turkish university. The study evaluated many factors using MCDM methods in e-learning applications [37,38].
Many BSC studies have successfully applied AHP because of its ability to aid organizations or firms in selecting alternative missions/visions, strategies, and resource allocations to implement organizational strategies and objectives [39,40,41,42,43,44,45,46].
AHP is a technique that considers both quantitative and qualitative factors while assessing a problem’s viability. The BSC model’s analytical framework was developed using AHP and Analytic Network Process (ANP), two multiple-criteria decision-making approaches. AHP is a decision-making framework created by Saaty that considers several factors [47]. Despite the AHP approach’s presumption that the components provided in the hierarchical structure are independent, it may be unsuitable due to the influence of specific internal and external factors. Due to this, the ANP technique is required [48]. The BSC framework was created because of the limitations of the conventional financial approach to measuring business success. Of course, BSC is not perfect. The ANP technique compensates for these drawbacks by assigning importance weights to individual indicators. ANP generalizes the hierarchical relationship between criteria and options using a network perspective. Because of the interplay and dependency between higher and lower-level components, many decision issues defy traditional hierarchical organization [49]. Different performance metrics may be investigated using ANP.
The VIšekriterijumsko KOmpromisno Rangiranje (VIKOR) method may be used to rank each option according to how well it meets each criterion [50,51]. VIKOR is based on the compromise programming of MCDM, namely, comparing the “closeness” metric to the “ideal” alternative. In compromise programming, the Lp-metric is employed as an aggregating function, and from this, the multi-criteria measure for compromise ranking was constructed [52,53]. VIKOR and TOPSIS are well-known MCDM approaches. They both use the idea of compromise to solve the conflicting dilemma among the assessment criteria and then rate the order of the options [54]. TOPSIS technique has a blind spot that prevents it from being utilized for ranking purposes. Thus, it reveals where improvements may be made to the criteria to reach the desired/aspired level [51].
The Decision-Making Trial and Evaluation Laboratory (DEMATEL) technique is used to identify interdependencies and reciprocal influences among viewpoints [55]. One way to look at the work of creating a strategy map is as part of a more extensive, holistic group decision-making process. The DEMATEL technique uses group wisdom to identify and record the unintentional links between several strategic criteria [56]. The cause-and-effect analysis tool DEMATEL was employed to determine which BSC metrics are most telling [57].
A demonstration of how webometrics rankings could be measured using reliable quantitative information by applying TOPSIS and VIKOR methods to university websites was published [58]. In many universities worldwide, the measures have significantly changed their competitive nature. The VIKOR method is one of the better models that higher education stakeholders and researchers have identified to help provide better webometric data and rankings for university sites. As a result of the approach, academic prestige and quality of education will be improved [59].
The Six-Sigma framework has also been advocated to improve higher education quality. A study was conducted to evaluate the consequences of such an approach in enhancing educational results in higher education institutions [60]. This study aimed to demonstrate the potential applications of Lean Six Sigma (LSS) in higher education services and to offer a conceptual framework for implementing LSS in higher education facilities. The objective was to provide an overview of the significance of the quality perfection criteria in general by using a variety of constructs drawn from the relevant research, such as Total Quality Management (TQM), Lean, Six Sigma, and LSS. In the study, more significant consideration was given to the relevance of LSS in the context of Higher Education Institutions (HEI). Similar to other studies, validating the application of LSS in HEIs was found to be highly important [61,62].

3. The Proposed Strategy

The main goal of this paper is to rank university colleges by assessing their educational quality. This evaluation is to be completed using standards criteria from the common international university ranking systems and other criteria from a university strategic plan’s BSC perspectives. A novel strategy is suggested to evaluate the educational quality of university colleges by combining the criteria from these two separate sources. The strategy is broken down into three main phases, as shown in Figure 1. In the first phase, a cross table is created by combining the standards’ criteria from the chosen international university ranking systems and the criteria from the BSC perspectives. The second phase offers a method for weighting every type of criterion considered in the first phase. Then, the university colleges will be ranked in the third phase by applying the Ranking of the Alternatives using the Trace to Median Index (RATMI) technique [17] as one of the recent MCDM tools, based on the weighted set of criteria. These three phases are further detailed subsequently.

3.1. Phase 1: Identify the Education Quality Standards

Step 1.1: Several systems produce worldwide university rankings based on different standards that maximize the university’s potential through educational achievements, international mobility, professional development, and other standards. Here, in this step, based on recommendations of the university’s top administration, select a set of international university-ranking systems aligned with the university’s vision.
Step 1.2: Each International University Ranking System (IURS) has its own standard criteria for judging the university’s excellence. These standards’ criteria range from broad areas of interest, such as academic reputation and quality of education, to more focused ones, such as the faculty-to-student ratio and institutional income/academic staff. On the other hand, the university has its own standard criteria that help it achieve its strategic objectives and fulfill its responsibility to the local and international community. So, in this step, identify the university’s standards’ criteria and the other international systems chosen. Let S u be the number of standards’ criteria at system u , where u = 1 , , U , and U is the total number of chosen ranking systems, including the university as an educational organization.
Step 1.3: The BSC is a strategic planning management system that is frequently used to evaluate the performance of the universities from four important perspectives: finances, learning and growth, internal processes, and customers. As a result, the university creates Key Performance Indicators (KPIs) to deliver the results required from each perspective. In this step, determine the KPIs (i.e., the criteria) associated with each perspective of the university’s BSC. Let F , G , P , and E stand for the total number of criteria on finance, learning and growth, internal processes, and customer perspectives, respectively.

3.2. Phase 2: Determine the Criteria Weights

Step 2.1: Construct a cross table between the standards’ criteria used by U systems and the criteria of the university’s BSC. The table has V rows, where V = F + G + P + E , and W columns, where W = u = 1 U S u . In this table, the intersection cell indicates whether or not there is a relationship between the row criteria and column standards’ criteria. Put ‘1’ if there is a relationship at the intersection cell and ‘0’ otherwise. Let M i u j be a binary value (0 or 1) at the intersection cell that represents the standards’ criteria j ,   j = 1 , , S u of the system u ,   u = 1 , , U and has a relationship with the criteria i , i = 1 , , V of the university’s BSC.
Step 2.2: Determine the weights assigned to each standard’s criteria by the system u , u = 1 , , U . The standards’ criteria of each IURS are available to the public on the internet, while the university standards’ criteria are assigned by a panel of experts from the top administration of the university using the AHP technique. The sum of the standards’ criteria weight at each system is 1. Let T u j be the weight of the standards’ criteria j at system u , where j = 1 , , S u and u = 1 , , U .
Step 2.3: Calculate the weight of the criteria related to each perspective of the university’s BSC. The most common methods used to determine the criteria weights are AHP and Best-Worst-Method (BWM). These two methods are based on the pairwise comparison. The number of pair comparisons is n ( n 1 ) / 2 for the AHP technique and ( 2 n 3 ) for BWM. When there are many criteria (i.e., n > 7 ), pair comparisons grow in size, and people become too confused to give accurate answers when there are too many inquiries about the same issue [37]. A mathematical method is suggested to determine the weights for the criteria relevant to the perspectives of the university’s BSC. According to Equations (1)–(5), each criterion’s weight is equal to its score divided by the total scores of all criteria stated in the cross table.
T S = i = 1 V u = 1 U j = 1 S u M i u j × T u j ,
f i = S f i / T S = u = 1 U j = 1 S u M i u j × T u j / T S   i [ 1 , 2 , , F ] ,
g i = S g i / T S = u = 1 U j = 1 S u M i u j × T u j / T S   i [ 1 , 2 , , G ] ,
p i = S p i / T S = u = 1 U j = 1 S u M i u j × T u j / T S   i [ 1 , 2 , , P ] ,
e i = S e i / T S = u = 1 U j = 1 S u M i u j × T u j / T S   i [ 1 , 2 , , E ]
where S f i indicates the score of each financial criterion i , S g i indicates the score of each learning and growth criterion i , S p i indicates the score of each internal process’s criterion i , S e i indicates the score of each customer criterion i , and T S indicates the total scores of all criteria in the cross table. Additionally, f i represents the weight of each criterion related to those relevant to financial perspective, g i represents the weight of each criterion related to those relevant to learning and growth perspective, p i represents the weight of each criterion related to those relevant to the internal processes perspective, and e i represents the weight of each criterion related to those relevant to customers’ perspectives.

3.3. Phase 3: Rank the University Colleges (Alternatives)

Step 3.1: Construct the problem data in the form of a decision-making matrix X i j :
D = x i j m x n = A / C C 1 C 2 C n A 1 x 11 x 12 x 1 n A 2 x 21 x 22 x 2 n A m x m 1 x m 2 x m n ,
where A = A 1 , A 2 , , A m is a given set of university colleges (alternatives), and m is the total number of alternatives.
C = C 1 , C 2 , , C n is a given set of criteria, and n is the total number of criteria. Some of the criteria should be maximized, while some should be minimized.
x i j m x n is an assessment of alternative A i with respect to a set of criteria.
Step 3.2: Normalization of problem data. Since each criterion is described by its corresponding dimension, the problem data are multidimensional. It is hard to make decisions in this situation. To avoid these difficulties, the multidimensional decision space must be converted into a nondimensional decision space. Here, in this step, determine the normalization in the following manner for the maximum criteria:
r i j = x i j m a x i x i j , i 1 , 2 , , m j S m a x
while for the minimum criteria:
r i j = m i n i x i j x i j , i 1 , 2 , , m j S m i n
S m a x is a set of criteria that should be maximized.
S m i n is a set of criteria that should be minimized.
As a result, the normalized decision matrix will have the following form:
R = r i j m x n = A / C C 1 C 2 C n A 1 r 11 r 12 r 1 n A 2 r 21 r 22 r 2 n A m r m 1 r m 2 r m n
Step 3.3: Weighted normalization. Perform the weighted normalization as follows for normalized assessment r i j :
u i j = w j r i j , i 1 , 2 , , m , j 1 , 2 , , n
w j is a weight of criterion j that can be determined either from a group of experts or from using one of the MCDM tools, such as the AHP technique. The sum of the weights must equal one: j = 1 n w j = 1 .
Then, the weighted normalization matrix can be formed as follows:
U = u i j m x n = A / C C 1 C 2 C n A 1 u 11 u 12 u 1 n A 2 u 21 u 22 u 2 n A m u m 1 u m 2 u m n
Step 3.4: Determination of optimal alternative. Determine each component of the optimal alternative as follows:
q j = m a x u i j | 1 j n , i 1 , 2 , , m
The optimal alternative is represented by the following set:
Q = q 1 , q 2 , , q j , j = 1 , 2 , , n
Step 3.5: Decomposition of the optimal alternative. Decompose the optimal alternative in the two sets or two components.
Q = Q m a x Q m i n
Q = q 1 , q 2 , , q k q 1 , q 2 , , q h ; k + h = j
k: represents the total number of criteria that should be maximized.
h: represents the total number of criteria that should be minimized.
Step 3.6: Decomposition of the alternative. Similar to step 3.5, decompose each alternative.
U i = U i m a x U i m i n , i 1 , 2 , , m ,
U i = u i 1 , u i 2 , , u i k u i 1 , u i 2 , , u i h ; i = 1 , 2 , , m
Step 3.7: Magnitude of components. For each component of the optimal alternative, calculate the magnitude defined via:
Q k = q 1 2 + q 2 2 + q k 2 ,
Q h = q 1 2 + q 2 2 + q h 2
The same approach is applied to each alternative.
U i k = u i 1 2 + u i 2 2 + u i k 2 , i = 1 , 2 , , m
U i h = u i 1 2 + u i 2 2 + u i h 2 , i = 1 , 2 , , m
From this point, the following two methods were developed to create the rank of alternatives:
Step 3.7a: Ranking by Alternatives Trace. Create the matrix F composed of optimal alternative components:
F = Q k 0 0 Q h
Create the matrix G j composed of alternative components:
G j = U i k 0 0 U i h , i = 1 , 2 , , m
Create the matrix T i as follows:
T i = F × G j = t 11 ; i 0 0 t 22 ; i , i = 1 , 2 , , m
Then, the trace of the matrix T i is as follows:
t r T i = t 11 ; i + t 22 ; i , i = 1 , 2 , , m
Alternatives are now ranked according to the descending order of t r T i .
Step 3.7b: Ranking by Alternatives Median Similarity. The median of the optimal alternative is expressed as the median of the right angle. Components Q k and Q h represent the base and perpendicular side of this triangle.
M = Q k 2 + Q h 2 / 2
Median similarity represents the ratio between the perimeter of each alternative and the optimal alternative:
M S i = M i M , i = 1 , 2 , , m
Alternatives are now ranked according to the descending order of M S i .
Step 3.8: Ranking the alternatives based on the RATMI technique. The median of the optimal alternative is expressed as the median of the right angle. If v is the weight of the Multiple Criteria Ranking by Alternative Trace (MCRAT) strategy and ( 1 v ) is the weight of Ranks Alternatives based on the Median Similarity (RAMS) strategy, then the majority index E i between the two strategies is as follows:
E i = v t r i t r * t r t r * + 1 v M S i M S * M S M S *
t r i = t r T i , i = 1 , 2 , , m .
t r * = m i n ( t r i , i = 1 , 2 , , m ) .
t r = m a x ( t r i , i = 1 , 2 , , m ) .
M S * = m i n ( M S i , i = 1 , 2 , , m ) .
M S = m a x ( M S i , i = 1 , 2 , , m ) .
v is a value from 0 to 1. Here, v = 0.5.

4. Application and Results

The information used to make the ranking decision, including the evaluation criteria, came from the non-profit XYZ University in the Middle East region. XYZ University has six colleges with more than twenty active programs providing undergraduate and graduate instruction. The Middle East is an area that has experienced rapid growth in higher education in recent years. By focusing on this region, we aim to address the need for strategic evaluation approaches that can be tailored to the unique characteristics of the Middle Eastern higher education landscape. The six selected colleges represent a wide range of disciplines, from engineering and natural sciences to humanities and social sciences. This diversity allowed us to explore the applicability of our proposed MCDM approach across different academic fields and to demonstrate its versatility, which allows exploring the applicability of their proposed MCDM approach across different academic fields. The colleges primarily deal with the fields of architecture (A1), management (A2), technology (A3), engineering (A4), science (A5), and law (A6). Some of the programs have local accreditation, while others have international accreditation. However, one of the university’s strategic objectives is to obtain international accreditations for all its programs, besides obtaining a high ranking worldwide. The following is an application of the three phases mentioned earlier to rank the university colleges according to the standards of educational quality and weighted criteria.

4.1. Phase 1: Identifying the Education Quality Standards

In accordance with steps 1.1 and 1.2, a panel of specialists from the university’s top administration chose five international systems that rank universities according to certain standards’ criteria and weights. The systems selected and the associated data are shown in Table 1.
The five international systems considered in this research are Times Higher Education (THE), Quacquarelli Symonds (QS), Academic Ranking of World Universities (ARWU), Webometrics Ranking of World Universities, and Universitas Indonesia GreenMetric (UI) GreenMetric. THE compiles university rankings to evaluate academic institutions worldwide and to help the public learn more about the varied goals and achievements of the world’s top universities [63]. QS, the world’s preeminent provider of services, analytics, and insight into the global higher education sector, aims to help ambitious people everywhere realize their full potential by pursuing higher education, traveling the world, and advancing their professional careers [64]. First released in June 2003 by the Center for World-Class Universities (CWCU), Graduate School of Education (previously the Institute of Higher Education) of Shanghai Jiao Tong University, China, the Academic Ranking of World Universities (ARWU) is updated annually. Shanghai Ranking Consultancy has owned the publication rights to the ARWU since 2009. Shanghai Ranking is not legally subservient to any institutions or government organizations, making it a truly independent company in higher education intelligence [65]. The Cybermetrics Lab of the Consejo Superior de Investigaciones Cientficas (CSIC), the biggest public research agency in Spain, is responsible for compiling the “Webometrics Ranking of World Universities” [66]. In 2010, Universitas Indonesia launched the UI GreenMetric World University Rating, a ranking focused on green campuses and environmental sustainability. The UI GreenMetric World University Rankings carefully determined the rankings based on institutions’ environmental commitment and projects using 39 indicators across 6 categories [67].
Furthermore, an additional six standards criteria were chosen based on the strategic plan of XYZ University. With respect to step 1.3, Table 2 displays the criteria for the four BSC perspectives. A financial perspective, learning and growth perspective, internal processes perspective, and customer perspective are the four pillars of a BSC. Universities may boost their external results by focusing on and optimizing their processes using a BSC, a strategic management performance tool. It considers historical performance metrics and provides universities with helpful advice for improving their future decision-making [68].

4.2. Phase 2: Determining the Criteria Weights

As indicated in Figure 2, the cross table between the criteria of the four BSC perspectives at XYZ University and the standards’ criteria of the international university ranking systems is produced in accordance with step 2.1. Experts from the university’s highest management represented the relationship between each criterion and each standard’s criteria using binary values, 0 or 1. The value 1 denotes a link, while the value 0 indicates none. In Figure 2, the intersection cell with a 0-value has been left blank to clarify the table visually. As per step 2.2, the weights of the standards’ criteria used by university ranking systems are given in Table 1. Equations (1)–(5) illustrated in step 2.3 were used to determine the weights of the educational quality criteria of the university colleges. The results of this step are shown in Table 3.

4.3. Phase 3: Ranking the University Colleges (Alternatives)

The necessary information from six colleges (alternatives) was gathered to create the decision matrix, as illustrated before in step 3.1. Except for criteria F7, F9, and P3, which are minimized, all the other criteria are maximized. Appendix A shows the results obtained from steps 3.2 to 3.8. Table A1 and Table A2 show the normalized and weighted normalized input data based on steps 3.2 and 3.3, respectively, using Equations (7)–(11). In Step 3.4, Equations (12) and (13) were applied to identify the best option (13). Subsequently, steps 3.5 and 3.6 used Equations (14)–(17) to define the decomposition of the optimal alternative and the decomposition of each of the alternatives. The findings of the decomposition are displayed in Table A3 and Table A4. Equations (18)–(21) are used in Step 3.6 to calculate the magnitude of the best alternative and other alternatives. Table A5 displays the values that were obtained in this step. Steps 3.7a and 3.7b ranked the alternatives using Equations (22)–(27). The ranking using the alternate trace and median similarity methods is displayed in Table A6 and Table A7. To implement the RATMI methodology, step 3.8 concentrates on the majority index between the alternative trace and median similarity methods utilizing Equation (28) with v = 0.5. Table 4 presents the findings.

5. Discussion

The study’s objective was attained by revealing the educational quality performance ranks of XYZ University colleges based on the combined set of criteria. This combination is achieved by mapping the international university ranking systems’ standard criteria with the studied university strategic plan’s BSC perspectives and its pertaining criteria. This mapping informed the weights of the criteria based on which the MCDM approach using the RATMI technique is conducted to find the performance ranking of the colleges. The findings of the study revealed that based on the quality criteria and their derived weightings, the ranking of the colleges in satisfying the criteria from most to least is as follows: engineering (A4), management (A2), technology (A3), architecture (A1), science (A5), and law (A6), respectively.
The RATMI technique is a technique for ranking the performance of different alternatives based on multiple criteria. Table 4 shows how the technique assigns quantitative scores to the alternatives based on their evaluation. The decision-maker can use these scores to make informed quality improvement decisions by identifying the colleges with lower rankings and the factors contributing to their lower scores. For example, if a college has a low score in a certain criterion, the decision-maker can work on improving that aspect to increase the college’s overall ranking. Colleges should pay attention to the criteria that lowered their performance to increase their overall score and ranking. For instance, they could improve their ranking by (1) conducting a SWOT analysis of their strengths and weaknesses, (2) creating effective marketing and branding campaigns, (3) recruiting the best scholars, (4) ensuring quality standards through hiring committees, (5) offering new incentives to motivate staff to change their behavior, and (6) producing original research and making a positive impact on society.
Developing a strategy map necessitates the involvement of a management board or a group of experts in the form of value assignments due to the nature of the strategy map demanding it. As a result, the reason for conducting this study is to evaluate the viability of using collective decision-making in real-world settings. The main difference between this study and those that came before is using the RATMI technique from the perspective of each BSC and comparing the results. The inferences of the research may be applied to short-term and long-term goals to improve the current situation at universities. Applying the proposed approach can help the decision-maker set more specific long- and short-term objectives for each evaluated college based on their performance ranking and by tracing back the criteria that caused their lower overall ranking score. This is because the study focused on defining each development criterion’s most important indices. According to the findings of RATMI, cost control is the most important index from a financial point of view, product quality is the most important index from a customer point of view, and college finances and classroom material/experiences are the most important indices from an internal process point of view. Additionally, investment is the most important index from a learning and development point of view.
This study puts forward a novel strategic approach for universities to develop efficient strategies that enable achieving their educational quality objectives by unifying their efforts in satisfying the requirements of multiple international ranking systems while achieving their strategic goals concurrently and as per their priorities. Implications of this study also include informing university leaders and decision-makers on the most contributing colleges in achieving their strategic objectives and international rankings. Furthermore, it assists in pinpointing the quality shortcomings and their causes. This, in turn, will also help universities design more precise quality performance indicators and better allocate their resources to achieve educational excellence ultimately.

6. Conclusions

All higher education institutions endeavor for excellence in research, learning, teaching, and community engagement. Universities with a recognized reputation and a world ranking aim to develop strategic plans that attain such objectives. The issue stems from the lack of a comprehensive strategic framework that aligns the university’s strategic objectives and criteria derived from a university strategic plan’s BSC perspectives to the standards’ criteria of international ranking systems. The existence of such a framework provides university decision-makers with the right tool for evaluating and realizing university educational quality objectives by consolidating their efforts in fulfilling the requirements of diverse worldwide ranking systems while achieving their strategic goals. This led to the research question: how can the higher education quality of university colleges be evaluated using criteria derived from the BSC’s perspectives of its strategic plan and simultaneously using standard criteria of international ranking systems? As a response to the question, this study proposed a novel strategy for ranking university institutions in terms of educational quality. This is achieved via cross-mapping standard sets of criteria from multiple international university ranking systems and a university’s strategic plan’s BSC perspectives using an MCDM technique for the evaluation.
The proposed strategic framework is applied to six colleges of a non-profit university in the Middle East, exposing their performance rankings and contributions to fulfilling the university’s educational quality objectives and the requirements of diverse worldwide ranking systems simultaneously.
The study’s findings demonstrate the effectiveness of the proposed strategic framework and the effectiveness of the MCDM evaluation method used in achieving the desired benefit of apprising university leaders and decision-makers about the colleges that contribute the most to achieving their strategic objectives and worldwide designations. Furthermore, the study’s findings show that the proposed methodology accurately identifies quality deficiencies and their causes. As a result, this will have the greatest influence on assisting universities in formulating more accurate performance metrics and efficiently allocating resources to attain educational excellence.
There are always limitations to any research study due to internal and external environmental considerations. Applying the proposed framework requires the availability of resources and skills to create and maintain high-quality ranking. Furthermore, data collection, analysis, verification, and dissemination processes are expensive and time-consuming, and they must be carried out rigorously and transparently. Moreover, university-ranking bodies require more significant funding that can be affected by global economic change. Additionally, evaluation alone cannot guarantee education quality, as it is primarily influenced by the commitment and alignment of all university academic and admin staff toward satisfying quality standards.
New assessment methods could be applied in future research to align criteria derived from BSC and standards criteria of international ranking systems. Furthermore, researchers may develop other methods for weighting the standards’ criteria. This is for customized weightings that suit their universities’ temporal and spatial contexts in terms of the type of university (i.e., public/private, profit/non-profit), size, resources, scientific degrees they offer, number of students, nature of scientific disciplines they teach, and location, to mention a few. Customized weightings of standards’ criteria enable better suiting with their particular strategic goals and objectives and provide more insights into the better design of better-targeted education quality strategies.

Author Contributions

Conceptualization, A.A.M., A.Y.A., R.M.S.A. and A.I.M.; methodology, A.A.M., A.Y.A. and R.M.S.A.; software, R.M.S.A.; validation, A.A.M. and A.Y.A.; formal analysis, A.A.M., A.Y.A. and R.M.S.A.; investigation, A.A.M., A.Y.A., R.M.S.A. and A.I.M.; resources, A.A.M., A.Y.A., R.M.S.A. and A.I.M.; data curation, A.A.M. and A.Y.A.; writing—original draft preparation, A.A.M., A.Y.A., R.M.S.A. and A.I.M.; writing—review and editing, A.A.M., A.Y.A., R.M.S.A. and A.I.M.; visualization, A.A.M., A.Y.A., R.M.S.A. and A.I.M.; supervision, A.A.M., A.Y.A., R.M.S.A. and A.I.M. All authors have read and agreed to the published version of the manuscript.


This research received no external funding.

Institutional Review Board Statement

The Research Ethics Committee (REC) (National Committee of Bio-Ethics Registration Number: HA-02-J008) at the Unit of Biomedical Ethics, Faculty of Medicine, King Abdulaziz University granted a full ethical approval/exemption (Reference Number: 32-22) for the participants’ consent and the collected information in this study.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy and ethical restrictions.


The authors acknowledge and thank all the respondents in this research study.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Appendix A depicts the complete calculation of the RATMI method. The calculation is presented based on the steps outlined in Section 3 (Steps 3.2 to 3.8).
Table A1. Normalized decision-making matrix.
Table A1. Normalized decision-making matrix.
Alternatives/CriteriaF1F2F3F4 F5F6F7F8F9G1
Alternatives/CriteriaG2G3G4G5 G6G7G8G9G10G11
Alternatives/CriteriaG12G13G14G15 G16G17G18G19G20G21
Alternatives/CriteriaP1P2P3P4 P5P6P7P8P9P10
Alternatives/CriteriaP11P12P13P14 P15P16P17P18P19P20
Alternatives/CriteriaP21P22P23E1 E2E3E4E5
Table A2. Weighted normalized decision-making matrix.
Table A2. Weighted normalized decision-making matrix.
Alternatives/CriteriaF1F2F3F4 F5F6F7F8F9G1
Alternatives/CriteriaG2G3G4G5 G6G7G8G9G10G11
Alternatives/CriteriaG12G13G14G15 G16G17G18G19G20G21
Alternatives/CriteriaP1P2P3P4 P5P6P7P8P9P10
Alternatives/CriteriaP11P12P13P14 P15P16P17P18P19P20
Alternatives/CriteriaP21P22P23E1 E2E3E4E5
Table A3. Decomposition of the optimal alternative.
Table A3. Decomposition of the optimal alternative.
Table A4. Decomposition of alternatives.
Table A4. Decomposition of alternatives.
Table A5. The magnitude of optimal alternatives.
Table A5. The magnitude of optimal alternatives.
Table A6. Alternatives ranked according to the alternative trace method.
Table A6. Alternatives ranked according to the alternative trace method.
Table A7. Alternatives ranked according to the median similarity method.
Table A7. Alternatives ranked according to the median similarity method.
AlternativeMax.Min.MedianPerimeter SimilarityRank
QkQhMMSi = Mi/M


  1. Syed Hassan, S.A.H.; Tan, S.C.; Yusof, K.M. MCDM for Engineering Education: Literature Review and Research Issues. In Proceedings of the Engineering Education for a Smart Society: World Engineering Education Forum & Global Engineering Deans Council 2016, Cham, Switzerland, 7 July 2017; pp. 204–214. [Google Scholar] [CrossRef]
  2. Jongbloed, B.; Vossensteyn, H.; Management. Keeping up performances: An international survey of performance-based funding in higher education. J. High. Educ. Policy 2001, 23, 127–145. [Google Scholar] [CrossRef]
  3. Buzzigoli, L.; Giusti, A.; Viviani, A. The evaluation of university departments. A case study for Firenze. Int. Adv. Econ. Res. 2010, 16, 24–38. [Google Scholar] [CrossRef]
  4. Nazari-Shirkouhi, S.; Mousakhani, S.; Tavakoli, M.; Dalvand, M.R.; Šaparauskas, J.; Antuchevičienė, J. Importance-performance analysis based balanced scorecard for performance evaluation in higher education institutions: An integrated fuzzy approach. J. Bus. Econ. Manag. 2020, 21, 647–678. [Google Scholar] [CrossRef][Green Version]
  5. Falch, T.; Iversen, J.M.V.; Nyhus, O.H.; Strøm, B. Quality measures in higher education: Norwegian evidence. Econ. Educ. Rev. 2022, 87, 102235. [Google Scholar] [CrossRef]
  6. Mandinach, E.B.; Schildkamp, K. Misconceptions about data-based decision making in education: An exploration of the literature. Stud. Educ. Eval. 2021, 69, 100842. [Google Scholar] [CrossRef]
  7. Du, M. Balanced scorecard in university financial management. In Proceedings of the ICIMTECH 21: The Sixth International Conference on Information Management and Technology, Jakarta, Indonesia, 19–20 August 2021; pp. 1–5. [Google Scholar] [CrossRef]
  8. Ozsahin, D.U.; Denker, A.; Kibarer, A.G.; Kaba, S. Evaluation of stage IV brain cancer treatment techniques. In Applications of Multi-Criteria Decision-Making Theories in Healthcare and Biomedical Engineering; Elsevier: Amsterdam, The Netherlands, 2021; pp. 59–69. [Google Scholar] [CrossRef]
  9. Chen, C.-H. A New Multi-criteria Assessment Model Combining GRA Techniques with Intuitionistic Fuzzy Entropy-Based TOPSIS Method for Sustainable Building Materials Supplier Selection. Sustainability 2019, 11, 2265. [Google Scholar] [CrossRef][Green Version]
  10. Ibrahim, A.; Surya, R. The implementation of simple additive weighting (SAW) method in decision support system for the best school selection in Jambi. In Proceedings of the Journal of Physics: Conference Series, The 2nd International Conference on Applied Sciences Mathematics and Informatics, Bandar Lampung, Indonesia, 9–11 August 2018; IOP Publishing Ltd.: Bristol, UK, 2019; p. 012054. [Google Scholar] [CrossRef][Green Version]
  11. Kraujalienė, L. Comparative Analysis of Multi-Criteria Decision-Making Methods Evaluating the Efficiency of Technology Transfer. Bus. Manag. Educ. 2019, 17, 72–93. Available online: (accessed on 7 May 2023). [CrossRef]
  12. Kabassi, K. Comparing Multi-criteria Decision Making Models for Evaluating Environmental Education Programs. Sustainability 2021, 13, 11220. [Google Scholar] [CrossRef]
  13. Miç, P.; Antmen, Z.F. A Decision-Making Model Based on TOPSIS, WASPAS, and MULTIMOORA Methods for University Location Selection Problem. SAGE Open 2021, 11, 21582440211040115. [Google Scholar] [CrossRef]
  14. Thakkar, J.J. Multi-Criteria Decision Making; Springer: Berlin/Heidelberg, Germany, 2021; Volume 336. [Google Scholar]
  15. Fofan, A.C.; Oliveira, L.A.B.d.; de Melo, F.J.C.; Jerônimo, T.d.B.; de Medeiros, D.D. An integrated methodology using PROMETHEE and Kano’s Model to rank strategic decisions. Eng. Manag. J. 2019, 31, 270–283. [Google Scholar] [CrossRef]
  16. Su, W.; Zhang, L.; Zhang, C.; Zeng, S.; Liu, W. A Heterogeneous Information-Based Multi-Attribute Decision Making Framework for Teaching Model Evaluation in Economic Statistics. Systems 2022, 10, 86. [Google Scholar] [CrossRef]
  17. Abdulaal, R.; Bafail, O.A. Two New Approaches (RAMS-RATMI) in Multi-criteria Decision-Making Tactics. J. Math. 2022, 2022, 6725318. [Google Scholar] [CrossRef]
  18. Furqatovna, O.; Niyozovna, N.; Nutfulloyevna, A. Approaches Aimed at Ensuring a High Quality of Education in the Training of Economists. J. Ethics Divers. Int. Commun. 2022, 2, 78–83. Available online: (accessed on 7 May 2023).
  19. Muller, K.; Scalzo, K.A.; Pickett, A.M.; Dugan, L.; Dubuc, L.; Simiele, D.; McCabe, R.; Pelz, W. Ensuring Online Learning Quality: Perspectives from the State University of New York. Online Learn. 2020, 24, 254–268. Available online: (accessed on 7 May 2023). [CrossRef]
  20. Kibik, O.; Nikolaieva, L.; Khaiminova, I.; Bereza, V. The key factors in ensuring the quality of maritime education in Ukraine. In Proceedings of the 6th International Conference on Strategies, Models and Technologies of Economic Systems Management (SMTESM 2019), Khmelnytskyi, Ukraine, 4–6 October 2019; pp. 114–118. [Google Scholar] [CrossRef][Green Version]
  21. Dewi, M.P.; Rahmatunnisa, M.; Sumaryana, A.; Kristiadi, J. Ensuring service quality in education for Indonesia’s sustainable education. J. Soc. Stud. Educ. Res. 2018, 9, 65–81. Available online: (accessed on 7 May 2023).
  22. Hladchenko, M. Balanced Scorecard–a strategic management system of the higher education institution. Int. J. Educ. Manag. 2015, 29, 167–176. [Google Scholar] [CrossRef]
  23. Fijałkowska, J.; Oliveira, C. Balanced scorecard in universities. J. Intercult. Manag. 2018, 10, 57–83. [Google Scholar] [CrossRef]
  24. Gamal, A.; Soemantri, A.I. The effect of balanced scorecard on the private college performance (Case study at the University of WR Supratman Surabaya). Arch. Bus. Res. 2017, 5, 126–134. [Google Scholar] [CrossRef][Green Version]
  25. Kiriri, P.N. Management of Performance in Higher Education Institutions: The Application of the Balanced Scorecard (BSC). Eur. J. Educ. 2022, 5, 144–158. Available online: (accessed on 7 May 2023). [CrossRef]
  26. Llach, J.; Bagur, L.; Perramon, J.; Marimon, F. Creating value through the balanced scorecard: How does it work? Manag. Decis. 2017, 55, 2181–2199. [Google Scholar] [CrossRef]
  27. Camilleri, M.A. Using the balanced scorecard as a performance management tool in higher education. Manag. Educ. 2021, 35, 10–21. [Google Scholar] [CrossRef]
  28. Wijayanti, N.; Setiawan, W.; Sukamto, R. Performance assessment of IT governance with balanced score card and COBIT 4.1 of Universitas Pendidikan Indonesia. In Proceedings of the Journal of Physics: Conference Series, International Seminar on Mathematics, Science, and Computer Science Education (MSCEIS 2016), Bandung, Indonesia, 15 October 2016; IOP Publishing Ltd.: Bristol, UK, 2017; p. 012072. [Google Scholar] [CrossRef]
  29. Chimtengo, S.; Mkandawire, K.; Hanif, R. An evaluation of performance using the balanced scorecard model for the university of Malawis polytechnic. Afr. J. Bus. Manag. 2017, 11, 84–93. [Google Scholar] [CrossRef][Green Version]
  30. Ramísio, P.J.; Pinto, L.M.C.; Gouveia, N.; Costa, H.; Arezes, D. Sustainability Strategy in Higher Education Institutions: Lessons learned from a nine-year case study. J. Clean. Prod. 2019, 222, 300–309. [Google Scholar] [CrossRef]
  31. Al-Hosaini, F.F.; Sofian, S. A review of balanced scorecard framework in higher education institution (HEIs). Int. Rev. Manag. Mark. 2015, 5, 26–35. Available online: (accessed on 7 May 2023).
  32. Shafii, M.; Rafiei, S.; Abooee, F.; Bahrami, M.A.; Nouhi, M.; Lotfi, F.; Khanjankhani, K. Assessment of Service Quality in Teaching Hospitals of Yazd University of Medical Sciences: Using Multi-criteria Decision Making Techniques. Osong Public Health Res. Perspect. 2016, 7, 239–247. [Google Scholar] [CrossRef][Green Version]
  33. Castro-Lopez, A.; Cervero, A.; Galve-González, C.; Puente, J.; Bernardo, A.B. Evaluating critical success factors in the permanence in Higher Education using multi-criteria decision-making. High. Educ. Res. Dev. 2022, 41, 628–646. [Google Scholar] [CrossRef]
  34. Abbasi, B. Identifying and Ranking of University Strategic Human Resources Management Criteria Based on Multi-criteria Decision Making Methods. Public Adm. Perspaective 2020, 11, 127–147. Available online: (accessed on 7 May 2023).
  35. Keshavarz-Ghorabaee, M. Assessment of distribution center locations using a multi-expert subjective–objective decision-making approach. Sci. Rep. 2021, 11, 19461. [Google Scholar] [CrossRef] [PubMed]
  36. Bafail, O.A.; Abdulaal, R.M.S.; Kabli, M.R. AHP-RAPS Approach for Evaluating the Productivity of Engineering Departments at a Public University. Systems 2022, 10, 107. [Google Scholar] [CrossRef]
  37. Tuan, N.; Hue, T.; Lien, L.; Thao, T.; Quyet, N.; Van, L.; Anh, L. A new integrated MCDM approach for lecturers’ research productivity evaluation. Decis. Sci. Lett. 2020, 9, 355–364. [Google Scholar] [CrossRef]
  38. Turan, H. Assessment factors affecting e-learning using fuzzy analytic hierarchy process and SWARA. Int. J. Eng. Educ. 2018, 34, 915–923. [Google Scholar]
  39. Huang, H.C. Designing a knowledge-based system for strategic planning: A balanced scorecard perspective. Expert Syst. Appl. 2009, 36, 209–218. [Google Scholar] [CrossRef]
  40. Kim, H.-S.; Kim, Y.-G. A CRM performance measurement framework: Its development process and application. Ind. Mark. Manag. 2009, 38, 477–489. [Google Scholar] [CrossRef]
  41. Varma, S.; Wadhwa, S.; Deshmukh, S. Evaluating petroleum supply chain performance: Application of analytical hierarchy process to balanced scorecard. Asia Pac. J. Mark. Logist. 2008, 20, 343–356. [Google Scholar] [CrossRef]
  42. Chan, Y.C.L. An analytic hierarchy framework for evaluating balanced scorecards of healthcare organizations. Can. J. Adm. Sci./Rev. Can. Sci. L’administration 2006, 23, 85–104. [Google Scholar] [CrossRef]
  43. Leung, L.C.; Lam, K.C.; Cao, D. Implementing the balanced scorecard using the analytic hierarchy process & the analytic network process. J. Oper. Res. Soc. 2006, 57, 682–691. [Google Scholar] [CrossRef]
  44. Fletcher, H.; Smith, D.B. Managing for value: Developing a performance measurement system integrating economic value added and the balanced scorecard in strategic planning. J. Bus. Strateg. 2004, 21, 1–18. [Google Scholar] [CrossRef]
  45. Reisinger, H.; Cravens, K.S.; Tell, N. Prioritizing performance measures within the balanced scorecard framework. MIR Manag. Int. Rev. 2003, 43, 429–437. [Google Scholar]
  46. Stewart, R.A.; Mohamed, S. Utilizing the balanced scorecard for IT/IS performance evaluation in construction. Constr. Innov. 2001, 1, 147–163. [Google Scholar] [CrossRef]
  47. Saaty, R.W. Decision Making in Complex Environment: The Analytic Hierarchy Process (AHP) for Decision Making and the Analytic Network Process (ANP) for Decision Making with Dependence and Feedback; Super Decisions: Pittsburgh, PA, USA, 2003. [Google Scholar]
  48. Lee, M.C.; Wang, H.W.; Wang, H.Y. A method of performance evaluation by using the analytic network process and balanced score car. In Proceedings of the 2007 International Conference on Convergence Information Technology (ICCIT 2007), Gwangju, Republic of Korea, 21–23 November 2007; pp. 235–240. [Google Scholar] [CrossRef]
  49. Saaty, T.L.; Vargas, L.G.; Saaty, T.L.; Vargas, L.G. The Analytic Network Process; Springer: Boston, MA, USA, 2013; Volume 195, pp. 1–40. [Google Scholar] [CrossRef]
  50. Opricovic, S. Multi-Criteria Optimization of Civil Engineering Systems. Ph.D. Thesis, Faculty of Civil Engineering, Belgrade, Serbia, 1998. (In Serbian). [Google Scholar]
  51. Opricovic, S.; Tzeng, G.-H. Extended VIKOR method in comparison with outranking methods. Eur. J. Oper. Res. 2007, 178, 514–529. [Google Scholar] [CrossRef]
  52. Yu, P.L. A class of solutions for group decision problems. Manag. Sci. 1973, 19, 936–946. [Google Scholar] [CrossRef]
  53. Zeleny, M. Multiple criteria decision making: Eight concepts of optimality. Hum. Syst. Manag. 1998, 17, 97–107. [Google Scholar] [CrossRef]
  54. Opricovic, S.; Tzeng, G.-H. Compromise solution by MCDM methods: A comparative analysis of VIKOR and TOPSIS. Eur. J. Oper. Res. 2004, 156, 445–455. [Google Scholar] [CrossRef]
  55. Wu, H.-Y.; Lin, Y.-K.; Chang, C.-H. Performance evaluation of extension education centers in universities based on the balanced scorecard. Eval. Program Plan. 2011, 34, 37–50. [Google Scholar] [CrossRef]
  56. Jassbi, J.; Mohamadnejad, F.; Nasrollahzadeh, H. A Fuzzy DEMATEL framework for modeling cause and effect relationships of strategy map. Expert Syst. Appl. 2011, 38, 5967–5973. [Google Scholar] [CrossRef]
  57. Ghadikolaei, A.S.; Chen, I.-S.; Akbarzadeh, S.H.Z.Z. Using DEMATEL method for cause and effect relations of BSC in universities of Iran. In Proceedings of the BALCOR 2011, Thessaloniki, Greece, 22–24 September 2011; pp. 333–340. [Google Scholar]
  58. Shekhovtsov, A.; Sałabun, W. A comparative case study of the VIKOR and TOPSIS rankings similarity. Procedia Comput. Sci. 2020, 176, 3730–3740. [Google Scholar] [CrossRef]
  59. Perdana, A.; Budiman, A. College Ranking Analysis Using VIKOR Method. J. Comput. Netw. Archit. High Perform. Comput. 2021, 3, 241–248. [Google Scholar] [CrossRef]
  60. Sunder, M.V.; Antony, J. A conceptual Lean Six Sigma framework for quality excellence in higher education institutions. Int. J. Qual. Reliab. Manag. 2018, 35, 857–874. [Google Scholar] [CrossRef]
  61. Shanshan, S.; Wenfei, L.; Lijuan, L. Applying lean six sigma incorporated with big data analysis to curriculum system improvement in higher education institutions. Int. J. Syst. Assur. Eng. Manag. 2022, 13, 641–656. [Google Scholar] [CrossRef]
  62. Laux, C.; Li, N.; Seliger, C.; Springer, J. Impacting big data analytics in higher education through six sigma techniques. Int. J. Product. Perform. Manag. 2017, 66, 662–679. [Google Scholar] [CrossRef]
  63. THE Times Higher Education. Available online: (accessed on 4 May 2023).
  64. QS TOPUNIVERSITIES. Available online: (accessed on 4 May 2023).
  65. ARWU. Academic Ranking of World Universities. Available online: (accessed on 4 May 2023).
  66. WEB. Webometrics Ranking of World Universities. Available online: (accessed on 4 May 2023).
  67. UI GreenMetric World University Rankings. Available online: (accessed on 4 May 2023).
  68. Kaplan, R.S.; Norton, D.P. Balanced Scorecard Success: The Kaplan-Norton Collection (4 Books); Harvard Business Review Press: Boston, MA, USA, 2015. [Google Scholar]
Figure 1. The framework of the proposed strategy.
Figure 1. The framework of the proposed strategy.
Education 13 00577 g001
Figure 2. The cross table between the standards’ criteria of university ranking systems and Balanced Scorecard’s (BSC) criteria of XYZ University.
Figure 2. The cross table between the standards’ criteria of university ranking systems and Balanced Scorecard’s (BSC) criteria of XYZ University.
Education 13 00577 g002
Table 1. The selected systems with their associated standards’ criteria and weights.
Table 1. The selected systems with their associated standards’ criteria and weights.
Ranking SystemRanking Standards’ CriteriaWeights
Times Higher Education World University Rankings (THE)S1-1 Teaching (the learning environment)0.30
S1-2 Research (volume, income, reputation)0.30
S1-3 Citations (research influence)0.30
S1-4 International Outlook (staff, students, research)0.075
S1-5 Industry Income (knowledge transfer)0.025
Total sum1
QS World University Rankings (QS)S2-1 Academic Reputation0.40
S2-2 Employer Reputation0.10
S2-3 Faculty–Student Ratio0.20
S2-4 Citations per Faculty0.20
S2-5 International Faculty Ratio0.05
S2-6 International Student Ratio0.05
Total sum1
Academic Ranking of World Universities (ARWU)S3-1 Quality of Education (alumni who have won Nobel prizes)0.10
S3-2 Quality of Faculty in terms of staff winning Nobel prizes and Highly Cited (HiCi) classified staff0.40
S3-3 Research Output (papers published in N&S and PUB)0.40
S3-4 Per Capita Performance (PCP)0.10
Total sum1
Webometrics Ranking of World Universities (WEB)S4-1 Visibility0.50
S4-2 Transparency (openness)0.10
S4-3 Excellence (scholar)0.40
Total sum1
UI GreenMetrics World University Rankings (UI)S5-1 Setting and Infrastructure0.15
S5-2 Energy and Climate Change0.21
S5-3 Waste0.18
S5-4 Water0.10
S5-5 Transportation0.18
S5-6 Education and Research0.18
Total sum1
XYZ UniversityS6-1 Curriculum0.23
S6-2 Academic Staff0.17
S6-3 Infrastructure0.16
S6-4 E-Services0.11
S6-5 Community Services0.10
S6-6 Library Services0.13
S6-7 Administrative Services0.10
Total sum1
Table 2. The criteria related to XYZ University’s Balanced Scorecard (BSC) perspectives.
Table 2. The criteria related to XYZ University’s Balanced Scorecard (BSC) perspectives.
Financial Perspective
F1. Research income per academic staffF4. Annual budget allocated by the universityF7. Conventional to smart implementation ratio (in %)
F2. Total income from contracts with industryF5. Percentage implementing recycling programsF8. Water conservation program implementation (Yes/No)
F3. The annual revenue from postgraduate programsF6. Percentage implementing paperless practicesF9. The ratio of surface parking spaces to the building’s overall area
Learning and Growth Perspective
G1. Percentage of operation and maintenance activities of the building during the COVID-19 pandemicG9. Number of teaching staff (Ph.D. holders)G17. Percentage satisfied with academic and administrative services provided on the website
G2. Percentage of satisfaction from special needs facilitiesG10. Number of full professors (excluding retired)G18. Percentage satisfied with appealing and efficiently arranged website
G3. Percentage satisfied with health facilitiesG11. Number of Highly Cited (HiCi) academic staffG19. E-services prompt technical support (Yes/No)
G4. Percentage satisfied with security and safety facilitiesG12. The number of faculty divided by the number of studentsG20. E-services are accessible in different ways (Yes/No)
G5. Percentage availability of up-to-date books and journalsG13. Number of staff with an h-index greater than 20G21. The website shows the research outcomes by academic staff and students (Yes/No)
G6. Availability of E-library (Yes/No)G14. Percentage of academic staff with foreign citizenship
G7. Rate sufficient places to sit and read (from 1 to 5)G15. Number of doctoral staff awarded international prizes
G8. Percentage attracting high-caliber teaching staffG16. Number of staff who earned a Ph.D. from top 100 universities (THEMS ranking)
Internal Processes Perspective
P1. Percentage of students with foreign citizenshipP9. The proportion of international postgraduate studentsP17. Rate the availability of sporting facilities (from 1 to 5)
P2. Number of accredited programs internationallyP10. The average number of published papersP18. Percentage satisfied with medical services
P3. Number of accredited programs locallyP11. Number of ISI (Q1) papers published over the last five yearsP19. Student’s hostel (Yes/No)
P4. The ratio of sustainability courses to total courses/subjectsP12. Number of Scopus (Q1) papers published over the last five yearsP20. Friendliness of advising system (Yes/No)
P5. Number of Ph.D.s awarded by the collegeP13. Average citations per paper annuallyP21. Rate the availability of administrative materials for services (from 1 to 5)
P6. Percentage of satisfaction with the current academic advisingP14. Number of citations in last five years divided by the number of staff membersP22. Rating of the clarity of administrative guidelines and advice (from 1 to 5)
P7. Number of curriculums or programs aligned with requirements of the labor marketP15. Number of certified labsP23. Number of initiatives during the COVID-19 pandemic
P8. The curriculum enhances student skills and self-capabilities (Yes/No)P16. Rate the availability of catering services (from 1 to 5)
Customer Perspective
E1. Number of community services related to sustainabilityE3. Number of international collaborationsE5. Number of prizes awarded to the college
E2. Number of scientific societiesE4. Number of bachelor or master’s students awarded prizes
Table 3. The educational quality criteria, scores, and relative weights.
Table 3. The educational quality criteria, scores, and relative weights.
Total   Criteria   Score = T S = 3550 %
Financial perspective criteria/scores/weights
S f i 70%42.5%70%70%18%18%63%10%18%
f i 0.0200.0120.0200.0200.0050.0050.0180.0030.005
Learning and Growth perspective criteria/scores/weights
S g i 31%31%31%31%13%24%44%202.5%67%242.5%227.5%50%
g i 0.0090.0090.0090.0090.0040.0070.0120.0570.0190.0680.0640.014
S g i 155.5%119.5%47.5%87.5%60%10%21%21%90%
g i 0.0440.0340.0130.0250.0170.0030.0060.0060.025
Internal processes criteria/scores/weights
S p i 52.5%138.5%131%67%88%75%33%63%57.5%138%100%100%
p i 0.0150.0390.0370.0190.0250.0210.0090.0180.0160.0390.0280.028
S p i 60%80%31%16%31%16%41%10%10%50%25%
p i 0.0170.0230.0090.0050.0090.0050.0120.0030.0030.0140.007
Customers’ perspective criteria/scores/weights
S e i 25%10%47.5%10%57.5%
e i 0.0070.0030.0130.0030.016
Table 4. Ranked alternatives using the Trace to Median Index (RATMI) method.
Table 4. Ranked alternatives using the Trace to Median Index (RATMI) method.
AlternativeAlternative TraceMedian SimilarityMajority IndexRank
t r * = 0.02057 M S * = 0.68978
t r = 0.02733 M S = 0.92935
t r i M S i E i
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Makki, A.A.; Alqahtani, A.Y.; Abdulaal, R.M.S.; Madbouly, A.I. A Novel Strategic Approach to Evaluating Higher Education Quality Standards in University Colleges Using Multi-Criteria Decision-Making. Educ. Sci. 2023, 13, 577.

AMA Style

Makki AA, Alqahtani AY, Abdulaal RMS, Madbouly AI. A Novel Strategic Approach to Evaluating Higher Education Quality Standards in University Colleges Using Multi-Criteria Decision-Making. Education Sciences. 2023; 13(6):577.

Chicago/Turabian Style

Makki, Anas A., Ammar Y. Alqahtani, Reda M. S. Abdulaal, and Ayman I. Madbouly. 2023. "A Novel Strategic Approach to Evaluating Higher Education Quality Standards in University Colleges Using Multi-Criteria Decision-Making" Education Sciences 13, no. 6: 577.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop