Next Article in Journal
How Drivers Feel When Traversing Speed Humps under a Variety of Driving Conditions
Previous Article in Journal
The Effects of the Interaction between Bacterial Inoculants and Mineral Fertilizers on Spring Barley Yield and Soil Properties
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

DEA Approach to Evaluate Research Efficiency of Departments in University †

1
Department of Education, National Chengchi University, Taipei 11605, Taiwan
2
Research and Development Office, National Chengchi University, Taipei 11605, Taiwan
3
Center of Teacher Education, Chaoyang University of Technology, Taichung 413310, Taiwan
*
Author to whom correspondence should be addressed.
Presented at the 3rd IEEE International Conference on Electronic Communications, Internet of Things and Big Data Conference 2023, Taichung, Taiwan, 14–16 April 2023.
Eng. Proc. 2023, 38(1), 71; https://doi.org/10.3390/engproc2023038071
Published: 29 June 2023

Abstract

:
The purpose of this study is to evaluate the research efficiency of 40 departments in a university from 2015 to 2017. In this study, data envelopment analysis (DEA) is used with a non-parametric mathematical linear planning approach, and an appropriate model is proposed for evaluating the research efficiency of the university’s departments. The analyzed items are selected based on the relevant literature on research efficiency. The result of this study helps research policymakers and motivates faculty and researchers to take the initiative for better-quality research with limited resources and international competitiveness.

1. Introduction

1.1. Research Motivation

The research object of this study is a university specializing in the humanities and social sciences (case university hereinafter). Compared with natural science, humanities and social sciences are mostly restricted by language, and their research is normally on regional or local issues. Regarding industry–university cooperation and budget allocation, the case university is in an inferior position. Although the case university has many disciplines within 200 in the QS ranking, the school’s overall ranking falls between 601–650, and its efforts are not easily highlighted in the world university rankings. Even its performance is underestimated. Therefore, to improve research efficiency, the case university established a subsidy and reward system and provided professors and researchers with appropriate assistance and incentives to improve academic research standards and stimulate research energy. Therefore, under a limited budget and investment, the case university must rationally evaluate each unit’s research efficiency to clarify the problems of inefficient units, identify the benchmark units as the objects of learning, and use the resources effectively.
The case university comprises several colleges, including liberal arts, science, law, commerce, social sciences, foreign languages and literature, communication, international affairs, and education. Initially, the colleges had a classification system to group similar departments. The colleges did not have a substantive administrative structure. Instead, resources were managed by individual departments. Over time, departments became segregated, and resources became limited. The case university recently proposed the substantiation of its colleges to integrate its resources. Currently, the College of Law, College of Communication, College of International Affairs, and College of Education have been substantiated as physical colleges.
The case university has four physical and five non-physical colleges. According to a study by Chao and Chen [1], the research efficiency of the physical colleges of the case university was better than non-physical colleges between 2015 and 2017. Therefore, we analyzed the research efficiency performance of 40 departments in the non-physical colleges of the case university using a data envelopment analysis and a non-parametric mathematical linear planning approach.

1.2. Research Objectives

Based on the above, the research objectives of this study are as follows:
  • To evaluate the research efficiency of the case universities’ departments using data envelopment analysis (DEA);
  • To identify the issues of research inefficiency in the departments of the case university through this study;
  • To identify benchmarks as a reference for departments to learn from;
  • To make recommendations to the departments and management to solve the management problems associated with research inefficiency.

1.3. Research Questions

Therefore, the questions to be studied in this research are defined as follows:
  • What is the research efficiency of the departments in the case university?
  • What are the issues of inefficiency in some departments of the case university?
  • What are the priorities for improvement in the relatively inefficient departments of the case university?
  • Which departments are regarded as benchmarks for the case university’s departments to follow?
  • How are the problems of inefficient research management in the case university solved?

2. Literature Review

In this section, the methods for assessing research efficiency are introduced to explain the theory, main models, and characteristics of the research method (data envelopment analysis) adopted in this study.
Universities assess the efficiency of academic research of different disciplines scientifically based on standardization and consistency of management. However, the estimation of university-research efficiency is a controversial issue, and the measurement of the so-called efficiency is under debate.
In the UK, the quality of research and allocation of research funds is assessed by the Research Excellence Framework (REF), which focuses on defining what research excellence means and identifying what constitutes “research excellence”. The key features of the REF are related to the evaluation of three elements: (1) Output Quality, (2) Research Impact, and (3) Research Environment. The results of the REF evaluation are the product of an expert review or examination based on appropriate indicators [2]. Other countries such as the United States, Germany, Australia, New Zealand, and Norway also attach great importance to the evaluation of university research efficiency. University research assessments are conducted to determine how to allocate research grants and motivate universities to pursue excellent research [3].
Scientific methods to evaluate research efficiency are important for research management. Commonly used methods to evaluate research efficiency include data envelopment analysis, peer review method, Delphi method, bibliometric method, hierarchical analysis, gray correlation analysis, and fuzzy integrated evaluation method [3,4].
DEA is derived from Farrell’s method of evaluating the relative efficiency of multiple inputs and outputs [5]. Later, Charnes, Cooper, and Rhodes improved it into the CCR model [6] based on DEA. DEA uses a mathematical model to obtain the production frontier for measuring efficiency without a preset production function model. This non-parametric quantitative technique assesses the relative efficiency of evaluated units, commonly known as decision-making units (DMUs) [7,8,9,10]. The input and output data of DMU are used to find the production frontier through the mathematical model by comparing the actual data of each DMU with the production frontier. Each DMU’s relative efficiency and relative inefficiency are measured to achieve the goal of relative efficiency improvement [11]. Rhodes defined DEA as “a nonlinear (nonconvex) programming model providing a new definition of efficiency for use in evaluating activities of not-for-profit entities participating in public programs” (p. 429) [6].
The DEA is addressed as a linear programming problem as stated by Charnes et al. [6], and the mathematical model to calculate the efficiency is expressed as follows [12].
M i n   E 0 = i = 1 m v i x i 0
i = 0 m v i x i j r = 1 s u r y r j 0
r = 1 m u r y r 0 = 1
v i     , u i 0 , i = 1 , ,   m ;   r = 1 , , s
where each DMU uses m inputs to obtain s outputs. Hence, the j-th DMU uses xij units of input i to produce yrj units of output r. Additionally, ur represents the weighted outputs and vi the weighted inputs.
The result is categorized into an efficient group (efficiency value equal to 1) and an inefficient group (efficiency value less than 1). A completely inefficient DMU assumes a value of 0. Data envelopment analysis has been used for years to study the efficiency of university research [12,13,14,15].
According to Hsu [16], the main methods for selecting input and output items for efficiency assessment are (1) literature analysis, (2) evaluation or review of indicators, and (3) the Delphi method. Since most studies to evaluate the efficiency of educational institutions used the literature analysis method to select the input and output items, we summarized and analyzed the input and output items by referring to the past literature to evaluate the research efficiency of the teaching unit of the university. The research output items are evaluated by the research-oriented indicators assessed by the university department, and the research output items are regulated by the evaluation of the faculty performance assessment. In the output, special reference is made so that research output items are evaluated by the research-oriented indicators of the case university departments. The research output items are regulated by the research evaluation of the faculty performance assessment to make the selected items more objective and applicable to the case university.

3. Research Methodology

3.1. Research Framework

The framework of this study is based on the research motivation, research objectives, and related literature. The research framework in Figure 1 is used to empirically analyze the “research efficiency of departments of the case university”.

3.2. Research Subjects and Decision Making Units

The samples used in this study are the departments of a national university in Taiwan, including 40 departments of the five non-physical colleges as the decision-making units. The research efficiency of 40 departments from 2015 to 2017 was analyzed. In the data analysis, each college was de-identified and coded as A, B, C, and so on, and the N department of the A college was coded as A- N.

3.3. Input and Output Items

In this study, the inputs and outputs selected were summarized and analyzed by considering the attributes of the case university and referring to the literature. There were five inputs (FTE professors, FTE associate professors or higher, academic research awards attained, doctorates, and research funding) and six outputs (approved budget, approved projects, citations in international databases, number of publications under peer review, professionally reviewed monographs (chapters) and patents, and winners of iconic external academic awards).
If we limited the calculation to the number of journal articles, it would not be able to include the overall research performance and would likely be influenced by the characteristics of the discipline and hide the performance of certain disciplines such as humanities and social sciences. This is often represented by an important monograph [17]. In addition, the Ministry of Education promulgated the “Regulations Governing Accreditation of Teacher Qualifications at Junior Colleges and Institutions of Higher Education”, which includes professors’ patents as the research results for promotion and examination. thus establishing patents as an important indicator of the research output of university professors. In addition, the university also recognizes patents as the result of evaluating professors’ research performance to encourage them to engage in industry–academia collaboration. In practice, it is common to combine the number of books and patents as one of the research output items [17,18]. We added the number of books and chapters to include “the number of professionally reviewed monographs (chapters) and patents“ as one of the research output items.

3.4. Data Collection and Processing

The data were collected from 2015 to 2017, and the sources of data for this study were official databases and website information from the following four sections: (1) the Higher Education Database of the Ministry of Education, (2) the Teacher’s Publication Catalogue Database, (3) the case university’s official website, and (4) the Scopus database.
In the DEA model for efficiency assessment, the number of inputs and outputs increase when the number of inputs increases, which is known as isotonicity. To examine this relationship, we used Pearson Product-moment Correlation Analysis. In this study, the CCR and BCC models for data envelopment analysis were used to calculate the research efficiency values of each department, including the overall efficiency, pure technical efficiency, and scale efficiency values for each year. The DEAP2.1-XP software developed by the Centre for Efficiency and Productivity Analysis at the University of Queensland, Australia, was used to perform the calculations.
According to Lovell [19], if the DMU is a profit-making organization, the market demand is random and uncontrollable, and the use of inputs can be freely adjusted. Thus, the input-oriented model was used to measure it. To measure the achievement of each output with the same level of input, we adopted an output-oriented model that emphasizes the measurement of research efficiency from an output perspective.
Finally, Slack Variable Analysis (SVA) was used to provide targets and magnitudes of improvement for relatively inefficient units so that relatively inefficient units could further be understood, whether they had too much input or not enough output, and obtain specific and quantitative data. The quantitative data were obtained to achieve relative efficiency and provide management with references and guidelines. In addition, through Reference Set Analysis, benchmark analysis was conducted to identify benchmark units and examine how often the relatively efficient units were used as reference objects for improvement by the inefficient units. The units referred to more often were listed as learning benchmarks.

4. Results and Recommendations

In this study, data were collected from 40 departments from 2015 to 2017 and were analyzed by using data envelopment analysis to analyze the research efficiency of each department. The results are as follows.

4.1. Stable Overall Research Efficiency

In this study, we analyzed the research efficiency of the departments in the non-physical colleges of the case university. The overall efficiency of the departments only showed a few differences in the three years, with the worst annual average in 2016 (0.841), the second best in 2015 (0.880), and the best in 2017 (0.888). The overall research efficiency was stable (Table 1).

4.2. Inefficiency Due to Scale Inefficiency and Technical Inefficiency

Among the 40 departments of the case university in this study from 2015 to 2017 with a total of 120 units (40 units/year × 3 years), 28 units (23%) were inefficient due to scale inefficiency; 5 units (4%) were inefficient due to pure technical inefficiency; and 22 units (18%) were inefficient due to both pure technical inefficiency and scale inefficiency. The overall inefficiency of 22 units (18%) was due to both pure technical inefficiency and scale inefficiency. In particular, 15 (63%) of the overall inefficient units (24) in 2016 were inefficient, indicating that departments were largely invested in resources but did not fulfill the optimal scale of output. (Table 2)

4.3. Importance of Research Funding and External Academic Awards and Citations

According to the slack variable analysis, among the inefficient departmental inputs, research funding needed to be improved in 2015, and the improvement rate was 44%. In 2016, research funding especially needed to be improved, and the improvement rate was 37%. In 2017, research funding and the number of doctorates were the items to be improved, and the improvement rate was 52%. Among the inefficient departmental inputs from 2015 to 2017, the most important items for improvement were reductions in research funding, which means that more inefficient resources were invested in research (Figure 2).
Among the overall inefficient departmental output items, the number of external academic awards needed to be improved in 2015, and the improvement rate was 700% (the actual was 2, and the improvement target was 16), followed by an increase in the number of international citation database articles, and the improvement rate was 153%. The number of external academic awards needed to be improved in 2017 with an improvement rate was 125% (the actual value was 4, and the improvement target was 9), followed by an increase in the number of approved research projects with an improvement rate was 102%. Overall, the most important item for improvement from 2015 to 2017 was the increase in the number of extramural academic award recipients, followed by increasing the number of articles in the international citation database (Figure 3).

4.4. Twenty-Five Benchmark Units from 2015−2017

The departments with a high number of references in the reference set analysis (a strong efficiency unit, referred to more than three times) had high overall efficiency and were used as a learning benchmark for other units (Table 3).

5. Conclusions

In this research, we adopted the DEA methodology to evaluate the research efficiency of the case. We analyzed the research efficiency of departments of non-physical colleges from 2015 to 2017 to identify the departments that must be improved and the departments that could be the benchmarks and improve research efficiency. In summary, the study result provides specific research results and recommendations for further reference by the research policymakers.

Author Contributions

Conceptualization, S.-M.C. and M.-J.C.; methodology, S.-M.C. and M.-J.C.; validation, S.-M.C.; formal analysis, S.-M.C.; investigation, S.-M.C.; data curation, S.-M.C.; writing—original draft preparation, S.-M.C.; writing—review and editing, S.-M.C. and M.-J.C.; supervision, S.-M.C. and M.-J.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chao, S.-M.; Chen, M.-J. An Evaluation of the Research Efficiency of Colleges in the University—Case Study of the C University. In Proceeding of the 2021 IEEE International Conference on Social Sciences and Intelligent Management (SSIM), Taichung, Taiwan, 29–31 August 2021. [Google Scholar] [CrossRef]
  2. Yang, Y. The latest reform of research assessment in UK universities—Planning and implementation of the Research Excellence Framework (REF). Eval. Bimon. 2014, 52, 41–47. [Google Scholar]
  3. Wang, R.Z. International Evaluation of University Research Performance, 1st ed.; Higher Education Evaluation & Accreditation Council of Taiwan: Taipei, Taiwan, 2008. [Google Scholar]
  4. Zhang, L. Research and Development of Research Performance Evaluation Methods in Universities. Master’s Thesis, Nanjing University of Science and Technology, Nanjing, China, 2012. [Google Scholar]
  5. Farrell, M.J. The measurement of productive efficiency. J. R. Stat. Soc. 1957, 120, 253–290. [Google Scholar] [CrossRef]
  6. Charnes, A.; Cooper, W.W.; Rhodes, E. Measuring the efficiency of decision making units. Eur. J. Oper. Res. 1978, 2, 429–444. [Google Scholar] [CrossRef]
  7. Ji, Y.-b.; Lee, C. Data envelopment analysis. Stata J. 2010, 10, 267–280. [Google Scholar] [CrossRef] [Green Version]
  8. Lin, T.T.; Lee, C.-C.; Chiu, T.-F. Application of DEA in analyzing a bank’s operating performance. Expert Syst. Appl. 2009, 36, 8883–8891. [Google Scholar] [CrossRef]
  9. Samoilenko, S.; Osei-Bryson, K.-M. Using data envelopment analysis (DEA) for monitoring efficiency-based performance of productivity-driven organizations: Design and implementation of a decision support system. Omega 2013, 41, 131–142. [Google Scholar] [CrossRef]
  10. Arshadi, H.; Goudarzi, R.; Okhovati, M. A data envelopment analysis (DEA) approach to evaluate the research efficiency of Iranian universities. Int. J. Inf. Sci. Manag. 2022, 20, 95–108. [Google Scholar]
  11. Sun, S. Data Envelopment Analysis: Theory and Application, 1st ed.; Yangzhi Culture: Taipei, Taiwan, 2004. [Google Scholar]
  12. Munoz, D.A. Assessing the research efficiency of higher education institutions in Chile: A data envelopment analysis approach. Int. J. Educ. Manag. 2016, 30, 809–825. [Google Scholar] [CrossRef]
  13. Johnes, G.; Johnes, J. Measuring the research performance of UK economics departments: Application of data envelopment analysis. Oxf. Econ. Pap. 1993, 45, 332–347. [Google Scholar] [CrossRef]
  14. Lee, B.L.; Worthington, A.C. A network DEA quantity and quality-orientated production model: An application to Australian university research services. Omega 2016, 60, 26–33. [Google Scholar] [CrossRef] [Green Version]
  15. Rhaiem, M. Measurement and determinants of academic research efficiency: A systematic review of the evidence. Scientometrics 2017, 110, 581–615. [Google Scholar] [CrossRef]
  16. Hsu, F.-Y. Operating Efficiency Evaluation Study of Faculties in Private University—A Case Study of Certain Medical University. Master’s Thesis, National Sun Yat-sen University, Kaohsiung, Taiwan, 2010. [Google Scholar]
  17. Fu, Y.C. Evaluating the Development Strategy of Our University: An Example of Data Envelopment Analysis. Master’s Thesis, National Chengchi University, Taipei, Taiwan, 2007. [Google Scholar]
  18. Liao, H.-M. Assessing the Research Performance of Private Universities in Taiwan Using Data Envelopment Analysis and Clustering. Master’s Thesis, Fu Jen Catholic University, Taipei, Taiwan, 2008. [Google Scholar]
  19. Lovell, C.A.K. Production frontiers and productive efficiency. In The Measurement of Productive Efficiency: Techniques and Applications, 1st ed.; Oxford University Press: New York, NY, USA, 1993; pp. 3–67. [Google Scholar]
Figure 1. Research framework.
Figure 1. Research framework.
Engproc 38 00071 g001
Figure 2. Input improvement for inefficient DMUs—CCR model.
Figure 2. Input improvement for inefficient DMUs—CCR model.
Engproc 38 00071 g002
Figure 3. Output improvement for inefficient DMUs—CCR mode.
Figure 3. Output improvement for inefficient DMUs—CCR mode.
Engproc 38 00071 g003
Table 1. Annual department efficiency value—CCR model.
Table 1. Annual department efficiency value—CCR model.
ItemDepartment
Code
CCR -Overall Efficiency
201520162017MeanSD
1A-10.818 1.000 0.943 0.920 *0.076
2A-20.985 0.921 0.518 0.8080.179
3A-30.493 0.945 0.859 0.7660.170
4A-41.000 1.000 1.000 1.000 ***0.000
5A-50.658 1.000 1.000 0.886 **0.161
6A-61.000 0.845 1.000 0.948 **0.073
7A-70.914 0.972 1.000 0.962 *0.036
8B-10.742 0.438 0.735 0.6380.123
9B-21.000 0.944 1.000 0.981 **0.026
10B-31.000 1.000 1.000 1.000 ***0.000
11B-41.000 1.000 1.000 1.000 ***0.000
12B-51.000 1.000 1.000 1.000 ***0.000
13C-11.000 1.000 1.000 1.000 ***0.000
14C-20.866 1.000 0.912 0.926 *0.056
15C-31.000 0.996 0.769 0.922 *0.108
16C-41.000 0.712 1.000 0.904 **0.136
17C-51.000 0.989 0.833 0.941 *0.076
18C-61.000 0.727 1.000 0.909 **0.129
19C-71.000 0.817 1.000 0.939 **0.086
20C-81.000 1.000 0.832 0.944 **0.079
21C-90.167 0.091 1.000 0.419 *0.412
22C-101.000 0.342 1.000 0.781 **0.310
23E-11.000 0.735 1.000 0.912 **0.125
24E-20.948 1.000 1.000 0.983 **0.025
25E-30.793 0.929 0.628 0.7830.107
26E-40.828 0.484 0.530 0.6140.132
27E-50.635 1.000 0.753 0.796 *0.152
28E-61.000 1.000 0.913 0.971 **0.041
29E-71.000 0.874 1.000 0.958 **0.059
30E-81.000 1.000 1.000 1.000 ***0.000
31E-91.000 0.937 1.000 0.979 **0.030
32F-10.468 0.691 1.000 0.720 *0.218
33F-21.000 1.000 1.000 1.000 ***0.000
34F-30.428 0.755 0.395 0.5260.141
35F-40.682 0.975 0.652 0.6620.012
36F-50.783 0.722 1.000 0.835 *0.119
37F-61.000 0.571 0.250 0.607 *0.307
38F-71.000 1.000 1.000 1.000 ***0.000
39F-81.000 0.556 1.000 0.852 **0.209
40F-91.000 1.000 1.000 1.000 ***0.000
Mean0.8800.8410.8880.870
Note: The average * represents the number of research units that achieved a relative efficiency of 1.000 over the years. A * indicates that the unit has an efficiency value of 1 for one year, ** indicates an efficiency value of 1 for two years, and *** indicates that the unit has an efficiency value of 1 for three years.
Table 2. The overall efficiency of the departments in 2015–2017 unit classification—BCC model.
Table 2. The overall efficiency of the departments in 2015–2017 unit classification—BCC model.
Category/Period
(Number of Units)
201520162017
(16 Departments)(24 Departments)(15 Departments)
Scale inefficiency
(Pure technical efficiency = 1)
A-1,A-2,A-5


E-3
F-4,F-5

(6 departments)
A-2,A-3,A-6
B-2
C-3,C-5,C-7
E-1,E-3,E-4,E-7
F-1,F-3,F-4,F-8

(15 departments)
A-1,A-3

C-2,C-5
E-4,E-5,E-6


(7 departments)
Pure technical inefficiency
(Scale efficiency = 1)
A-3
C-9

(2 departments)

C-9,C-10

(2 departments)


F-6
(1 department)
Pure technical inefficiency and
Scale inefficiency
A-7
B-1
C-2
E-2,E-4,E-5
F-1,F-3

(8 departments)
A-7
B-1
C-4,C-6
E-9
F-5,F-6

(7 departments)
A-2
B-1
C-3,C-8
E-3
F-3,F-4

(7 departments)
Table 3. CCR model—reference set analysis table.
Table 3. CCR model—reference set analysis table.
ItemDepartment Code201520162017
1A-1-0-
2A-2---
3A-3---
4A-4010
51-5-12
6A-62-5
7A-7--0
8B-1---
9B-20-0
10B-39147
11B-46159
12B-5494
13C-1092
14C-2-1-
15C-31--
16C-45-1
17C-52--
18C-60-0
19C-71-0
20C -870-
21C -9--4
22C -100-0
23E-10-1
24E-2-05
25E-3---
26E-4---
27E-5-0-
28E-601-
29E-72-4
30E-8128
31E-93-3
32F-1--0
33F-21113
34F-3---
35F-4---
36F-5--0
37F-60--
38F-7442
39F-85-7
40F-9010
Note: More than 3 times are benchmark units, and “-“ refers to not relative efficiency units.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chao, S.-M.; Chen, M.-J. DEA Approach to Evaluate Research Efficiency of Departments in University. Eng. Proc. 2023, 38, 71. https://doi.org/10.3390/engproc2023038071

AMA Style

Chao S-M, Chen M-J. DEA Approach to Evaluate Research Efficiency of Departments in University. Engineering Proceedings. 2023; 38(1):71. https://doi.org/10.3390/engproc2023038071

Chicago/Turabian Style

Chao, Shu-Mei, and Mu-Jin Chen. 2023. "DEA Approach to Evaluate Research Efficiency of Departments in University" Engineering Proceedings 38, no. 1: 71. https://doi.org/10.3390/engproc2023038071

Article Metrics

Back to TopTop