Next Article in Journal
Measurement of the Income Difference of Rural Residents in Peri-Urbanized Areas and Its Influencing Factors: Evidence from Nanhai, Foshan, China
Previous Article in Journal
The Effects of Restoration Practices on a Small Watershed in China’s Loess Plateau: A Case Study of the Qiaozigou Watershed
Previous Article in Special Issue
Communication on Sustainability in Spanish Universities: Analysis of Websites, Scientific Papers and Impact in Social Media
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fostering Sustainable Quality Assurance Practices in Outcome-Based Education: Lessons Learned from ABET Accreditation Process of Computing Programs

by
Abdullah M. Almuhaideb
and
Saqib Saeed
*
Department of Computer Science, College of Computer Science and Information Technology, Imam Abdulrahman Bin Faisal University, P.O. Box 1982, Dammam 31441, Saudi Arabia
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(20), 8380; https://doi.org/10.3390/su12208380
Submission received: 24 August 2020 / Revised: 28 September 2020 / Accepted: 7 October 2020 / Published: 12 October 2020

Abstract

:
Education is an important enabler for economic uplift of a society and academic institutions need to deliver quality education to equip students with required skills to excel in their professional careers. Due to international initiatives such as Washington and Seoul accords, outcome-based education has gained significant interest from industry, academia, governments, accreditation bodies and students. Outcome-based education is a paradigm shift form conventional education approach and its successful adoption requires sustainable quality practices by higher education institutions. Fostering quality assurance processes for outcome-based education requires careful planning and active collaboration among stakeholders. However, due to the sparse body of knowledge about quality processes in outcome-based education, many academic institutions rely on ad hoc practices, resulting in a trial and error approach. In this paper, we present set of guidelines which can help academic institutions to deploy sustainable practices in their academic programs. We document important guidelines to deliver outcome-based education based on our longitudinal work of ABET accreditation process of three different computing programs (Computer Science, Computer Information Systems, and Cyber Security and Digital Forensics). The successful application of proposed guidelines helps to foster sustainable quality practices in academic programs.

1. Introduction

Education is considered as an important enabler for social and economic uplift of communities, as it helps individuals to improve their socio-economic status by gaining employment. However, it is critical that academic programs should develop required skills in graduates to improve their employability. Quality education is set as one of important sustainable development goals by the United Nations and it refers to provision of uniform skillset among learners [1]. To deliver quality education, academic institutions, need to develop integrated processes with active collaboration among top management, faculty, industry, government, and communities. Quality has remained an important issue in the higher education and academic institutions have developed different internal and external quality measures to improve the quality. As a result, different quality control mechanisms such as peer reviews, student evaluation, employer feedback, accreditation, ranking, etc., have emerged. Although there is a divided opinion on the effectiveness of these tools, but still they are in practice in some form or another. As Hou highlights, due to the extreme mobility of students and faculty, there is a need for comparability of educational programs [2]. The comparability of education is very difficult tasks as the physical, financial, human, and intellectual infrastructures are not uniform across the educational institutions, so there is a need for a suitable framework which can realistically compare the academic credentials of students graduating from these diverse institutions. Keeping this in view, the notion of outcome-based education emerged, which advocates for setting a measurable learning objective of academic programs and then measuring those objectives with appropriate assessments to measure the effectiveness of the academic programs [3]. This way, it can be ensured that graduating students from any academic institution possess minimum required skill set necessary for professional practice. This process relies on continuous improvement to identify the weaknesses in academic programs so that they can be rectified quickly. As a result, academic programs continuously monitor graduating student’s skillset to meet the market needs. Due to this agility in this approach, there is pressure on stakeholders to continuously improve the programs. Different international initiatives like Seoul accord [4] and Washington accord [5] which aim for comparable standardized academic programs across different countries also promote the outcome-based education so that learners possess consistent attributes irrespective of their geographical location. As signatories of these accords different accreditation bodies such as ABET (Accreditation Board of Engineering and Technology) started encouraging educational institutions to adopt outcome-based education [6].
The transformation from conventional academic teaching to an effective outcome-based education initiative requires a careful planning and implementation activities. In an effective outcome-based education process, academicians need to strategically align the program outcomes with the market needs, deliver relevant skills through educational practices, and evaluate the attainment level of graduating students and defining remedial actions to improve attainment, in case any deviations in performance are observed in the assessment results. Continuous improvement is the essence of outcome-based education. Academic programs have many stakeholders, such as faculty, alumni, current students [7,8], employers, government representatives, parents, so quality assurance systems should have their voices heard to realize an academic program, which is satisfying the needs of stakeholders. However, their priorities may differ, so there is a need to foster appropriate communication processes among stakeholders to devise inclusive quality assurance practices which can help the academic programs to improve. Designing and implementing quality assurance processes is a complex practice and requires considerable knowledge, skills, efforts, and time. There are not many in depth case studies highlighting the implications of embedding quality culture in academic institution. As a result, many academic institutions follow trial and error approach which results in wastage of resources and time, and sometimes they may adopt ad hoc practices rather than well-defined procedures. The objective of this paper is to document sustainable practices to foster integrated quality assurance processes in academic institutions. Therefore, in this paper, we present a framework based on our longitudinal action research work at a higher education institution in Saudi Arabia. The findings will help other higher education institutions to deploy sustainable practices to impart outcome-based education.
The rest of the paper is structured as follows: Section 2 describes the related work followed by a detailed methodology in Section 3. Section 4 presents the empirical findings for fostering effective outcome-based education which is followed by a discussion in Section 5 and a conclusion in Section 6.

2. Related Work

There are many studies in the education literature focusing on quality aspects of the higher education sector, but due to dynamic changes in the academic processes, there is a need for continuous research on emerging paradigms in higher education. Abukari and Corner have highlighted that establishing quality assurance systems in higher education is critical and they proposed a quality assurance model for higher education institutions of developing countries [9]. Utuka has carried out a comparative analysis of higher education sector in New Zealand and Ghana and presented the gap analysis of quality management practices [10]. ElAlfy and Abukari have highlighted that facilities, academic and administrative services are the main pillars of service quality in higher education sector [11]. Teeroovengadum et al. have carried out an empirical study with students in Mauritius and found that technical service quality, perceived image, and value influence students’ satisfaction [12]. Phuc et al. have investigated master program in economic management to identify factors affecting outcome-based education. They found that once students perceive that an academic program is beneficial for them, then they apply their knowledge and skills effectively to excel in the academic program [13]. Ram et al. have recommended to use varying level of Bloom’s taxonomy for writing learning outcomes and assessment items in geoscience curriculum [14]. Sasipraba et al. have highlighted that it is very challenging to design an assessment of the capstone project to evaluate the program outcomes. Based on their experience, at Sathyabama Institute of Science and Technology, they presented an assessment methodology and rubrics to evaluate capstone project for outcome-based education [15]. Rathy et al. have described that outcome-based education is student centric in nature, so they have proposed micro-level knowledge structures in teaching power electronic engineering curriculum [16]. Lavanya et al. have described that, in a successful outcome-based education, learning transformation should be observable and formative, and summative assessments can be used to measure the students’ attainment [17]. Iqbal et al. have highlighted that outcome-based education has transformed the pedagogical approach in medical education and they recommend using “ADAPTIVE species” model in teaching to foster outcome-based education [18]. Tan et al. have carried out a critical literature review and highlighted that outcome-based education has shown competency enhancements in nursing students [19]. Senaratne and Gunarathne have presented a case study of adopting outcome-based education in an accounting degree in a Sri Lankan university and outlined guidelines for adoption of outcome-based education [20]. Rajak et al. have discussed their experience of outcome attainment in an academic program in India [21]. Premalatha has highlighted the mechanism for mapping the course outcomes and program outcomes to foster outcome-based education [22]. Manzoor et al. have carried out a study and concluded that transformation from conventional education to outcome-based education has a positive impact on students learning experience [23]. Cooper et al. have highlighted that there is a pressure from accreditation bodies on academicians to design measurable learning outcomes for the academic programs and courses in computing domain [24]. Kahlon et al. have documented that competency-based education is the way forward for computing faculty to measure the readiness of students for industry and academia [25]. Nguyen et al. have implemented outcome-based education framework in higher and vocational education in the context of Financing and Promoting Technology (FPT) education. They documented a positive impact on the learning and employability of learners undergoing outcome-based education [26]. Hu et al. have developed a dashboard to analyze student learning progress at a university in Hong Kong. This dashboard based analytic application helps an effective monitoring support system [27]. Xu et al. have developed an outcome-based computational thinking program for teachers in China which helped the teachers to apply computational theory concepts in practical skill development [28]. Lam et al. have studied the role of online collaborative learning in students learning outcome attainment at the Education University of Hong Kong. They found that online collaborative learning enhances the attainment of learning outcomes [29].
Accreditation Board of Engineering and Technology (ABET) is a renowned global accreditation body which advocates for outcome-based education. ABET has four commissions namely Applied and Natural Science Accreditation Commission (ANSAC), Computing Accreditation Commission (CAC), Engineering Accreditation Commission (EAC) Engineering Technology Accreditation Commission (ETAC) [6]. Each commission sets out generic and program specific criteria for the academic programs in their domain. The weblinks of detailed ABET criteria are given in Appendix B. There are some studies in literature who have investigated ABET accreditation process. Adams et al. have documented their experience of aligning an undergraduate laboratory course in chemical engineering with ABET criterion 3 and they named this approach as inquiry guided laboratory [30]. Barr based on his ABET visit experience, has discussed the impact of changes in ABET criteria 3 and 5 [31]. Alhakami et al. have proposed to use data mining predictive algorithms on student data in ABET course files to measure the attainment of program outcomes [32]. Meah et al. have presented their experience of adopting capstone project for multi-disciplinary students and they have documented the attainment process of ABET program outcomes [33]. Delatte et al. have analyzed the ABET criteria changes and recommended that ABET should have a roadmap for proposed changes to ensure predictability of changes in ABET criteria [34]. Peridier has advocated to make faculty as core of continuous improvement process. He further recommended making a separate committee for each program outcome. These committees should be responsible for summative data collection, assessment review and curricula change management [35]. Bachnak et al. have highlighted the need for updating assessment plans for ABET accreditation aspiring programs, so that the ABET visit could yield positive results [36]. Shafi et al. have presented the assessment methodology based on their experiences of ABET accreditation process of computer science and computer information systems undergraduate programs [37]. Zambrano has presented a two-tier continuous improvement model for ABET accreditation, where first tier focuses on curriculum improvement whereas the second tier focuses on improvement in the measurement process of learning outcomes [38]. Hussain et al. have documented their experience of ABET accreditation process of an electrical engineering degree program. They have highlighted the assessment as well as continuous improvement processes employed for the successful realization of ABET accreditation [39]. Despite these contributions, there is a lack of work which can provide insights of the complexities of planning and execution processes of fostering sustainable quality practices and outline mechanisms to optimally instantiate an inclusive quality assurance framework. Keeping this in view, we have presented a quality assurance framework to foster outcome-based education based on our accreditation work.

3. Materials and Methods

In our longitudinal empirical work, we mainly followed an action research approach [40] to identify the best practices to foster a quality assurance mechanism at the College of Computer Science and Information Technology (CCSIT) at Imam Abdulrahman Bin Faisal University (IAU) [41]. Adopting the case study approach helped us in understanding complexity of the collaboration process required to adopt quality assurance processes in practice [42]. The focus of the work was to develop effective quality assurance processes which can result in sustainable quality practices beyond a mere accreditation drive. Our case setting (CCSIT) currently offers four academic programs, namely, Bachelor of Science in Computer Science (CS), Bachelor of Science in Computer Information Systems (CIS), Bachelor of Science in Cyber Security and Digital Forensics (CYS), and Bachelor of Science in Artificial Intelligence (AI). All the programs have adopted outcome-based education since their inception and follow quality assurance practices, which guided these programs to successful ABET accreditation. CS, CIS and CYS programs are already accredited by ABET. First batch of AI program students are not graduated yet; however, they follow the same quality assurance procedures being followed across other programs at the college and the program will apply for ABET accreditation after the graduation of first batch of students. ABET accreditation requires filing of an expression of interest from aspirant institutions which is followed by submitting a comprehensive self-study report of the aspirant program. Self-study report outlines the strengths of the academic program by providing a detailed evidence of adherence of ABET standards. Once the self-study is approved by the ABET headquarters, experts are nominated to conduct a comprehensive onsite visit of the academic institution offering the program applying for accreditation. After the visit, the evaluators document their findings and a response can be submitted by the academic institutions. ABET board makes a final accreditation decision by analyzing evaluators onsite visit report and institution’s reply to findings of evaluators.
When students join IAU, they undergo a preparatory year before joining CCSIT. At CCSIT, the first two years are common across all programs whereas the last two years have specialized modules pertaining to each academic program. CS and CIS program outcomes have just recently been updated based on the ABET criteria change. However, in this paper the data are based on old program outcomes, since at the time of assessment older program outcomes were applicable. In the case of the CYS program, the ABET-revised criteria have already been updated so CYS assessment data are based on ABET revised criteria. At the conclusion of cycle 1 of CS and CIS programs, an action plan was designed and after its implementation the second assessment cycle was executed. Therefore, our recommendations are based on designing and executing five assessment cycles across three different academic programs.

4. Empirical Results

To deliver an effective academic program by higher education institutions, there is a need for effective quality assurance processes in place. We have developed a simplified framework for fostering an effective quality assurance model for effective outcome-based education. As shown in Figure 1, we have classified activities into four categories which are explained in the following subsections.

4.1. Strategic Planning

Top management of the educational institution needs to play an active leadership role to strategically plan the quality assurance processes. There is need for setting goals, establishing timelines, designing organizational structure, defining roles and responsibilities, and defining processes to carryout quality assurance tasks in the academic institutions. To develop a shared understanding among all stakeholders, it is very critical that academic programs should be consistent with the organizational mission and there should be documentary evidence which needs to be shared among stakeholders. Keeping this in view, as a first step we recommend having strong cohesion among visions and objectives of organization, academic department, and academic programs. In our case, we started with developing a mapping of the college goals with the University goals to show the consistency. Table A1 (Appendix A) highlights that CCSIT has the three goals and they are highly consistent with the goals of the University. Such close mapping provides confidence to the stakeholders that the academic program is in line with the strategic priorities of the organization.
Program educational objectives are abstract targets which are expected to be achieved after few years of graduation by the students. As a second step, program educational objectives of an academic program need to be fully aligned with the mission of the University and the College/department. Developing program educational objectives for an academic program is a complex task, which requires vision of future skillset, characteristics of the academic program and the department. We recommend establishing a program setup committee of experienced academicians in a department who can brainstorm on program educational objectives and present the recommendations to faculty board for approval. After approval, the college board/senate/advisory board should also discuss and finally approve them. Such involvement of faculty members and external stakeholders in these forums helps in reducing ambiguities and provides confidence to stakeholders that the academic program is in line with strategic priorities of the organization. As an example, Table A2 shows the mapping of CS program educational objectives (PEOs) with program, college, and university missions. In Table A2, “S” stands for strong mapping, “M” stands for medium mapping and “L” stands for low mapping.
As a third step, the program outcomes of the academic programs should be consistent with the program educational objectives. Program outcomes are the desired characteristics which graduates need to possess at the time of graduation. The development and approval process of program outcomes should involve stakeholders, like the PEO development process. It is recommended to adopt program outcomes as proposed by some accreditation bodies such as ABET and then add some additional specialized outcomes as per the program expertise (if needed). This way, it becomes easy for the program to prove that program outcomes are in line with the concerned accreditation bodies, otherwise an extra layer of mapping would be required to show mapping of program outcomes with accreditation body’s proposed outcomes. In our case, all academic programs adopted the ABET-proposed students’ outcomes [6]. As an example, there are four program educational objectives and 11 program outcomes of the CS program. Table A3 shows the mapping of program outcomes with program outcomes of CS program. The mapping shows a strong coherence in CS PEOs and program outcomes. Such documented mapping of program educational objective and program outcomes builds confidence among stakeholders that the goals of the programs are aligned with the required skillset by industry.
Another important activity in the planning stage is to develop an organizational structure in the educational institution which is responsible for carrying out different activities required by quality assurance framework. The majority of the educational institutions lack on this part and their organizational structure is mainly defined at higher level of abstraction and as a result, task responsibilities are not well defined at lower levels and, for the majority of the time, “heroes” are assigned to complete the tasks which limits the sustainability of organizational practices. It is very critical that assignment of responsibilities should be properly defined. Such an assignment will provide clarity to the roles and responsibilities of individuals. In our framework, we propose to establish dedicated units such as academic advising unit, alumni unit, program quality unit, academic affair unit, etc., to improve different processes in the academic settings. At the formation of these units, tasks, responsibilities and expected deliverables need to be defined. This task specialization will not only improve different academic processes but also the involvement of the faculty members in these units will help in fostering a quality culture and keep them better informed about institutional policies.

4.2. Educational Practices and Strategies

The curriculum is the backbone for the success of any academic program. At the time of curricula design it is very important that the program satisfies the requirements set by international and national bodies. In the case of computing programs, Institute of Electrical and Electronics Engineers (IEEE) [43], Association of Computing Machinery (ACM) [44], Association of Information Systems (AIS) [45] have jointly designed curricula guidelines [46], so each academic program should carry out a gap analysis with these guidelines to ensure that the program is meeting the international standards. Furthermore, accreditation bodies such as ABET also have some specific curriculum requirements for an academic program which need to be considered. Furthermore, for any academic program, the mapping of course modules with program outcomes is very critical to ensure successful delivery of program outcomes. Each program can follow a bottom-up or top-down approach. In a bottom-up approach, each module’s learning outcomes are analyzed to identify which program outcomes are appropriately covered in this module. Whereas in a top-down approach, for each program outcome, appropriate courses are identified and then the course learning outcomes of these courses are specifically designed to deliver appropriate content. Although the bottom-up approach looks more appropriate as each course is analyzed to identify which respective program outcome is covered. However, in this approach, many faculty members are involved, and each may have different perception and it may result in less coverage of some key contents. Some faculty members also think that if they are teaching a specific course it should be mapped to all program outcomes, as mapping to few program outcomes may make it seem that a respective course is not important for the academic degree. Therefore, we recommend having a group of experienced faculty members who use top-down approach and prepare a mapping for the academic program. It is also recommended that each program outcome should be further decomposed into different performance indicators. The mapping should be devised corresponding to each performance indicator and there should be at least three modules mapped to a performance indicator for better coverage of associated content. As an example, Figure 2 highlights the program mapping of CYS program at IAU and it should be noted that shaded rows are not included in the program mapping as these courses are either University requirements or elective and they may change. Each performance indicator is mapped with three levels, showing as (I) Introduce, (R) Reinforce and (E) Emphasis. Introduce means relevant concepts are introduced, whereas Reinforce and Emphasis highlight the repetition and advanced coverage of relevant concepts. This mapping should be widely shared across the teaching faculty and this will help them to understand that how a respective course is contributing to achieve a program outcome. Another important activity for this stage is to adopt appropriate educational pedagogies to successfully impart the desired skills among learners [47,48]. An optimal alignment of pedagogy with curriculum significantly improves the learning of students. There is a need for designing a regular training program for faculty and this training program should focus on technical topics as well as emerging pedagogical approaches. At IAU, there is a dedicated deanship of academic development, which is responsible for conducting pedagogical trainings for IAU faculty members. Specialized trainings were conducted for CCSIT faculty on pedagogical approaches and assessments mechanisms.

4.3. Assessment and Evaluation

Adopting an effective assessment strategy is the key for continuous improvement. A poor assessment strategy will not reflect the weaknesses and as a result the problems will persist despite all the quality assurance efforts. A robust assessment should be comprehensive as well as lightweight so that there should not be extra load on faculty [37]. In order to identify early warning signs there should be a formative assessment in place, rather than only summative assessments towards the end of the academic program. Generally, these formative assessments should be before the last year so that, in case of problems, required improvements can be implemented. To minimize the workload on the faculty members, formative assessments should be mainly based on instructor’s subjective assessment and there is also no need to keep the evidence of corresponding assessments. Before the commencement of the semester each instructor should know if his/her course assessments will be used for program assessment. It is important to understand that for assessment it is not mandatory that the course is specifically designed to deliver the relevant skills which are going to be measured. The specific skills may have already been instilled in any other previous course. For example, if you want to measure oral communication skills then it is not mandatory that you measure it in a module dedicated for communication skills, you can measure it in a technical course where there may be an oral presentation in the assessments. There is also a need to train faculty members that program assessment strategy is not measuring the performance of faculty rather it is focused on program quality. A program quality assurance team should collaborate with teachers to ensure that assessment questions aimed for measuring a specific program outcome/performance indicator are in line with the rigor required in rubrics.
To measure the attainment of performance indicators standardized rubrics should be developed. The rubrics can be developed for each performance indicator and respective assessments mapped to any performance indicator should use these specific rubrics for evaluation. These rubrics help faculty members to minimize the variations in assessment process. In Table A4, we present sample rubrics which we used to evaluate four performance indicators of SO 6 of our CYS program.
As an evidence, all the instructors need to maintain course portfolio of all academic activities which include lectures material, assessment data and student assessment samples. This enables the academic staff to document the course activities which can also be used for knowledge sharing with the next set of instructors in future course offering. A program quality assurance team should collaborate with teachers in filling the program assessment data based on the assessments conducted during the course. In case a course is having students from different academic programs, then separate data need to be collected for each set of students belonging to an academic program. Once the assessments data are collected, there is a need to combine data collected from different direct and indirect measures. A cohort analysis could give a clear understanding of attainment levels. As one performance indicator highlights one specific aspect of program outcome, so it is not recommended to mathematically average the attainment of each performance indicator to calculate the attainment level of a program outcome.
In this paper, we present the data based on two assessment cycles of CS and CIS programs whereas CYS data are based on only one cycle. We have used a 360° approach to collect assessment data from variety of direct and indirect assessment metrics. For direct assessments, we used summative data from course assessments and comprehensive exit exam conducted at the end of the academic program. Students are evaluated based on their performance and categorized in Poor (0–24%), Developing (25–49%), Developed (50–74%), and Exemplary (75–100%) categories. In order to measure the attainment of students, we take into account only students belonging to developed and exemplary categories. If a course is selected for program assessments by the quality committee, then the respective teacher needs to specify that which specific questions of an assessment were mapped to a performance indicator. In most of the cases, at least two assessments are used for more accurate results. Figure 3 provides attainment data for both cycle of CS and CIS and one cycle of CYS program. In all cycles, the target for attainment was 70% and the assessment data highlighted that the target level was achieved for each SO. The execution of two cycles provides further confidence that attainment was not accidental rather the students’ performance is consistently meeting the targets. The second assessment which we used to assess student attainment of program outcome was exit exam. At the time of graduation, students were asked to appear in an exit exam where each question was specifically designed for a performance indicator. There were few performance indicators which were not possible to measure in exit exam, so there were no questions in the exam for these performance indicators. As shown in Figure 4, some program outcomes were below the intended target of 70% in both CS and CIS programs. After analysis, we found two important reasons for this below par performance: a lack of student motivation and exit exam quality. After cycle 1, we improved the exit exam preparation process and arranged student awareness campaign. In cycle 2, we found slight improvement than cycle 1 but students still lacked motivation. Since the exit exam did not have any academic weightage for students, they preferred to focus more on completion of degree requirements rather than performing in the exit exam. As a result, at the completion of cycle 2, the quality team decided to not use exit exam as an assessment tool in future assessment cycles.
Indirect assessments provide the perceptions of relevant stakeholders regarding the attainment process of program outcomes. External advisory boards of academic program can be constituted having representatives of employers, technical experts, government representatives, faculty, parents, alumni, and students. Such a forum can provide detailed improvement guidelines for any academic program. Additionally, different surveys can be conducted to measure the outcome attainment. In our case, we have an annual meeting of external advisory board for each program. We have collected attainment data from formative assessments, alumni, exit and faculty surveys. Formative data were collected from instructors relying on their subjective assessment of student’s attainment of a performance indicator based on their progress in a course. Formative data was collected using the same performance categories as used for summative data. Formative data provide early indications of student performance and problems could be avoided early on. Figure 5 highlights the formative data from CS, CIS and CYS programs. The data highlight that, for each program outcome, the target of 70% attainment was achieved. In case of surveys, we asked respondents to rate attainment of each SO on a Likert scale (strongly agreed, agreed, true sometimes, disagreed, and strongly disagreed). It is also very important to extract program concerns and improvement suggestions from open ended survey questions rather than just relying on the attainment numbers. Figure 6 highlights the alumni survey responses of CS program and combined numbers of strongly agreed and agreed improved in cycle 2 as compared to cycle 1. A similar trend is observable in Figure 7 for CIS program. As Figure 8 highlights, in case of CYS program alumni were also satisfied about program outcome attainment. Figure 9 (CS Program), Figure 10 (CIS Program) and Figure 11 (CYS program) highlight that faculty members have positive views about program outcome attainment for all programs. Exit survey data are not presented in this paper as it was not collected in first cycles.

4.4. Continuous Improvement

Most of the quality assurance frameworks have this weakness that they collect a huge amount of data, but they do not extract improvement requirements from these data. While developing an improvement action plan there should be a separate analysis for each specific performance indicator. An attainment target level needs to be set and deficiencies to the attainment level should be identified and backtracked to find out the source of such discrepancy. An action plan needs to be formulated to supplement these weaknesses before next assessment cycle. There should be regular follow up on completion of action items. In our case, we have devised an action plan based on cycle 1 of CS and CIS program and cycle 2 assessment data show better performance as compared to cycle 1 data. In the case of the CYS program, we have developed an action plan and it is nearing completion. Table A5 shows an excerpt from CYS action plan follow up document.

5. Discussion

Quality education is an important enabler for society and educational institution have responsibility to deliver quality education. We have presented detailed quality assurance framework to foster effective outcome-based learning. This framework has chalked out set of tasks required to better collaborate among the stakeholders to deliver quality education. The adoption of this framework will help academic institutions to sustain quality practices and adopting a quality culture. The key practices of the framework are summarized below.

5.1. Management Support

Outcome-based education requires a quality policy which needs to be followed in the institution. It is very important to acquire management endorsement of the quality policy. Such endorsement helps in realizing the importance of these processes. These quality initiatives may result in extra documentation and workload; however, due to management support, there will be less resistance. We also recommend having periodic meetings of faculty and other academic staff with top management of the institution to convey the intent of the quality processes in the institution.

5.2. Organizational Structure

In order to promote quality culture across the organization, it is recommended to have designated units responsible for different tasks pertaining to program quality. The examples of such units could be program advisory unit, program quality unit, curriculum unit. This way the tasks are divided across different units and each unit can focus on their specialized tasks. Otherwise, all the quality work will be dependent on few individuals which is contrasting to the essence of quality culture.

5.3. Process Focus

In order to foster an effective quality management system, the practices need to be consistent and therefore it is highly recommended to develop a quality culture across the institution. This requires definition of organizational processes which should be documented and shared across faculty and other stakeholders. It is also important to keep a track of performance of these processes to continually improve these processes. Typical examples of important processes are student advising process, graduation requirements completion, program outcome definition and revision process, assessment data collection process, faculty promotion process, etc. Almuhaideb and Saeed have proposed a variety of processes to instantiate quality assurance in academic settings [49].

5.4. Training of Faculty and Support Staff

It is also very important to motivate faculty members so that they can participate effectively in quality assurance practices. Training is an important tool to consistently remind faculty members about following the quality practices. Posters and other artifacts can be used along with the regular workshops to train faculty members. Such sessions are especially helpful for junior faculty who may not know the educational strategies and theories, so they may have difficulties in understanding quality processes and terminologies and may make mistakes. Furthermore, pedagogical trainings help faculty to update their repertoire to foster better learning environment, this is in line with the findings of Bascopé et al. [50] and Dobrowolska et al. [51].

5.5. Consistency in Organizational and Program Mission

It is very critical for the academic programs to be consistent with organizational strategic goals. Therefore, it is recommended to have optimal mapping in University, college, department and program visions, and missions. Designing program outcomes, educational objectives, performance indicators and rubrics is very critical task as they represent the essence of the academic program. Program outcomes are those characteristics which are expected to be achieved by the students at the time of graduation whereas program educational objectives are those intended goals which are expected to be achieved by graduates after some time of graduation. An important starting point could be to adopt the program outcomes of accreditation agencies such as ABET [6] and add additional program outcomes to present the specific flavor of the program. It is very important for the program educational objectives to be broader than program outcomes and written in futuristic tense. Program educational objectives and program outcomes should be consistent with each other and should be approved by all the stakeholders. Furthermore, they need to be publicized as well. Since program outcomes are at a very higher abstraction level so they should be further decomposed into different performance indicators to cover different important dimensions of each program outcome.

5.6. Alignment of Curriculum with International Standards

The curriculum is the backbone of any academic program, so there should be a greater emphasis on curriculum design which should have a balance of theory and hands on training. It is very important to benchmark the curriculum with recommended curricula by international bodies such as IEEE and ACM [46]. Furthermore, different accreditation bodies such as ABET have specific curriculum requirements and it is important to measure the alignment of curriculum with such requirements to ensure that program curriculum does not lack any important component.

5.7. Formation of External Advisory Board

Refae et al. have also stressed that the role of advisory body in improving the learning experience of students needs to be explored [52]. We recommend that to acquire feedback from external constituencies each academic program should have an external advisory body. The members could be representatives of employers, government, field experts, alumni, current students, and parents of students. Such 360° feedback provides a balanced view and future directions for program improvement.

5.8. Educational Strategies and Program Mapping

There should be an experienced group (quality group) which collaborates with other faculty members to carryout gap analysis of curricula based on international recommendations such as ACM, ABET, etc. This group should develop curriculum mapping with program outcomes/performance indicators and develop an assessment strategy. Shafi et al. highlighted that an effective assessment is essential to accurately measure the attainment of student outcomes [37]. Therefore, we recommend that the assessment plan should be prepared in advance and based on assessment strategy, assessment data should be collected and analyzed. The data collection should not be left with individual faculty members rather the quality group should collaborate with faculty members.

5.9. Continuous Improvement

As Zambrano [38] and Husain et al. [39] have highlighted that continuous improvement in curriculum and assessments is important. We also agree with them that continuous improvement is the ultimate objective of quality assurance drive and many of the academic programs collect important assessment data but fail to make program improvements. Once improvement action plan is developed based on the assessment data, there should be a timeline to achieve these targets. Normally there is need for an active follow up by the top management to ensure that these actions are implemented in due course of time.

5.10. Technology Usage in Program Assessment

As program assessments results in huge data so there is a need to optimally handle this data. A web-based system could be used in data collection process which was also employed by Rajak et al. in their data collection process [21]. Furthermore, as Alhakami et al. [32] proposed, the usage of data analytics could help in identifying the weaknesses, we recommend applying different data mining algorithms to predict attainment levels before the summative assessments. Such kind of predictive analysis could be helpful to better understand the weaknesses in the programs.

6. Conclusions

Quality education requires fostering effective quality assurance processes in a place which can ensure the delivery of the required skillset in graduates to improve their employability. Globalization’s need for comparable credentials has increased and international agreements, such as the Washington accord and the Seoul accord, have provided frameworks for comparable qualifications. As a result, educational institutions have started adopting outcome-based education, which is an emerging paradigm to deliver quality education. Deploying effective outcome-based quality assurance programs require careful planning and deployment efforts from different stakeholders in education sector. There is a very little body of knowledge on best practices and guidelines to adopt effective outcome-based education. In this paper, we have presented detailed guidelines to foster an effective outcome-based program framework based on the ABET accreditation experience of three different academic programs. These guidelines will help educational institutions to deploy sustainable practices which will help in fostering a quality culture in academic institutions. As a future work, longitudinal studies need to be carried out across different educational institutions, where proposed guidelines need to be evaluated in cross-cultural environments. Such studies will help to understand the impact of cultural implications on the effectiveness of proposed guidelines.

Author Contributions

Conceptualization, A.M.A. and S.S.; methodology, A.M.A. and S.S.; data curation, A.M.A. and S.S.; writing—original draft preparation, A.M.A. and S.S.; writing—review and editing, A.M.A. and S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to thank the four anonymous reviewers for their very constructive comments. We would further like to thank Imam Abdulrahman Bin Faisal University’s management, Deanship of Quality and Academic Accreditation, CCSIT management, all department chairs, organization units, committees, current and former faculty members for their continuous support in fostering quality culture at CCSIT. Without the support of all these stakeholders, it would have not been possible to exercise such an initiative. Authors are also thankful to ABET headquarter, computing accreditation commission and reviewers for encouraging feedback on our quality initiatives.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Mapping of IAU and CCSIT Goals [41].
Table A1. Mapping of IAU and CCSIT Goals [41].
Goals of Imam Abdulrahman Bin Faisal UniversityCollege Goals
Learning: Enhance the Quality of Graduating Students.Discovery: Improve the Culture of Innovation and Research by Focusing on Areas Strategic to Imam Abdulrahman Bin Faisal University and the Kingdom.Engagement: Engage High Schools, Colleges, Alumni, Community and the Government.
1.
Create and sustain relevant, high quality instruction that fulfills the Kingdom’s needs and carry out research and services in the professional fields that are the developmental priorities for the Kingdom.
2.
Build a new organizational quality management system that is appropriate for a large and complex institute as IAU.
3.
Instill and develop a culture of quality at IAU, where all professional activities are recognized, where accountability is ensured and where IAU’s stated objectives are achieved at all levels and throughout all Departments.
4.
Expand the opportunities for students to learn and engage at IAU in order to support the realization of their academic and career aspirations.
5.
Provide access and support for all qualified students, while establishing a vibrant and interactive campus environment that leads to an engaging University community that fosters loyalty to IAU.
6.
Create state-of-the-art libraries and related learning resources that utilize all available technology.
7.
Develop, expand, and sustain modern academic facilities, equipment and related infrastructure in order to serve the requirements of quality instruction, learning, research, service programs, campus life and community outreach.
8.
Develop and implement a robust financial planning and management system to serve all academic programs, administrative support, and services units.
9.
Increase human resource capacities to accomplish teaching, research, and service missions more effectively.
10.
Create a culture of intellectual curiosity and research of the highest ethical standards and accomplishment that effectively serves the advancement, health and prosperity of the University and the community at large.
11.
Develop entrepreneurial initiatives and business partnerships with the community in order to expand private sector joint investments for future development.
Table A2. Mapping of program educational objectives with Department, College and University mission statements.
Table A2. Mapping of program educational objectives with Department, College and University mission statements.
CS Program Educational ObjectivesCS Program MissionCollege MissionUniversity Mission
The graduates of Computer Science program will:To offer a quality education in the various domains of Computer Science and prepare students for both their professional careers and lifelong learning by enhancing their problem solving skills and instilling in them a sense of responsibility towards serving their community, society and the nation in a professional manner.Provide quality computing education, discovery, and professional services with community engagementsProviding creative knowledge, research, and professional services with effective community partnerships
PEO-1: Apply computing knowledge and skills to design and implement solutions in computer science domain.SSS
PEO-2: Contribute effectively as an individual, team member and leader.SSS
PEO-3: Demonstrate ethical and social values in their professional practices.SSM
PEO-4: Engage in lifelong learning, higher education, career growth, and community service.SSS
Table A3. Mapping of CS program’s outcomes and program educational objectives [41].
Table A3. Mapping of CS program’s outcomes and program educational objectives [41].
Program OutcomesProgram Educational Objectives
CS Program Student OutcomesPEO-1: Apply computing knowledge and skills to design and implement solutions in computer science domain.PEO-2: Contribute effectively as an individual, team member and leader.PEO-3: Demonstrate ethical and social values in their professional practices.PEO-4: Engage in lifelong learning, higher education, career growth, and community service.
A: An ability to apply knowledge of computing and mathematics appropriate to the program’s Student Outcomes and to the discipline
B: An ability to analyze a problem, and identify and define the computing requirements appropriate to its solution
C: An ability to design, implement, and evaluate a computer-based system, process, component, or program to meet desired needs
D: An ability to function effectively on teams to accomplish a common goal
E: An understanding of professional, ethical, legal, security and social issues and responsibilities
F: An ability to communicate effectively with a range of audiences
G: An ability to analyze the local and global impact of computing on individuals, organizations, and society
H: Recognition of the need for and an ability to engage in continuing professional development
I: An ability to use current techniques, skills, and tools necessary for computing practice
J: An ability to apply mathematical foundations, algorithmic principles, and Computer Science theory in the modeling and design of computer-based systems in a way that demonstrates comprehension of the tradeoffs involved in design choices
K: An ability to apply design and development principles in the construction of software systems of varying complexity
Table A4. Sample rubrics for performance indicator of SO 6 of CYS program.
Table A4. Sample rubrics for performance indicator of SO 6 of CYS program.
Performance IndicatorsRubrics
Poor or Non-ExistentDevelopingDevelopedExemplary
6.1 Students demonstrate the abilities to evaluate variety of cybersecurity and digital forensics tools/techniques to achieve appropriate solutionsThe student doesn’t have any background about possible solutions and wrongly selected the solutionThe student has limited understanding the alternatives techniques in CYS environment and the selected solution can further be improvedThe student has partially Understand the alternatives techniques in CYS environment, but the selected solution was correctThe student has completely understood the alternatives techniques in CYS environment and chose an appropriate solution
6.2 Students demonstrate the abilities to apply the concepts of data, software, component, connection, and system securityThe student shows inability to work on contemporary development toolThe student is able to gain limited working mastery of contemporary development tool only under instructor guidanceThe student is able to gain working mastery of contemporary development tool only under instructor guidanceThe student is able to gain in depth mastery of contemporary development tool without instructor guidance
6.3 Students demonstrate the abilities to analyze and manage security risks affecting business continuityThe student has limited knowledge of possible cyber threats and can’t build a plan for responding for wide range of threats The student is adequately knowing the possible cyber threats and has limited ability to build a plan for responding for wide range of threatsThe student is aware of some cyber threats and build working plan for mitigating those threats The student is able to identify most potential cyber threats and build an applicable plan to mitigate such risks
6.4 Students demonstrate abilities to carry out cybersecurity strategic planning targeting organizational infrastructure securityThe student has weak ability to build future plans for securing organizational infrastructure.The student has limited ability to build future plans for securing organizational infrastructure.The student has adequate ability to build future plans for securing organizational infrastructure.The student has the ability to build effective future plans that is able to secure the organizational infrastructure
Table A5. An Excerpt from CYS Continuous Improvement Follow up Document.
Table A5. An Excerpt from CYS Continuous Improvement Follow up Document.
Assessed Course/s:Student Outcome (SO), Performance Indicator (PI)Attainment Percentage%Recommended ActionProgress
CYS 404
CYS 433
CYS 501
CYS 503
CYS 508
CYS 408
Student Outcome-1: Analyze a complex computing problem and to apply principles of computing and other relevant disciplines to identify solutions. Action 1: Adoption of Project based Learning. Decomposing a problem into appropriate components, soliciting a formulating requirements and resource estimation not only requires theoretical knowledge but also practical skills. Keeping this in view it is suggested to all course instructors contributing to this SO, to use project-based approach to impart these skills among students. The realization of project deliverables will develop student skills for this SOCYS curriculum unit has discussed the importance of this student outcome and has selected the candidate courses to improve this skill. The following two course have been selected to include discussions and projects to learn the project-based approach:
  • CYS 404 Information System Audit
  • CYS 503 Secure Software Design and Engineering
The following three courses are based on practical knowledge and are recommended to assess the project-based to achieve the said SO. They are:
  • CYS 433 Practical (Co-Op) Training
  • CYS 501 Project Proposal
  • CYS 508 Project Implementation
The mapping of CYS curriculum and the course syllabus are attached for reference
Performance Indicator 1.1: Students demonstrate the abilities to formulate and decompose a problem into appropriate components.82.20%
Performance Indicator 1.2: Students demonstrate the abilities to solicit and formulate requirements specifications.85.17%
Performance Indicator 1.3: Students demonstrate the abilities to estimate resources required for the proposed solution.88.12%

Appendix B

Accreditation criteria and supporting documents required for ABET accreditation are available at https://www.abet.org/accreditation/accreditation-criteria/.

References

  1. UN. Sustainable Development Goals-Quality Education. Available online: https://www.un.org/sustainabledevelopment/education/ (accessed on 16 September 2020).
  2. Hou, A.Y. Mutual recognition of quality assurance decisions on higher education institutions in three regions: A lesson for Asia. High. Educ. 2012, 64, 911–926. [Google Scholar] [CrossRef]
  3. Spady, W.G. Outcome-Based Education: Critical Issues and Answers; American Association of School Administrators: Arlington, VA, USA, 1994. [Google Scholar]
  4. Seoul Accord. Available online: https://www.seoulaccord.org/ (accessed on 16 August 2020).
  5. Washington Accord. Available online: https://www.ieagreements.org/accords/washington/ (accessed on 16 August 2020).
  6. ABET. Available online: www.abet.org (accessed on 16 August 2020).
  7. Siddiqui, R.K.; Saeed, S.; Wahab, F. Understanding role of student feedback in quality assessment: A case study. Vfast Trans. Educ. Soc. Sci. 2015, 3, 25–35. [Google Scholar]
  8. Mumtaz, H.; Saeed, S.; Wahab, F. Quality of university computing education: Perception of Pakistani students. Res. J. Recent Sci. 2013, 2, 24–30. [Google Scholar]
  9. Abukari, A.; Corner, T. Delivering higher education to meet local needs in a developing context: The quality dilemmas? Qual. Assur. Educ. 2010, 18, 191–208. [Google Scholar] [CrossRef] [Green Version]
  10. Utuka, G. Quality Assurance in Higher Education: Comparative Analysis of Provisions and Practices in Ghana and New Zealand. Ph.D. Thesis, Victoria Univesity of Wellington, Wellington, New Zealand, 2012. [Google Scholar]
  11. El Alfy, S.; Abukari, A. Revisiting perceived service quality in higher education: Uncovering service quality dimensions for postgraduate students. J. Mark. High. Educ. 2019, 1–25. [Google Scholar] [CrossRef]
  12. Teeroovengadum, V.; Nunkoo, R.; Gronroos, C.; Kamalanabhan, T.J.; Seebaluck, A.K. Higher education service quality, student satisfaction and loyalty. Qual. Assur. Educ. 2019, 27, 427–445. [Google Scholar] [CrossRef]
  13. Phuc, P.; Vinh, N.; Do, Q. The implementation of outcome-based education: Evidence from master program in economic management at Hanoi universities. Manag. Sci. Lett. 2020, 10, 3299–3306. [Google Scholar] [CrossRef]
  14. Ram, M.P.; Ajay, K.K.; Nair, A. Geoscience curriculum: Approach through learning taxonomy and outcome based education. High. Educ. Future 2020, 7, 22–44. [Google Scholar] [CrossRef]
  15. Sasipraba, T.; Navas, R.K.B.; Nandhitha, N.M.; Prakash, S.; Jayaprabakar, J.; Pushpakala, S.P.; Subbiah, G.; Kavipriya, P.; Ravi, T.; Arunkumar, G. Assessment tools and rubrics for evaluating the capstone projects in outcome based education. Procedia Comput. Sci. 2020, 172, 296–301. [Google Scholar] [CrossRef]
  16. Rathy, G.A.; Sivasankar, P.; Gnanasambandhan, T.G. Developing a knowledge structure using outcome based education in power electronics engineering. Procedia Comput. Sci. 2020, 172, 1026–1032. [Google Scholar]
  17. Lavanya, C.; Murthy, J.N.; Kosaraju, S. Assessment practices in outcome-based education: Evaluation drives education. In Methodologies and Outcomes of Engineering and Technological Pedagogy; IGI Global: Hershey, PA, USA, 2020; pp. 50–61. [Google Scholar]
  18. Iqbal, S.; Willis, I.; Almigbal, T.H.; Aldahmash, A.; Rastam, S. Outcome-based education: Evaluation, implementation and faculty development. MedEdPublish 2020, 9. [Google Scholar] [CrossRef]
  19. Tan, K.; Chong, M.C.; Subramaniam, P.; Wong, L.P. The effectiveness of outcome based education on the competencies of nursing students: A systematic review. Nurse Educ. Today 2018, 64, 180–189. [Google Scholar] [CrossRef] [PubMed]
  20. Senaratne, S.; Gunarathne, A.N. Outcome-based education (OBE) in accounting in Sri Lanka: Insights for teacher education. In Teaching and Teacher Education; Palgrave Macmillan: London, UK, 2019; pp. 23–47. [Google Scholar]
  21. Rajak, A.; Shrivastava, A.K.; Shrivastava, D.P. Automating outcome based education for the attainment of course and program outcomes. In Proceedings of the 5th HCT Information Technology Trends (ITT), Dubai, UAE, 28–29 November 2018; pp. 373–376. [Google Scholar]
  22. Premalatha, K. Course and program outcomes assessment methods in outcome-based education: A review. J. Educ. 2019, 199, 111–127. [Google Scholar] [CrossRef]
  23. Manzoor, A.; Aziz, H.; Jahanzaib, M.; Wasim, A.; Hussain, S. Transformational model for engineering education from content-based to outcome-based education. Int. J. Contin. Eng. Educ. Life-Long Learn. 2017, 27, 266. [Google Scholar] [CrossRef]
  24. Cooper, S.; Cassel, L.; Moskal, B.; Cunningham, S. Outcomes-based computer science education. SIGCSE Bull. 2005, 37, 260–261. [Google Scholar] [CrossRef]
  25. Kahlon, A.; Kennedy, A.; Smarzik, L. Competency-based education: The future of learning. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (SIGCSE ’19), Minneapolis, MN, USA, 27 February–2 March 2019; Association for Computing Machinery: New York, NY, USA, 2019; p. 1238. [Google Scholar]
  26. Nguyen, H.T.; Sivapalan, S.; Linh, N.T. Implementing an outcome-based education framework: Case studies of FPT Education. In Proceedings of the 2nd International Conference on Modern Educational Technology, Singapore, 15 May 2020; pp. 15–20. [Google Scholar]
  27. Hu, X.; Hou, X.; Lei, C.; Yang, C.; Ng, J. An outcome-based dashboard for Moodle and Open edX. In Proceedings of the 7th International Learning Analytics & Knowledge Conference (LAK ’17), Vancouver, BC, Canada, 13–17 March 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 604–605. [Google Scholar]
  28. Xu, Y.; Liu, P.; Tang, P. Exploration of outcome-based computational thinking education programs for teachers. In Proceedings of the 2nd International Conference on E-Society, E-Education and E-Technology (ICSET 2018), Taipei, Taiwan, 13–15 August 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 123–126. [Google Scholar]
  29. Lam, W.W.M.; Xie, H.; Liu, D.Y.W.; Yung, K.W.H. Investigating Online Collaborative Learning on Students’ Learning Outcomes in Higher Education. In Proceedings of the 2019 3rd International Conference on Education and E-Learning (ICEEL 2019), Barcelona, Spain, 5–7 November 2019; pp. 13–19. [Google Scholar]
  30. Adams, B.G.; Arce-Trigatti, A.; Arce, P.E. Developing an inquiry-guided laboratory manual with ABET-centered student learning objectives for chemical engineering transfer science courses. In Proceedings of the American Society for Engineering Education, Auburn, AL, USA, 8–10 March 2020. [Google Scholar]
  31. Barr, R. An ABET preparation perspective under the new proposed criteria 3 and 5. In Proceedings of the 2017 Gulf Southwest Annual Regional Conference, Richardson, TX, USA, 12–14 March 2017. [Google Scholar]
  32. Alhakami, H.H.; Al-Masabi, B.A.; Alsubait, T.M. Data analytics of student learning outcomes using Abet course files. In Proceedings of the SAI 2020, Advances in Intelligent Systems and Computing Conference, London, UK, 16–17 July 2020; pp. 309–325. [Google Scholar]
  33. Meah, K.; Hake, D.; Wilkerson, S.D. A multidisciplinary capstone design project to satisfy abet student outcomes. Educ. Res. Int. 2020, 2020, 9563782. [Google Scholar] [CrossRef]
  34. Delatte, N.; Ressler, S.J.; Morse, A.N.; Saviz, C.M.; Barry, B.E. Toward continuous improvement of EAC/ABET Criteria 3 and 5. In Proceedings of the ASEE Virtual Annual Conference Experience, College Park, MD, USA, 22–26 June 2020. [Google Scholar]
  35. Peridier, V.A. Faculty-directed continuous improvement regimen with intentional ABET/SO 1–7 Scaffolding. In Proceedings of the ASEE Virtual Annual Conference Experience, College Park, MD, USA, 22–26 June 2020. [Google Scholar]
  36. Bachnak, R.; Marikunte, S.S.; Shafaye, A.B. Fundamentals of ABET accreditation with the newly approved changes. In Proceedings of the ASEE Annual Conference and Exposition, Tampa, FL, USA, 16–19 June 2019. [Google Scholar]
  37. Shafi, A.; Saeed, S.; Bamarouf, Y.A.; Iqbal, S.Z.; Min-Allah, N.; Alqahtani, M.A. Student outcomes assessment methodology for ABET accreditation: A case study of computer science and computer information systems programs. IEEE Access 2019, 7, 13653–13667. [Google Scholar] [CrossRef]
  38. Zambrano, C. Continuous improvement model to systematize curricular processes in the context of ABET accreditation. In Proceedings of the International Conference on Frontiers in Education: Computer Science and Computer Engineering (FECS), Las Vegas, NV, USA, 29 July–1 August 2019; pp. 88–93. [Google Scholar]
  39. Hussain, A.A.; Tayem, N.; Nayfeh, J.; El Nakla, S. Undergraduate Engineering Program Assessment, Evaluation, and Continuous Improvement Process: A Case Study. In Proceedings of the 2020 ASEE Gulf-Soutwest Annual Conference, Albuquerque, NM, USA, 23–29 April 2020. [Google Scholar]
  40. Merriam, S.B. Qualitative research and case study applications in education. Revised and expanded. In Case Study Research in Education; Jossey-Bass Publishers: San Francisco, CA, USA, 1998. [Google Scholar]
  41. IAU. Available online: https://www.iau.edu.sa/en (accessed on 16 August 2020).
  42. Feagin, J.R.; Orum, A.M.; Sjoberg, G. (Eds.) A Case for the Case Study; UNC Press Books: Chapel Hill, NC, USA, 1991. [Google Scholar]
  43. IEEE. Available online: https://www.IEEE.org (accessed on 16 August 2020).
  44. ACM. Available online: https://www.acm.org/ (accessed on 16 August 2020).
  45. AIS. Available online: https://aisnet.org/ (accessed on 16 August 2020).
  46. Computing Curriculum Guideline. Available online: https://www.acm.org/education/curricula-recommendations (accessed on 16 August 2020).
  47. Yu, S.; Ally, M.; Tsinakos, A. Emerging Technologies and Pedagogies in the Curriculum; Springer: Singapore, 2020. [Google Scholar]
  48. Hennessy, S.; Wishart, J.; Whitelock, D.; Deaney, R.; Brawn, R.; La Velle, L.; McFarlane, A.; Ruthven, K.; Winterbottom, M. Pedagogical approaches for technology-integrated science teaching. Comput. Educ. 2007, 48, 137–152. [Google Scholar] [CrossRef]
  49. Almuhaideb, A.M.; Saeed, S.A. Process based approach to ABET accreditation: A case study of Cyber Security and Digital Forensics Program Accreditation. J. Inf. Syst. Educ. 2020. in submission. [Google Scholar]
  50. Bascopé, M.; Perasso, P.; Reiss, K. Systematic review of education for sustainable development at an early stage: Cornerstones and pedagogical approaches for teacher professional development. Sustainability 2019, 11, 719. [Google Scholar] [CrossRef] [Green Version]
  51. Dobrowolska, M.; Flakus, M.; Ślazyk-Sobol, M.; Wawoczny, A. Strengthening professional efficacy due to sustainable development of social and individual competences—Empirical research study among Polish and Slovak employees of the aviation sector. Sustainability 2020, 12, 6843. [Google Scholar] [CrossRef]
  52. Refae, G.A.; Askari, M.Y.; Alnaji, L. Does the industry advisory board enhance education quality? Int. J. Econ. Bus. Res. 2016, 12, 32–43. [Google Scholar] [CrossRef]
Figure 1. Quality Assurance Framework.
Figure 1. Quality Assurance Framework.
Sustainability 12 08380 g001
Figure 2. Program Mapping of Cyber security and Digital Forensics (CYS) Program.
Figure 2. Program Mapping of Cyber security and Digital Forensics (CYS) Program.
Sustainability 12 08380 g002
Figure 3. Attainment of Summative Data.
Figure 3. Attainment of Summative Data.
Sustainability 12 08380 g003
Figure 4. Attainment of Exit Exam.
Figure 4. Attainment of Exit Exam.
Sustainability 12 08380 g004
Figure 5. Attainment of Formative Data.
Figure 5. Attainment of Formative Data.
Sustainability 12 08380 g005
Figure 6. Alumni Survey response of Computer Science (CS) Program.
Figure 6. Alumni Survey response of Computer Science (CS) Program.
Sustainability 12 08380 g006
Figure 7. Alumni Survey response of Computer Information Systems (CIS) Program.
Figure 7. Alumni Survey response of Computer Information Systems (CIS) Program.
Sustainability 12 08380 g007
Figure 8. Alumni Survey response of CYS Program.
Figure 8. Alumni Survey response of CYS Program.
Sustainability 12 08380 g008
Figure 9. Faculty Survey response of CS Program.
Figure 9. Faculty Survey response of CS Program.
Sustainability 12 08380 g009
Figure 10. Faculty Survey response of CIS Program.
Figure 10. Faculty Survey response of CIS Program.
Sustainability 12 08380 g010
Figure 11. Faculty Survey response of CYS Program.
Figure 11. Faculty Survey response of CYS Program.
Sustainability 12 08380 g011

Share and Cite

MDPI and ACS Style

Almuhaideb, A.M.; Saeed, S. Fostering Sustainable Quality Assurance Practices in Outcome-Based Education: Lessons Learned from ABET Accreditation Process of Computing Programs. Sustainability 2020, 12, 8380. https://doi.org/10.3390/su12208380

AMA Style

Almuhaideb AM, Saeed S. Fostering Sustainable Quality Assurance Practices in Outcome-Based Education: Lessons Learned from ABET Accreditation Process of Computing Programs. Sustainability. 2020; 12(20):8380. https://doi.org/10.3390/su12208380

Chicago/Turabian Style

Almuhaideb, Abdullah M., and Saqib Saeed. 2020. "Fostering Sustainable Quality Assurance Practices in Outcome-Based Education: Lessons Learned from ABET Accreditation Process of Computing Programs" Sustainability 12, no. 20: 8380. https://doi.org/10.3390/su12208380

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop