Next Article in Journal
Global Seafood Trade: Insights in Sustainability Messaging and Claims of the Major Producing and Consuming Regions
Previous Article in Journal
Determining the Factors That Influence Electric Vehicle Adoption: A Stated Preference Survey Study in Beijing, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Strategy for the Evaluation and Monitoring of Competencies in Engineering Programs to Improve Students’ Learning and Quality of Education

by
Pamela Hermosilla
1,*,
Felipe Muñoz-La Rivera
2,3,4,
Nicolás Ateaga
5 and
Elisa Gallardo
6
1
School of Computer Engineering, Pontificia Universidad Católica de Valparaíso, Av. Brasil 2241, Valparaíso 2340000, Chile
2
School of Civil Engineering, Pontificia Universidad Católica de Valparaíso, Av. Brasil 2147, Valparaíso 2340000, Chile
3
School of Civil Engineering, Universitat Politècnica de Catalunya, Carrer de Jordi Girona 1, 08034 Barcelona, Spain
4
International Centre for Numerical Methods in Engineering (CIMNE), C/Gran Capitán S/N, UPC Campus Nord, Edifici C1, 08034 Barcelona, Spain
5
Faculty of Engineering, Pontificia Universidad Católica de Valparaíso, Av. Brasil 2147, Valparaíso 2340000, Chile
6
Department of Civil Engineering, Universidad de la Frontera, Francisco Salazar 1145, Temuco 4780000, Chile
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(21), 11721; https://doi.org/10.3390/su132111721
Submission received: 8 September 2021 / Revised: 11 October 2021 / Accepted: 12 October 2021 / Published: 23 October 2021

Abstract

:
The Sustainable Development Goals (SDGs) and the Education for Sustainable Development (ESD) framework propose the concept of competencies as a key cognitive, attitudinal, and procedural aspect aimed at the integral development of students, which implies a challenge in the way of evaluating them. Thus, the traditional monitoring of students’ progress through their grades is not enough, and monitoring competency is becoming more important. This research proposes a system for monitoring competencies in engineering programs. The system identifies the expected learning outcomes (LOs) of each course and cross-references them according to the ponderation of each evaluation planned. Subsequently, it links each LO with the competencies of the course, allowing the student to be monitored throughout their career. The strategy was applied to a civil engineering course. The students’ results, according to the course competencies, were obtained, linking them correctly with the LO and the grades obtained. The evolution of these competencies was evaluated until the end of the semester in which the students were taking the course. The analysis of the results shows the differences between the monitoring by grades versus by competencies, evidencing that there were cases in which a student passed the course by grades but failed to develop the expected competency.

1. Introduction

Improving the quality of and access to education is one of the aspects present in the Sustainable Development Goals (SDGs) proposed by the United Nations [1]. Accordingly, Education for Sustainable Development (ESD) suggests the need to foster structured processes to develop competencies in students that promote sustainable development [2]. This is based on the principles of continuous improvement and integral education, in which learning is not only based on “to know”, but extends to “to be”, “to do”, “to live together”, and “to transform” [3].
Competency-based education (CBE) seeks a comprehensive training of students, achieving a balance between theory and practice, by developing behavioural skills according to different levels of achievement defined for each specific course [4,5]. Thus, in the university context, competency-based educational programs seek to establish a set of competencies to be developed throughout a career, promoting the integration of knowledge, skills, and attitudes in the contexts of different disciplines [6], as well as student learning opportunities in accordance with the sustainable development goal of quality education [7]. In this sense, the concept of competency in CBE refers to a learning model or framework in a certain area, in this case education. In contrast, the idea of “competence” is associated with a particular ability or skill that can be measured individually in students based on a specific behaviour [8,9].
To evaluate the development and achievement of competencies throughout the training period, it is necessary to connect them with the contents and evaluations of each course [10]. The concept of learning outcomes makes it possible to assess the degree of appropriation of competencies through the more direct measurement of various achievement indicators [11]. Learning outcomes are measurable statements of what a student should know, understand, and apply in a curriculum; that is, they weigh knowledge (to know) with behaviour (know-how and know how to be) in specific contexts of interest [12]. Universities have been adopting competency-based education for years; however, monitoring both competencies as such and the learning outcomes of individual students is not a concept that is widely applied [13]. The monitoring of students is conducted through grades that are direct results of evaluations of different types, but not of learning outcomes or competencies as such [11,14].
Following the continuous improvement trends and the evaluation parameters of different international accreditation entities, this paper provides a general description of the aspects of the CBE model, giving a practical approach through its application to a real-life case study in the field of engineering. It is necessary to investigate concepts, such as learning outcomes (LOs), which play a fundamental role in the model to evaluate competencies [4,15]. Their relevance directly affects the appropriation of the competencies that emanate from the declared graduate profile. The integration of concepts, such as those mentioned, will reveal a way to measure the progress of students throughout their training process, identifying the achievement of competencies, which brings together aspects of knowledge, know-how, and knowing how to be [16,17].
In this article, the application and analysis of a case study are presented, through which it is possible to identify, evaluate, and measure the competencies of a specific course in the second year of studies, with which the initial basic training of an engineering career ends. The applied model allows a systematic review of the progress of the students and the level achieved by each student in a course, as well as what is expected at that level of study in the program, thus establishing a comprehensive evaluation system for the continuous improvement of engineering training processes.

2. Research Methodology

Our research methodology is organised in two stages: (1) concepts and (2) case study. Figure 1 shows the stages and specifies the research tools and activities, along with the results of each stage.
In the first stage, a literature review was carried out to identify the characteristics and key concepts of student learning outcomes and competencies in engineering programs. In order to identify these characteristics, it was necessary to recognise some keywords, which were selected based on relevance. The search was carried out in the Web of Science and Scopus libraries, with the incorporation of manuals and technical reports from recognised entities in the area and university accreditation organizations. Documents were selected from the year 1998 to the present. The following search topics were considered: (a) Competency-based education, (b) learning outcomes in CBE, and (c) continuous improvement cycle. Later, based on the general bibliographic support and a preliminary proposed method by the research team in [18], an “Assessment System for Learning Outcomes and Competencies in Engineering Programs” was described.
In the second stage, based on the system described, a case study was developed. The system was applied to part of a civil engineering program, and to the fourth semester course Dynamic Mechanics, to exemplify the application of the system. Two analyses were carried out: one at the course level, where the results of 3 semesters were studied (104 students), reducing the analysis to a systematic sample (probabilistic method, which is based on the principle of equiprobability). Then, an analysis at the program level was carried out, with the aim of exemplifying the impact of the application of the system proposed by the authors in the follow-up of the students. The results were visualised and analysed with a focus on identifying the advantages of the assessment system in engineering courses and programs.

3. Background

3.1. Competency-Based Education

For several decades now, competency-based education has been investigated in the teaching field, and it has its origin in the United States, where it was initially called performance-based teacher training. From the first approaches to the concept, in the 1960s, competency-based education was characterised by providing details of behavioural aspects of a professional development task, that is, considering procedural and attitudinal aspects in addition to conceptual notions [8,19,20]. This approach is intended to promote student learning from a more comprehensive perspective that considers the application of concepts in a more practical way, developing behavioural skills and aiming at the fulfilment of certain levels of achievement defined for certain courses [6,15].
The definition of the learning trajectory for the achievement of the relevant competencies must safeguard the adequate integration of situations or learning experiences, in which it is possible to demonstrate aspects of performance in realization of activities, as well as ensure meaningful learning in students that allows for an adequate connection between theory and practice [13,21]. These theoretical constructs result in three principles: learning has to be situated in a recognizable and meaningful context; it is necessary to connect theory and practice, because it lets students acquire experience and lets them reflect on these experiences; and knowledge, skill, and attitude should be integrated in learning trajectories [16,18].
These principles make it possible to establish that the design of competency-based education programs (CBE) is an approach that, by its nature, is increasingly closer to the educational field, and especially to the university field, which has chosen to base the processes of curricular redesign under this approach, and analyse assessment alternatives to best evaluate the competencies defined in the curricular plans of study [13,22].

3.2. Learning Outcomes in CBE

The competence-based education model implicitly considers the concept of learning outcome (LO), which allows for a more direct assessment of the degree of appropriation of skills, since its definition points to the development of learning experiences composed of elements that allow a more direct identification and evaluation of the desired achievement indicators [23]. Although the definition of learning results and their levels of achievement are more well known within the educational environment, there are various sources associated with this concept, among which are some imposing aspects, recommendations, and results of good practices [24]. To mention some: learning outcomes are statements of what a student is expected to know, understand, and apply in a given learning experience; learning outcomes allow a short-term measurable representation of the competencies associated with a study program, or the level of achievement of the learning results is a way of verifying the appropriation of competencies by students [25,26].
In this way, in the field of higher education and the training of professionals, the evaluation of learning results is a method that integrates different aspects in its assessment and measurement, allowing the assignment of levels of achievement [27] that are not easily identifiable if the assessment is based only on the contents of a course; the evaluation allows knowledge (know), performance (know to do), and behaviour (how to be) to be associated in its application to certain situations, allowing for a direct appreciation of the integral progress in the training of students [23,28].

3.3. Education for Sustainable Development (ESD)

Implementing the SDGs in the educational field requires an adequate integration of different elements that allow students to appropriate the knowledge, skills, and attitudes to solve the global interdisciplinary challenges facing today’s society. In this sense, the framework for Education for Sustainable Development (ESD) for the year 2030, coordinated by UNESCO, recognises Sustainable Development Goal (SDG) 4 on quality education as a key enabling agent of all the other SDGs [2].
In this sense, studies have been carried out that provide guidelines to incorporate these aspects in the design of educational curricula, delivering initiatives to reorient university training to manage sustainability. This has suggested that it is not only necessary to define key competencies for the SDGs, but also that this approach to ESD brings with it, in addition to knowing, knowing how to do, and knowing how to be, learning to live together and learning to transform, aspects that will undoubtedly lead us to rethink the terms of competence in the educational field [3,29].

3.4. Continuous Improvement

The commitment to ensuring and make visible the quality of training in higher education institutions is a priority task for universities [4], but also becomes a transversal axis in the approach to the sustainable development goals, considering that education is the basis for achieving them [7].
Most of the accrediting entities, for engineering careers, follow a common pattern: (1) publicly demonstrating through concrete mechanisms that a certain study program makes sure to educate the professionals in the graduation profiles, i.e., how the training objectives are achieved and made visible; (2) publicly demonstrate the processes that allow continuous improvement in a certain study program, and how improvement actions are carried out that benefit and add value to undergraduate training; (3) publicly demonstrate coherence and cohesion in the training process through institutional objectives (university level), the objectives of the study program (degree or program level), learning outcomes (course or subject level), how the contents are structured, how the assessment or evaluation of learning is carried out by teachers, and how feedback is given to students [30,31,32].
Table 1 shows key concepts that can be identified in different accrediting entities for engineering by territory. Improvement (CI); Program Evaluation (PE); Program Assessment (PA); Learning Outcome/Learning Objective (LO); Student/Graduate Outcome/Competence (S/GO/C); Educational/Program Objective (E/PO); Evidence Making (EM); Feedback Making (FM).
However, none of these entities propose a specific mechanism for how to achieve continuous improvement or how to measure the level of appropriation of the competencies declared in the graduation profiles, which ABET defines as assessment and evaluation [33]. Achieving continuous improvement in higher education means permanently detecting opportunities for improvement in all teaching–learning processes of a study program or faculty, in order that, both externally and internally, it is possible to make visible that all efforts go in the direction of delivering the best educational service possible, which means that all the actors in an institution are aligned with the same commitments and objectives [42].

4. Design of the Assessment System for Learning Outcomes and Competencies in Engineering Programs

The proposed evaluation system has been structured according to the five competencies cluster presented by Education for Sustainable Development: learning to know, learning to be, learning to do, learning to live together, and learning to transform [3]; to meet the challenge of educational programs (curricula and teaching methods), universities must innovate and improve to sensitise students to sustainable development [2].
The assessment system is structured for engineering programs that understand or define learning outcomes as objectives in undergraduate courses, and student outcomes, attributes, or competencies as educational objectives. Figure 2 illustrates the components of the assessment system, the actors, and the interrelationship between them.
This system methodology allows assessment, evaluation, and improvement at course and program levels. The program level assessment is defined by the average of all the courses involved in the evaluation strategy or, if applicable, by previously defined weightings (for example, by SCT). For these purposes, it recommends a unified database in which teachers can enter, save, and modify the parameters they use in their assessment process.
Table 2 summarises the procedure of the proposed assessment system, including formulas and descriptions. The procedure is separated into seven stages: (1) identification and definition of default input variables; (2) definition of system configuration input variable 1; (3) assessment and calculation of achievement factors of learning outcomes; (4) definition of system configuration input variable 2; (5) calculation of in-between parameters; (6) assessment and calculation of achievement factors of competencies (course level); and (7) calculation of achievement factors of competencies (program level).
In the first stage, the variables that the system considers must be defined. These are learning outcome, competence, assessment activity, and weight of assessment activity. Generally, these variables are in the course syllabus. Otherwise, the teacher must define these variables. In the second stage, the defined variables should be correlated, assigning a weight of the learning outcomes for each activity and the total weight of the course’s learning outcomes. The teacher must define these variables. In the third stage, using qualifications of assessment activities, it is possible to calculate achievement factors of learning outcomes. The relationship between a learning outcome and competency is obtained (variable E). To define them, the teacher must read both definitions and relate them to one another (1 for a relationship, 0 for no ties). In the fifth stage, the weight of each competency in each of the evaluation activities is calculated. In the sixth stage, using qualifications of assessment activities, it is possible to calculate achievement factors of competencies (not to be confused with the accumulated achievement factor of competencies). Finally, if the process is undertaken in a certain number of courses in the seventh stage, it is possible to calculate accumulated achievement factors of competencies. These factors make it possible to measure competencies.
Figure 3 shows all the components of the proposed evaluation and improvement strategy, which consist of three main stages: input of variables to obtain data (using the assessment system), obtaining data for evaluation, and obtaining and interpreting information to detect opportunities for improvement.
The competencies to be quantified and evaluated must be defined, together with the courses where the development of the competencies is declared. It should be noted that it is not necessary to cover all courses, but only those that are considered key or capstone. The teachers in charge of the courses must be aligned with the strategy and follow the procedure described in Figure 3 step by step, since their effort will determine the validity of and confidence in the data obtained, which, in turn, will serve as feedback for their students and the structure of their courses in order to make improvements. At the end of the semester, after having carried out all the assessment activities, it will be possible to calculate achievement factors of learning outcomes and partial achievement factors of competencies to realise the level of acquisition of learning outcomes by students.
As the semesters progress, and only the last course remains to be evaluated, it will be possible to monitor the development of the defined competencies, making explicit the strengths and weaknesses of the students and providing a way for possible improvement actions such as remedies, reinforcements, or adjustments in order to achieve an adequate level of appropriation with what is declared in the graduation profile. In addition, it will be possible to obtain information on the types of assessment instruments that were used through a training period, and the entire study program will be coherent in terms of content, assessment activities, learning outcomes, courses, and competencies.
Once this has been specified and applied over time, engineering programs that opt for the strategy will be able to publicly demonstrate that they have reached a cycle of continuous improvement.

5. Study Case

5.1. Implementation Context

The application of the competence assessment model presented above was carried out in the Civil Engineering section of the Faculty of Engineering at the Pontifical Catholic University of Valparaiso (PUCV). It initially considered a study of a specific course, and then at the level of the study program to see the level of advancement of a particular competency, until the end of the fourth semester of study, considered the first cycle of initial training in engineering careers.
The study plan of the degree as mentioned above has a total duration of 11 semesters, and each course contributes to different competencies, which are classified into fundamental (FC), disciplinary (DC), and professional (PC) training competencies.
The competencies described and numbered in Table 3 and Table 4 show details of the civil engineering courses until the fourth semester (initial cycle of the career). In the case of this study, the analysis at the course level is carried out in the fourth semester subject “Dynamic Mechanics” (No. 17), belonging to the area of engineering sciences whose professor is one of the authors of this article. For each course, the PUCV credits are detailed (1 PUCV credit is equivalent to a dedication of 40.5 chronological hours per semester), as are their contributions to career competencies.

5.2. Application of the Proposed Methodology

In a general context, Dynamic Mechanics is a theoretical–practical course that aims at having students develop and assimilate the fundamental principles of the laws of dynamics and achieve their correct application in problems associated with civil engineering. At the end of the course, students are expected to develop the following Learning Outcomes (LOs):
  • LO1: Understands the basic principles of the dynamics of particle systems and rigid bodies for the study of their movement behaviour.
  • LO2: Applies the fundamental concepts of the dynamics of particle systems and rigid bodies to the analysis of their motion and interaction with other bodies from the conception of mathematical models.
  • LO3: Describes and analyses vibrations of systems with one or more degrees of freedom in simple harmonic or damped regimes for the study of the behaviour of simple systems and structures.
In addition, according to Table 3 and Table 4, the course contributes to the competencies of a civil engineering career:
  • DC1: It uses the knowledge of the basic sciences to understand, pose, and solve mathematical models associated with physical phenomena and processes related to the field of civil engineering.
  • DC3: Manages the conceptual basis and analysis tools of the engineering sciences area to study and solve civil engineering problems and problems that transcend the field of specialty.
  • FC5: Demonstrates capacity for analysis, abstraction, synthesis, and critical reflection in order to solve problems, build knowledge, and develop self-learning, both individually and in interdisciplinary teamwork.
For the achievement of the learning results, the course considers three typologies of assessment activities: (1) quizzes, (2) tests, and (3) laboratory. Table 5 shows the description of each of these types of activities, along with the number and weighting of each in the course (as indicated by Equation (1) in Table 2).
Figure 4 shows the triad: competencies, learning outcomes, and assessment activities. It is possible to see the allocation of the weights to the learning outcomes for each activity, along with the correspondence between the learning outcomes and the competencies (as indicated by Equation (2) in Table 2).
The application to the course mentioned above was carried out in three semesters between 2018 and 2019, from which a study group was obtained. A sample was to be identified to determine the monitoring of competence at the program level. Of the periods considered for this study for the subject of Dynamic Mechanics, there were 104 students, of whom 30 were from semester one in the year 2018, 34 from semester two in the year 2018, and 40 from semester one in the year 2019. The scale assessment is from 10 to 70, and, upon approval of the course, a grade greater than or equal to 40.

5.3. Course Level: Results

With the input data obtained from Table 5 and Figure 4, it was possible to calculate the percentage of achievement for both learning outcomes (as indicated by Equation (3) in Table 2) and related competencies (as indicated by Equation (4) in Table 2). Figure 5 shows the distribution of course grades. The quizzes, tests, and laboratory results are shown in green tones, and the final averages in red. Figure 6 shows the distributions of learning outcomes (orange tones) and competencies (blue tones).
To perform an analysis of the data, a systematic sample was taken. It uses a probabilistic method based on the principle of equiprobability, meaning that all the individuals in the selected sample have the same probability of being chosen. This method assures us that the extracted sample will be representative. This type of sample has the following characteristics:
  • There is no discretion available to the researcher (subjective intervention of the sampler).
  • Items are selected by mechanical rules.
  • Sample error is considered.
  • The probability of inclusion is known.
To carry out this type of sampling, the sample size that we want to calculate must be initially determined according to Equation (8):
n = N   Z 2 p q d 2 ( N 1 ) + Z 2 p q
where n is the sample size that we want to calculate, N is population size, p the probability of success, q the probability of failure, d the maximum permissible error, and Z the confidence coefficient for a given confidence level. Table 6 shows the values of the selected parameters.
With the values of Table 6, according to Equation (8), the sample size of n = 25 is calculated. Then, the selection factor (k) is calculated by Equation (9):
k = n N
We obtain k = 4.217. Finally, we determinate A, which represents a random number between 1 and k; in this case, A was 3. Therefore, the students in this sample are determined from the indexes of the ordered list with the sequence A, A+K, A+2K, A+3K … all rounded to the nearest integer. With this, we obtain a sample of 25 individuals with representation proportional to the number of total students, and select individuals from a systematic sampling of the members of the course under study (presented in Table 7) and ordered according to the resulting average, where P is pass and F is fail.
Undertaking a general review of the data and investigation, it is possible to verify one of the premises of this research in terms of average and real scope of learning results and skills. If we look at students numbered 65 and 69 on the list, it is possible to verify that both have an average of 45; therefore, they pass the course, but their individual achievements are not the same. Among the most significant, there are differences of 8, 12, and 6 percentage points between LO1, LO2, and DC3, respectively, for the same final course result.
Along with the above, it is also possible to group the data according to grade ranges to facilitate analysis of the results, as shown in Table 8, with a stratification corresponding to every five points of variation in terms of averages. In this way, it is possible to identify the following summary data from the sample under study.

5.4. Program Level: Results

The proposed monitoring of competencies can also be focused on the level of the study program, which allows monitoring of the progress of each competency at different stages in the training process. Understanding that it is a gradual process, we consider, in Figure 7, the development of DC3 competence up to the fourth semester for the 25 students in the sample analysed in this case study. It is possible to visualise the trend of the formation of that competency. The lines in the red zone correspond to those students who, for the indicated semester, have not surpassed 60% of the achievement of the competence that they should have fulfilled in the corresponding courses. If students achieved 100% competency in DC3 in the fourth semester, they would have made 20% progress in this competency regarding the total study program (11 semesters).

6. Analysis and Discussion

6.1. Course Level

Based on the observed results, it is possible to affirm that, although an average of grades greater than or equal to 40, with which a subject is passed, does not reflect a direct relationship with the degree of fulfilment of the learning results or the appropriation of competencies associated with the course, this level of demand (average of grades greater than or equal to 40), associated with the applied mime of the grade scale, is 60%; however, this could be adjusted, since this requirement does not imply a change in the grade itself, but in what is expected in terms of skills by students. Thus, according to the stratified summary of the previous table, students being in Section 4 (40–44), the first within the “Passed” range, does not imply that they are above what is expected in terms of the development of certain skills. Competency-based assessment aims to highlight the importance of facts such as this, allowing a greater degree of knowledge of the real learning of students instead of just one grade.
From a graphic perspective, in Figure 8, it can be seen that the central sections are where somewhat undefined situations may arise in relation to a clear approach to the real achievement of learning outcomes and competencies. On the minimum required grade, behaviour is clearer and aligned towards the expected achievement; thus, this type of analysis makes more visible the cases in which a student passes, complying with the required grade, but has not necessarily passed the grade of appropriation of the skills considered important for a given course.
Finally, from the individual purpose of a course and the level of achievement of its competencies, there could be aspects that have no major relevance in the training process. However, there is a relationship between the different courses that make up a complete study program, so more background is required to have an overview of the true impact of the achievement of a competence, understanding that the real appropriation of this is a gradual process, and that focusing on a specific time, such as during a course, is not enough to undertake a more comprehensive analysis.

6.2. Program Level

It is possible to project the specific analysis of the course we analysed to the rest of the subjects that make up a career curriculum. Thus, it is possible to know exactly the students’ progress in the specific competencies in each of the stages of the formative path. It is also possible to see the progress of the students in the appropriation of the competencies. It is important to understand that this is a gradual process throughout the curriculum training process.
Because the level of achievement reached by approximately 48% of the students in the course is under the requirement of 60%—y delving into the graph of DC3 (Figure 7)—different analyses can be discussed. Level of achievement could show that students who do not progress according to expectations may continue in the career. If viewed from a numerical perspective, by calculating their average grades, the students can achieve the minimum passing grade, despite the real appropriation of competencies analysed at this moment out under the expected level. This situation would indicate that, for the following courses of the study program, where this competence or another related one is presented, the students would face certain weaknesses in skills, knowledge, and attitudes expected at this level. Since it is understood that student learning is a progressive and cumulative process, any deficiency evidenced at a given moment may face future situations where a student is affected by not having the necessary competencies to acquire new knowledge, and is therefore not responding to the expected results.

7. Conclusions

This work presents a proposal that considers the main aspects of defining a competency assessment system with a simple application approach but, at the same time, significance in terms of results obtained that are related to what grades represent upon completion of a course. From this point of view, the proposal presented provides a mechanism that focuses on safeguarding the quality of education, having equality as its main axis and, in this case, mainly in the measurement of the students’ learning achievements, guaranteeing equity in the formative process.
In addition to the above, it allows for the investigation of different ways of evaluating a course and, therefore, the competencies it pursues, providing more representative data. Additionally, by allowing a progressive application throughout a career, it is possible to generate reinforcement actions in case they are required, and adequately review new instances of evaluation that manage to incorporate the concept of competencies in the courses.
After applying the method to a course of study, differences in the monitoring of students by marks and by competencies were observed. Thus, it was observed that, although there were students who passed the course by grades, some competencies were not acquired correctly. That is to say, although the students exceeded the pass mark, some competencies (when looked at in isolation) did not exceed the established achievement percentages. This analysis highlights an essential aspect that teachers and those in charge of continuous improvement in education need to consider. Although university education systems tout the application of competency-based education, monitoring systems such as the one presented in this research manage to show the actual achievement and acquisition of competencies.
This problem, which can be analysed at the course level, can be expanded during the progress of a student’s education (semester by semester). When follow-ups throughout a student’s years of study are carried out, the deficiencies in competence acquisition are cumulative.
Finally, considering that education is a continuous improvement process, it is worth mentioning that one way to enhance future work is to consider some variations in the proposed model. Incorporating each course’s weighting within the curriculum would provide more relevant information about a given course. Thus, a more representative and meaningful approach could be obtained in terms of its results for the students themselves and the related teachers. Additionally with the above, it is necessary to incorporate aspects of the Education for Sustainable Development (ESD) model, whose framework is an activity that has begun to be reviewed through international organizations, where it is intended to establish general guidelines to incorporate the competencies that allow safeguarding the compliance with the SDGs that have been elucidated. In our particular case for the study carried out, this would imply making adjustments in the proposal so that it focuses on the criteria set out in this framework to facilitate quality education.

Author Contributions

This paper represents the results of teamwork. P.H., N.A. and F.M.-L.R. designed the research methodology. N.A. and E.G. carried out the background research. All of the authors worked on the results, discussion and conclusions the manuscript. Finally, P.H. and F.M.-L.R. reviewed and edited the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Proyecto VRIEA-PUCV, grant number 039.441/2021. This research was conducted thanks to the contributions of the CORFO Project 14ENI2-26905 Ingeniería 2030-PUCV. This research was funded by CONICYT grant number CONICYT-PCHA/International Doctorate/2019-72200306 for funding the graduate research of Muñoz—La Rivera.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors acknowledge the Collaborative Group of Engineering Education (CGEE) from Pontificia Universidad Católica de Valparaíso (Chile).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Muñoz La Rivera, F.; Hermosilla, P.; Delgadillo, J.; Echeverría, D. The sustainable development goals (SDGs) as a basis for innovation skills for engineers in the industry 4.0 context. Sustainability 2020, 12, 6622. [Google Scholar] [CrossRef]
  2. UNESCO. Framework for the Implementation of Education for Sustainable Development (Esd) Beyond 2019; Unesco: Paris, France, 2019. [Google Scholar]
  3. Makrakis, V.; Kostoulas-Makrakis, N.; Kanbar, N. Developing and Validating an ESD Student Competence Framework: A Tempus—Rucas Initiative. Int. J. Excell. Educ. 2013, 5, 1–13. [Google Scholar] [CrossRef]
  4. Zlatkin-Troitschanskaia, O.; Shavelson, R.J. Advantages and challenges of performance assessment of student learning in higher education. Br. J. Educ. Psychol. 2019, 89, 413–415. [Google Scholar] [CrossRef] [PubMed]
  5. Sistermans, I.J. Integrating competency-based education with a case-based or problem-based learning approach in online health sciences. Asia Pac. Educ. Rev. 2020, 21, 683–696. [Google Scholar] [CrossRef]
  6. Pérez, C.G.; Clem, S. Teaching practices at a Chilean university 3 years after conversion to competency-based education. J. Competency-Based Educ. 2017, 2, e01054. [Google Scholar] [CrossRef] [Green Version]
  7. Muñoz La Rivera, F.; Hermosilla, P.; Delgadillo, J.; Echeverría, D. Propuesta de construcción de competencias de innovación en la formación de ingenieros en el contexto de la industria 4.0 y los objetivos de desarrollo sostenible (ODS). Form. Univ. 2021, 14, 75–84. [Google Scholar] [CrossRef]
  8. Paek, S.; Um, T.; Kim, N. Exploring latent topics and international research trends in competency-based education using topic modeling. Educ. Sci. 2021, 11, 303. [Google Scholar] [CrossRef]
  9. Medina, M. Does Competency-Based Education Have a Role in Academic Pharmacy in the United States? Pharmacy 2017, 5, 13. [Google Scholar] [CrossRef] [Green Version]
  10. Moon, Y.L. Education reform and competency-based education. Asia Pac. Educ. Rev. 2007, 8, 337–341. [Google Scholar] [CrossRef]
  11. Millet, I.; Weinstein, S. Grading by Objectives: A Matrix Method for Course Assessment. Qual. Approaches High. Educ. 2015, 6, 3–19. [Google Scholar]
  12. Moskal, B.M.; Leydens, J.A.; Pavelich, M.J. Validity, reliability and the assessment of engineering education. J. Eng. Educ. 2002, 91, 351–354. [Google Scholar] [CrossRef]
  13. Vargas, H.; Heradio, R.; Chacon, J.; De La Torre, L.; Farias, G.; Galan, D.; Dormido, S. Automated assessment and monitoring support for competency-based courses. IEEE Access 2019, 7, 41043–41051. [Google Scholar] [CrossRef]
  14. Silitonga, P. Competency-based education: A multi-variable study of tourism vocational high school graduates. J. Teach. Travel Tour. 2021, 21, 72–90. [Google Scholar] [CrossRef]
  15. Walden, P.R. Competency-Based Education: Purposes and Promises. In Seminars in Speech and Language; Thieme Medical Publishers: New York, NY, USA, 2020; Volume 41, pp. 289–297. [Google Scholar]
  16. Dragoo, A.; Barrows, R. Implementing Competency-Based Education: Challenges, Strategies, and a Decision-Making Framework. J. Contin. High. Educ. 2016, 64, 73–83. [Google Scholar] [CrossRef]
  17. Crawford, L.; Cofie, N.; McEwen, L.; Dagnone, D.; Taylor, S.W. Perceptions and barriers to competency-based education in Canadian postgraduate medical education. J. Eval. Clin. Pract. 2020, 26, 1124–1131. [Google Scholar] [CrossRef] [PubMed]
  18. Ateaga, N.; Hermosilla, P.; Muñoz-La Rivera, F. Design of Assessment System for Learning Outcomes and Competences in Engineering Programs. Solid State Technol. 2020, 63, 886–896. [Google Scholar]
  19. Ramírez, L.M.D.; Lamancusa, J.S.; Zayas-Castro, J.L.; Jorgensen, J.E. Making a Partnership Work: Outcomes Assessment of the Manufacturing Engineering Education Partnership. J. Eng. Educ. 1998, 87, 519–527. [Google Scholar] [CrossRef]
  20. Fullerton, J.T.; Thompson, J.B.; Johnson, P. Competency-based education: The essential basis of pre-service education for the professional midwifery workforce. Midwifery 2013, 29, 1129–1136. [Google Scholar] [CrossRef]
  21. Fan, J.Y.; Wang, Y.H.; Chao, L.F.; Jane, S.W.; Hsu, L.L. Performance evaluation of nursing students following competency-based education. Nurse Educ. Today 2015, 35, 97–103. [Google Scholar] [CrossRef] [PubMed]
  22. Cukierman, U.R.; Morell, L.; Noel, R.; Muñoz, R.; Vendrell, E.; Becerra, C.; Prado, C.G. Triple Helix for Outcomes Based Engineering. In Proceedings of the ASEE 2016 International Forum, New Orleans, LA, USA, 25 June 2016. [Google Scholar]
  23. Anderson, C.I.; Basson, M.D.; Ali, M.; Davis, A.T.; Osmer, R.L.; McLeod, M.K.; Haan, P.S.; Molnar, R.G.; Peshkepija, A.N.; Hardaway, J.C.; et al. Comprehensive Multicenter Graduate Surgical Education Initiative Incorporating Entrustable Professional Activities, Continuous Quality Improvement Cycles, and a Web-Based Platform to Enhance Teaching and Learning. J. Am. Coll. Surg. 2018, 227, 64–76. [Google Scholar] [CrossRef] [PubMed]
  24. Davis, D.; Gentili, K.; Trevisan, M.; Calkins, D. Engineering Design Assessment Processes and Scoring Scales for Program Improvement and Accountability. J. Eng. Educ. 2002, 91, 211–221. [Google Scholar] [CrossRef]
  25. Trevisan, M.S.; David, D.C.; Crain, R.W.; Calkuns, D.E.; Gentili, K.L. Developing and Assessing Statewide Competencies for Engineering Design. J. Eng. Educ. 1998, 87, 185–193. [Google Scholar] [CrossRef]
  26. Cates, S.V.; Doyle, S.; Gallagher, L.; Shelton, G.; Broman, N.; Escudier, B. Making the case for virtual competency-based education: Building a twenty-first century small business workforce. High. Educ. Ski. Work. Learn. 2021, 11, 282–295. [Google Scholar] [CrossRef]
  27. Damaj, I.; Yousafzai, J. Effective assessment of student outcomes in computer engineering programs using a minimalistic framework. Int. J. Eng. Educ. 2018, 35, 59–75. [Google Scholar]
  28. McGourty, J.; Sebastian, C.; Swart, W. Developing a Comprehensive Assessment Program for Engineering Education*. J. Eng. Educ. 1998, 87, 355–361. [Google Scholar] [CrossRef]
  29. Makrakis, V.; Kostoulas-Makrakis, N. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme. Eval. Program. Plann. 2016, 54, 144–151. [Google Scholar] [CrossRef]
  30. Calderón, H.E.; Vásquez, R.E.; Aponte Roa, D.A.; Del Valle, M. Successful Assessment Strategies for ABET Accreditation of Engineering Programs Offered at Different Campuses. In Proceedings of the 14th LACCEI International Multi-Conference for Engineering, Education, and Technology: “Engineering Innovations for Global Sustainability”, San José, Costa Rica, 20–22 July 2016. [Google Scholar] [CrossRef] [Green Version]
  31. Robinson, R.; Hensel, E., Jr. A Process for Assessment of ABET Student Outcomes in a Mechanical Engineering Department. In Proceedings of the 13th LACCEI Annual International Conference: Engineering Education Facing the Grand Challenges, What Are We Doing, Santo Domingo, Dominican Republic, 29–31 July 2015. [Google Scholar] [CrossRef] [Green Version]
  32. Rogers, G.M. EC2000 and measurement: How much precision is enough? J. Eng. Educ. 2000, 89, 161–165. [Google Scholar] [CrossRef]
  33. ABET. Criteria for Accrediting Engineering Programs. 2018. Available online: https://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2020-2021/ (accessed on 1 August 2021).
  34. CEAB. Canadian Engineering Accreditation Board. 2020. Available online: https://engineerscanada.ca/accreditation/accreditation-board (accessed on 1 August 2021).
  35. MERCOSUR. Criterios de Calidad Para La Acreditación ARCU-SUR. 2019. Available online: http://arcusur.org/arcusur_v2/index.php/documentos-del-sistema (accessed on 1 August 2021).
  36. ENAFE. Framework Standards and Guidelines (EAFSG). 2015. Available online: https://www.enaee.eu/eur-ace-system/standards-and-guidelines/ (accessed on 1 August 2021).
  37. Chinese Engineering Education Accreditation Association (CEEAA). Criteria for Engineering Education Accreditation. 2017. Available online: http://english.cast.org.cn/col/col1333/index.html (accessed on 1 August 2021).
  38. JABEE (Japan Accreditation Board for Engineering Education). JABEE Criteria Guide for Accreditation of Engineering Education Programs at Bachelor Level. 2019. Available online: https://jabee.org/doc/Criteria_Guide_ENB_2019-.pdf (accessed on 1 August 2021).
  39. National Board of Accreditation (NBA). General Manual for Accreditation; National Board of Accreditation (NBA): New Delhi, India, 2019; pp. 1–39. [Google Scholar]
  40. Engineers Australia. Accreditation Management System Accreditation Principles. 2018, pp. 1–30. Available online: https://www.engineersaustralia.org.au/sites/default/files/2019-02/AMS-POL-01_Accreditation_Principles_v2.0.pdf (accessed on 1 August 2021).
  41. Engineering Council of South Africa (ECSA). Engineering Qualifications in the Higher Education; Engineering Council of South Africa: Johannesburg, South Africa, 2019. [Google Scholar]
  42. Endut, A.S.; Majid, F.A.; Ibrahim, A.B.; Ashari, H. Responsive Outcome Evaluation as an Internal Quality Assurance Mechanism Alternative at IHLs in Malaysia. Procedia-Soc. Behav. Sci. 2013, 90, 13–21. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Research methodology.
Figure 1. Research methodology.
Sustainability 13 11721 g001
Figure 2. Assessment system general diagram.
Figure 2. Assessment system general diagram.
Sustainability 13 11721 g002
Figure 3. Evaluation and improvement strategy flow diagram.
Figure 3. Evaluation and improvement strategy flow diagram.
Sustainability 13 11721 g003
Figure 4. Triad diagram: competencies, learning outcomes, and assessment activities.
Figure 4. Triad diagram: competencies, learning outcomes, and assessment activities.
Sustainability 13 11721 g004
Figure 5. Distribution of grades in the Dynamic Mechanics course.
Figure 5. Distribution of grades in the Dynamic Mechanics course.
Sustainability 13 11721 g005
Figure 6. Distribution of achievement factors in the Dynamic Mechanics course (learning outcomes and competencies—range 0 to 100).
Figure 6. Distribution of achievement factors in the Dynamic Mechanics course (learning outcomes and competencies—range 0 to 100).
Sustainability 13 11721 g006
Figure 7. Development of the DC3 competency up to the fourth semester for each student.
Figure 7. Development of the DC3 competency up to the fourth semester for each student.
Sustainability 13 11721 g007
Figure 8. Competencies and learning outcomes according to grade ranges.
Figure 8. Competencies and learning outcomes according to grade ranges.
Sustainability 13 11721 g008
Table 1. Some accrediting entities for engineering programs by country or region.
Table 1. Some accrediting entities for engineering programs by country or region.
Engineering Accreditation Agency
(Country or Region)
QTECIPEPALOS/G/OCE/POEMFM
ABET: Accreditation Board of Engineering and Technology (USA) [33]xxxxxxxx
CEAB: Canadian Engineering Accreditation Board (Canada) [34]xxxxxxxx
ARCU-SUR: Regional Accreditation System for University Careers (South America) [35]xxxxxxxx
ENAEE: European Network for Accreditation of Engineering Education (Europe) [36]xxxxxxxxx
CEEAA: China Engineering Education Accreditation Association (China) [37]xxxx xxxx
JABEE: Japan Accreditation Board for Engineering Education (Japan) [38]xxxxxxxx
NBA: National Board of Accreditation (India) [39]xxxx xxxx
IEAust: Institution of Engineers Australia (Australia) [40]xxxxxxxxx
ECSA: Engineering Council of South Africa (South Africa) [41]xxxxxxxxx
Table 2. Assessment system procedure. Obtaining data for evaluation and improvement.
Table 2. Assessment system procedure. Obtaining data for evaluation and improvement.
StageFormula & ParametersDescription
Stage 1
Identification and Definition of Default Input Variables
j     { 1 , 2 , , n }   L O j = L e a r n i n g   O u t c o m e   j
x     { 1 , 2 , , k }   C x = C o m p e t e n c e   x
i     { 1 , 2 , , m }   A A i = A s s e s s m e n t   A c t i v i t y   i
i     { 1 , 2 , , m }   P i = W e i g h t   o f   A s s e s s m e n t   A c t i v i t y   i
i = 1 m P i = 100    
Should be defined in course syllabus; otherwise, teacher must define these variables
Stage 2
Definition of System Configuration Input Variable 1
B i , j = W e i g h t   o f   L e a r n i n g   O u t c o m e   j   a t   A .   A c t i v i t y   i  
T j = T o t a l   W e i g h t   o f   L e a r n i n g   O u t c o m e   j   a t   c o u r s e
P i   = j = 1 n B i , j   ;   T j = i = 1 m B i , j   ;   j = 1 n T j = 100
Teacher must define these variables
Stage 3
Assessment and Calculation of Achievement Factor of Learning Outcomes
s     { 1 , 2 , , r }   s = S t u d e n t   s
Q s , i = S t u d e n t   s   Q u a l i f i c a t i o n   i n   A s s e s s m e n t   A c t i v i t y   i
M Q = M a x i m u n   Q u a l i f i c a t i o n
A F ( L O ) s , j = S t u d e n t   s   A c h i e v e m e n t   F a c t o r   o f   L e a r n i n g   O u t c o m e   j
A F ( L O ) s , j = 100 M Q T j i = 1 m Q s , i B i , j
Using qualifications of assessment activities, achievement factors of learning outcomes are possible to calculate
Stage 4
Definition of System Configuration Input Variable 2
E x , j = R e l a t i o n   b e t w e e n   L .   O u t c o m e   j   a n d   C o m p e t e n c e   x
W x = T o t a l   n u m b e r   o f   L e a r n i n g   O u t c o m e s   i n   C o m p e t e n c e   x
E x , j { 0 , 1 } ;   j = 1 n E x , j = W x 0
E” variables are the relation between a learning outcome and a competency. To define these, the teacher must read both definitions and link them: 1 for relation, 0 for no relation
Stage 5
Calculation of In-Between Parameters
D i , x = W e i g h t   o f   C o m p e t e n c e   x   i n   A s s e s s m e n t   A c t i v i t y   i
D i , x   = 100 W x j = 1 n E x , j B i , j T j
D percentages” parameters are analogous to “B percentages”; they are the result of adding the “E” variables
Stage 6
Assessment and Calculation of Achievement Factor of Competencies (Course Level)
A F ( C ) s , j = S t u d e n t   s   A c h i e v e m e n t   F a c t o r   o f   C o m p e t e n c e   x
A F ( C ) s , x = 1 M Q i = 1 m Q s , i D i , x   ,  
Using qualifications of assessment activities, achievement factors of competencies are possible to calculate. Must not be confused with the accumulated achievement factor of competencies
Stage 7
Calculation of Achievement Factor of Competencies (Program Level)
Z x = T o t a l   c o u r s e s   w h e r e   C o m p e t e n c e   x   i s   d e v e l o p e d
A A F ( C ) s , j = S t u d e n t   s   A c c u m u l a t e d   A c h i e v e m e n t   F a c t o r   o f   C o m p e t e n c e   x
  A A F ( C ) s , x = A F ( C ) s , x Z x
If this process is conducted in a certain number of courses, accumulated achievement factors of competencies are possible to calculate. These factors make possible competencies measurements
Table 3. Competencies of the civil engineering degree of the PUCV.
Table 3. Competencies of the civil engineering degree of the PUCV.
TypeDescription of Competence
Fundamental (FC)1It recognises the transcendent dimension of human existence, and Christian anthropology as a valuable response to the meaning of life.
2It acts ethically, illuminated by the Christian proposal, in real contexts, with autonomy and respect for others, seeking the common good, the promotion of human rights and the realization of the human person, in a context of diversity.
3Communicates in a clear and coherent manner through their mother tongue in an academic context.
4It uses information and communication technologies as tools for academic and professional development.
5Demonstrates capacity for analysis, abstraction, synthesis, and critical reflection with the aim of solving problems, building knowledge, and developing self-learning, both individually and in interdisciplinary teams.
6It communicates orally and in writing in the English language, in order to facilitate their insertion and participation in multicultural and interdisciplinary contexts.
7It recognises reading, relationships with others, physical activity, healthy living, environmental care, art, and culture as sources of integral personal development.
8It participates in democratic instances, committing its formation in a local, national, and international context.
Disciplinary (DC)1It uses the knowledge of basic sciences to understand, pose, and solve mathematical models associated with physical phenomena and processes related to the field of civil engineering.
2It demonstrates logical-deductive thinking that allows the student to methodically face multidisciplinary problems that require the analytical capacity of an engineer.
3The student masters the conceptual basis and the analysis tools of the engineering sciences area to study and solve problems of civil engineering and those that transcend the scope of the specialty.
Professional (PC)1It has the tools that allow it to understand the social, economic, cultural, and environmental contexts to design and develop engineering projects.
2The student masters the techniques and procedures relevant to the management and direction of civil engineering projects, in order to optimise the use of resources for their development.
3It works in interdisciplinary teams by generating integrated and efficient solutions related to civil engineering works and systems.
4It identifies infrastructure deficiencies and proposes solutions that are technically feasible, economically viable, and responsible to society and the environment, in the field of application of civil engineering.
5It designs civil works applying principles and methodologies of analysis, design criteria, and current regulations, to respond to the needs of society with an innovative vision.
6Makes informed decisions while protecting the community and the environment in the formulation and management of civil engineering projects.
7Leads, manages, and directs civil engineering works and systems, ensuring the proper use of economic, human, and environmental resources to meet the objectives of a project.
Table 4. Competencies of Dynamic Mechanics course.
Table 4. Competencies of Dynamic Mechanics course.
SemesterCoursePUCV CreditHours per SemesterFundamental (FC)Disciplinary (DC)Professional (PC)
123456781231234567
11Challenges in Civil Engineering4162 xx x x xx
2Introduction to Geometry3121.5 xx
3Fundamentals of Mathematics5202.5 xx x
4Introduction to Physics5202.5 x xx
5Oral and written communication3121.5 xx x
Accumulated Total20810034010202300000011
26Applied Programming4162 x x x xxx
7Linear Algebra4162 xx
8Calculus 14162 xx
9General Mechanical Physics3121.5 xx
10Modelling and Topography Tools4162 xx
11Christian Anthropology281xx
Accumulated Total21850.5145130306710000011
312Static Mechanics5202.5 x x x
13Engineering Materials4162 x x x xxx
14Calculus 24162 xx
15General Physics Thermodynamics and Waves3121.5 x xx
16Applied Chemistry4162 x xx
Accumulated Total411660.514617040111130000011
417Dynamic Mechanics4162 x x x
18Differential Equations4162 xx
19Calculus 35202.5 xx
20General Physics Electromagnetism3121.5 xx
21Construction Processes and Techniques4162 x xx x x x
Accumulated Total612470.515629040151450100021
Table 5. Description of assessment activities.
Table 5. Description of assessment activities.
ActivityDescriptionQuantityTotal WeightWeight for Each One
AQuizzesSpecific activity, to evaluate knowledge of direct application.420%5%
BTestsGlobal activity, to finalise a series of studied topics, cumulative, and their combined applications.345%15%
CLaboratoryPractical activity, of empirical application of the topics.135%35%
Table 6. Parameters and values considered in the study.
Table 6. Parameters and values considered in the study.
AcronymParametersValues
NPopulation size104 students
ZConfidence coefficientZ = 1960 for a 95% confidence level
PProbability of successWe consider the maximum: 0.50
qProbability of failureIt will be 1.00 − 0.50 = 0.50
dMaximum permissible errorWe consider 3%
Table 7. Selected sample data.
Table 7. Selected sample data.
QuizzesTestsLab.Final AverageLearning OutcomesCompetenciesFinal Status
Q1Q2Q3Q4QAT1T2T3TALALO1LO2LO3FC5DC1DC3
22229551027292621251020353419302735F
64454443534393127331025504025383745F
104940585043192828251927404435403842F
151829412826333037332529424341424143F
196266606050262640302131505043484750F
236552613146372540342232554741484851F
275670524749242633283334495349514951F
323649535841335336412635546045534957F
362252495238354845433137526153555357F
406066575753204334323738526454575358F
444470575653342539333939555858575656F
486410703852382738344441575459575856P
536150536561332958403342575662596057P
576460556159203255364343536369626158P
612846555242405440454544607063646265P
653764705953343869473845586970666464P
696010435957323343365145565771616356P
746761636169352255374546626171656661P
785256576464315042414647607166666366P
825860606167333555414748616872676765P
866358536168334155434850636973696866P
916251606770434053455052697075717270P
956562636169523457485253757078747773P
996568676776413960475355717681767673P
1036568676874524761535759808185828281P
Table 8. Data grouped according to grade ranges.
Table 8. Data grouped according to grade ranges.
Grade RangesAverageLearning OutcomesCompetences
LO1LO2LO3FC5DC1DC3
R1 (25–29)27444234403943
R2 (30–34)32515044494851
R3 (35–39)38536153555357
R4 (40–44)42576163606059
R5 (45–49)46606570656562
R6 (50–54)52697075717270
R7 (55–59)59808185828281
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hermosilla, P.; Muñoz-La Rivera, F.; Ateaga, N.; Gallardo, E. Strategy for the Evaluation and Monitoring of Competencies in Engineering Programs to Improve Students’ Learning and Quality of Education. Sustainability 2021, 13, 11721. https://doi.org/10.3390/su132111721

AMA Style

Hermosilla P, Muñoz-La Rivera F, Ateaga N, Gallardo E. Strategy for the Evaluation and Monitoring of Competencies in Engineering Programs to Improve Students’ Learning and Quality of Education. Sustainability. 2021; 13(21):11721. https://doi.org/10.3390/su132111721

Chicago/Turabian Style

Hermosilla, Pamela, Felipe Muñoz-La Rivera, Nicolás Ateaga, and Elisa Gallardo. 2021. "Strategy for the Evaluation and Monitoring of Competencies in Engineering Programs to Improve Students’ Learning and Quality of Education" Sustainability 13, no. 21: 11721. https://doi.org/10.3390/su132111721

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop