Programmatic Assessment in Education for Health Professions

A special issue of Education Sciences (ISSN 2227-7102).

Deadline for manuscript submissions: closed (15 June 2022) | Viewed by 17840

Special Issue Editors


E-Mail Website
Guest Editor
Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, 6200 MD Maastricht, The Netherlands
Interests: competency-based education; competence-based assessment; programmatic assessment; assessment programs; workplace-based assessment

E-Mail Website
Guest Editor
Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, 6200 MD Maastricht, The Netherlands
Interests: assessment of professional competence

E-Mail Website
Guest Editor
Department of Education and Student Affairs, Faculty of Civil Engineering and Geosciences, Technical University of Delft, 2628CN Delft, The Netherlands
Interests: programmatic assessment; assessment for learning; feedback

Special Issue Information

Dear Colleagues,

High-quality assessment in health professions education fosters the development of professional competence as well as ensures robust decision making with regard to learners’ fitness for practice. Assessment research, however, consistently shows that integrating these seemingly conflicting goals in assessment is challenging, and it is increasingly acknowledged that a systems approach to assessment—rather than a focus on individual measures—is required to resolve key dilemmas in assessment. Recently, programmatic assessment has been introduced as an approach to assessment which aims to simultaneously optimize formative and summative assessment functions. Programmatic assessment (PA) involves the careful selection and combination of a broad range of assessment activities embedded in students’ learning trajectories, each of which generates meaningful feedback for learning as well as evidence that can be used to support high stakes decision making (Van der Vleuten et al., 2012). Although the principles underlying PA are firmly grounded in assessment research, studies on the effectiveness of programmatic assessment reveal several problems related to design and implementation (Schut et al., 2020).

The aim of this Special Issue of Education Sciences was to advance our understanding of how to best design and implement a programmatic assessment in curricula for health professions education. We specifically seek articles that present models of effective programmatic assessment practice in different education systems that exist in various countries throughout the world. We furthermore welcome papers addressing challenges in the implementation of programmatic assessment within different settings, requiring adaptation to the legal, political, financial and/or cultural factors in the local context. This themed issue will therefore focus on case studies presenting programmatic assessment practices, elaborating contextualized design, the implementation and evaluation of PA in education for health professions, as well as reflect on the lessons learned (please see the template that serves as a guiding format for structuring your manuscript).

References:

Van der Vleuten, C. P., Schuwirth, L. W. T., Driessen, E. W., Dijkstra, J., Tigelaar, D., Baartman, L. K. J., & van Tartwijk, J. (2012). A model for programmatic assessment fit for purpose. Medical Teacher, 34(3), 205-214.

Schut, S., Maggio, L. A., Heeneman, S., van Tartwijk, J., van der Vleuten, C., & Driessen, E. (2020). Where the rubber meets the road—An integrative review of programmatic assessment in health care professions education. Perspectives on Medical Education, 1-8.

Dr. Marjan J. B. Govaerts
Prof. Dr. Cees van der Vleuten
Dr. Suzanne Schut
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Education Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • programmatic assessment
  • assessment programs
  • competence assessment

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Other

6 pages, 202 KiB  
Editorial
Implementation of Programmatic Assessment: Challenges and Lessons Learned
by Marjan Govaerts, Cees Van der Vleuten and Suzanne Schut
Educ. Sci. 2022, 12(10), 717; https://doi.org/10.3390/educsci12100717 - 18 Oct 2022
Cited by 7 | Viewed by 2864
Abstract
In the past few decades, health professions education programmes around the world have embraced the competency-based paradigm to guide the education and assessment of future healthcare workers [...] Full article
(This article belongs to the Special Issue Programmatic Assessment in Education for Health Professions)

Research

Jump to: Editorial, Other

16 pages, 1169 KiB  
Article
Taking the Big Leap: A Case Study on Implementing Programmatic Assessment in an Undergraduate Medical Program
by Raphaël Bonvin, Elke Bayha, Amélie Gremaud, Pierre-Alain Blanc, Sabine Morand, Isabelle Charrière and Marco Mancinetti
Educ. Sci. 2022, 12(7), 425; https://doi.org/10.3390/educsci12070425 - 22 Jun 2022
Cited by 2 | Viewed by 1845
Abstract
The concept of programmatic assessment (PA) is well described in the literature; however, studies on implementing and operationalizing this systemic assessment approach are lacking. The present case study developed a local instantiation of PA, referred to as Assessment System Fribourg (ASF), which was [...] Read more.
The concept of programmatic assessment (PA) is well described in the literature; however, studies on implementing and operationalizing this systemic assessment approach are lacking. The present case study developed a local instantiation of PA, referred to as Assessment System Fribourg (ASF), which was inspired by an existing program. ASF was utilized for a new competency-based undergraduate Master of Medicine program at the State University of Fribourg. ASF relies on the interplay of four key principles and nine main program elements based on concepts of PA, formative assessment, and evaluative judgment. We started our journey in 2019 with the first cohort of 40 students who graduated in 2022. This paper describes our journey implementing ASF, including the enabling factors and hindrances that we encountered, and reflects on our experience and the path that is still in front of us. This case illustrates one possibility for implementing PA. Full article
(This article belongs to the Special Issue Programmatic Assessment in Education for Health Professions)
Show Figures

Figure 1

12 pages, 1363 KiB  
Article
Do Resident Archetypes Influence the Functioning of Programs of Assessment?
by Jessica V. Rich, Warren J. Cheung, Lara Cooke, Anna Oswald, Stephen Gauthier and Andrew K. Hall
Educ. Sci. 2022, 12(5), 293; https://doi.org/10.3390/educsci12050293 - 20 Apr 2022
Cited by 4 | Viewed by 1907
Abstract
While most case studies consider how programs of assessment may influence residents’ achievement, we engaged in a qualitative, multiple case study to model how resident engagement and performance can reciprocally influence the program of assessment. We conducted virtual focus groups with program leaders [...] Read more.
While most case studies consider how programs of assessment may influence residents’ achievement, we engaged in a qualitative, multiple case study to model how resident engagement and performance can reciprocally influence the program of assessment. We conducted virtual focus groups with program leaders from four residency training programs from different disciplines (internal medicine, emergency medicine, neurology, and rheumatology) and institutions. We facilitated discussion with live screen-sharing to (1) improve upon a previously-derived model of programmatic assessment and (2) explore how different resident archetypes (sample profiles) may influence their program of assessment. Participants agreed that differences in resident engagement and performance can influence their programs of assessment in some (mal)adaptive ways. For residents who are disengaged and weakly performing (of which there are a few), significantly more time is spent to make sense of problematic evidence, arrive at a decision, and generate recommendations. Whereas for residents who are engaged and performing strongly (the vast majority), significantly less effort is thought to be spent on discussion and formalized recommendations. These findings motivate us to fulfill the potential of programmatic assessment by more intentionally and strategically challenging those who are engaged and strongly performing, and by anticipating ways that weakly performing residents may strain existing processes. Full article
(This article belongs to the Special Issue Programmatic Assessment in Education for Health Professions)
Show Figures

Figure 1

Other

Jump to: Editorial, Research

13 pages, 485 KiB  
Case Report
From Traditional to Programmatic Assessment in Three (Not So) Easy Steps
by Anna Ryan and Terry Judd
Educ. Sci. 2022, 12(7), 487; https://doi.org/10.3390/educsci12070487 - 14 Jul 2022
Cited by 3 | Viewed by 3568
Abstract
Programmatic assessment (PA) has strong theoretical and pedagogical underpinnings, but its practical implementation brings a number of challenges—particularly in traditional university settings involving large cohort sizes. This paper presents a detailed case report of an in-progress programmatic assessment implementation involving a decade of [...] Read more.
Programmatic assessment (PA) has strong theoretical and pedagogical underpinnings, but its practical implementation brings a number of challenges—particularly in traditional university settings involving large cohort sizes. This paper presents a detailed case report of an in-progress programmatic assessment implementation involving a decade of assessment innovation occurring in three significant and transformative steps. The starting position and subsequent changes represented in each step are reflected against the framework of established principles and implementation themes of PA. This case report emphasises the importance of ongoing innovation and evaluative research, the advantage of a dedicated team with a cohesive plan, and the fundamental necessity of electronic data collection. It also highlights the challenge of traditional university cultures, the potential advantage of a major pandemic disruption, and the necessity for curriculum renewal to support significant assessment change. Our PA implementation began with a plan to improve the learning potential of individual assessments and over the subsequent decade expanded to encompass a cohesive and course wide assessment program involving meaningful aggregation of assessment data. In our context (large cohort sizes and university-wide assessment policy) regular progress review meetings and progress decisions based on aggregated qualitative and quantitative data (rather than assessment format) remain local challenges. Full article
(This article belongs to the Special Issue Programmatic Assessment in Education for Health Professions)
Show Figures

Figure 1

12 pages, 219 KiB  
Case Report
Embedding a Coaching Culture into Programmatic Assessment
by Svetlana Michelle King, Lambert W. T. Schuwirth and Johanna H. Jordaan
Educ. Sci. 2022, 12(4), 273; https://doi.org/10.3390/educsci12040273 - 12 Apr 2022
Cited by 3 | Viewed by 2131
Abstract
Educational change in higher education is challenging and complex, requiring engagement with a multitude of perspectives and contextual factors. In this paper, we present a case study based on our experiences of enacting a fundamental educational change in a medical program; namely, the [...] Read more.
Educational change in higher education is challenging and complex, requiring engagement with a multitude of perspectives and contextual factors. In this paper, we present a case study based on our experiences of enacting a fundamental educational change in a medical program; namely, the steps taken in the transition to programmatic assessment. Specifically, we reflect on the successes and failures in embedding a coaching culture into programmatic assessment. To do this, we refer to the principles of programmatic assessment as they apply to this case and conclude with some key lessons that we have learnt from engaging in this change process. Fostering a culture of programmatic assessment that supports learners to thrive through coaching has required compromise and adaptability, particularly in light of the changes to teaching and learning necessitated by the global pandemic. We continue to inculcate this culture and enact the principles of programmatic assessment with a focus on continuous quality improvement. Full article
(This article belongs to the Special Issue Programmatic Assessment in Education for Health Professions)
10 pages, 484 KiB  
Case Report
Assessment for Learning: The University of Toronto Temerty Faculty of Medicine M.D. Program Experience
by Glendon R. Tait and Kulamakan Mahan Kulasegaram
Educ. Sci. 2022, 12(4), 249; https://doi.org/10.3390/educsci12040249 - 31 Mar 2022
Cited by 3 | Viewed by 2016
Abstract
(1) Background: Programmatic assessment optimizes the coaching, learning, and decision-making functions of assessment. It utilizes multiple data points, fit for purpose, which on their own guide learning, but taken together form the basis of holistic decision making. While they are agreed on principles, [...] Read more.
(1) Background: Programmatic assessment optimizes the coaching, learning, and decision-making functions of assessment. It utilizes multiple data points, fit for purpose, which on their own guide learning, but taken together form the basis of holistic decision making. While they are agreed on principles, implementation varies according to context. (2) Context: The University of Toronto MD program implemented programmatic assessment as part of a major curriculum renewal. (3) Design and implementation: This paper, structured around best practices in programmatic assessment, describes the implementation of the University of Toronto MD program, one of Canada’s largest. The case study illustrates the components of the programmatic assessment framework, tracking and making sense of data, how academic decisions are made, and how data guide coaching and tailored support and learning plans for learners. (4) Lessons learned: Key implementation lessons are discussed, including the role of context, resources, alignment with curriculum renewal, and the role of faculty development and program evaluation. (5) Conclusions: Large-scale programmatic assessment implementation is resource intensive and requires commitment both initially and on a sustained basis, requiring ongoing improvement and steadfast championing of the cause of optimally leveraging the learning function of assessment. Full article
(This article belongs to the Special Issue Programmatic Assessment in Education for Health Professions)
Show Figures

Figure 1

14 pages, 416 KiB  
Case Report
The Importance of Professional Development in a Programmatic Assessment System: One Medical School’s Experience
by Colleen Y. Colbert and S. Beth Bierer
Educ. Sci. 2022, 12(3), 220; https://doi.org/10.3390/educsci12030220 - 18 Mar 2022
Cited by 3 | Viewed by 2275
Abstract
The Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (CCLCM) was created in 2004 as a 5-year undergraduate medical education program with a mission to produce future physician-investigators. CCLCM’s assessment system aligns with the principles of programmatic assessment. The curriculum [...] Read more.
The Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (CCLCM) was created in 2004 as a 5-year undergraduate medical education program with a mission to produce future physician-investigators. CCLCM’s assessment system aligns with the principles of programmatic assessment. The curriculum is organized around nine competencies, where each competency has milestones that students use to self-assess their progress and performance. Throughout the program, students receive low-stakes feedback from a myriad of assessors across courses and contexts. With support of advisors, students construct portfolios to document their progress and performance. A separate promotion committee makes high-stakes promotion decisions after reviewing students’ portfolios. This case study describes a systematic approach to provide both student and faculty professional development essential for programmatic assessment. Facilitators, barriers, lessons learned, and future directions are discussed. Full article
(This article belongs to the Special Issue Programmatic Assessment in Education for Health Professions)
Show Figures

Figure 1

Back to TopTop