Download PDF Download PDF

Implementing programmatic assessment: Bridging the gap between theory and practice

Publication date: 2024-06-06

Author:

Andreou, Vasiliki

Abstract:

Introduction Programmatic assessment is an instructional approach, where the assessment purpose requires the alternation between assessments for learning (or low-stakes assessments) and assessments of learning (high-stakes). Low-stakes assessments are intended to support trainees' further development, while high-stakes assessments are used to take decisions on trainees' progression, based on aggregated information. By combining assessment stakes in a continuum, PA aspires to create a fit-for-purpose approach, integrating both the learning and decision-making purpose of assessment. Although literature on PA has dominated medical education research, early implementation evidence indicates barriers. Alternating the assessment stakes, along with difficulties to engage workplace stakeholders, namely trainees and trainers, posit challenges to PA, when it is applied in practice. Therefore, designing and implementing a PA framework that aligns with stakeholders' needs is crucial for enhancing assessment practices in postgraduate medical education. Such stakeholder-centric PA framework can help bridge the gap between theoretical knowledge and practical application, ensuring that the assessment process fulfils its dual purpose. Method This PhD project was based on the principles of the educational design research (EDR) within the General Practitioner's (GP) Training context. Embracing the complexity of real learning environments, EDR attempts to pursue practical and scientific goals for specific contexts through the phases of analysis, design, and evaluation. Therefore, this PhD project began by thoroughly analysing the existing assessment structures within the GP Training (Chapter 1 and Chapter 2). Subsequently, it proceeded to designing customized solutions to address stakeholders' assessment needs (Chapter 3, Chapter 4, and Chapter 5), and, lastly, it concluded with evaluating these solutions (Chapter 6, Chapter 7, Chapter 8, and Chapter 9). Results Starting with the first high-stakes assessment and problem analysis, our validity study demonstrated that a multicomponent proficiency-testing exam can provide valid performance evidence, setting trainees' learning agenda. Our focus group study showed that low-stakes workplace assessments need toestablish clear learning outcomes, allow frequent assessments, and enhance feedback processes, to meet workplace stakeholders' needs. During the design phase, amidst the COVID-19 pandemic, our comparative study on proctoring systems suggested that online proctoring could be a viable solution for high-stakes assessments. Furthermore, our Delphi study indicated that the CanMEDS competency framework is context-dependent and challenging for low-stakes workplace assessments. Our co-design study aimed to bridge the gap between CanMEDS and practice by designing an Entrustable Professional Activities (EPAs) framework with clinical competence committee members, trainers, and trainees, highlighting the importance of collaborative efforts. This study showed that involving various stakeholders in the design phase of an educational intervention can identify implementation barriers. In the evaluation phase, our longitudinal study assessed the EPA intervention's implementation, revealing that stakeholders' perceptions changed over time and were divergent between trainees and trainers. Additionally, our mixed-method study on feedback demonstrated that EPAs can provide high-quality feedback. Furthermore, our first qualitative study with semi-structured interviews provided evidence on trainees' agency during EPAs. Trainees' agency was multifaceted, and was influenced by internal and external factors. Finally, our second qualitative study with semi-structured interviews highlighted challenges with e-portfolios in implementing PA. User profile, limited functionalities of the e-portfolio, and time constraints were factors influencing not only utilization of e-portfolios, but also PA implementation. Conclusion This PhD project added to the current literature on PA by describing a stepwise design and implementation process. Each chapter of this dissertation delineates the stepwise process of implementing PA, by not only outlining the necessary steps, but also by offering profound insights into curriculum planning and process evaluation, aimed at fostering sustainable and enduring results.