TY - JOUR
T1 - Evaluating a Competency-Based Blended Health Professions Education Program
T2 - A Programmatic Approach
AU - Samuel, Anita
AU - King, Beth
AU - Cervero, Ronald M.
AU - Durning, Steven J.
AU - Melton, John
N1 - Publisher Copyright:
© 2023 Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States. This work is written by (a) US Government employee(s) and is in the public domain in the US.
PY - 2023/5/1
Y1 - 2023/5/1
N2 - Introduction: Competency-based education (CBE) programs usually evaluate student learning outcomes at a course level. However, a more comprehensive evaluation of student achievement of competencies requires evaluation at a programmatic level across all courses. There is currently insufficient literature on accomplishing this type of evaluation. In this article, we present an evaluation strategy adopted by the competency-based master's degree program at the Center for Health Professions Education at the Uniformed Services University of Health Sciences to assess student achievement of competencies. We hypothesized that (1) learners would grow in the competencies through their time in the program and (2) learners would exhibit a behavioristic change as a result of their participation in the program. Materials and Methods: The degree program at the Center for Health Professions Education conducts an annual student self-assessment of competencies using a competency survey. The competency survey data from graduated master's students were collected, providing data from three time points: initial (pre-program survey), middle, and final (end-of-program survey). Open-ended responses from these three surveys were also analyzed. A general linear model for repeated measures was conducted. Significant effects were followed by post hoc tests across time. We also conducted post hoc analysis across domains to better understand the comparative levels of the domains at each time point. The responses to the open-ended prompt were thematically analyzed. Results: Analysis of the quantitative data revealed that (1) learners reported significant growth across time, (2) learners had different perceptions of their competencies in each of the domains, and (3) not all domains experienced similar changes over time. Analysis of the free responses highlighted the impact of coursework on competency attainment and the behavioristic change in learners. Conclusions: This study presents a strategic evaluation tool for course-based CBE programs that follow a traditional credit hour model. Programmatic evaluation of CBE programs should enable the inclusion of the learner's voice and provide evaluation data that go beyond individual course evaluations.
AB - Introduction: Competency-based education (CBE) programs usually evaluate student learning outcomes at a course level. However, a more comprehensive evaluation of student achievement of competencies requires evaluation at a programmatic level across all courses. There is currently insufficient literature on accomplishing this type of evaluation. In this article, we present an evaluation strategy adopted by the competency-based master's degree program at the Center for Health Professions Education at the Uniformed Services University of Health Sciences to assess student achievement of competencies. We hypothesized that (1) learners would grow in the competencies through their time in the program and (2) learners would exhibit a behavioristic change as a result of their participation in the program. Materials and Methods: The degree program at the Center for Health Professions Education conducts an annual student self-assessment of competencies using a competency survey. The competency survey data from graduated master's students were collected, providing data from three time points: initial (pre-program survey), middle, and final (end-of-program survey). Open-ended responses from these three surveys were also analyzed. A general linear model for repeated measures was conducted. Significant effects were followed by post hoc tests across time. We also conducted post hoc analysis across domains to better understand the comparative levels of the domains at each time point. The responses to the open-ended prompt were thematically analyzed. Results: Analysis of the quantitative data revealed that (1) learners reported significant growth across time, (2) learners had different perceptions of their competencies in each of the domains, and (3) not all domains experienced similar changes over time. Analysis of the free responses highlighted the impact of coursework on competency attainment and the behavioristic change in learners. Conclusions: This study presents a strategic evaluation tool for course-based CBE programs that follow a traditional credit hour model. Programmatic evaluation of CBE programs should enable the inclusion of the learner's voice and provide evaluation data that go beyond individual course evaluations.
UR - http://www.scopus.com/inward/record.url?scp=85159756270&partnerID=8YFLogxK
U2 - 10.1093/milmed/usac353
DO - 10.1093/milmed/usac353
M3 - Article
C2 - 37201499
AN - SCOPUS:85159756270
SN - 0026-4075
VL - 188
SP - 69
EP - 74
JO - Military Medicine
JF - Military Medicine
ER -