TY - JOUR
T1 - Evaluating Intersite Consistency Across 11 Geographically Distinct Pediatric Clerkship Training Sites
T2 - Providing Assurance That Educational Comparability Is Possible
AU - Judd, Courtney A.
AU - Dong, Ting
AU - Foster, Christopher
AU - Durning, Steven J.
AU - Hickey, Patrick W.
N1 - Publisher Copyright:
© 2023 Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States. This work is written by (a) US Government employee(s) and is in the public domain in the US.
PY - 2023/5/1
Y1 - 2023/5/1
N2 - Introduction: We compared core pediatric clerkship student assessments across 11 geographically distinct learning environments following a major curriculum change. We sought to determine if intersite consistency existed, which can be used as a marker of program evaluation success. Methods: We evaluated students' overall pediatric clerkship performance along with individual assessments that target our clerkship learning objectives. Using the data of graduating classes from 2015 to 2019 (N = 859), we conducted an analysis of covariance and multivariate logistic regression analysis to investigate whether the performance varied across training sites. Results: Of the students, 833 (97%) were included in the study. The majority of the training sites did not show statistically significant differences from each other. After controlling for the Medical College Admission Test total score and the average pre-clerkship National Board of Medical Examiners final exam score, the clerkship site only explained a 3% additional variance of the clerkship final grade. Conclusions: Over the ensuing 5-year period after a curriculum overhaul to an 18-month, integrated module pre-clerkship curriculum, we found that student pediatric clerkship performance in clinical knowledge and skills did not differ significantly across 11 varied geographic teaching sites when controlling for students' pre-clerkship achievement. Specialty-specific curriculum resources, faculty development tools, and assessment of learning objectives may provide a framework for maintaining intersite consistency when faced with an expanding network of teaching facilities and faculty.
AB - Introduction: We compared core pediatric clerkship student assessments across 11 geographically distinct learning environments following a major curriculum change. We sought to determine if intersite consistency existed, which can be used as a marker of program evaluation success. Methods: We evaluated students' overall pediatric clerkship performance along with individual assessments that target our clerkship learning objectives. Using the data of graduating classes from 2015 to 2019 (N = 859), we conducted an analysis of covariance and multivariate logistic regression analysis to investigate whether the performance varied across training sites. Results: Of the students, 833 (97%) were included in the study. The majority of the training sites did not show statistically significant differences from each other. After controlling for the Medical College Admission Test total score and the average pre-clerkship National Board of Medical Examiners final exam score, the clerkship site only explained a 3% additional variance of the clerkship final grade. Conclusions: Over the ensuing 5-year period after a curriculum overhaul to an 18-month, integrated module pre-clerkship curriculum, we found that student pediatric clerkship performance in clinical knowledge and skills did not differ significantly across 11 varied geographic teaching sites when controlling for students' pre-clerkship achievement. Specialty-specific curriculum resources, faculty development tools, and assessment of learning objectives may provide a framework for maintaining intersite consistency when faced with an expanding network of teaching facilities and faculty.
UR - http://www.scopus.com/inward/record.url?scp=85159768873&partnerID=8YFLogxK
U2 - 10.1093/milmed/usad044
DO - 10.1093/milmed/usad044
M3 - Article
C2 - 37201493
AN - SCOPUS:85159768873
SN - 0026-4075
VL - 188
SP - 81
EP - 86
JO - Military Medicine
JF - Military Medicine
ER -