The feasibility, reliability, and validity of a program director's (supervisor's) evaluation form for medical school graduates

Steven J. Durning*, Louis N. Pangaro, Linda L. Lawrence, Donna Waechter, John McManigle, Jeffrey L. Jackson

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

39 Scopus citations

Abstract

Purpose: To determine the feasibility, reliability, and validity of the supervisor's evaluation form for first-year residents as an outcome measure for programmatic evaluation. Method: Prospective feedback has been sought from supervisors for the Uniformed Services University of the Health Sciences (USUHS) graduates during their internship year. Supervisors are sent yearly evaluation forms with up to three additional mailings. Using a six-point scale, supervisors rate residents on 18 items. The authors used evaluation data from 1993 to 2002. Feasibility was estimated by response rate. Internal consistency was assessed by calculating Cronbach's alpha and analyzing scores on a year-to-year and interrater basis. Validity was determined by exploratory factor analysis with oblique rotations, comparing ratings with end-of-medical school GPA and United States Medical Licensing Examination (USMLE) Step 1 and Step 2 scores (Pearson correlations), and by analyzing the range of scores to include the percentage of scores below acceptable level. Results: A total of 1,247 evaluations were collected for the 1,559 USUHS graduates (80%). Cronbach's alpha was .96 with no significant difference in scores by supervisor specialty or year. Factor analysis found that the evaluation form collapsed into two domains accounting for 68% of the variance: professionalism and expertise. End-of-medical school GPA and USMLE Step 1 and 2 scores correlated with expertise but not with professionalism. Mean scores across items were 3.5-4.31 with a median of 4.0 for all items (SD .80-1.21). Four percent of graduates received less-than-satisfactory ratings. Conclusions: This evaluation form has high feasibility and internal consistency. Factory analysis revealed two complimentary domains supporting its validity. Correlation with end-of-medical school measurements and analysis of range of scores supports the form's validity.

Original languageEnglish
Pages (from-to)964-968
Number of pages5
JournalAcademic Medicine
Volume80
Issue number10
DOIs
StatePublished - Oct 2005
Externally publishedYes

Fingerprint

Dive into the research topics of 'The feasibility, reliability, and validity of a program director's (supervisor's) evaluation form for medical school graduates'. Together they form a unique fingerprint.

Cite this