Background: Using a previously developed postgraduate year (PGY)-1 program director’s evaluation survey, we developed a parallel form to assess more senior residents (PGY-3). The PGY-3 survey, which aligns with the core competencies established by the Accreditation Council for Graduate Medical Education, also includes items that reflect our institution’s military-unique context. Purpose: To collect feasibility, reliability, and validity evidence for the new PGY-3 evaluation. Methods: We collected PGY-3 data from program directors who oversee the education of military residents. The current study’s cohort consisted of Uniformed Services University of the Health Sciences students graduating in 2008, 2009, and 2010. We performed exploratory factor analysis (EFA) to examine the internal structure of the survey and subjected each of the factors identified in the EFA to an internal consistency reliability analysis. We then performed correlation analysis to examine the relationships between PGY-3 ratings and several outcomes: PGY-1 ratings, cumulative medical school grade point average (GPA), and performance on U.S. Medical Licensing Examinations (USMLE) Step 1, Step 2 Clinical Knowledge, and Step 3. Results: Of the 510 surveys we distributed, 388 (76%) were returned. Results from the EFA suggested four factors: “Medical Expertise,” “Professionalism,” “Military-unique Practice,” and “Systems-based Practice.” Scores on these four factors showed good internal consistency reliability, as measured by Cronbach’s a (a ranged from 0.92 to 0.98). Further, as expected, “Medical Expertise” and “Professionalism” had small to moderate correlations with cumulative medical school GPA and performance on the USMLE Step examinations. Conclusions: The new program director’s evaluation survey instrument developed in this study appears to be feasible, and the scores that emerged have reasonable evidence of reliability and validity in a sample of third-year residents.