Background: Objective Structured Clinical Examinations (OSCEs) are used at the majority of U.S. medical schools. Given the high resource demands with constructing and administering OSCEs, understanding how OSCEs relate to typical performance measures in medical school could help educators more effectively design curricula and evaluation to optimize student instruction and assessment. Purpose: To investigate the correlation between second-year and third-year OSCE scores, as well as the associations between OSCE scores and several other typical measures of students' medical school performance. Methods: We tracked the performance of a 5-year cohort (classes of 2007-2011). We studied the univariate correlations among OSCE scores, U.S. Medical Licensing Examination (USMLE) scores, and medical school grade point average. We also examined whether OSCE scores explained additional variance in the USMLE Step 2 Clinical Knowledge score beyond that explained by the Step 1 score. Results: The second-and third-year OSCE scores were weakly correlated. Neither second-nor third-year OSCE score was strongly correlated with USMLE scores or medical school grade point average. Conclusion: Our findings suggest that OSCEs capture a viewpoint that is different from typical assessment measures that largely reflect multiple choice questions; these results also support tenets of situated cognition theory.