Novel examination for evaluating medical student clinical reasoning: Reliability and association with patients seen

Research output: Contribution to journalArticlepeer-review

5 Scopus citations


Background: Medical students learn clinical reasoning, in part, through patient care. Although the numbers of patients seen is associated with knowledge examination scores, studies have not demonstrated an association between patient problems and an assessment of clinical reasoning. Aim: To examine the reliability of a clinical reasoning examination and investigate whether there was association between internal medicine core clerkship students’ performance on this examination and the number of patients they saw with matching problems during their internal medicine clerkship. Methods: Students on the core internal medicine clerkship at the Uniformed Services University students log 11 core patient problems based on the Clerkship Directors in Internal Medicine curriculum. On a final clerkship examination (Multistep), students watch a scripted video encounter between physician and patient actors that assesses three sequential steps in clinical reasoning: Step One focuses on history and physical examination; Step Two, students write a problem list after viewing additional clinical findings; Step Three, students complete a prioritized differential diagnosis and treatment plan. Each Multistep examination has three different cases. For graduating classes 2010-2012 (n = 497), we matched the number of patients seen with the problem most represented by the Multistep cases (epigastric pain, generalized edema, monoarticular arthritis, angina, syncope, pleuritic chest pain). We report two-way Pearson correlations between the number of patients students reported with similar problems and the student’s percent score on: Step One, Step Two, Step Three, and Overall Test. Results: Multistep reliability: Step 1, 0.6 to 0.8; Step 2, 0.41 to 0.65; Step 3, 0.53 to 0.78; Overall examination (3 cases): 0.74 to 0. 83. For three problems, the number of patients seen had small to modest correlations with the Multistep Examination of Analytic Ability total score (r = 0.27 for pleuritic pain, p < 0.05, n = 81 patients; r = 0.14 for epigastric pain, p < 0.05, n = 324 patients; r = 0.19 for generalized edema, p < 0.05, n = 118 patients). Discussion or Conclusion: Although a reliable assessment, student performance on a clinical reasoning examination was weakly associated with the numbers of patients seen with similar problems. This may be as a result of transfer of knowledge between clinical and examination settings, the complexity of clinical reasoning, or the limits of reliability with patient logs and the Multistep.

Original languageEnglish
Pages (from-to)79-87
Number of pages9
JournalMilitary Medicine
Issue number4
StatePublished - Apr 2015
Externally publishedYes


Dive into the research topics of 'Novel examination for evaluating medical student clinical reasoning: Reliability and association with patients seen'. Together they form a unique fingerprint.

Cite this