Dual processing theory and expertsʼ reasoning: exploring thinking on national multiple-choice questions

Steven J. Durning*, Ting Dong, Anthony R. Artino, Cees van der Vleuten, Eric Holmboe, Lambert Schuwirth

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

34 Scopus citations

Abstract

Background: An ongoing debate exists in the medical education literature regarding the potential benefits of pattern recognition (non-analytic reasoning), actively comparing and contrasting diagnostic options (analytic reasoning) or using a combination approach. Studies have not, however, explicitly explored faculty’s thought processes while tackling clinical problems through the lens of dual process theory to inform this debate. Further, these thought processes have not been studied in relation to the difficulty of the task or other potential mediating influences such as personal factors and fatigue, which could also be influenced by personal factors such as sleep deprivation. We therefore sought to determine which reasoning process(es) were used with answering clinically oriented multiple-choice questions (MCQs) and if these processes differed based on the dual process theory characteristics: accuracy, reading time and answering time as well as psychometrically determined item difficulty and sleep deprivation. Methods: We performed a think-aloud procedure to explore faculty’s thought processes while taking these MCQs, coding think-aloud data based on reasoning process (analytic, nonanalytic, guessing or combination of processes) as well as word count, number of stated concepts, reading time, answering time, and accuracy. We also included questions regarding amount of work in the recent past. We then conducted statistical analyses to examine the associations between these measures such as correlations between frequencies of reasoning processes and item accuracy and difficulty. We also observed the total frequencies of different reasoning processes in the situations of getting answers correctly and incorrectly. Results: Regardless of whether the questions were classified as ‘hard’ or ‘easy’, non-analytical reasoning led to the correct answer more often than to an incorrect answer. Significant correlations were found between self-reported recent number of hours worked with think-aloud word count and number of concepts used in the reasoning but not item accuracy. When all MCQs were included, 19 % of the variance of correctness could be explained by the frequency of expression of these three think-aloud processes (analytic, nonanalytic, or combined). Discussion: We found evidence to support the notion that the difficulty of an item in a test is not a systematic feature of the item itself but is always a result of the interaction between the item and the candidate. Use of analytic reasoning did not appear to improve accuracy. Our data suggest that individuals do not apply either System 1 or System 2 but instead fall along a continuum with some individuals falling at one end of the spectrum.

Original languageEnglish
Pages (from-to)168-175
Number of pages8
JournalPerspectives on Medical Education
Volume4
Issue number4
DOIs
StatePublished - 1 Aug 2015
Externally publishedYes

Keywords

  • assessment
  • clinical reasoning
  • dual-process theory

Fingerprint

Dive into the research topics of 'Dual processing theory and expertsʼ reasoning: exploring thinking on national multiple-choice questions'. Together they form a unique fingerprint.

Cite this