TY - JOUR
T1 - Effect of access to an electronic medical resource on performance characteristics of a certification examination a randomized controlled trial
AU - Lipner, Rebecca S.
AU - Brossman, Bradley G.
AU - Samonte, Kelli M.
AU - Durning, Steven J.
N1 - Funding Information:
Financial Support: Funds for incentives were provided by the ABIM Foundation. The ABIM provided support for the admin- istrative and technical work and statistical analysis by allowing staff time to be used for this purpose.
Funding Information:
Disclosures: The ABIM is pursuing use of external resources on its high-stakes examination as part of routine business for 2018 and beyond. UpToDate (which supplied the external resource for this study) is a potential vendor for that future work. Drs. Lipner, Brossman, and Samonte report grants from the ABIM Foundation during the conduct of the study and are employed by the ABIM. Dr. Durning was a research consultant for the ABIM during the conduct of the study. Disclosures can also be viewed at www.acponline.org/authors/icmje /ConflictOfInterestForms.do?msNum=M16-2843.
PY - 2017/9/5
Y1 - 2017/9/5
N2 - Background: Electronic resources are increasingly used in medical practice. Their use during high-stakes certification examinations has been advocated by many experts, but whether doing so would affect the capacity to differentiate between high and low abilities is unknown. Objective: To determine the effect of electronic resources on examination performance characteristics. Design: Randomized controlled trial. Setting: Medical certification program. Participants: 825 physicians initially certified by the American Board of Internal Medicine (ABIM) who passed the Internal Medicine Certification examination or sat for the Internal Medicine Maintenance of Certification (IM-MOC) examination in 2012 to 2015. Intervention: Participants were randomly assigned to 1 of 4 conditions: closed book using typical or additional time, or open book (that is, UpToDate [Wolters Kluwer]) using typical or additional time. All participants took the same modified version of the IM-MOC examination. Measurements: Primary outcomes included item difficulty (how easy or difficult the question was), item discrimination (how well the question differentiated between high and low abilities), and average question response time. Secondary outcomes included examination dimensionality (that is, the number of factors measured) and test-taking strategy. Item response theory was used to calculate question characteristics. Analysis of variance compared differences among conditions. Results: Closed-book conditions took significantly less time than open-book conditions (mean, 79.2 seconds [95% CI, 78.5 to 79.9 seconds] vs. 110.3 seconds [CI, 109.2 to 111.4 seconds] per question). Mean discrimination was statistically significantly higher for open-book conditions (0.34 [CI, 0.32 to 0.35] vs. 0.39 [CI, 0.37 to 0.41] per question). A strong single dimension showed that the examination measured the same factor with or without the resource. Limitation: Only 1 electronic resource was evaluated. Conclusion: Inclusion of an electronic resource with time constraints did not adversely affect test performance and did not change the specific skill or factor targeted by the examination. Further study on the effect of resource inclusion on other examinations is warranted.
AB - Background: Electronic resources are increasingly used in medical practice. Their use during high-stakes certification examinations has been advocated by many experts, but whether doing so would affect the capacity to differentiate between high and low abilities is unknown. Objective: To determine the effect of electronic resources on examination performance characteristics. Design: Randomized controlled trial. Setting: Medical certification program. Participants: 825 physicians initially certified by the American Board of Internal Medicine (ABIM) who passed the Internal Medicine Certification examination or sat for the Internal Medicine Maintenance of Certification (IM-MOC) examination in 2012 to 2015. Intervention: Participants were randomly assigned to 1 of 4 conditions: closed book using typical or additional time, or open book (that is, UpToDate [Wolters Kluwer]) using typical or additional time. All participants took the same modified version of the IM-MOC examination. Measurements: Primary outcomes included item difficulty (how easy or difficult the question was), item discrimination (how well the question differentiated between high and low abilities), and average question response time. Secondary outcomes included examination dimensionality (that is, the number of factors measured) and test-taking strategy. Item response theory was used to calculate question characteristics. Analysis of variance compared differences among conditions. Results: Closed-book conditions took significantly less time than open-book conditions (mean, 79.2 seconds [95% CI, 78.5 to 79.9 seconds] vs. 110.3 seconds [CI, 109.2 to 111.4 seconds] per question). Mean discrimination was statistically significantly higher for open-book conditions (0.34 [CI, 0.32 to 0.35] vs. 0.39 [CI, 0.37 to 0.41] per question). A strong single dimension showed that the examination measured the same factor with or without the resource. Limitation: Only 1 electronic resource was evaluated. Conclusion: Inclusion of an electronic resource with time constraints did not adversely affect test performance and did not change the specific skill or factor targeted by the examination. Further study on the effect of resource inclusion on other examinations is warranted.
UR - http://www.scopus.com/inward/record.url?scp=85028811738&partnerID=8YFLogxK
U2 - 10.7326/M16-2843
DO - 10.7326/M16-2843
M3 - Article
C2 - 28806791
AN - SCOPUS:85028811738
SN - 0003-4819
VL - 167
SP - 302
EP - 310
JO - Annals of Internal Medicine
JF - Annals of Internal Medicine
IS - 5
ER -