Background: Electronic resources are increasingly used in medical practice. Their use during high-stakes certification examinations has been advocated by many experts, but whether doing so would affect the capacity to differentiate between high and low abilities is unknown. Objective: To determine the effect of electronic resources on examination performance characteristics. Design: Randomized controlled trial. Setting: Medical certification program. Participants: 825 physicians initially certified by the American Board of Internal Medicine (ABIM) who passed the Internal Medicine Certification examination or sat for the Internal Medicine Maintenance of Certification (IM-MOC) examination in 2012 to 2015. Intervention: Participants were randomly assigned to 1 of 4 conditions: closed book using typical or additional time, or open book (that is, UpToDate [Wolters Kluwer]) using typical or additional time. All participants took the same modified version of the IM-MOC examination. Measurements: Primary outcomes included item difficulty (how easy or difficult the question was), item discrimination (how well the question differentiated between high and low abilities), and average question response time. Secondary outcomes included examination dimensionality (that is, the number of factors measured) and test-taking strategy. Item response theory was used to calculate question characteristics. Analysis of variance compared differences among conditions. Results: Closed-book conditions took significantly less time than open-book conditions (mean, 79.2 seconds [95% CI, 78.5 to 79.9 seconds] vs. 110.3 seconds [CI, 109.2 to 111.4 seconds] per question). Mean discrimination was statistically significantly higher for open-book conditions (0.34 [CI, 0.32 to 0.35] vs. 0.39 [CI, 0.37 to 0.41] per question). A strong single dimension showed that the examination measured the same factor with or without the resource. Limitation: Only 1 electronic resource was evaluated. Conclusion: Inclusion of an electronic resource with time constraints did not adversely affect test performance and did not change the specific skill or factor targeted by the examination. Further study on the effect of resource inclusion on other examinations is warranted.