TY - JOUR
T1 - Comprehensive Multicenter Graduate Surgical Education Initiative Incorporating Entrustable Professional Activities, Continuous Quality Improvement Cycles, and a Web-Based Platform to Enhance Teaching and Learning
AU - Michigan State University Guided Operative Assessment and Learning Consortium
AU - Anderson, Cheryl I.
AU - Basson, Marc D.
AU - Ali, Muhammad
AU - Davis, Alan T.
AU - Osmer, Robert L.
AU - McLeod, Michael K.
AU - Haan, Pam S.
AU - Molnar, Robert G.
AU - Peshkepija, Andi N.
AU - Hardaway, John C.
AU - Chojnacki, Karen A.
AU - Pfeifer, Christopher C.
AU - Gauvin, Jeffrey M.
AU - Jones, Mark W.
AU - Mansour, M. Ashraf
AU - Soleimani, Tahereh
AU - Nebeker, Cody A.
AU - Gupta, Rama N.
AU - Yeo, Charles J.
AU - Pozzessere, Anthony S.
AU - Langdon, Sarah
AU - Loseth, Caitlin
AU - Gelbard, Rondi B.
AU - Delman, Keith A.
AU - Martin, Denny R.
AU - Narkiewicz, Lawrence
AU - Obi, Shawn H.
AU - Smith, Daniel E.
AU - Czajka, Meghan L.
AU - Deppen, Jeffrey G.
AU - Ferguson, Troy M.
AU - Hanses, Suzanne M.
AU - St Hilaire, Nicholas J.
AU - Deere, Matthew J.
AU - Henning, Werner H.
AU - Welle, Nickolas J.
AU - Collins, Jason T.
AU - Toomey, Ariel E.
AU - Henderson, Jessica A.
AU - Chen, Anthony J.
AU - Harris, Justin R.
AU - Jelenek, Leandra A.
AU - Olson, Michelle M.
N1 - Publisher Copyright:
© 2018 American College of Surgeons
PY - 2018/7
Y1 - 2018/7
N2 - Background: It is increasingly important for faculty to teach deliberately and provide timely, detailed, and formative feedback on surgical trainee performance. We initiated a multicenter study to improve resident evaluative processes and enhance teaching and learning behaviors while engaging residents in their education. Study Design: Faculty from 7 US postgraduate training programs rated resident operative performances using the perioperative briefing, intraoperative teaching, debriefing model, and rated patient visits/academic performances using the entrustable professional activities model via a web-based platform. Data were centrally analyzed and iterative changes made based on participant feedback, individual preferences, and database refinements, with trends addressed using the Plan, Do, Check, Act improvement methodology. Results: Participants (92 surgeons, 150 residents) submitted 3,880 assessments during July 2014 through September 2017. Evidence of preoperative briefings improved from 33.9% ± 2.5% to 95.5% ± 1.5% between April and September 2014 compared with April and September 2017 (p < 0.001). Postoperative debriefings improved from 10.6% ± 2.7% to 90.2% ± 2.5% (p < 0.001) for the same period. Meaningful self-reflection by residents improved from 28.6% to 67.4% (p < 0.001). The number of assessments received per resident during a 6-month period increased from 6.4 ± 6.2 to 13.4 ± 10.1 (p < 0.003). Surgeon-entered assessments increased from 364 initially to 685 in the final period, and the number of resident assessments increased from 308 to 445. We showed a 4-fold increase in resident observed activities being rated. Conclusions: By adopting recognized educational models with repeated Plan, Do, Check, Act cycles, we increased the quality of preoperative learning objectives, showed more frequent, detailed, and timely assessments of resident performance, and demonstrated more effective self-reflection by residents. We monitored trends, identified opportunities for improvement and successfully sustained those improvements over time, applying a team-based approach.
AB - Background: It is increasingly important for faculty to teach deliberately and provide timely, detailed, and formative feedback on surgical trainee performance. We initiated a multicenter study to improve resident evaluative processes and enhance teaching and learning behaviors while engaging residents in their education. Study Design: Faculty from 7 US postgraduate training programs rated resident operative performances using the perioperative briefing, intraoperative teaching, debriefing model, and rated patient visits/academic performances using the entrustable professional activities model via a web-based platform. Data were centrally analyzed and iterative changes made based on participant feedback, individual preferences, and database refinements, with trends addressed using the Plan, Do, Check, Act improvement methodology. Results: Participants (92 surgeons, 150 residents) submitted 3,880 assessments during July 2014 through September 2017. Evidence of preoperative briefings improved from 33.9% ± 2.5% to 95.5% ± 1.5% between April and September 2014 compared with April and September 2017 (p < 0.001). Postoperative debriefings improved from 10.6% ± 2.7% to 90.2% ± 2.5% (p < 0.001) for the same period. Meaningful self-reflection by residents improved from 28.6% to 67.4% (p < 0.001). The number of assessments received per resident during a 6-month period increased from 6.4 ± 6.2 to 13.4 ± 10.1 (p < 0.003). Surgeon-entered assessments increased from 364 initially to 685 in the final period, and the number of resident assessments increased from 308 to 445. We showed a 4-fold increase in resident observed activities being rated. Conclusions: By adopting recognized educational models with repeated Plan, Do, Check, Act cycles, we increased the quality of preoperative learning objectives, showed more frequent, detailed, and timely assessments of resident performance, and demonstrated more effective self-reflection by residents. We monitored trends, identified opportunities for improvement and successfully sustained those improvements over time, applying a team-based approach.
UR - http://www.scopus.com/inward/record.url?scp=85046677961&partnerID=8YFLogxK
U2 - 10.1016/j.jamcollsurg.2018.02.014
DO - 10.1016/j.jamcollsurg.2018.02.014
M3 - Article
C2 - 29551697
AN - SCOPUS:85046677961
SN - 1072-7515
VL - 227
SP - 64
EP - 76
JO - Journal of the American College of Surgeons
JF - Journal of the American College of Surgeons
IS - 1
ER -