TY - JOUR
T1 - Head-camera video recordings of trauma core competency procedures can evaluate surgical resident's technical performance as well as colocated evaluators
AU - The Retention and Assessment of Surgical Performance (RASP) Group
AU - Mackenzie, Colin F.
AU - Pasley, Jason
AU - Garofalo, Evan
AU - Shackelford, Stacy
AU - Chen, Hegang
AU - Longinaker, Nyaradzo
AU - Granite, Guinevere
AU - Pugh, Kristy
AU - Hagegeorge, George
AU - Tisherman, Samuel A.
N1 - Funding Information:
The authors declare no conflicts of interest. This research and development project was conducted by the University of Maryland School of Medicine and was made possible by a cooperative and administered by the US Army Medical Research & Materiel Command (USAMRMC) and the Telemedicine & Advanced Technology Research Center (TATRC) at Fort Detrick, MD, under Grant Number: W81XWH-13-2-0028.
Publisher Copyright:
Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.
PY - 2017
Y1 - 2017
N2 - BACKGROUND: Unbiased evaluation of trauma core competency procedures is necessary to determine if residency and predeployment training courses are useful. We tested whether a previously validated individual procedure score (IPS) for individual procedure vascular exposure and fasciotomy (FAS) performance skills could discriminate training status by comparing IPS of evaluators colocated with surgeons to blind video evaluations. METHODS: Performance of axillary artery (AA), brachial artery (BA), and femoral artery (FA) vascular exposures and lower extremity FAS on fresh cadavers by 40 PGY-2 to PGY-6 residents was video-recorded fromhead-mounted cameras. Two colocated trained evaluators assessed IPS before and after training. One surgeon in each pretraining tertile of IPS for each procedure was randomly identified for blind video review. The same 12 surgeons were video-recorded repeating the procedures less than 4 weeks after training. Five evaluators independently reviewed all 96 randomly arranged deidentified videos. Inter-rater reliability/consistency, intraclass correlation coefficients were compared by colocated versus video reviewof IPS, and errors. Study methodology and bias were judged by Medical Education Research Study Quality Instrument and the Quality Assessment of Diagnostic Accuracy Studies criteria. RESULTS: Therewere no differences (p ≥ 0.5) in IPS forAA, FA, FAS,whether evaluatorswere colocated or reviewed video recordings. Evaluator consistency was 0.29 (BA). 0.77 (FA). Video and colocated evaluators were in total agreement (p = 1.0) for error recognition. Intraclass correlation coefficient was 0.73 to 0.92, dependent on procedure. Correlations video versus colocated evaluations were 0.5 to 0.9. Except for BA, blinded video evaluators discriminated (p < 0.002) whether procedures were performed before training versus after training. Study methodology by Medical Education Research Study Quality Instrument criteria scored 15.5/19, Quality Assessment of Diagnostic Accuracy Studies 2 showed low bias risk. CONCLUSION: Video evaluations of AA, FA, and FAS procedures with IPS are unbiased, valid, and have potential for formative assessments of competency.
AB - BACKGROUND: Unbiased evaluation of trauma core competency procedures is necessary to determine if residency and predeployment training courses are useful. We tested whether a previously validated individual procedure score (IPS) for individual procedure vascular exposure and fasciotomy (FAS) performance skills could discriminate training status by comparing IPS of evaluators colocated with surgeons to blind video evaluations. METHODS: Performance of axillary artery (AA), brachial artery (BA), and femoral artery (FA) vascular exposures and lower extremity FAS on fresh cadavers by 40 PGY-2 to PGY-6 residents was video-recorded fromhead-mounted cameras. Two colocated trained evaluators assessed IPS before and after training. One surgeon in each pretraining tertile of IPS for each procedure was randomly identified for blind video review. The same 12 surgeons were video-recorded repeating the procedures less than 4 weeks after training. Five evaluators independently reviewed all 96 randomly arranged deidentified videos. Inter-rater reliability/consistency, intraclass correlation coefficients were compared by colocated versus video reviewof IPS, and errors. Study methodology and bias were judged by Medical Education Research Study Quality Instrument and the Quality Assessment of Diagnostic Accuracy Studies criteria. RESULTS: Therewere no differences (p ≥ 0.5) in IPS forAA, FA, FAS,whether evaluatorswere colocated or reviewed video recordings. Evaluator consistency was 0.29 (BA). 0.77 (FA). Video and colocated evaluators were in total agreement (p = 1.0) for error recognition. Intraclass correlation coefficient was 0.73 to 0.92, dependent on procedure. Correlations video versus colocated evaluations were 0.5 to 0.9. Except for BA, blinded video evaluators discriminated (p < 0.002) whether procedures were performed before training versus after training. Study methodology by Medical Education Research Study Quality Instrument criteria scored 15.5/19, Quality Assessment of Diagnostic Accuracy Studies 2 showed low bias risk. CONCLUSION: Video evaluations of AA, FA, and FAS procedures with IPS are unbiased, valid, and have potential for formative assessments of competency.
KW - MERSQI and QUADAS-2 Criteria
KW - Surgeon's performance
KW - blind video review
KW - intraclass correlation
KW - trauma core competency procedures
UR - http://www.scopus.com/inward/record.url?scp=85017145670&partnerID=8YFLogxK
U2 - 10.1097/TA.0000000000001467
DO - 10.1097/TA.0000000000001467
M3 - Article
C2 - 28376020
AN - SCOPUS:85017145670
SN - 2163-0755
VL - 83
SP - S124-S129
JO - Journal of Trauma and Acute Care Surgery
JF - Journal of Trauma and Acute Care Surgery
IS - 1
ER -