Simulation modules allow for the safe practice of certain techniques and are becoming increasingly important in the shift toward education for integrated vascular residents. There is an unquestionable need to standardize the evaluation of trainees on these simulation models to assure their impact and effectiveness. We sought to validate such an assessment tool for a basic open vascular technique.Vascular fellows, integrated vascular residents, and general surgery residents attending Society for Clinical Vascular Surgery, Introduction to Academic Vascular Surgery, and Methodist Boot Camp in 2012 were asked to participate in an assessment model using multiple anastomotic models and given 20 minutes to complete an end-to-side anastomosis. Trained vascular faculty evaluated subjects using an assessment tool that included a 25-point checklist and a graded overall global rating scale (GRS) on a 5-point Likert scale with 8 parameters. Self-assessment using the GRS was performed by 20 trainees. Reliability and construct validity were evaluated.Ninety-two trainees were assessed. There was excellent agreement between assessors on 21 of the 25 items, with 2 items found not to be relevant for the bench-top model. Graders agreed that the checklist was prohibitively cumbersome to use. Scores on the global assessments correlated with experience and were higher for the senior trainees, with median global summary scores increasing by postgraduate year. Reliability was confirmed through interrater correlation and internal consistency. Internal consistency was 0.92 for the GRS. There was poor correlation between grades given by the expert observers and the self-assessment from the trainee, but good correlation between scores assigned by faculty. Assessment of appropriate hemostasis was poor, which likely reflects the difficulty of evaluating this parameter in the current inanimate model.Performance on an open simulation model evaluated by a standardized global rating scale correlated to trainee experience level. This initial work confirms the ease and applicability of the grading tool among multiple expert observers and different platforms, and supports additional; research into applications translating this performance into the operating room.
View details for DOI 10.1016/j.avsg.2013.07.005
View details for Web of Science ID 000328646400016
View details for PubMedID 24189012