///2012 Abstract Details
2012 Abstract Details2018-05-01T17:55:36+00:00

ENHANCING THE TRAINING OF ANESTHESIOLOGY RESIDENTS TO MANAGE CRITICAL OB EVENTS: A PILOT STUDY UTILIZING SIMULATION AND INNOVATIVE ASSESSMENT TECHNIQUES

Abstract Number: S-21
Abstract Type: Original Research

Bryan P Mahoney MD1 ; Andrew MIller MD2; Lawrence C Tsen MD3; May Pian-Smith MD4; Meredith Albrecht MD5; Priscilla G Harrell MD6

Introduction: Simulation is widely accepted as an educational modality for anesthesiology residents and is now an ACGME requirement (1,2). As with all emerging education techniques, meaningful metrics to assess outcomes of an intervention are critical (3). This prospective, randomized, blinded pilot study uses innovative performance assessments to compare residents who have learned about OB critical events (CEs) via immersive simulation (SIM) vs didactic lectures (DL).

Methods: 24 CA-1 residents are being recruited from 2 hospitals during the 2011-12 academic year. Each receives SIM training or a DL on the management of amniotic fluid embolus (AFE) and high spinal (HS). Pre-intervention assessments include written-boards style multiple choice questions (MCQ) on general OB anesthesia topics, AFE, and HS. Subjects also complete a self-assessment of abilities and attitudes (SA). Post-intervention assessments include MCQ, SA, an oral-boards style exam (OBE) scored by 2 independent blinded faculty examiners, and actual ABA In-Training (OB-specific) exam performance. Appropriate statistical analysis will be applied after data collection is completed from 24 subjects.

Results: Data collection is on-going, with completion expected in April 2012. Sample size does not yet allow for statistical comparison, but preliminary data from the first 12 subjects suggest (1) a strong interest of trainees in receiving additional training in CEs (4.5/5); (2) prior to intervention, similar mean baseline MCQ scoring on general OB anesthesia knowledge (57.5% vs 56.7%), but a mismatch of baseline knowledge about AFE (SIM 30% vs DL 54%) and HS (SIM 40% vs DL 60%), which may reflect differences in institutional curricular experiences; (3) following intervention, mean correct MCQ scores on AFE (SIM 52% vs DL 70%) and HS are SIM 84% vs DL 70% and (4) no discernible difference between the two groups in OBE performance, but post-intervention SA confidence in diagnosis and management of all OB CE’s lower in SIM than DL; and (5) greater post-intervention motivation to learn more about the subjects in the SIM than DL group (4.6/5 vs 4/5)

Discussion: Simulation is a potentially powerful educational tool but effectiveness requires rigorous study. We have developed a multi-modal assessment system to provide detailed, granular information about trainee competence in the management of CEs. This assessment system is partly modeled after tools currently utilized by the ABA for assessing competence (MCQ and OBE). SA surveys provide subjective resident impressions while ITE data may provide objective, third-party feedback. The true utility and results from our innovative assessment system will be borne out with a larger sample size and with analysis of (delta) effects on individual trainees. Ultimately such assessments can help measure effectiveness of any educational intervention while impacting trainee growth and curricular development.

1 Gaba et al 2 DeAnda et al 3 Goodwin et al

SOAP 2012