To Jeffrey Barsuk, MD, FACP, FHM, the concept of simulation-based mastery learning is simplistic to the point of genius. Give a hospitalist—or any other physician—a physical task and let them practice the procedure until they master it. Take care not to fall into the decades-old mind-set that repetition alone will achieve a threshold of competence. Test the competence with a rigorous assessment schedule, which will objectively determine if the skill is truly mastered.
It’s standard operating procedure in many technical fields, such as engineering, computer programming, and aviation. For example, professional pilots undergo countless hours of simulation flying to freshen and further hone the skills they need to succeed in an airplane cockpit. But with tasks as menial as central venous catheter (CVC) insertions, the typical practice most young physicians get is the trial and error of needle passes.
More training, Dr. Barsuk argues, would make everyone involved better off—from the resident nervously seeking a line to the patient who wants the procedure completed as quickly and painlessly as possible.
“It’s very common sense,” says Dr. Barsuk, assistant professor of medicine in the division of hospital medicine at Northwestern University’s Feinberg School of Medicine in Chicago. “But no one is doing this. People don’t know that simulators are so effective. At least in the medical profession, we’re probably behind the times in it. … We’re enthusiastic about it because we believe in it so much. We want to see how far it can go. With mastery learning, the sky’s the limit. You can simulate almost anything you want.”
Dr. Barsuk and his colleagues have worked hard to translate “common sense” into empirical literature. Accordingly, the team will publish their latest work, “Use of Simulation-Based Mastery Learning to Improve the Quality of Central Venous Catheter Placement in a Medical Intensive Care Unit,” in September’s Journal of Hospital Medicine. The single-institution cohort study found simulation-based mastery learning increased internal-medicine residents’ skills in simulated CVC insertions, decreased the number of needle passes when performing actual procedures, and increased resident self-confidence.
“It’s always been assumed that experience in and of itself is a proxy for competence,” says William McGaghie, PhD, professor of medical education and preventive medicine at Feinberg, as well as director of evaluation for Northwestern University’s Clinical and Translational Science Institute (NUCATS). “We now know that is not the case … if skill acquisition is the goal. Experience on the wards isn’t enough. We have to have deliberate education interventions to practice in controlled, safe environments.”
—Jeffrey Barsuk, MD, FACP, FHM, Feinberg School of Medicine, Northwestern University, Chicago
The Northwestern study put that theory to the test. It aimed to expand mastery learning to a new skill set and assess quality indicators (number of needle passes, arterial punctures, etc.) and resident confidence before and after training modules. The team studied 41 second- and third-year residents rotating through the medical intensive-care unit (MICU) from October 2006 to February 2007. The university’s Institutional Review Board approved the study, and all of the participants gave informed consent prior to participating. Thirteen of the residents rotated through during a six-week pre-intervention phase, serving as the “traditionally-trained group,” the authors wrote. Twenty-eight residents were trained on Simulab’s CentralLineMan, a model with “ultrasound compatibility, an arterial pulse, and self-sealing veins and skins. Needles, dilators and guidewires can be inserted and realistic venous and arterial pressures demonstrated,” the authors wrote.
The residents who were trained for internal jugular (IJ) and subclavian (SC) CVC insertions received two two-hour education sessions consisting of a lecture, ultrasound training, deliberate practice, and feedback. A 27-item checklist was drafted to measure outcomes; all pre- and post-tests were graded by a single unblended instructor to ensure accuracy. According to the study:
- None of the residents met the minimum passing score (MPS) of 79.1% for CVC insertion at baseline: mean IJ=48.4%, standard deviation=23.1; mean SC=45.2%, standard deviation=26.3;
- All residents met or exceeded the MPS at testing after simulation training: mean IJ=94.8%, standard deviation=10.0; mean SC=91.1%, standard deviation=17.8 (P<0.001);
- In the MICU, simulator-trained residents required fewer needle passes to insert a CVC than traditionally trained residents: mean=1.79, standard deviation=1.0 vs. mean=2.78, standard deviation=1.77 (P=0.04);
- Simulator-trained residents displayed more self-confidence about their procedural skills: mean=81, standard deviation=11 vs. mean=68, standard deviation=20 (P=0.02).
Dr. Barsuk isn’t surprised that confidence increases with training, saying “they hammer this home.” There were several categories for which the authors found no major improvement, though, even with the addition of deliberate training and standardized didactic materials.
Notably, the authors wrote, the resident groups “did not differ in pneumothorax, arterial puncture, or mean number of CVC adjustments.” Some of the lack of disparity was attributed to the small sample size.
In interviews, the authors noted that additional study would help assess such clinical outcomes as reduced CVC-related infections after simulation-based training. Still, Dr. Barsuk says, this pilot report is an important first step to win over skeptics.
“Simulation-based training and deliberate practice in a mastery learning setting improves performance of both simulated and actual CVC insertions by internal medicine residents,” the study reads. “Procedural training remains an important component of internal medicine training although internists are performing fewer invasive procedures now than in years past. Use of a mastery model of CVC insertion requires that trainees demonstrate skill in a simulated environment before independently performing this invasive procedure on patients.”
Another advantage of the training, McGaghie says, is that it helps physicians track their own improvement. He cautions against administrators using the data for more nefarious purposes, lest the testing become unpopular and less useful to quality improvement programs.
“You don’t use these evaluations as a weapon; you use them as a tool,” McGaghie says. “No one is there to beat up the doctors; no one is there to make them look foolish. The whole idea is to be as rigorous as possible to look for improvement—constant improvement.” TH
Richard Quinn is a freelance writer based in New Jersey.