Page 60 - JSOM Winter 2021
P. 60
FIGURE 1 Average scores on knowledge exam. • Ultrasound-guided regional anesthesia: 7.3 ± 0.866
• Lung ultrasound: 7.8 ± 0.441
• RUSH: 10.7 ± 0.707
• Ultrasound-guided vascular access: 4.7 ± 0.707
• TTE: 9.8 ± 0.441
The average score at every station was > 91%.
Discussion
We demonstrated the feasibility of successfully implementing
an ultrasound course tailored for military medics that taught
multiple ultrasound applications through multimodal edu-
cational methods. Additionally, we showed that a short but
intensive course can develop ultrasound knowledge, manual
Average knowledge exam scores increased from pre-course (56% ± skills, and workflow understanding in nonphysician providers
6.8%) to post-course (80% ± 5.0%, p < .001).
such as military medics with varying levels of prior ultrasound
experience. Our specific multimodal framework allowed for
(mean baseline score: 1.89, mean day 4 score: 3.93, p = .007), assessment of knowledge and manual skills at various times.
image optimization (mean baseline score: 1.67, mean day The military medics improved both their knowledge base and
4 score: 3.93, p = .008), image acquisition speed (mean base- hands-on skills as evidenced by their knowledge exam scores
line score: 1.78, mean day 4 score: 4, p = .008), final image and scores on the global rating scale, respectively. After the
quality (mean baseline score: 1.56, mean day 4 score: 3.85, p = course, all participants achieved knowledge exam scores ex-
.008), and global assessment (mean baseline score: 1.74, mean ceeding 70%, the threshold previously used as a passing score
day 4 score: 3.93, p = .008) (Figure 2). Participants’ average for an ultrasound knowledge exam for anesthesiology train-
comfort level with ultrasound applications was 2.63 ± 0.744 ees. Our customized OSCE at the end of the course allowed
14
before the course and 3.89 ± 0.601 after the course. evaluators to assess medics’ understanding of the workflow
for ultrasound applications. Moreover, participants reported
FIGURE 2 Average ratings of manual skills.
feeling more comfortable with using ultrasound by the end of
the course compared to the start of the course.
Earlier studies that involved ultrasound training for military
medics are limited to teaching a few specific applications in
a short span of time. Alternatively, they are limited to a lack
of comprehensive assessment to evaluate the efficacy of the
course in imparting knowledge, manual skills, and workflow
understanding in ultrasound applications. 2,6–11,15–18 Our course
is unique for its robust training time of approximately 40
hours, incorporating a flipped classroom model that allows
more time for numerous repetitions and live feedback. Broad
training in multiple ultrasound applications was conducted via
Average ratings for manual skills of the medics improved from base- multimodal teaching tools in an organized fashion. The online
line to day 4 for image finding (mean baseline score: 1.89, mean day 4 modules, live lectures, and substantial case-based discussions
score: 3.93, p = .007), image optimization (mean baseline score: 1.67,
mean day 4 score: 3.93, p = .008), image acquisition speed (mean by experts expanded the medics’ knowledge base while the
baseline score: 1.78, mean day 4 score: 4, p = .008), final image qual- repetitive deliberate practice with personalized feedback al-
ity (mean baseline score: 1.56, mean day 4 score: 3.85, p = .008), and lowed them to solidify their workflow understanding, as well
global assessment (mean baseline score: 1.74, mean day 4 score: 3.93, as improve their manual skills. Our curriculum also included
p = .008). Ratings were based on a 4-point Likert scale (1 = Needing
Attention [Novice], 4 = Almost Expert). open lab time to address areas of individual interests or areas
of deficiency, which further improved ultrasound proficiency
in participants. A comprehensive assessment with individual
Workflow Understanding
The mean scores of the medics at the OSCE stations were as components to assess knowledge base, manual skills, and
follows (Table 3): workflow understanding was carried out to confirm the valid-
ity and efficacy of the course.
TABLE 3 Performance of Medics on OSCE* Stations Although our curriculum is time-intensive and appropriately
Station Maximum Score Medics Mean Score, ± SD assessed ultrasound knowledge, manual skills, and workflow
Regional anesthesia 8 7.3 ± 0.866 understanding, there is still a question of skill decay weeks to
Lung ultrasound 8 7.8 ± 0.441 months after the course ended. This can easily be evaluated
RUSH 11 10.7 ± 0.707 with follow-up knowledge exams, assessment of manual skills
Vascular access 5 4.7 ± 0.707 on simulators, and OSCEs. The addition of spaced learning is
TTE 10 9.8 ± 0.441 desirable with appropriate interval refresher training exercises.
*OSCE = Objective Structured Clinical Examination; SD = standard There is also potential for more ultrasound applications to be
deviation; RUSH = Rapid Ultrasound for Shock and Hypotension; incorporated in this curriculum that may be helpful for mili-
TTE = transthoracic echocardiography. tary medics. These include bone fracture detection, assessment
58 | JSOM Volume 21, Edition 4 / Winter 2021

