Published on in Vol 7, No 1 (2021): Jan-Mar

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/17277, first published .
Evaluating the Instructional Design and Effect on Knowledge, Teamwork, and Skills of Technology-Enhanced Simulation-Based Training in Obstetrics in Uganda: Stepped-Wedge Cluster Randomized Trial

Evaluating the Instructional Design and Effect on Knowledge, Teamwork, and Skills of Technology-Enhanced Simulation-Based Training in Obstetrics in Uganda: Stepped-Wedge Cluster Randomized Trial

Evaluating the Instructional Design and Effect on Knowledge, Teamwork, and Skills of Technology-Enhanced Simulation-Based Training in Obstetrics in Uganda: Stepped-Wedge Cluster Randomized Trial

Original Paper

1Department of Obstetrics and Gynecology, Máxima Medical Center, Veldhoven, Netherlands

2Department of Obstetrics and Gynecology, Mulago Hospital, Makerere University College of Health Sciences, Kampala, Uganda

3Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, Netherlands

4Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, Netherlands

Corresponding Author:

Anne Antonia Cornelia van Tetering, MD

Department of Obstetrics and Gynecology

Máxima Medical Center

De Run 4600

Veldhoven, 5500 MB

Netherlands

Phone: 31 614800853

Fax:31 40 888 62 89

Email: anne_van_tetering@hotmail.com


Background: Simulation-based training is a common strategy for improving the quality of facility-based maternity services and is often evaluated using Kirkpatrick’s theoretical model. The results on the Kirkpatrick levels are closely related to the quality of the instructional design of a training program. The instructional design is generally defined as the “set of prescriptions for teaching methods to improve the quality of instruction with a goal of optimizing learning outcomes.”

Objective: The aim of this study is to evaluate the instructional design of a technology-enhanced simulation-based training in obstetrics, the reaction of participants, and the effect on knowledge, teamwork, and skills in a low-income country.

Methods: A stepped-wedge cluster randomized trial was performed in a university hospital in Kampala, Uganda, with an annual delivery volume of over 31,000. In November 2014, a medical simulation center was installed with a full-body birthing simulator (Noelle S550, Gaumard Scientific), an interactive neonate (Simon S102 Newborn CPR Simulator, Gaumard Scientific), and an audio and video recording system. Twelve local obstetricians were trained and certified as medical simulation trainers. From 2014 to 2016, training was provided to 57 residents in groups of 6 to 9 students. Descriptive statistics were calculated for ten instructional design features of the training course measured by the 42-item ID-SIM (Instructional Design of a Simulation Improved by Monitoring). The Wilcoxon signed rank test was conducted to investigate the differences in scores on knowledge, the Clinical Teamwork Scale, and medical technical skills.

Results: The mean scores on the ten instructional design features ranged from 54.9 (95% CI 48.5-61.3) to 84.3 (95% CI 80.9-87.6) out of 100. The highest mean score was given on the feature feedback and the lowest scores on repetitive practice and controlled environment. The overall score for the training day was 92.8 out of 100 (95% CI 89.5-96.1). Knowledge improved significantly, with a test score of 63.4% (95% CI 60.7-66.1) before and 78.9% (95% CI 76.8-81.1) after the training (P<.001). The overall score on the 10-point Clinical Teamwork Scale was 6.0 (95% CI 4.4-7.6) before and 5.9 (95% CI 4.5-7.2) after the training (P=.78). Medical technical skills were scored at 55.5% (95% CI 47.2-63.8) before and 65.6% (95% CI 56.5-74.7) after training (P=.08).

Conclusions: Most instructional design features of a technology-enhanced simulation-based training in obstetrics in a low-income country were scored high, although intervals were large. The overall score for the training day was high, and knowledge did improve after the training program, but no changes in teamwork and (most) medical technical skills were found. The lowest-scored instructional design features may be improved to achieve further learning aims.

Trial Registration: ISRCTN Registry ISRCTN98617255; http://www.isrctn.com/ISRCTN98617255

International Registered Report Identifier (IRRID): RR2-10.1186/s12884-020-03050-3

JMIR Med Educ 2021;7(1):e17277

doi:10.2196/17277

Keywords



Maternity Care

The improvement of maternal and newborn care is a global priority. The United Nations constructed the Millennium Development Goals and the Sustainable Development Goals, in which the aim of reducing the maternal and neonatal mortality was included [1]. Targets for 2030 are to reduce the global maternal mortality ratio to less than 70 per 100,000 live births and to reduce neonatal mortality to at least as low as 12 per 1000 live births [1]. In Uganda, in 2015 the maternal mortality ratio was still 343 per 100,000 live births, and the neonatal mortality rate was 20.2 per 1000 live births in 2017 [2,3]. Shortage of trained staff, poor management of emergency obstetric care provision, poor referral practices, and poor coordination among staff are barriers that hinder or delay the ability to access emergency obstetric services [4]. Simulation-based medical team training may have a positive effect on these barriers.

Simulation-Based Training

Simulation-based training in low-income and middle-income countries usually focuses on improving capacity and providing safe clinical skills to directly reduce maternal and neonatal mortality and morbidity [5]. A review in 2010 about training programs in low-resource environments aimed at improving emergency obstetric care concluded that training programs may improve quality of care, but strong evidence was lacking [6]. Since this review, there have been numerous evaluation studies on the effectiveness of simulation training for obstetric emergencies in low-income and middle-income countries [7-40]. The results of these studies show that obstetric simulation training is associated with improvements in clinical outcomes, mostly neonatal outcomes [7,11,16,18,24,26,28,33,36,38,40]. A later review included 23 studies about the impact of multiprofessional emergency obstetric and neonatal care training in high-income, middle-income, and low-income countries [5]. The conclusion of this review was that this type of training does make a difference [5]. Progress was not only found with regard to individual knowledge, skills, and attitudes, but also with regard to longer-term change in behavior and improvements in maternal and neonatal morbidity and mortality [5]. Sufficient evidence exists to justify the expense and effort of it [5]. Draycott et al agreed with this, but also mentioned that not all training is clinically effective and results are not entirely consistent [41]. Further research on the evaluation of different training programs is necessary to understand why some training programs improve clinical outcomes, and others show no improvements or even deterioration in outcomes.

Evaluating Simulation-Based Training

Most evaluation studies on simulation-based training in low-income and middle-income countries used Kirkpatrick’s theoretical model. This model is composed of four levels: reaction, learning, behavior, and results. Each successive level of the model represents a more precise measure of the effectiveness of a training program. The results on these Kirkpatrick levels are closely related to the quality of the instructional design of a training program [42]. The instructional design is generally defined as the “set of prescriptions for teaching methods to improve the quality of instruction with a goal of optimizing learning outcomes” [43]. Another name for these prescriptions is affordances with the purpose of maximizing the effect, effectiveness, and usefulness of an educational instrument [44]. The instructional design of the training program may influence the outcomes on the Kirkpatrick levels [45]. Therefore, if the learning aim is not met, this may have to do with an inappropriate design.

A review on postgraduate medical e-learning recommended not only to evaluate the outcomes of an educational intervention, but to start with evaluation of its design [45]. For simulation-based medical education, Issenberg et al and McGaghie et al have described essential instructional design features [42,46]. These include feedback, repetitive practice, ranging difficulty levels, defined outcomes, individualized learning, curriculum integration, multiple learning strategies, clinical variation, controlled environment, and simulator validity [42,46]. These features were integrated by Fransen et al in the ID-SIM (Instructional Design of a Simulation Improved by Monitoring), an evidence-based assessment tool that can be used to aid development and evaluation of the instructional design of a simulation-based team training [47].

Training for Life

A technology-enhanced simulation-based training in emergency obstetrics was developed in Mulago Hospital in Kampala, Uganda (Training for Life). The training focused on both medical technical skills and teamwork. To evaluate the training program, we conducted a stepped-wedge cluster randomized trial. In this paper, we present the results of the evaluation of the instructional design of this training program, the reaction of participants, and the effect on knowledge, teamwork, and medical technical skills (Kirkpatrick levels 1 and 2).


Recruitment

Between October 2014 and April 2016, a stepped-wedge cluster randomized trial was conducted to implement technology-enhanced simulation-based team training in obstetrics. This educational intervention took place at the Makerere University College of Health Sciences, situated in Mulago Hospital in Kampala, Uganda. In November 2014, a medical simulation center was installed with a full-body birthing simulator (Noelle S550, Gaumard Scientific), an interactive neonate (S102 Simon Newborn CPR Simulator, Gaumard Scientific), and an audio and video recording system. Mulago Hospital is a national referral hospital in Kampala with an annual delivery volume of approximately 31,000. Over 23,000 women deliver at a medium-to-high–risk ward, and the staff of this ward consists of 45 gynecologists, 60 residents (first-year, second-year, and third-year senior house officers [SHOs]), and 45 midwives. To be included in the study, SHOs had to work at the medium-to-high–risk maternity ward of Mulago Hospital. As this study was set up as a stepped-wedge cluster randomized trial, clusters of SHOs started in a control period. Therefore, recruitment was done before the official opening of the simulation center and the train-the-trainers course. Seven clusters of first-year, second-year, and third-year SHOs were randomly created by a scheduler. To evaluate clinical outcomes, the SHOs had to work in the hospital in these fixed clusters during the study period.

Training for Life used a train-the-trainer model in which training was cascaded down from master trainers to local facilitators to learners. The group of master trainers consisted of two Dutch obstetricians, one communication expert, and one simulation specialist. They were all certified simulation educators. Twelve local senior obstetricians finalized a four-day training program and were certified as facilitators. Course materials were developed in cooperation with staff members in Mulago Hospital and Medsim, a medical simulation center in Eindhoven, the Netherlands. All materials were provided in English.

After the train-the-trainers course, training was cascaded down to the SHOs. Each training was given by two recently certified local facilitators to 7 clusters of each 6 to 9 SHOs of different study years. The training comprised a one-day (8-hour) simulation-based acute obstetric training focusing on medical technical skills and teamwork/crew resource management (eg, closed-loop communication, leadership, speaking up). The two facilitators focused alternately on medical technical skills or crew resource management. Scenarios included postpartum hemorrhage, eclampsia, ventouse delivery followed by resuscitation of the newborn, breech delivery, and a repetition of postpartum hemorrhage with a different etiological mechanism. Every scenario was briefly introduced by the medical facilitator, and after each scenario, a debriefing with review of the video recordings was provided with feedback on medical technical skills and crew resource management. All scenarios were performed once, according to a fixed script with realistic clinical progress. At least three SHOs could participate actively in each scenario. After the main training, at least one half-day repetition training session was organized for each group.

As this study was set up as a stepped-wedge cluster randomized trial, all 7 clusters of SHOs started in the control condition. Then, all clusters received the training at consecutive time points, scheduled 7 weeks apart. The order of the switch per cluster was randomized by a computer. Eventually, all clusters switched from the control to the training condition.

Instructional Design

This study evaluates the instructional design of the training and the effect of the training on Kirkpatrick levels 1 and 2. The instructional design was measured using the ID-SIM [47]. This questionnaire is an assessment tool, specifically designed for the evaluation of the instructional design of a simulation-based team training [47]. It consists of 42 statements that can be answered by placing a mark on a line from “not at all/never” to “completely/always”. The questions are divided over ten instructional design features: feedback, repetitive practice, curriculum integration, difficulty range, learning strategies, clinical variation, controlled environment, defined outcomes, individualized learning, and simulation fidelity.

Kirkpatrick Levels 1 and 2

Kirkpatrick level 1 was measured by asking all participants to give an overall score for the training day by placing a mark on a line. Suggestions for improvement could be made in an open remark at the end of the evaluation questionnaire. Level 2, the effect on knowledge of the participants, was measured by a knowledge test consisting of 30 multiple-choice questions on medical technical skills and teamwork at the beginning and end of the main training (Multimedia Appendix 1). To obtain content validity, a team of Dutch and Ugandan obstetricians developed and evaluated the multiple-choice questions. Construct validity was tested by asking obstetricians and first-year, second-year, and third-year SHOs to complete the knowledge test. A Cronbach α coefficient was calculated to measure the internal consistency of the knowledge test.

The effect on technical skills and teamwork was evaluated by assessing the video-recorded scenarios. Three independent researchers assessed the first and last scenario for medical technical skills and teamwork together until consensus was reached. The topic of both scenarios was postpartum hemorrhage; however, the etiology differed. The assessors were blinded for the day of training and whether the scenario was the first or the last of the day. The assessment consisted of the Clinical Teamwork Scale (CTS) and a checklist of medical technical procedures. The CTS is a validated tool for assessing teamwork [48]. It consists of 15 items about communication, situational awareness, decision-making, and role responsibility, and each can be scored on a 10-point scale. The checklist of medical technical procedures is based on local protocols for postpartum hemorrhage, and it consists of 24 items that can be either scored as “done,” “not done,” or “not applicable.”

Statistical Analysis

This paper shows secondary outcome results. A sample size calculation was performed based on the primary outcome of the study (the combined mortality proportion including maternal and neonatal mortality ratios). For a stepped-wedge design, first the sample size calculation for a standard randomized clinical trial is required [49-51]. To show a reduction in combined mortality proportion of 20% with an α of .05 and a power of 80%, a total of 6398 deliveries were needed for a standard randomized clinical trial design. The design effect was then calculated assuming an intracluster correlation of 0.05, 7 clusters, and a cluster size of 3343 deliveries per year, which resulted in 2367 deliveries per cluster period. This resulted in a minimum duration of 5 weeks for each cluster period based on local delivery rates. For logistical reasons in staff scheduling, the duration of each step was set at 7 weeks. As exam and holiday periods were excluded from the cluster periods, the total duration of the study was anticipated to be 1.5 years. Data were analyzed using IBM SPSS Statistics, version 21 (IBM Corporation). Descriptive statistics were calculated for participant characteristics and for the results of the ID-SIM. The Wilcoxon signed rank test was conducted to investigate the difference in scores on the knowledge test, the CTS, and medical technical skills assessment. The difference in scores on the knowledge test between the SHOs in their first, second, and third years of study was analyzed using the Kruskal-Wallis test. Statistical significance was accepted at a 2-sided P value of .05.

Ethical Permission

Ethical permission was obtained from both the Mulago Research and Ethics committee (Protocol MREC: 674) and the Uganda National Council for Science and Technology (number SS 3927). All participants gave written informed consent before the study began, and they acknowledged that they cannot be identified via the paper. Data were fully anonymized.


Learner Characteristics

From 2014 to 2016, 68 SHOs were invited to participate in the training program; 19 (28%) of them were female, and 49 (72%) were male. Of these, 57 SHOs (84%) participated in the main training, with an even distribution over the three years of their obstetric curriculum (20 first-year SHOs, 18 second-year SHOs, and 19 third-year SHOs). Of the 11 SHOs who did not participate in the main training, 3 finalized their specialization, 1 quit specialization, and 7 did not give any reason. Almost half of the SHOs (49%, 33/68) took part in at least one repetition training. The total number of trained SHOs was higher than the average working number, because of the organization of extra main training sessions for leaving SHOs and the new first-year SHOs who were added to an already trained cluster.

Instructional Design

All of the 57 SHOs who participated in the main training completed the ID-SIM. The mean scores of the ten instructional design features are shown in Table 1. Mean scores on the features differed between 54.9 and 84.3 out of 100. The highest mean score of 84.3 (95% CI 80.9-87.6) was given on feedback. The lowest scores of 62.8 (95% CI 55.8-69.8) and 54.9 (95% CI 48.5-61.3) were given on repetitive practice and controlled environment, respectively.

Table 1. Mean scores of senior house officers on the ID-SIM.
VariableID-SIM score, mean (95% CI)
Feedback84.3 (80.9-87.6)
Repetitive practice62.8 (55.8-69.8)
Curriculum integration78.7 (74.5-82.9)
Difficulty range74.0 (68.5-79.4)
Learning strategies83.2 (78.9-87.4)
Clinical variation80.0 (74.9-85.1)
Controlled environment54.9 (48.5-61.3)
Individualized learning81.9 (76.9-86.9)
Defined outcomes74.2 (69.2-79.3)
Simulation fidelity80.3 (76.9-83.7)

Kirkpatrick Levels 1 and 2

The overall score for the training day rated by the participants was 92.8 out of 100 (95% CI 89.5-96.1). The following suggestions for improvement were made in the open remark at the end of the questionnaire: (1) to incorporate other members of the team, (2) to add other scenarios, (3) to have repetition training more often, (4) to plan more time for the debriefing, especially relating to a real-life setting, and (5) to provide the training materials a day earlier.

Of the 57 participating SHOs, a total of 53 (93%) completed the knowledge test before and after the main training. One SHO completed the knowledge test only after the training. Construct validity was tested using the Kruskal-Wallis test to compare knowledge test results of obstetricians and first-year, second-year, and third-year SHOs and showed a significant result (P=.03). A Cronbach α coefficient of .67 was calculated to measure the internal consistency of the knowledge test. Mean scores of the knowledge test are listed in Table 2. The mean score of the knowledge test increased from the beginning to the end of the training day. This result was also found for each study year separately. The improvement in score on the knowledge test between the three study years was not significantly different (P=.24).

Table 2. Mean scores of senior house officers on the knowledge test.
Year of studyScore before training, mean (95% CI)Score after training, mean (95% CI)P value
All63.4 (60.7-66.1)78.9 (76.8-81.1)<.001
1st year62.3 (58.3-66.4)77.7 (72.5-82.8)<.001
2nd year60.9 (56.1-65.7)78.9 (76.7-81.1)<.001
3rd year68.1 (62.5-73.7)80.7 (77.2-84.1).001

To evaluate teamwork and medical technical skills, the recordings of the first and last scenarios of 8 teams were evaluated. Out of 16 recordings, 2 could not be assessed because of recording issues. No differences in scores on the CTS between the first and last sessions were found (Table 3). The scores of the technical skills assessment only improved statistically significantly for the provision of drugs (Table 3). During the first scenario, none of the teams reached the moment to tamponade the uterus. For 5 out of the 8 teams, the last scenario was stopped before they had to tamponade the uterus, hence this item was scored as not applicable. The scenarios were stopped by the local facilitators at the moment when they judged that the SHOs had reached sufficient learning subjects to discuss in the debriefing sessions.

Table 3. Mean scores of senior house officers in clusters on the Clinical Teamwork Scale and the medical technical skills assessment.
ItemFirst scenario score, mean (95% CI)Fifth scenario score, mean (95% CI)P value
Clinical Teamwork Scale
Overall score6.0 (4.4-7.6)5.9 (4.5-7.2).78
Overall communication6.5 (5.5-7.6)6.0 (4.5-7.5).4
Overall situational awareness4.4 (2.8-6.0)5.4 (4.5-6.2).1
Overall decision making4.6 (3.4-5.7)6.0 (5.1-6.9).07
Overall responsibility6.6 (5.6-7.7)6.0 (5.3-6.8).59
Patient friendliness5.6 (4.1-7.1)6.0 (4.8-7.2).79
Medical technical skills
Overall score55.5 (47.2-63.8)65.6 (56.5-74.7).08
Ask for help100100>.99
Airway, breathing, circulation58.9 (45.9-72.0)54.6 (43.0-66.2).89
Establish cause50.0 (25.2-74.8)76.2 (41.9-110.5).34
Massage uterus57.1 (18.5-95.8)66.7 (31.1-102.3).59
Provision of drugs28.6 (12.6-44.5)56.0 (46.3-65.6).04
Shift to theatre85.7 (63.2-108.3)78.6 (53.9-103.3).56
TamponadeN/AaN/AN/A

aN/A: not applicable.


Principal Results

In this article, we investigated the instructional design of a technology-enhanced simulation-based training in obstetrics, the reaction of participants, and the effect on knowledge, teamwork, and medical technical skills of SHOs. Most instructional design features were scored high, although intervals were large. The highest-rated instructional design feature was feedback, and the lowest-rated were repetitive practice and controlled environment. The overall rating of the SHOs for the training program was high, with a mean score of 92.8 out of 100. Knowledge did increase after the training program, but no changes in teamwork and (most) technical skills were found. Results of the ID-SIM showed suggestions for improvement of the instructional design of the training program to achieve learning aims.

Strengths and Limitations

This study evaluates both the instructional design of a technology-enhanced simulation-based training in obstetrics and the effect on Kirkpatrick levels 1 and 2 in a low-income country at one of the biggest maternity wards in the world. The validated ID-SIM was used to evaluate the instructional design of the training program. A limitation of the study may be that the ID-SIM was scored by the SHOs, who may not have much expertise in evaluating an instructional design. However, Fransen et al mentioned that the ID-SIM may be helpful for less-experienced individuals who are challenged with the development or evaluation of a simulation-based team training course [47]. Nevertheless, validation of participants’ ratings, instead of expert opinion, on the ID-SIM could be an item of further research.

Another limitation of this study is the level of expertise and the composition of the training groups. SHOs of different study years were divided into groups with a different team leader in the first and last scenario of the day. This means that the level of knowledge, skills, and teamwork of the team leader can differ between sessions. Other limitations include the ratio of male to female participants with 72% male participants, and missing data due to the dropout of 7 of the 68 SHOs without known reason, 4 SHOs who didn’t fill in the knowledge test, and 2 missing video recordings due to technical issues. Moreover, only 33 SHOs participated in at least one repetition training. Information on motivation and reasons for not participating in further training sessions should be included in further evaluation studies to optimize learning results. Furthermore, it was hard to specifically define the level of knowledge, teamwork, and medical technical skills in advance. This may have resulted in learning objectives that were not challenging enough for all SHOs. Additionally, the item tamponade the uterus in the medical technical skills could not be scored in the way it was originally planned, as most scenarios were stopped before the clusters reached the moment to practice this skill. Hence, evaluation on Kirkpatrick levels 3 and 4 will probably not show any effect of this training subjective. Finally, the training teams only consist of SHOs, as it was not feasible to create working schedules with fixed teams including midwives, interns, SHOs, obstetricians, anesthesiologists, and pediatricians. To measure the effects of the training program using a stepped-wedge cluster randomized trial in one hospital, fixed teams were necessary. As the SHOs are the first responders after the midwives in emergency care at the labor ward, we chose to focus on these care providers. However, we are aware that teamwork is critical to provide safe obstetric care. All of the previous studies that have reported improvements after training have implemented “in-house” training programs and have trained almost 100% of their staff [52]. These features seem to be two of the active components of effective training [52]. For future training, a multiprofessional training program is recommended.

Comparison With Prior Work

De Leeuw et al have identified and compared the outcomes and methods used to evaluate postgraduate medical e-learning, including simulation [45]. Of the theories, Kirkpatrick’s hierarchy was the most used method [45]. However, many other ways to carry out an evaluation were found, and it is probable that many ways to do so are correct [45]. A recommendation by De Leeuw et al was to evaluate not only the outcomes of an educational intervention but to start with the evaluation of its design [45]. Robust instructional design is required to achieve an effective training course. Moreover, to perform comparisons between simulation-based team training courses, Eppich et al recommended standardized reporting of these instructional designs [42,53]. Issenberg et al translated the literature into ten important design features [46]. Five out of these ten features corresponded to the educational theory of deliberate practice by Ericsson et al [54,55]. Cook et al confirmed the effectiveness of several of Issenberg’s instructional design features [46,56]. The features were incorporated into two guidelines for designing an effective simulation-based training by the Association for Medical Education in Europe [57,58]. Later, Fransen et al developed, based on previous findings, an evidence-based assessment tool for evaluation of the instructional design of a simulation-based team training: the ID-SIM [47]. Table 1 shows the instructional design features of the technology-enhanced simulation-based training in obstetrics evaluated in this study. The table identifies the weaknesses in the instructional design of this training: repetitive practice and controlled environment.

Repetitive Practice

There is increasing evidence of the beneficial effect of repetitive practice. Cook et al analyzed over 600 studies in a systematic review and meta-analysis and reported that the distribution of learning activities over more than one day was consistently associated with larger effect sizes [59]. Bluestone et al also described that repetitive, time-spaced education exposure resulted in better knowledge outcomes, better knowledge retention, and better clinical decisions compared with single interventions and live instruction [60]. Additionally, improvement in skills was demonstrated after various types of refresher courses [61-64]. A study from van de Ven et al reported that the beneficial effect of a one-day, simulation-based, multiprofessional obstetric team training seems to decline after 3 months [65]. Repetitive training sessions every 3 months are therefore recommended. However, in low-income and middle-income countries conflict may arise because having adequate time and support for simulation-based training can be a challenge. Several studies describe challenges of pulling staff both as learners and educators out of their workplaces because of staff shortages or complex schedules [14,17,66,67]. In particular, longer courses have struggled with high on-site dropout rates because of night call schedules [67]. More research is necessary to determine the optimal training intervals in low-income and middle-income countries. The effects of training programs with different intervals between repetition sessions on the four Kirkpatrick levels, but also on participants' dropout rates and participants' and trainers' motivation, should be investigated in order to optimize this instructional design feature in low-income and middle-income countries.

Controlled Environment

The other lower-scored item on the ID-SIM was controlled environment. In a controlled clinical environment, learners can make, detect, and correct errors in patient care without adverse consequences. Moreover, instructors can focus on learners instead of patients. The low score in this study on this item may have to do with staff shortages and complex schedules. Training sessions were frequently interrupted by phone calls. Interference with clinical obligations may be a bigger issue in low-income and middle-income countries compared with high-income countries due to a shortage of personnel. Moreover, the educational system of Uganda differs from the system in high-income countries. In low-income to middle-income countries, health professionals may not be as familiar with simulation-based education as in high-income countries [68,69]. Moran et al even described the educators' lack of comfort with leading simulations as one of the key challenges in simulation-based training [69]. To increase the effectiveness of the training program, the controlled environment has to be improved.

Conclusions

Most instructional design features of a technology-enhanced simulation-based training in obstetrics in a low-income country were scored high, although intervals were large. The highest mean score was given on feedback, and the lowest scores on repetitive practice and controlled environment. The overall score for the training day was high, and knowledge did improve after the training program, but no changes in teamwork and (most) medical technical skills were found. The lowest-scored instructional design features, controlled environment and repetitive practice, may be improved to achieve further learning aims. Future studies should also include evaluation of the instructional design of a training program in order to understand why some training programs are effective and others are not.

.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Knowledge test.

DOCX File , 547 KB

Multimedia Appendix 2

CONSORT checklist.

DOC File , 220 KB

References

  1. United Nations. Sustainable development goals. 2016.   URL: http://www.un.org/sustainabledevelopment/health/ [accessed 2016-06-20] [WebCite Cache]
  2. World Health Organization. Maternal and reproductive health. 2017.   URL: http://www.who.int/gho/maternal_health/en/ [accessed 2012-01-20] [WebCite Cache]
  3. World Health Organization. Neonatal Mortality. 2019.   URL: http://apps.who.int/iris/bitstream/handle/10665/272596/9789241565585-eng.pdf?ua=1 [accessed 2018-10-28] [WebCite Cache]
  4. Geleto A, Chojenta C, Mussa A, Loxton D. Barriers to access and utilization of emergency obstetric care at health facilities in sub-Saharan Africa-a systematic review protocol. Syst Rev 2018 Apr 16;7(1):60 [FREE Full text] [CrossRef] [Medline]
  5. Bergh A, Baloyi S, Pattinson RC. What is the impact of multi-professional emergency obstetric and neonatal care training? Best Pract Res Clin Obstet Gynaecol 2015 Nov;29(8):1028-1043. [CrossRef] [Medline]
  6. van Lonkhuijzen L, Dijkman A, van Roosmalen J, Zeeman G, Scherpbier A. A systematic review of the effectiveness of training in emergency obstetric care in low-resource environments. BJOG 2010 Jun;117(7):777-787 [FREE Full text] [CrossRef] [Medline]
  7. Crofts JF, Mukuli T, Murove BT, Ngwenya S, Mhlanga S, Dube M, et al. Onsite training of doctors, midwives and nurses in obstetric emergencies, Zimbabwe. Bull World Health Organ 2015 May 01;93(5):347-351 [FREE Full text] [CrossRef] [Medline]
  8. Andreatta P, Gans-Larty F, Debpuur D, Ofosu A, Perosky J. Evaluation of simulation-based training on the ability of birth attendants to correctly perform bimanual compression as obstetric first aid. Int J Nurs Stud 2011 Oct;48(10):1275-1280. [CrossRef] [Medline]
  9. Egenberg S, Karlsen B, Massay D, Kimaro H, Bru LE. "No patient should die of PPH just for the lack of training!" Experiences from multi-professional simulation training on postpartum hemorrhage in northern Tanzania: a qualitative study. BMC Med Educ 2017 Jul 14;17(1):119 [FREE Full text] [CrossRef] [Medline]
  10. Bang A, Patel A, Bellad R, Gisore P, Goudar SS, Esamai F, et al. Helping Babies Breathe (HBB) training: What happens to knowledge and skills over time? BMC Pregnancy Childbirth 2016 Nov 22;16(1):364 [FREE Full text] [CrossRef] [Medline]
  11. Dumont A, Fournier P, Abrahamowicz M, Traoré M, Haddad S, Fraser WD, QUARITE research group. Quality of care, risk management, and technology in obstetrics to reduce hospital-based maternal mortality in Senegal and Mali (QUARITE): a cluster-randomised trial. Lancet 2013 Jul 13;382(9887):146-157. [CrossRef] [Medline]
  12. Mildenberger C, Ellis C, Lee K. Neonatal resuscitation training for midwives in Uganda: Strengthening skill and knowledge retention. Midwifery 2017 Jul;50:36-41. [CrossRef] [Medline]
  13. Dettinger JC, Kamau S, Calkins K, Cohen SR, Cranmer J, Kibore M, et al. Measuring movement towards improved emergency obstetric care in rural Kenya with implementation of the PRONTO simulation and team training program. Matern Child Nutr 2018 Feb;14 Suppl 1 [FREE Full text] [CrossRef] [Medline]
  14. Arlington L, Kairuki AK, Isangula KG, Meda RA, Thomas E, Temu A, et al. Implementation of "Helping Babies Breathe": A 3-Year Experience in Tanzania. Pediatrics 2017 May;139(5) [FREE Full text] [CrossRef] [Medline]
  15. Evans CL, Johnson P, Bazant E, Bhatnagar N, Zgambo J, Khamis AR. Competency-based training "Helping Mothers Survive: Bleeding after Birth" for providers from central and remote facilities in three countries. Int J Gynaecol Obstet 2014 Sep;126(3):286-290. [CrossRef] [Medline]
  16. Mduma ER, Ersdal H, Kvaloy JT, Svensen E, Mdoe P, Perlman J, et al. Using statistical process control methods to trace small changes in perinatal mortality after a training program in a low-resource setting. Int J Qual Health Care 2018 May 01;30(4):271-275. [CrossRef] [Medline]
  17. Ersdal HL, Vossius C, Bayo E, Mduma E, Perlman J, Lippert A, et al. A one-day "Helping Babies Breathe" course improves simulated performance but not clinical management of neonates. Resuscitation 2013 Oct;84(10):1422-1427. [CrossRef] [Medline]
  18. Rule ARL, Maina E, Cheruiyot D, Mueri P, Simmons JM, Kamath-Rayne BD. Using quality improvement to decrease birth asphyxia rates after 'Helping Babies Breathe' training in Kenya. Acta Paediatr 2017 Oct;106(10):1666-1673. [CrossRef] [Medline]
  19. Ramaswamy R, Iracane S, Srofenyoh E, Bryce F, Floyd L, Kallam B, et al. Transforming Maternal and Neonatal Outcomes in Tertiary Hospitals in Ghana: An Integrated Approach for Systems Change. J Obstet Gynaecol Can 2015 Oct;37(10):905-914. [CrossRef] [Medline]
  20. Ameh CA, Kerr R, Madaj B, Mdegela M, Kana T, Jones S, et al. Knowledge and Skills of Healthcare Providers in Sub-Saharan Africa and Asia before and after Competency-Based Training in Emergency Obstetric and Early Newborn Care. PLoS One 2016;11(12):e0167270 [FREE Full text] [CrossRef] [Medline]
  21. Mirkuzie AH, Sisay MM, Bedane MM. Standard basic emergency obstetric and neonatal care training in Addis Ababa; trainees reaction and knowledge acquisition. BMC Med Educ 2014 Sep 24;14:201 [FREE Full text] [CrossRef] [Medline]
  22. Arabi AME, Ibrahim SA, Ahmed SE, MacGinnea F, Hawkes G, Dempsey E, et al. Skills retention in Sudanese village midwives 1 year following Helping Babies Breathe training. Arch Dis Child 2016 May;101(5):439-442. [CrossRef] [Medline]
  23. Livingston P, Evans F, Nsereko E, Nyirigira G, Ruhato P, Sargeant J, et al. Safer obstetric anesthesia through education and mentorship: a model for knowledge translation in Rwanda. Can J Anaesth 2014 Nov;61(11):1028-1039. [CrossRef] [Medline]
  24. Eblovi D, Kelly P, Afua G, Agyapong S, Dante S, Pellerite M. Retention and use of newborn resuscitation skills following a series of helping babies breathe trainings for midwives in rural Ghana. Glob Health Action 2017;10(1):1387985 [FREE Full text] [CrossRef] [Medline]
  25. Pattinson RC, Bergh A, Makin J, Pillay Y, Moodley J, Madaj B, et al. Obstetrics knowledge and skills training as a catalyst for change. S Afr Med J 2018 Aug 28;108(9):748-755 [FREE Full text] [CrossRef] [Medline]
  26. Msemo G, Massawe A, Mmbando D, Rusibamayila N, Manji K, Kidanto HL, et al. Newborn mortality and fresh stillbirth rates in Tanzania after helping babies breathe training. Pediatrics 2013 Feb;131(2):e353-e360. [CrossRef] [Medline]
  27. Mistry SC, Lin R, Mumphansha H, Kettley LC, Pearson JA, Akrimi S, et al. Newborn Resuscitation Skills in Health Care Providers at a Zambian Tertiary Center, and Comparison to World Health Organization Standards. Anesth Analg 2018 Jul;127(1):217-223. [CrossRef] [Medline]
  28. Willcox M, Harrison H, Asiedu A, Nelson A, Gomez P, LeFevre A. Incremental cost and cost-effectiveness of low-dose, high-frequency training in basic emergency obstetric and newborn care as compared to status quo: part of a cluster-randomized training intervention evaluation in Ghana. Global Health 2017 Dec 06;13(1):88-44 [FREE Full text] [CrossRef] [Medline]
  29. Ameh C, Adegoke A, Hofman J, Ismail FM, Ahmed FM, van den Broek N. The impact of emergency obstetric care training in Somaliland, Somalia. Int J Gynaecol Obstet 2012 Jun;117(3):283-287. [CrossRef] [Medline]
  30. Skelton T, Nshimyumuremyi I, Mukwesi C, Whynot S, Zolpys L, Livingston P. Low-Cost Simulation to Teach Anesthetists' Non-Technical Skills in Rwanda. Anesth Analg 2016 Aug;123(2):474-480. [CrossRef] [Medline]
  31. Nelissen E, Ersdal H, Ostergaard D, Mduma E, Broerse J, Evjen-Olsen B, et al. Helping mothers survive bleeding after birth: an evaluation of simulation-based training in a low-resource setting. Acta Obstet Gynecol Scand 2014 Mar;93(3):287-295 [FREE Full text] [CrossRef] [Medline]
  32. Nelissen E, Ersdal H, Mduma E, Evjen-Olsen B, Broerse J, van Roosmalen J, et al. Helping Mothers Survive Bleeding After Birth: retention of knowledge, skills, and confidence nine months after obstetric simulation-based training. BMC Pregnancy Childbirth 2015 Aug 25;15:190 [FREE Full text] [CrossRef] [Medline]
  33. Mduma E, Ersdal H, Svensen E, Kidanto H, Auestad B, Perlman J. Frequent brief on-site simulation training and reduction in 24-h neonatal mortality--an educational intervention study. Resuscitation 2015 Aug;93:1-7. [CrossRef] [Medline]
  34. Zongo A, Dumont A, Fournier P, Traore M, Kouanda S, Sondo B. Effect of maternal death reviews and training on maternal mortality among cesarean delivery: post-hoc analysis of a cluster-randomized controlled trial. Eur J Obstet Gynecol Reprod Biol 2015 Feb;185:174-180. [CrossRef] [Medline]
  35. Chaudhury S, Arlington L, Brenan S, Kairuki AK, Meda AR, Isangula KG, et al. Cost analysis of large-scale implementation of the 'Helping Babies Breathe' newborn resuscitation-training program in Tanzania. BMC Health Serv Res 2016 Dec 01;16(1):681 [FREE Full text] [CrossRef] [Medline]
  36. Nelissen E, Ersdal H, Mduma E, Evjen-Olsen B, Twisk J, Broerse J, et al. Clinical performance and patient outcome after simulation-based training in prevention and management of postpartum haemorrhage: an educational intervention study in a low-resource setting. BMC Pregnancy Childbirth 2017 Sep 11;17(1):301 [FREE Full text] [CrossRef] [Medline]
  37. Reynolds A, Zaky A, Moreira-Barros J, Bernardes J. Building a Maternal and Newborn Care Training Programme for Health-Care Professionals in Guinea-Bissau. Acta Med Port 2017 Oct 31;30(10):734-741 [FREE Full text] [CrossRef] [Medline]
  38. Gomez PP, Nelson AR, Asiedu A, Addo E, Agbodza D, Allen C, et al. Accelerating newborn survival in Ghana through a low-dose, high-frequency health worker training approach: a cluster randomized trial. BMC Pregnancy Childbirth 2018 Mar 22;18(1):72 [FREE Full text] [CrossRef] [Medline]
  39. Umar LW, Ahmad HR, Isah A, Idris HW, Hassan L, Abdullahi FL, et al. Evaluation of the cognitive effect of newborn resuscitation training on health-care workers in selected states in Northern Nigeria. Ann Afr Med 2018;17(1):33-39 [FREE Full text] [CrossRef] [Medline]
  40. Sorensen BL, Rasch V, Massawe S, Nyakina J, Elsass P, Nielsen BB. Impact of ALSO training on the management of prolonged labor and neonatal care at Kagera Regional Hospital, Tanzania. Int J Gynaecol Obstet 2010 Oct;111(1):8-12. [CrossRef] [Medline]
  41. Draycott TJ, Collins KJ, Crofts JF, Siassakos D, Winter C, Weiner CP, et al. Myths and realities of training in obstetric emergencies. Best Pract Res Clin Obstet Gynaecol 2015 Nov;29(8):1067-1076. [CrossRef] [Medline]
  42. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010 Jan;44(1):50-63. [CrossRef] [Medline]
  43. Fraser KL, Ayres P, Sweller J. Cognitive Load Theory for the Design of Medical Simulations. Simul Healthc 2015 Oct;10(5):295-307. [CrossRef] [Medline]
  44. Khalil MK, Elkhider IA. Applying learning theories and instructional design models for effective instruction. Adv Physiol Educ 2016 Jun;40(2):147-156 [FREE Full text] [CrossRef] [Medline]
  45. de Leeuw R, de Soet A, van der Horst S, Walsh K, Westerman M, Scheele F. How We Evaluate Postgraduate Medical E-Learning: Systematic Review. JMIR Med Educ 2019 Apr 05;5(1):e13128 [FREE Full text] [CrossRef] [Medline]
  46. Issenberg SB, McGaghie WC, Petrusa ER, Lee GD, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005 Jan;27(1):10-28. [CrossRef] [Medline]
  47. Fransen AF, van der Hout-van der Jagt MB, Gardner R, Capelle M, Oei SP, van Runnard Heimel PJ, et al. Assessment tool for the instructional design of simulation-based team training courses: the ID-SIM. BMJ STEL 2017 Sep 06;4(2):59-64. [CrossRef]
  48. Guise J, Deering SH, Kanki BG, Osterweil P, Li H, Mori M, et al. Validation of a tool to measure and promote clinical teamwork. Simul Healthc 2008;3(4):217-223. [CrossRef] [Medline]
  49. Hussey MA, Hughes JP. Design and analysis of stepped wedge cluster randomized trials. Contemp Clin Trials 2007 Feb;28(2):182-191. [CrossRef] [Medline]
  50. Woertman W, de Hoop E, Moerbeek M, Zuidema SU, Gerritsen DL, Teerenstra S. Stepped wedge designs could reduce the required sample size in cluster randomized trials. J Clin Epidemiol 2013 Jul;66(7):752-758 [FREE Full text] [CrossRef] [Medline]
  51. de Hoop E, Woertman W, Teerenstra S. The stepped wedge cluster randomized trial always requires fewer clusters but not always fewer measurements, that is, participants than a parallel cluster randomized trial in a cross-sectional design. In reply. J Clin Epidemiol 2013 Dec;66(12):1428. [CrossRef] [Medline]
  52. Siassakos D, Crofts JF, Winter C, Weiner CP, Draycott TJ. The active components of effective training in obstetric emergencies. BJOG 2009 Jul;116(8):1028-1032 [FREE Full text] [CrossRef] [Medline]
  53. Eppich W, Howard V, Vozenilek J, Curran I. Simulation-based team training in healthcare. Simul Healthc 2011 Aug;6 Suppl:S14-S19. [CrossRef] [Medline]
  54. Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychological Review 1993;100(3):363-406. [CrossRef]
  55. Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med 2008 Nov;15(11):988-994. [CrossRef] [Medline]
  56. Cook DA, Hamstra SJ, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis. Med Teach 2013;35(1):e867-e898. [CrossRef] [Medline]
  57. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach 2013 Oct;35(10):e1511-e1530. [CrossRef] [Medline]
  58. Khan K, Tolhurst-Cleaver S, White S. Simulation in healthcare education. Building a simulation programme: A practical guide. AMEE Curriculum Planning Guide 50. Dundee, UK: Association for medical education in Europe; 2011.   URL: https://amee.org/shop/publications/amee-guides [accessed 2018-10-28] [WebCite Cache]
  59. Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011 Sep 07;306(9):978-988. [CrossRef] [Medline]
  60. Bluestone J, Johnson P, Fullerton J, Carr C, Alderman J, BonTempo J. Effective in-service training design and delivery: evidence from an integrative literature review. Hum Resour Health 2013 Oct 01;11:51 [FREE Full text] [CrossRef] [Medline]
  61. Close K, Karel M, White M. A pilot program of knowledge translation and implementation for newborn resuscitation using US Peace Corps Volunteers in rural Madagascar. Global Health 2016 Nov 16;12(1):73 [FREE Full text] [CrossRef] [Medline]
  62. Musafili A, Essén B, Baribwira C, Rukundo A, Persson L. Evaluating Helping Babies Breathe: training for healthcare workers at hospitals in Rwanda. Acta Paediatr 2013 Jan;102(1):e34-e38. [CrossRef] [Medline]
  63. Bang A, Patel A, Bellad R, Gisore P, Goudar SS, Esamai F, et al. Helping Babies Breathe (HBB) training: What happens to knowledge and skills over time? BMC Pregnancy Childbirth 2016 Nov 22;16(1):364 [FREE Full text] [CrossRef] [Medline]
  64. Arabi AME, Ibrahim SA, Ahmed SE, MacGinnea F, Hawkes G, Dempsey E, et al. Skills retention in Sudanese village midwives 1 year following Helping Babies Breathe training. Arch Dis Child 2016 May;101(5):439-442. [CrossRef] [Medline]
  65. van de Ven J, Fransen A, Schuit E, van Runnard Heimel P, Mol B, Oei S. Does the effect of one-day simulation team training in obstetric emergencies decline within one year? A post-hoc analysis of a multicentre cluster randomised controlled trial. Eur J Obstet Gynecol Reprod Biol 2017 Sep;216:79-84. [CrossRef] [Medline]
  66. Jaganath D, Gill HK, Cohen AC, Young SD. Harnessing Online Peer Education (HOPE): integrating C-POL and social media to train peer leaders in HIV prevention. AIDS Care 2012;24(5):593-600 [FREE Full text] [CrossRef] [Medline]
  67. Hategeka C, Mwai L, Tuyisenge L. Implementing the Emergency Triage, Assessment and Treatment plus admission care (ETAT+) clinical practice guidelines to improve quality of hospital care in Rwandan district hospitals: healthcare workers' perspectives on relevance and challenges. BMC Health Serv Res 2017 Apr 07;17(1):256 [FREE Full text] [CrossRef] [Medline]
  68. Seto TL, Tabangin ME, Taylor KK, Josyula S, Vasquez JC, Kamath-Rayne BD. Breaking Down the Objective Structured Clinical Examination: An Evaluation of the Helping Babies Breathe OSCEs. Simul Healthc 2017 Aug;12(4):226-232. [CrossRef] [Medline]
  69. Moran NF, Naidoo M, Moodley J. Reducing maternal mortality on a countrywide scale: The role of emergency obstetric training. Best Pract Res Clin Obstet Gynaecol 2015 Nov;29(8):1102-1118. [CrossRef] [Medline]


CTS: Clinical Teamwork Scale
ID-SIM: Instructional Design of a Simulation Improved by Monitoring
SHO: senior house officer


Edited by G Eysenbach; submitted 02.12.19; peer-reviewed by S Jung, M Mostafa; comments to author 19.02.20; revised version received 07.06.20; accepted 13.06.20; published 05.02.21

Copyright

©Anne Antonia Cornelia van Tetering, Maartje Henrica Martine Segers, Peter Ntuyo, Imelda Namagambe, M Beatrijs van der Hout-van der Jagt, Josaphat K Byamugisha, S Guid Oei. Originally published in JMIR Medical Education (http://mededu.jmir.org), 05.02.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on http://mededu.jmir.org/, as well as this copyright and license information must be included.