Introduction: Healthcare organizations around the world have embraced simulation to prepare healthcare professionals to the COVID-19 pandemic. However, this pandemic implies additional educational challenges in rapidly designing simulation activities, while remaining compliant with health and safety measures to prevent the spread of the virus. The effect of simulation-based education in this context remains to be evaluated. Objective: The purpose of this systematic review was to describe the features and evaluate the effect of simulation activities on the preparedness of healthcare professionals and students to safely deliver care during the COVID-19 pandemic. Methods: Databases were searched up to November 2020 using index terms and keywords related to healthcare professions, simulation, and COVID-19. All learning outcomes were considered according to the Kirkpatrick model adapted by Barr et al. (2020). Reference selection, data extraction, and quality assessment were performed in pairs and independently. Results were synthesized using meta-analytical methods and narrative summaries. Results: 22 studies were included, 21 of which were single-group studies and 14 of those included pretest/posttest assessments. Simulation activities were mostly implemented in clinical settings using manikins for training on the use of personal protective equipment, hand hygiene, identification and management of COVID-19 patients, and work processes and patient flow. Large improvements in learning outcomes after simulation activities were reported in all studies. Discussion and conclusion: Results should be interpreted cautiously due to significant threats to the internal validity of studies and the absence of control groups. However, these findings are coherent with the overall evidence on the positive effect of simulation-based education. Future studies should include control groups if feasible.
- experiential learning,
- continuing education,
- universal precautions,
- infection control
Introduction : Les organisations de santé ont adopté la simulation pour préparer les professionnels à la pandémie de COVID-19. La conception en accéléré de simulations tout en respectant les mesures de prévention de la propagation du virus amène des défis. Dans ces conditions, l’efficacité de la simulation reste à être évaluée. Objectif : Décrire les caractéristiques et évaluer l’effet de simulations sur la préparation des professionnels de la santé et des étudiants pour fournir des soins sécuritaires pendant cette pandémie. Méthodes : Les bases de données ont été consultées jusqu’en novembre 2020 en utilisant des descripteurs et des mots-clés relatifs aux professions de la santé, à la simulation et à la COVID-19. Tous les résultats d’apprentissage ont été considérés. La sélection des articles, l’extraction des données et l’évaluation de la qualité ont été effectuées par paires. Les résultats ont été synthétisés par des méthodes méta-analytiques et des résumés narratifs. Résultats : 22 études ont été incluses ; 21 à groupe unique et, parmi ces 21, 14 évaluations pré-posttest. Les simulations ont principalement été déployées en milieux cliniques avec des mannequins pour la formation à l’utilisation d’équipements de protection individuelle, au lavage de mains, à l’identification et la prise en charge de patients atteints de la COVID-19 et à l’implantation de procédés organisationnels. Toutes les études rapportent des apprentissages importants après les simulations. Discussion et conclusion : Malgré les limites de validité interne et l’absence de groupes de contrôle, ces résultats sont cohérents avec l’état des connaissances sur les effets positifs de la simulation. De futures études devraient inclure des groupes de contrôle si possible.
- apprentissage expérientiel,
- formation continue,
- précautions universelles,
- contrôle des infections
The coronavirus disease (COVID-19) pandemic represents a major challenge for healthcare organizations around the world. New clinical processes, guidelines, and protocols had to be developed promptly to ensure that healthcare professionals were prepared to deliver care effectively while ensuring the safety of both patients and healthcare providers. Indeed, evidence suggests that healthcare professionals are 3.4 times as likely to be diagnosed with COVID-19 compared with the general population, even after controlling for the difference in rates of testing between these groups (Nguyen et al., 2020). Areas of care associated with more risk include endotracheal intubation, cardiopulmonary resuscitation, patient flow, and isolation procedures (Bhimraj et al., 2020; Chaplin et al., 2020; Edelson et al., 2020; Nolan et al., 2020).
An increasing number of educators and decision makers rely on simulation for the education of healthcare professionals regarding new processes and protocols (Dube et al., 2020). Simulation is used to reproduce real clinical situations that healthcare professionals can experience and interact with, without compromising patient safety (Gaba, 2004). Simulation activities are generally composed of three components: 1) a briefing to introduce learners to the simulation environment, the learning objectives and the scenario they are about to experience, 2) a clinical scenario, which refers to the simulated clinical situation, and 3) a debriefing to reflect on their simulation experience and receive feedback (Lopreiato, 2016). In the last decades, simulation-based education has been embraced by healthcare organizations to prepare healthcare professionals to safely deliver care and its effectiveness has been highlighted in several systematic reviews (Alanazi et al., 2017; Beal et al., 2017; Bracq et al., 2019; Hippe et al., 2020; Marion-Martins et Pinho, 2020). Yet, the COVID-19 pandemic forced educators to quickly redesign and deliver simulation activities that comply with health and safety measures, while meeting the needs of healthcare organizations. For example, authors report that using personal protective equipment (PPE) during simulation activities has proven to be difficult due to shortages in many healthcare organizations that had to prioritize its use for clinical practice (Chaplin et al., 2020). Concerns regarding the risk of contamination between healthcare professionals during simulation activities have also been raised (Chiu et al., 2020). To mitigate this risk, strategies such as reducing group sizes and enforcing physical distancing during simulation activities have been implemented (Chaplin et al.). Additional concerns included the increased stress of healthcare professionals during the pandemic, which could decrease their receptivity for learning, and the presentation of unrealistic simulation scenarios due to the limited amount of time that educators had to design them, which could affect the suspension of participant disbelief, i.e., their ability to accept the simulation scenario as genuine (Chiu et al.). As such, it is unclear if simulation activities have reached their purpose to prepare healthcare professionals to deliver care during the COVID-19 pandemic. To our knowledge and based on a search in the International prospective register of systematic reviews (PROSPERO), no systematic review has focused on the effect of simulation activities on the preparedness of healthcare professionals to safely deliver care during the COVID-19 pandemic.
Considering that simulation activities are a first-choice educational intervention for many organizations, it is essential to evaluate if the simulation activities designed and delivered during the COVID-19 pandemic reached their purpose. As such, this systematic review objective was to describe the features and evaluate the effect of simulation activities on the preparedness of healthcare professionals for the COVID-19 pandemic.
This systematic review was conducted according to the Joanna Briggs Institute Reviewer’s Manual: Systematic Reviews of Effectiveness (Tufanaru, 2017). Reporting is based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines (Moher et al., 2009). The review protocol was prospectively registered in the PROSPERO database [CRD42020210741].
We considered all experimental (i.e., randomized controlled trial), quasi-experimental (i.e., non-randomized controlled trial, pretest/posttest, and interrupted time-series design), and observational studies (e.g., cross-sectional, case control, cohort study) where the effect of simulation activities on the preparedness of healthcare professionals and students to deliver care during the COVID-19 pandemic was assessed. Preparedness was defined as the achievement of learning outcomes related to the safety and effective delivery of care during the COVID-19 pandemic.
We considered studies with healthcare professionals and students at any level of practice (pre- and post-licensure, undergraduate, and postgraduate) and in any clinical context. Thus, healthcare professionals and students are identified as “participants” in this article, whereas patients are identified as such.
We considered studies assessing the effect of simulation activities. A simulation activity was defined as the entire set of actions and events from the beginning to the end of a simulated event for educational purposes (e.g., briefing, scenario, debriefing). All simulation modalities were considered, including part-task trainers, simulated patients (i.e., standardized patients), manikin-based (low to high-fidelity), computer-based (i.e., screen-based simulation), and hybrid simulations (i.e., combining two or more simulation modalities; Chiniara et al., 2013). To be included in this review, simulation activities had to involve learning objectives related to the delivery of care to a patient with a confirmed or suspected diagnosis of COVID-19, or a change in clinical practice directly related to the COVID-19 pandemic. Studies using simulation solely to identify latent safety threats (e.g., can a gurney be easily transported to a resuscitation room) were excluded because their primary objectives were not educational but aimed at identifying and addressing issues in the healthcare environment (Jee et al., 2020).
When available, comparators included any other educational intervention.
The modified version of Kirkpatrick’s Levels of Evaluation model (Barr et al., 2000), a model frequently used in simulation-based education (Blue et al., 2015; Reeves et al., 2015), was chosen as the framework to categorize outcomes related to the preparedness of healthcare professionals to deliver care during the COVID-19 pandemic. This model includes the following levels of educational outcomes: 1) learners’ views and reaction to simulation-based education, 2a) modification of attitudes/perceptions, 2b) acquisition of knowledge/skills, 3) behavioral change, 4a) change in organizational practice, and 4b) benefits to patients. Immediate acquisition (i.e., right after the simulation) or retention (i.e., after a period without simulation) of these outcomes were both of interest.
On November 18, 2020, we searched four databases—Cumulative Index to Nursing and Allied Health Literature (EBSCO), Excerpta Medica dataBASE (Ovid), Citation Index Expanded Medline (Web of Science), and MEDLINE (Ovid)—using a combination of controlled descriptors and keywords related to the following concepts: healthcare professionals and students, simulation, and COVID-19. A sample of the MEDLINE search strategy is available in Appendix 1. The search was restricted to peer-reviewed papers published in English or French since 2019 considering the first documented COVID-19 cases (Shereen et al., 2020). We also searched the Cochrane Central Register of Controlled Trials (CENTRAL) for additional records.
Titles and abstracts of citations retrieved from the initial search were screened independently by two of the authors (MAMC, AL, TM, or PL) using the Covidence platform (Veritas Health Innovation Ltd, 2021). Full texts of eligible citations were retrieved and assessed independently by two of the authors (MAMC, AL, TM, or PL) based on the inclusion/exclusion criteria mentioned above. Disagreements at any stage of the selection process were resolved with a third author.
Data Extraction and Synthesis
Data were extracted independently by two of the authors (MAMC, AL, or GF) using a form adapted from a previous systematic review (Lapierre et al., 2021). Data items included general study information (e.g., country in which the study was conducted), methods (e.g., study aim and design), simulation features (e.g., briefing, scenario, debriefing), and outcomes (e.g., name and definition, time points measured, descriptive and inferential statistics). For pretest/posttest studies, we used meta-analytical methods to evaluate intra-group changes (i.e., change in outcomes before and after participating in the simulation activity) using a generic inverse variance approach and random-effect models in RevMan 5.4.1. (The Cochrane Collaboration, 2020). To account for correlations between timepoint measures and considering that we did not have access to individual participant data, we corrected all effect sizes (Cohen’s D) by considering a correlation of 0.6 for all pretest/posttest outcome measures. Results are reported with 95% confidence intervals (CI). The statistical significance level was set at 0.05.
For all studies, we synthesized posttest scores using descriptive meta-analytical methods. Although less common than meta-analyses of efficacy and effectiveness, descriptive meta-analyses are used to pool cross-sectional data from similar studies and provide an overview of the distribution of results (Bohannon, 2007; Vakili et al., 2020). All scores were standardized to fit a scale between zero (0) and a hundred (100), with higher scores indicating positive educational outcomes such as favorable reactions to the simulation activity, better attitude/perceptions, and higher levels of knowledge or skills. A narrative description is also provided when quantitative syntheses were not possible due to missing data or unclear reporting.
Methodological Quality Assessment
The methodological quality of included studies was assessed independently by two of the authors (MAMC, AL, or GF) using the Methodological Item for Non-Randomized Studies (MINORS) tool (Slim et al., 2003). This tool consists of 12 items to assess factors that may affect the methodological quality of two-group non-randomized studies—eight of these items also apply to single-group studies according to the authors of this tool. Although the evaluation of the adequacy of statistical analyses is only suggested for two-group studies, we also included this evaluation for single-group studies. Each item is scored on a three-point scale (0-information not reported, 1-reported but inadequate, or 2-reported and adequate) for a maximum possible score of 18 points for single-group studies, or 24 points for two-group non-randomized studies. The content validity of the MINORS tool was determined by 10 clinical methodologists. The MINORS tool showed a satisfactory level of internal consistency (Cronbach’s alpha = 0.73) after independent assessment in pairs of a sample of 80 non-randomized and single-group studies (Slim et al.). Considering that authors of this tool report that two-group non-randomized studies of good methodological quality reached a mean score of 19.8/24.0 (82.5%), we used the following threshold to dichotomize the methodological quality (poor or adequate) of included studies: 15/18 for single-group studies and 20/24 for two-group non-randomized studies.
Search Results and Descriptive Results
The initial database search yielded a total of 1,271 unique citations, and 22 studies met the inclusion criteria. Study characteristics are presented in Table 1. Study flowchart is available in Appendix 2.
All studies were judged to be of poor methodological quality (see Table 2). The median MINORS score of single-group studies (n=21) was 11/18 (Q1=10, Q3=12). The sole two-group study scored 15/24. MINORS scores were mostly low because: 1) sample sizes were not prospectively calculated (n=22); 2) study protocols were not published or prospectively registered, which made it impossible to determine if data collection was prospectively planned and if all collected data were reported (n=22); 3) there were no long-term follow-ups (n=21); 4) study aims were not clearly stated (n=11); 5) inappropriate endpoints (i.e., no pretest assessments were included; n=8).
Features of Simulation Activities
Details regarding the features of simulation activities are presented in Table 3 (Appendix 3). All simulation activities included multiple learning objectives concerning COVID-19 preparedness. Learning objectives were related to: 1) infection prevention and control (use of PPE, n=19; hand hygiene, n=4), 2) identification and management of COVID-19 (ventilation and airway management, n=9; care of COVID-19 patients, n=7; nasopharyngeal swabbing and other diagnostic procedures, n=5; triage and early identification of COVID-19 patients, n=4; prone positioning, n=1), and 3) work processes and patient flow (transport of COVID-19 patients, n=4; contamination zones, n=3; biosafety and medical waste disposal, n=2). Additional learning objectives revolved around teamwork and communication (n=9), as well as the pathophysiology and epidemiology of COVID-19 (n=3).
Most simulation activities were deployed in clinical settings, either in situ (n=9) or on-site (n=3); seven studies reported using a simulation lab. One study combined in situ simulations with lab simulations. The most frequent simulation modality was manikin-based (n=9); two studies employed standardized patients, and another used a real patient who had been tested negative to the SARS-CoV-2 to simulate a COVID-19 case. Two other studies combined two simulation modalities (standardized patients and manikins). In most studies, simulation occurred in groups (n=18); two studies involved simulations with individual participants. Settings, simulators, and group or individual participation were not reported in two, eight, and two study, respectively.
A five to ten-minute briefing consisting of a presentation of the simulation activity and familiarization with the simulator and environment was reported in seven studies. Simulation scenarios were diverse but generally involved caring for a patient with suspected or confirmed COVID-19 diagnosis. A 15 to 40 min debriefing or feedback session using various methods and approaches was reported in most studies (n=20). The length of simulation activities, from briefing to debriefing, ranged from 20 min to 180 min; however, 12 studies did not report on this feature. In most studies (n=17), simulation activities were complemented with additional educational activities such as lectures, video demonstrations, skills stations, or written material.
Effect of Simulation Activities
Pretest/Posttest Differences. Ten out of the 14 pretest/posttest studies provided enough data to compute an effect size and combine their results regarding the improvement of participants’ attitudes/perceptions (level 2a; n=6), knowledge (level 2b; n=3), and skills (level 2b; n=2). Other pretest/posttest results are presented narratively.
In studies assessing improvement of attitudes/perceptions (n=6 studies; 305 participants), the pretest/posttest pooled effect size was 2.0 [95% CI: 1.0, 3.0]. Individual study results are shown in Table 4.
Four other studies reported improvements in participants’ attitudes/perceptions of their preparedness for providing care. Montauban et al. (2020) reported significant improvements in participants’ (n=27) perception of preparedness for all aspects of care delivery to COVID-19 patients (e.g., donning and doffing of PPE, transfer to an intensive care unit, and screening high-risk patients). Trembley et al. (2020) reported improvements in participants’ (n=48) confidence in their role during intubation. Aljahany et al. (2020) reported that although participants (n=54) felt significantly more comfortable providing care to unstable COVID-19 patients, they did not feel significantly more comfortable performing airway procedures, nor did they feel more knowledgeable of the triage process after the simulation activity. Finally, Jensen et al. (2020) reported that participants (n=97) felt more comfortable using PPE and providing care to COVID-19 patients after participating in the simulation activity. In studies assessing improvement in knowledge (n=3 studies; 61 participants), the pretest/posttest pooled effect size was 2.8 [95% CI 1.7, 3.8]. These studies showed significant improvements in participants’ knowledge regarding: 1) prevention, identification, and treatment of COVID-19, as well as referral of COVID-19 patients (Cohen’s D 2.0 [95% CI 1.3, 2.7]; Shi et al., 2020), 2) triage of patients exhibiting COVID-19 symptoms (Cohen’s D 2.4 [95% CI 1.7, 3.1]; Shrestha et al., 2020), and 3) ventilation of COVID-19 patients (Cohen’s D 4.0 [95% CI 3.0, 4.9]; Mouli et al., 2020). Furthermore, Mark et al. (2020) mentioned an improvement in participants’ knowledge (n=45) of the nasopharyngeal swab but did not report data supporting that claim. In studies assessing improvement in skills (n=2 studies; 99 participants), the pretest/posttest pooled effect size was 4.2 [95% CI 0.3, 8.0]. These studies showed significant improvements in skills regarding: 1) application of universal precautions (Cohen’s D 2.2 [95% CI 1.6, 2.8]; Tan et al., 2020), and 2) donning and doffing of PPE (Cohen’s D 6.2 [95% CI 5.3, 7.0]; Diaz-Guio et al., 2020).
Posttest scores. Seventeen studies provided enough data to combine their results regarding the posttest scores of participants’ reactions (level 1; n=4), attitudes/perceptions (level 2a; n=12), skills (level 2b; n=6), and knowledge (level 2b; n=4). In studies of participants’ reactions (level 1; n=4 studies, 245 participants) to the simulation activities (Jensen et al., 2020; Mouli et al., 2020; Sharara-Chami et al., 2020; Shi et al., 2020), satisfaction results normalized to a 0–100 scale ranged from 84.6 to 96.7 with a median of 90.1 (Quartile 1 84.8, Quartile 3 96.3). It was found in two studies that the vast majority of participants would recommend the simulation activity to their colleagues (Loh et al., 2021; Trembley et al., 2020). In two studies, researchers report that 72.6% to 94.0% of participants (n=131) found the simulation activities helpful to prepare them to deliver care (Dharamsi et al., 2020; Doussot et al., 2020). Score distributions in studies of participants’ attitudes/perceptions (level 2a; n=12 studies, 1,973 participants) and skills (level 2b; n=6 studies, 423 participants) are illustrated in Figure 1. Median scores were similar for both levels of outcomes; their distribution lies in the upper third of the range of possible scores.
In the sole two-group study (Cheung et al., 2020), no statistically significant differences were found between the effect of lab-based and in situ simulation activities to improve participants’ confidence, control, and motivation to deliver care during the COVID-19 pandemic.
In studies of participants’ knowledge (level 2b; n=4 studies, 192 participants) after simulation activities (Lakissian et al., 2020; Mouli et al., 2020; Shi et al., 2020; Shrestha et al., 2020), results normalized to a 0–100 scale ranged from 74.6 to 96.6 with a median of 87.4 (Quartile 1 77.2, Quartile 3 94.9). Furthermore, Loh et al. (2021) reported that 98.0% of 42 participants obtained at least 16 out of 20 correct answers on a quiz on aerosol-generating procedures, PPE, and airway management. However, only 42.9% and 52.3% of participants remembered the steps for PPE donning and doffing, respectively.
Regarding behaviors and benefits to patients, Doussot et al. (2020) reported that more than half of participants (n=109/212; 51%) eventually performed prone positioning to ICU patients that had developed severe acute respiratory distress syndrome (not specified if self-reported or observed). For four patients, prone positioning had to be stopped due to respiratory complications or pressure ulcers. Also, Loh et al. (2021) shared that participants (n=33) had self-reported at least one change in their clinical practice following the simulation activity (e.g., hand hygiene, use of PPE, physical distancing).
This systematic review described the features and evaluated the effect of simulation activities on the preparedness of healthcare professionals to deliver care during the COVID-19 pandemic. We found significant and large improvements in participants’ attitudes and perceptions, knowledge, skills. We also found high posttest scores regarding participants’ reactions, attitudes and perceptions, knowledge, and skills following simulation activities. However, almost all the studies under review were of poor methodological quality with significant threats to their internal validity mostly due to the absence of control groups. Furthermore, although healthcare students are increasingly being called upon to provide care in healthcare organizations (Bohsra, 2020; Goshua, 2020; Mensik, 2020), we could only identify a single study with this population.
The most frequent purposes for using simulation were to prepare healthcare professionals for infection prevention and control measures (e.g., PPE and hand hygiene), identification and management of COVID-19 patients, and work processes and patient flow. Overall, these efforts were driven by an imperative to protect professionals from contracting COVID-19 and a will to maintain or improve the efficiency of healthcare delivery under rapidly changing circumstances. Although results from this review must be considered with caution, the outcomes of these efforts appear to align with those of simulation-based education in other contexts. Specifically, it is widely acknowledged that healthcare professionals are highly satisfied with simulation and that their attitudes/perceptions, knowledge, and skills tend to increase following simulation-based education (Alanazi et al., 2017; Beal et al., 2017; Bracq et al., 2019; Hippe et al., 2020; Marion-Martins et Pinho, 2020). In this review, we found statistically significant improvements in all learning outcomes following simulation activities. However, effect sizes were large and imprecise due to their intragroup nature (i.e., pre/post-intervention, pre-posttest effect sizes are known to be affected by natural modifications to variables after their measure at baseline or by other uncontrolled variables, which often leads to large pre-posttest effect sizes (Cuijpers et al., 2017)) and the low number of included studies. They should therefore be interpreted with caution in light of these limitations. Furthermore, evidence about higher-level outcomes, such as changes in behaviors, organizational practice, or benefits to patients was scarcer (only three studies identified)—notably because of the methodological challenges in measuring such outcomes. Nevertheless, simulation-based education seems relevant to improve health professionals’ perception of their preparedness for the COVID-19 pandemic, a non-negligible outcome considering the severe impacts of the pandemic on their mental health (Chen et al., 2020; Civantos et al., 2020; Dal’Bosco et al., 2020; Lai et al., 2020; Luceno-Moreno et al., 2020; Wang et al., 2020; Xiao et al., 2020).
In terms of simulation features, wide variations were observed in the methods and approaches to simulation-based education. In addition, the reporting of important components of simulation activities was often incomplete. Nevertheless, most simulation activities were implemented in situ, an increasingly popular practice (Guise et Mladenovic, 2013; Patterson et al., 2013). It is also a cost-effective solution for clinical settings that do not have access to a simulation laboratory and other resources for simulation-based education (Villemure et al., 2016). Despite its advantages, in situ simulation comes with its own challenges, including high cancelation rates because patient care takes precedence over continuing education (Kurup et al., 2017). Besides, this review did not reveal trends in terms of the methods for briefings, scenarios, or debriefings. As such, standards of best practice for simulation-based education (Sittner et al., 2015) appear as the most reliable source to guide practices in that area.
Most studies used single-group designs and more than a third used a posttest design. These research designs are subject to significant internal validity threats. As such, they are indicated when researchers want to explore if a phenomenon warrants further investigation before undertaking more costly experiments or, in the case of a posttest design, when it is irrelevant to assess the outcome before an intervention is implemented (Creswell, 2013). Considering the amount of evidence from randomized trials that already support the efficacy of simulation for healthcare professional education (Alanazi et al., 2017; Beal et al., 2017; Bracq et al., 2019; Hippe et al., 2020; Marion-Martins & Pinho, 2020), the use of such research design adds very little to our overall understanding of the effect of simulation activities.
However, due to the state of crisis caused by the COVID-19 pandemic in healthcare organizations, it seems hardly defensible from an ethical standpoint to assign participants to a control group. As such, if true experiments cannot be conducted and researchers must rely on a single-group design, to enhance the internal validity of their results, researchers could adopt interrupted time series or repeated-treatment research designs or add nonequivalent outcome variables (i.e., an outcome variable that is not expected to be affected by the intervention) to their methods. Such strategies may help to address maturation and historical threats that may affect the internal validity of single-group studies (Bell, 2010). Strengths of this review include a literature search in multiple databases and complemented by a search in an online study registry, favoring the odds that all potentially relevant studies were identified. However, the search was limited to studies published in English or French and may be subject to a language bias as we did not have the resource to consider other languages. Study selection, data extraction, and quality assessment processes were performed in pairs and independently. This ensures the correct application of eligibility criteria as well as data integrity. Besides, current findings are limited by internal validity threats due to the research designs of the studies under review. As such, it cannot be excluded that historical factors, the maturation of study participants, the testing procedure, or the Hawthorne effect (i.e., bias related to participants’ awareness that they are part of a study and are being observed) may have affected individual study findings. Finally, as we did not have access to individual participant data, we corrected all pre/posttest effect sizes by considering a correlation of 0.6 (Cuijpers et al., 2017). However, these effect size values may well be positively biased.
Based on single-group pretest/posttest studies, findings from this review suggest that simulation activities have a positive effect on the preparedness of healthcare professionals to deliver care during the COVID-19 pandemic. Importantly, this review contributes a systematic description of the features of simulation activities designed for this purpose. Such description has the potential to inform the work of clinical educators who wish to use simulation as an educational tool for pandemic preparedness in various care settings. However, the state of the evidence prevents us from making recommendations as to which simulation modalities is more effective than others. In addition, as only a single study was conducted among healthcare students, the extent to which these results can be applied to this population is limited. Furthermore, the validity of these results is impeded by threats such as the absence of control groups and overall poor methodological quality.
Future studies should include a control group if practically and ethically feasible. Other strategies to improve the internal validity of single-group studies have been suggested.
Appendix 1. MEDLINE SEARCH STRATEGY
Appendix 2. PRISMA FLOW DIAGRAM
MAMC received scholarships as part of his doctoral studies from the following organizations: Fonds de recherche du Québec – Santé (259241), ministère de l’Éducation et de l’Enseignement supérieur, Université de Montréal, Quebec Network on Nursing Intervention Research, Montreal Heart Institute Foundation, FUTUR Team (Fonds de recherche du Québec – Société et culture), and the Center for Innovation in Nursing Education. PL holds a research scholar award from the Fonds de recherche du Québec – Santé (282306).
- Alanazi, A. A., Nicholson, N. & Thomas, S. (2017). The use of simulation training to improve knowledge, skills, and confidence among healthcare students: A systematic review. Internet Journal of Allied Health Sciences and Practice, 15(3). https://nsuworks.nova.edu/ijahsp/vol15/iss3/2/
- Aljahany, M., Alassaf, W., Alibrahim, A. A., Kentab, O., Alotaibi, A., Alresseeni, A., Algarni, A., Algaeed, H. A., Aljaber, M. I., Alruwaili, B. & Aljohani, K. (2020). Use of in situ simulation to improve emergency department readiness for the COVID-19 pandemic. Prehospital and Disaster Medicine, 36(1), 1-8. https://doi.org/10.1017/S1049023X2000134X
- Barr, H., Freeth, D., Hammick, M., Koppel, I. & Reeves, S. (2000). Evaluations of interprofessional education: a United Kingdom review for health and social care. CAIPE and the British Educational Research Association.
- Beal, M. D., Kinnear, J., Anderson, C. R., Martin, T. D., Wamboldt, R. & Hooper, L. (2017). The effectiveness of medical simulation in teaching medical students critical care medicine: A systematic review and meta-analysis. Simulation in Healthcare, 12(2), 104-116. https://doi.org/10.1097/SIH.0000000000000189
- Bell, B. A. (2010). Pretest–Posttest Design. Dans N. J. Salkind (dir.), Encyclopedia of Research Design. SAGE Publications. https://doi.org/10.4135/9781412961288
- Bhimraj, A., Morgan, R. L., Shumaker, A. H., Lavergne, V., Baden, L., Cheng, V. C., Edwards, K. M., Gandhi, R., Muller, W. J., O'Horo, J. C., Shoham, S., Murad, M. H., Mustafa, R. A., Sultan, S. & Falck-Ytter, Y. (2020). Infectious Diseases Society of America guidelines on the treatment and management of oatients with COVID-19. Clinical Infectious Diseases. https://doi.org/10.1093/cid/ciaa478
- Blue, A. V., Chesluk, B. J., Conforti, L. N. & Holmboe, E. S. (2015). Assessment and evaluation in interprofessional education: exploring the field. Journal of Allied Health, 44(2), 73-82.
- Bohannon, R. W. (2007). Number of pedometer-assessed steps taken per day by adults: a descriptive meta-analysis. Physical Therapy, 87(12), 1642-1650. https://doi.org/10.2522/ptj.20060037
- Bohsra, B. (2020). Quebec order allows health-care students near graduation, recent retirees to work during COVID-19 pandemic. https://montreal.ctvnews.ca/quebec-order-allows-health-care-students-near-graduation-recent-retirees-to-work-during-covid-19-pandemic-1.4898614
- Bracq, M. S., Michinov, E. & Jannin, P. (2019). Virtual reality simulation in nontechnical skills training for healthcare professionals: A systematic review. Simulation in Healthcare, 14(3), 188-194. https://doi.org/10.1097/SIH.0000000000000347
- Chaplin, T., McColl, T., Petrosoniak, A. & Hall, A. K. (2020). "Building the plane as you fly": Simulation during the COVID-19 pandemic. Canadian Journal of Emergency Medicine, 22(5), 576-578. https://doi.org/10.1017/cem.2020.398
- Chen, X., Zhang, S. X., Jahanshahi, A. A., Alvarez-Risco, A., Dai, H., Li, J. & Ibarra, V. G. (2020). Belief in a COVID-19 conspiracy theory as a predictor of mental health and well-being of health care workers in Ecuador: cross-sectional survey study. JMIR Public Health Surveillance, 6(3), e20737. https://doi.org/10.2196/20737
- Cheung, V. K. L., So, E. H. K., Ng, G. W. Y., So, S. S., Hung, J. L. K. & Chia, N. H. (2020). Investigating effects of healthcare simulation on personal strengths and organizational impacts for healthcare workers during COVID-19 pandemic: A cross-sectional study. Integrative Medicine Research, 9(3), 100476. https://doi.org/10.1016/j.imr.2020.100476
- Chiniara, G., Cole, G., Brisbin, K., Huffman, D., Cragg, B., Lamacchia, M. & Norman, D. (2013). Simulation in healthcare: a taxonomy and a conceptual framework for instructional design and media selection. Medical Teacher, 35(8), e1380-1395. https://doi.org/10.3109/0142159x.2012.733451
- Chiu, M., Crooks, S., Fraser, A. B., Rao, P. & Boet, S. (2020). Physical health risks during simulation-based COVID-19 pandemic readiness training. Canadian Journal of Anesthesia, 67(11), 1667-1669. https://doi.org/10.1007/s12630-020-01744-y
- Civantos, A. M., Bertelli, A., Gonçalves, A., Getzen, E., Chang, C., Long, Q. & Rajasekaran, K. (2020). Mental health among head and neck surgeons in Brazil during the COVID-19 pandemic: A national study. American Journal of Otolaryngology, 41(6), 102694. https://doi.org/10.1016/j.amjoto.2020.102694
- Creswell, J. W. (2013). Research design: qualitative, quantitative, and mixed methods approaches (4th ed.). Sage Publications.
- Cuijpers, P., Weitz, E., Cristea, I. A. & Twisk, J. (2017). Pre-post effect sizes should be avoided in meta-analyses. Epidemiology and Psychiatric Sciences, 26(4), 364-368. https://doi.org/10.1017/S2045796016000809
- Dal’Bosco, E. B., Floriano, L. S. M., Skupien, S. V., Arcaro, G., Martins, A. R. & Anselmo, A. C. C. (2020). Mental health of nursing in coping with COVID-19 at a regional university hospital. Revista Brasileira De Enfermagem, 73. https://doi.org/10.1590/0034-7167-2020-0434
- Dharamsi, A., Hayman, K., Yi, S., Chow, R., Yee, C., Gaylord, E., Tawadrous, D., Chartier, L. B. & Landes, M. (2020). Enhancing departmental preparedness for COVID-19 using rapid-cycle in-situ simulation. The Journal of hospital infection, 105(4), 604-607. https://doi.org/10.1016/j.jhin.2020.06.020
- Diaz-Guio, D. A., Ricardo-Zapata, A., Ospina-Velez, J., Gomez-Candamil, G., Mora-Martinez, S. & Rodriguez-Morales, A. J. (2020). Cognitive load and performance of health care professionals in donning and doffing PPE before and after a simulation-based educational intervention and its implications during the covid-19 pandemic for biosafety. Infezioni in Medicina, 28, 111-117.
- Doussot, A., Ciceron, F., Cerutti, E., Salomon du Mont, L., Thines, L., Capellier, G., Pretalli, J. B., Evrard, P., Vettoretti, L., Garbuio, P., Brunel, A. S., Pili-Floury, S. & Lakkis, Z. (2020). Prone positioning for severe acute respiratory distress syndrome in COVID-19 patients by a dedicated team: A safe and pragmatic reallocation of medical and surgical work force in response to the outbreak. Annals of Surgery, 272(6), e311-e315. https://doi.org/10.1097/SLA.0000000000004265
- Dube, M., Kaba, A., Cronin, T., Barnes, S., Fuselli, T. & Grant, V. (2020). COVID-19 pandemic preparation: using simulation for systems-based learning to prepare the largest healthcare workforce and system in Canada. Advances in Simulation, 5, 22. https://doi.org/10.1186/s41077-020-00138-w
- Edelson, D. P., Sasson, C., Chan, P. S., Atkins, D. L., Aziz, K., Becker, L. B., Berg, R. A., Bradley, S. M., Brooks, S. C., Cheng, A., Escobedo, M., Flores, G. E., Girotra, S., Hsu, A., Kamath-Rayne, B. D., Lee, H. C., Lehotsky, R. E., Mancini, M. E., Merchant, R. M., Nadkarni, V. M., ... & American Heart Association Emergency Cardiovascular Care Interim Covid Guidance Authors. (2020). Interim Guidance for Basic and Advanced Life Support in Adults, Children, and Neonates With Suspected or Confirmed COVID-19: From the Emergency Cardiovascular Care Committee and Get With The Guidelines-Resuscitation Adult and Pediatric Task Forces of the American Heart Association. Circulation, 141(25), e933-e943. https://doi.org/10.1161/CIRCULATIONAHA.120.047463
- Gaba, D. M. (2004). The future vision of simulation in health care. Quality and Safety in Health Care, 13, i2-i10. https://doi.org/10.1136/qshc.2004.009878
- Goshua, A. (2020). Medical students called to the COVID-19 fight need support, protection. https://www.statnews.com/2020/04/17/medical-students-called-covid-19-fight-need-support-protection/
- Guise, J. M. and Mladenovic, J. (2013). In situ simulation: identification of systems issues. Seminars in Perinatology, 37(3), 161-165. https://doi.org/10.1053/j.semperi.2013.02.007
- Hippe, D. S., Umoren, R. A., McGee, A., Bucher, S. L. & Bresnahan, B. W. (2020). A targeted systematic review of cost analyses for implementation of simulation-based education in healthcare. SAGE Open Medicine, 8, 2050312120913451. https://doi.org/10.1177/2050312120913451
- Jee, M., Khamoudes, D., Brennan, A. M. & O'Donnell, J. (2020). COVID-19 outbreak response for an emergency department using in situ simulation. Cureus, 12(4), e7876. https://doi.org/10.7759/cureus.7876
- Jensen, R. D., Bie, M., Gundso, A. P., Schmid, J. M., Juelsgaard, J., Gamborg, M. L., Mainz, H. & Rolfing, J. D. (2020). Preparing an orthopedic department for COVID-19. Acta Orthopaedica, 91(6), 644-649. https://doi.org/10.1080/17453674.2020.1817305
- Khan, J. A. and Kiani, M. R. B. (2020). Impact of multi-professional simulation-based training on perceptions of safety and preparedness among health workers caring for coronavirus disease 2019 patients in Pakistan. Journal of Educational Evaluation for Health Professions, 17, 19. https://doi.org/10.3352/jeehp.2020.17.19
- Kurup, V., Matei, V. & Ray, J. (2017). Role of in-situ simulation for training in healthcare: opportunities and challenges. Current Opinion in Anesthesiology, 30(6), 755-760. https://doi.org/10.1097/ACO.0000000000000514
- Lai, J., Ma, S., Wang, Y., Zhongxiang, C., Hu, J., Wei, N., Wu, J., Du, H., Chen, T., Li, R., Tan, H., Kang, L., Yao, L., Huang, M., Wang, H., Wang, G., Liu, Z. & Hu, S. (2020). Factors associated with mental health outcomes among health care workers exposed to coronavirus disease 2019. JAMA Network Open, 3(3), e203976. https://doi.org/10.1001/jamanetworkopen.2020.3976
- Lakissian, Z., Sabouneh, R., Zeineddine, R., Fayad, J., Banat, R. & Sharara-Chami, R. (2020). In-situ simulations for COVID-19: a safety II approach towards resilient performance. Advances in Simulation, 5, 15. https://doi.org/10.1186/s41077-020-00137-x
- Lapierre, A., Arbour, C., Maheu-Cadotte, M. A., Radermaker, M., Fontaine, G. & Lavoie, P. (2021). Effect of simulation on cognitive load in health care professionals and students: protocol for a systematic review and meta-analysis. JBI Evid Synth, Online First. https://doi.org/10.11124/JBIES-20-00213
- Loh, P. S., Chaw, S. H., Shariffuddin, I., Ng, C. C., Yim, C. C. & Hashim, N. H. M. (2021). A developing nation's experience in using simulation-based training as a preparation tool for the coronavirus disease 2019 outbreak. Anesthesia and Analgesia, 132(1), 15-24. https://doi.org/10.1213/ANE.0000000000005264
- Lopreiato, J. O. (2016). Healthcare simulation dictionary. Agency for Healthcare Research and Quality. http://www.ssih.org/dictionary
- LoSavio, P. S., Eggerstedt, M., Tajudeen, B. A., Papagiannopoulos, P., Revenaugh, P. C., Batra, P. S. & Husain, I. (2020). Rapid implementation of COVID-19 tracheostomy simulation training to increase surgeon safety and confidence. American Journal of Otolaryngology, 41(5), 102574. https://doi.org/10.1016/j.amjoto.2020.102574
- Luceno-Moreno, L., Talavera-Velasco, B., Garcia-Albuerne, Y. & Martin-Garcia, J. (2020). Symptoms of posttraumatic stress, anxiety, depression, levels of resilience and burnout in spanish health personnel during the COVID-19 pandemic. International Journal of Environmental Research and Public Health, 17(15), 5514. https://doi.org/10.3390/ijerph17155514
- Marion-Martins, A. D. & Pinho, D. L. M. (2020). Interprofessional simulation effects for healthcare students: A systematic review and meta-analysis. Nurse Education Today, 94, 104568. https://doi.org/10.1016/j.nedt.2020.104568
- Mark, M. E., LoSavio, P., Husain, I., Papagiannopoulos, P., Batra, P. S. & Tajudeen, B. A. (2020). Effect of implementing simulation education on health care worker comfort with nasopharyngeal swabbing for COVID-19. Otolaryngology – Head and Neck Surgery, 163(2), 271-274. https://doi.org/10.1177/0194599820933168
- Mensik, H. (2020). Retirees, medical students called to help treat COVID-19 patients. https://www.healthcaredive.com/news/medical-students-retirees-coronavirus-help/575066/
- Mileder, L. P., Schuttengruber, G., Prattes, J. & Wegscheider, T. (2020). Simulation-based training and assessment of mobile pre-hospital SARS-CoV-2 diagnostic teams in Styria, Austria. Medicine, 99(29), e21081. https://doi.org/10.1097/MD.0000000000021081
- Moher, D., Liberati, A., Tetzlaff, J. & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Annals of internal medicine, 151(4), 264-269. https://doi.org/10.7326/0003-4819-151-4-200908180-00135
- Montauban, P., Balakumar, C., Rait, J., Zarsadias, P., Iqbal, S., Aravind, B., Shrestha, A., Fernandes, R., Shah, A. & Collaborators. (2020). The important role of in-situ simulation in preparing surgeons for the COVID-19 pandemic. The Surgeon, 22, 22. https://doi.org/10.1016/j.surge.2020.08.013
- Mouli, T. C., Davuluri, A., Vijaya, S., Priyanka, A. D. Y. & Mishra, S. K. (2020). Effectiveness of simulation based teaching of ventilatory management among non-anaesthesiology residents to manage COVID 19 pandemic - A Quasi experimental cross sectional pilot study. Indian Journal of Anaesthesia, 64, S136-S140. https://doi.org/10.4103/ija.IJA_452_20
- Munzer, B. W., Bassin, B. S., Peterson, W. J., Tucker, R. V., Doan, J., Harvey, C., Sefa, N. & Hsu, C. H. (2020). In-situ simulation use for rapid implementation and process improvement of COVID-19 airway management. Western Journal of Emergency Medicine, 21(6), 99-106. https://doi.org/10.5811/westjem.2020.7.48159
- Nguyen, L. H., Drew, D. A., Graham, M. S., Joshi, A. D., Guo, C.-G., Ma, W., Mehta, R. S., Warner, E. T., Sikavi, D. R., Lo, C.-H., Kwon, S., Song, M., Mucci, L. A., Stampfer, M. J., Willett, W. C., Eliassen, A. H., Hart, J. E., Chavarro, J. E., Rich-Edwards, J. W., Davies, R., ... & Zhang, F. (2020). Risk of COVID-19 among front-line health-care workers and the general community: a prospective cohort study. The Lancet Public Health, 5(9), e475-e483. https://doi.org/10.1016/s2468-2667(20)30164-x
- Nolan, J. P., Monsieurs, K. G., Bossaert, L., Bottiger, B. W., Greif, R., Lott, C., Madar, J., Olasveengen, T. M., Roehr, C. C., Semeraro, F., Soar, J., Van de Voorde, P., Zideman, D. A., Perkins, G. D. & European Resuscitation Council Covid-Guideline Writing Groups. (2020). European Resuscitation Council COVID-19 guidelines executive summary. Resuscitation, 153, 45-55. https://doi.org/10.1016/j.resuscitation.2020.06.001
- Patterson, M. D., Geis, G. L., Falcone, R. A., LeMaster, T. & Wears, R. L. (2013). In situ simulation: detection of safety threats and teamwork training in a high risk emergency department. BMJ Quality and Safety, 22(6), 468-477. https://doi.org/10.1136/bmjqs-2012-000942
- Reeves, S., Boet, S., Zierler, B. & Kitto, S. (2015). Interprofessional Education and Practice Guide No. 3: Evaluating Interprofessional Education. Journal of Interprofessional Care, 29(4), 305-312. https://doi.org/10.3109/13561820.2014.1003637
- Sharara-Chami, R., Sabouneh, R., Zeineddine, R., Banat, R., Fayad, J. & Lakissian, Z. (2020). In situ simulation: An essential tool for safe preparedness for the COVID-19 pandemic. Simulation in Healthcare, 15(5), 303-309. https://doi.org/10.1097/SIH.0000000000000504
- Shereen, M. A., Khan, S., Kazmi, A., Bashir, N. & Siddique, R. (2020). COVID-19 infection: Origin, transmission, and characteristics of human coronaviruses. Journal of Advanced Research, 24, 91-98. https://doi.org/10.1016/j.jare.2020.03.005
- Shi, D., Lu, H., Wang, H., Bao, S., Qian, L., Dong, X., Tao, K. & Xu, Z. (2020). A simulation training course for family medicine residents in China managing COVID-19. Australian Journal of General Practice, 49(6), 364-368. https://doi.org/10.31128/AJGP-04-20-5337
- Shrestha, A., Shrestha, A., Sonnenberg, T. & Shrestha, R. (2020). COVID-19 emergency department protocols: Experience of protocol implementation through in-situ simulation. Open Access Emergency Medicine, 12, 293-303. https://doi.org/10.2147/OAEM.S266702
- Sittner, B. J., Aebersold, M. L., Paige, J. B., Graham, L. L., Schram, A. P., Decker, S. I. & Lioce, L. (2015). INACSL standards of best practice for simulation: Past, present, and future. Nurse Education Perspectives, 36(5), 294-298. https://doi.org/10.5480/15-1670
- Slim, K., Nini, E., Forestier, D., Kwiatkowski, F., Panis, Y. & Chipponi, J. (2003). Methodological index for non-randomized studies (minors): development and validation of a new instrument. ANZ Journal of Surgery, 73(9), 712-716. https://doi.org/10.1046/j.1445-2197.2003.02748.x
- Tan, W., Ye, Y., Yang, Y., Chen, Z., Yang, X., Zhu, C., Chen, D., Tan, J. & Zhen, C. (2020). Whole-process emergency training of personal protective equipment helps healthcare workers against COVID-19: design and effect. Journal of Occupational and Environmental Medicine, 62(6), 420-423. https://doi.org/10.1097/JOM.0000000000001877
- The Cochrane Collaboration. (2020). Review Manager (RevMan) (Version 5.4.1) [Computer program]. The Nordic Cochrane Centre. https://training.cochrane.org/online-learning/core-software-cochrane-reviews/revman
- Trembley, L. L., Tobias, A. Z., Schillo, G., von Foerster, N., Singer, J., Pavelka, S. L. & Phrampus, P. (2020). A multidisciplinary intubation algorithm for suspected COVID-19 patients in the emergency department. Western Journal of Emergency Medicine, 21(4), 764-770. https://doi.org/10.5811/westjem.2020.5.47835
- Tufanaru, C., Munn, Z., Aromataris, E., Campbell, J., & Hopp, L. (2017). Chapter 3: Systematic reviews of effectiveness. Dans E. Aromataris, Munn, Z. (Eds.), Joanna Briggs Institute Reviewer's Manual. The Joanna Briggs Institute. https://reviewersmanual.joannabriggs.org/
- Vakili, K., Fathi, M., Pezeshgi, A., Mohamadkhani, A., Hajiesmaeili, M., Rezaei-Tavirani, M. & Sayehmiri, F. (2020). Critical complications of COVID-19: A descriptive meta-analysis study. Rev Cardiovasc Med, 21(3), 433-442. https://doi.org/10.31083/j.rcm.2020.03.129
- Veritas Health Innovation Ltd. (2021). Covidence (Version 2577) [Computer software]. https://www.covidence.org/
- Villemure, C., Tanoubi, I., Georgescu, L. M., Dube, J. N. & Houle, J. (2016). An integrative review of in situ simulation training: Implications for critical care nurses. Canadian Journal of Critical Care Nursing, 27(1), 22-31.
- Wang, L. Q., Zhang, M., Liu, G. M., Nan, S. Y., Li, T., Xu, L., Xue, Y., Zhang, M., Wang, L., Qu, Y. D. & Liu, F. (2020). Psychological impact of coronavirus disease (2019) (COVID-19) epidemic on medical staff in different posts in China: A multicenter study. Journal of Psychiatric Research, 129, 198-205. https://doi.org/10.1016/j.jpsychires.2020.07.008
- Wenlock, R. D., Arnold, A., Patel, H. & Kirtchuk, D. (2020). Low-fidelity simulation of medical emergency and cardiac arrest responses in a suspected COVID-19 patient - an interim report. Clinical Medicine, 20(4), e66-e71. https://doi.org/10.7861/clinmed.2020-0142
- Xiao, X., Zhu, X., Fu, S., Hu, Y., Li, X. & Xiao, J. (2020). Psychological impact of healthcare workers in China during COVID-19 pneumonia epidemic: A multi-center cross-sectional survey investigation. Journal of Affective Disorders, 274, 405-410. https://doi.org/10.1016/j.jad.2020.05.081
- Yuriditsky, E., Horowitz, J. M., Nair, S. & Kaufman, B. S. (2021). Simulation-based uptraining improves provider comfort in the management of critically ill patients with COVID-19. Journal of Critical Care, 61, 14-17. https://doi.org/10.1016/j.jcrc.2020.09.035