Content area
Background
Owing to the infrequent emergence of disasters and the challenges associated with their management, responders need appropriate training beyond doubt. Ensuring the highest standard of disaster management (DM) training is of paramount importance for high-quality DM. However, the literature concerning DM training monitoring and evaluation (M&E) is scarce. The primary objective of this review was to document the existing M&E strategies for DM training.
Methods
The authors conducted a systematic literature search on June 28, 2023, on the PubMed, Scopus, Embase and Cochrane databases, including studies that described the learning objectives and the M&E strategy of DM training. The authors categorized the learning objectives and the evaluation methodology according to the revised Bloom’s Taxonomy and the New World Kirkpatrick model, respectively.
Results
Fifty-seven articles met the inclusion and exclusion criteria, described DM training targeting healthcare and non-healthcare professionals and employed diverse teaching methods and topics. Five studies reported using monitoring, while all reported an evaluation methodology. The learning objectives focused on students’ ability to “Remember” (N = 50) and “Apply”(N = 44). The evaluations centred around the second level of the New World Kirkpatrick model (N = 57), with only 7 articles investigating the third level. Sixteen authors used existing, validated M&E frameworks. When corelating the learning objectives with the evaluation methodology, the authors observed a mismatch, as skills like the students’ ability to “Apply” and “Create” were evaluated using the second level of the New World Kirkpatrick model.
Conclusions
The great heterogeneity in DM training highlights the particularity of these educational programs. The lack of monitoring and the low usage of existing M&E frameworks highlighted a lack of awareness and standardization in the field. The mismatch between the learning objectives and the evaluation process led to deceptive evaluations, which may have resulted in graduates being deemed ready to deploy despite facing hardships in real-world settings, potentially leading to unprepared responders.
Background
Disaster management (DM), as defined by the World Health Organization (WHO), involves the systematic coordination, planning, and execution of strategies designed to prepare for, respond to, and recover from catastrophic events [1]. Due to the infrequent emergence of disasters and the challenges associated with their management, assessing the performance of those responding to these events is very difficult. Therefore, to ensure responders’ preparedness, it is essential to verify that they have been trained effectively. The importance of training in the field of DM has been widely emphasized, with numerous calls for its improvement [2, 3]. Several after action reports highlighted that effective DM requires efficient training, which not only benefits the affected population but also increases the willingness and confidence of professionals in performing their duties [4, 5].
In 2002, the United Nations recognized monitoring and evaluation (M&E) as a key element for enhancing learning and promoting education [6]. While monitoring represents a continuous process aimed at providing timely data on training progress to allow organizers to correct deficiencies early, evaluation constitutes an episodic, objective examination focused on assessing training efficiency, impact, and effectiveness [7]. Among the most common types of monitoring are process, compliance, context, financial and beneficiary monitoring, while formative, summative, midterm, final or ex-post evaluation represent some of the most widely used evaluation strategies [8].
Regrettably, despite the increasing number of training programs developed over the years, the standardization and consistency of introducing M&E approaches have been lacking and rather arbitrary [9,10,11,12,13]. The importance of standardized quality management systems and tools for evaluating learning is also emphasized by the WHO’s learning strategy [14]. However, in the field of DM education and training, it remains unclear how extensively M&E approaches are implemented and whether they align with the learning objectives set by trainers.
The literature on the DM core competencies is widely heterogenic, as they range from situation awareness, communication, personal safety measures to surge capacity principles, public health management, ethics and legal principles [15, 16]. The diversity and complexity inherent in DM education and training pose significant challenges for their M&E [9, 17]. Indeed, DM training programs currently encompass a wide range of topics, ranging from disaster medical response and patient management to planning and coordination strategies, and utilize various teaching methods, such as e-learning, traditional classroom lectures, and multiple types of simulations [10].
Therefore, the primary objective of this review was to identify the existing M&E strategies for DM training. The secondary objective was to analyse the relationships between the learning objectives and the evaluation methods (Table 1).
[IMAGE OMITTED: SEE PDF]
Methods
The present scoping review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) [19]. Prior to the literature search, the authors developed a review protocol according to the recommendations of Peters et al. [20, 21].
A systematic literature search was conducted on June 28, 2023, on PubMed, Scopus, Embase and Cochrane databases. The search strings were created in collaboration with a professional librarian, combining three different search terms: “disaster”, “training”, and “evaluation/monitoring”.
The identified articles were uploaded into the Rayyan Intelligent Systematic Review tool, and duplicates were automatically removed [22]. Title and abstract screenings were independently conducted by two researchers in accordance with predefined inclusion criteria. Following this initial phase, articles meeting the inclusion criteria underwent full-text examination. In instances where full-text versions were unavailable, efforts were made to establish communication with corresponding authors. Any screening outcomes were reconciled through mutual consensus at the conclusion of each phase.
Inclusion and exclusion criteria
Articles that described a monitoring and/or evaluation approach/tool/framework used for disaster management courses aimed at disaster management professional workers were included. Disaster management professionals consider all the skilled workers in any of the fields encompassed in the WHO definition of “Disaster Management”, including those that are currently in training (e.g. undergraduates) [23]. The trainees were divided into healthcare workers, if they were categorised as health or health associate professionals, according to WHO and non-healthcare workers those professions that are involved in disaster response, but don’t pertain to the previously mentioned professional groups [24].
To be included, a learning program must have clear learning objectives and an M&E framework. The objectives should specify measurable outcomes, such as the skills or knowledge participants are expected to acquire [25, 26]. Peer reviewed or not reviewed, empirical or theoretical original research articles written in one of the following languages were included: English, Arabic, French, Romanian, or Italian. Reviews, commentaries, editorials, conference papers, letters to editors, and anecdotal accounts were excluded.
Records that did not address the subject under investigation or for which the full text was not available (directly or made available following contact with the authors) were excluded. Similarly, records were excluded if the courses were addressed to nonprofessional workers, did not present the courses’ learning objectives or did not present the M&E framework.
Data extraction and analysis
The included records were organized into a Microsoft Excel, Version 2408 (Microsoft Corporation, Redmond, Washington, U.S.) spreadsheet, and the data were extracted, with a focus on course characteristics (course subject, participants, training methodology), learning objectives, and the M&E process (methodology, development process, evaluated items, questions used for evaluation, outcomes, follow-up duration and the fundamental reason for performing M&E). The training topics were categorized following Sarin et al. [27], whereas Anderson et al. revised Bloom’s taxonomy to index the learning objectives because of the strong background and wide usage of the framework [28]. Therefore, the learning objectives were categorized into those addressing the trainees’ ability to “Remember” “Understand”, “Apply”, “Analyse”, “Evaluate” and “Create”. According to the revised taxonomy, “Remembering” involves recalling knowledge from memory, while “Understanding” entails interpreting or making meaning from messages. “Applying” refers to using knowledge in practice, such as through simulations, and “Analysing” requires organizing or differentiating message components. “Evaluating” involves making judgments and offering critiques or recommendations, and “Creating” focuses on producing new knowledge or synthesizing information from learned material [28].
The type of training used was categorized into “classroom-based”, “simulation-based” and “technology-based”, with the simulation types being further classified according to the WHO’s simulation exercise manual [29, 30]. The evaluation methodology was indexed according to the Preparedness and Emergency Response Learning Centre’s framework and Cochrane’s recommendations for data collection [31, 32], whereas the evaluation processes were grouped according to the levels indicated by the New World Kirkpatrick framework [33].
As part of the analysis, the reviewers examined how frequently each evaluation methodology was used for different training topics. The relationships among the learning objectives, categorized according to the revised Bloom’s taxonomy, and the evaluation levels defined by the New World Kirkpatrick framework were also investigated. To visualize these connections and their relative magnitudes, a Sankey plot was used.
Results
A total of 5,721 articles were identified through the initial database search. An additional 23 records were included through other sources, such as cross-referencing and keyword searches. After removing duplicates, the titles and abstracts of 3,853 records were screened. Of these, 271 were selected for full-text review, resulting in 57 articles that met the established inclusion and exclusion criteria. Detailed information on the article selection process is illustrated in the PRISMA diagram (Fig. 1). A comprehensive summary of the extracted data is provided in Additional File 1.
[IMAGE OMITTED: SEE PDF]
Training features
Various healthcare professionals, including undergraduates (N = 20), nurses (N = 21), medical doctors (N = 16), residents (N = 6), emergency medical service professionals (N = 10), public health workers (N = 6), humanitarian professionals (N = 1) and pharmacists (N = 1), were targeted by the identified DM training programs. Additionally, nonhealthcare professionals such as hospital administrators (N = 5), logistics specialists (N = 2), government officials (N = 2), security workers (N = 2), firefighters (N = 2), and personnel involved in military programs (N = 1) were included in the studies. (Additional File 1).
Classroom-based training was used in 41 trainings, including lectures (N = 41), group work (N = 17), videos (N = 5), case studies (N = 4) and serious games (N = 1). Technology-based training was used either as blended learning or as a standalone training method by 15 authors. Distance-based online learning (N = 14) was used by most of the training organizers, whereas computer-based simulation was reported in 7 trainings and virtual reality simulation in 1. The simulations were used in most of the trainings (N = 42), with the main type being drills (N = 24), followed by tabletop simulations (N = 20), full-scale exercises (N = 6) and functional exercises (N = 4). The training methodology was not mentioned by 1 article.
The most recurrent training topics were Incident Command System (N = 29), Triage (N = 22), Communication (N = 22), Hospital Disaster Management (N = 20) and Public health during Disasters (N = 19). Adult teaching, Legal and Ethics, Community disaster preparedness, Safety and Security were present in 3 articles, while Mental health and Health consequences of disasters were taught during 2 trainings.
The trainings were carried out in various economic context ranging from high income countries from North America and Europe to middle income countries from Middle East. (Additional File 1).
Monitoring and evaluation
Five studies reported the implementation of a monitoring process throughout the course. (Additional File 1) In 2 papers, course organizers reported using questionnaires to monitor the training, whereas interviews were conducted in 2 other studies. Additionally, one author mentioned employing both formal and informal feedback. None of the studies included a validation methodology for their monitoring processes. In contrast, all included articles evaluated the described DM training. According to the New World Kirkpatrick Model, all the studies assessed the learning outcomes of the trainees, most of them evaluating the second level of the model (N = 57) (focusing on “Knowledge” (N = 50), “Skills” (N = 24), and “Attitude” (N = 20), compared to “Confidence” (N = 15) and “Commitment” (N = 4)). The first level was evaluated in 41 articles, with the majority reporting on “Satisfaction” (N = 38) and “Relevance” (N = 20), whereas “Engagement” was examined in 6 articles. Seven articles investigated the third level of the framework (Fig. 2).
[IMAGE OMITTED: SEE PDF]
The predominant evaluation method used was the pre-post-test, defined as an assessment of the participants’ knowledge and skills before and after the training (N = 41). Deploying a questionnaire either at the beginning or the end of training was a commonly employed method for evaluating training outcomes (N = 27). Practical simulations, ranging from tabletops to full-scale exercises, were utilized for evaluation purposes in 7 articles. Conducting interviews with participants and organizing structured discussion groups were reported by 6 studies, while 4 cases utilized formally structured feedback or informal discussions. Finally, observations of participants' behaviours, interactions, and activities during the course, which were based on a predefined set of criteria, were reported by 3 articles. (Additional File 1). However, only one article mentioned using a validated framework for analysing the students’ behaviour, while the other two developed their own set of criteria [34]. None of the articles linked their findings to a behavioural change theory.
A follow-up evaluation, defined as an assessment conducted at a significant interval after the conclusion of the course, was reported by 18 papers. Among them, 3 evaluations were performed within one month after the training ended, 8 between one and six months, 7 between six months and one year and 1 after more than one year, while 1 study did not specify the exact timing of the follow-up evaluation. Notably, theoretical frameworks meant to validate the M&E methodologies were employed by only 16 authors. Pilot studies were described in 10 articles, whereas expert opinion served as the primary validation methodology in 8 cases. Details regarding the frameworks used by the authors can be found in Additional File 1.
Training topics and their relationships with the evaluation methodology
The evaluation methods for different training topics revealed that triage, introduction to disaster management and terminology, and the incident command system were assessed via all available methodologies. Hospital disaster management, prehospital disaster management, chemical, biological, radiological, and nuclear emergencies (CBRNE), communication, and public health during disasters were evaluated via a wide range of methodologies. In contrast, government- and NGO-sponsored response teams and mental health were evaluated via two and one methodologies, respectively (Fig. 3).
[IMAGE OMITTED: SEE PDF]
The correlation between learning objectives and M&E
The learning objectives outlined by the studies focused primarily on the "Remember" level of the revised Bloom's taxonomy (N = 50), followed by "Apply" (N = 44) and "Understand" (N = 38) (Fig. 4). Eight trainings were aimed at "Create," 7 at "Analyse," and 6 at "Evaluate". The "Remember" level was assessed primarily via Level 2 of the new Kirkpatrick model (N = 50), whereas Level 3 was addressed by 6 studies. The "Apply" objectives were primarily evaluated via Level 2 (N = 44) and Level 1 (N = 31) objectives of the Kirkpatrick model. Similarly, "Understand" objectives were assessed via Level 2 (N = 38) and Level 1 (N = 26) objectives, whereas Level 3 objectives were explored in 3 studies. The learning objectives aimed at "Analyse," "Evaluate," and "Create" were investigated primarily via Levell 2 (N = 7, N = 6, N = 8) and Level 1 (N = 6, N = 5, N = 5), with Level 3 being the least addressed (N = 2, N = 1, N = 1).
[IMAGE OMITTED: SEE PDF]
Discussion
The present scoping review examined the M&E strategies currently used in DM training. By analysing the main characteristics of DM training, indexing their learning objectives, and studying the M&E processes used, this study highlights the significant diversity in the application of M&E methodologies and frameworks and highlights a mismatch between the stated learning objectives and the M&E process.
In accordance with the literature, there is significant heterogeneity in the typology, topics, and teaching methodologies of DM training, which can be attributed to factors such as the absence of a standardized training protocol and governance in the field [10, 29, 35]. Training participants exhibit a broad spectrum of professional backgrounds, with a notable majority coming from healthcare, highlighting the predominant focus on medical DM over nonmedical sectors [15]. This trend seems to persist despite the literature's call to adopt an interprofessional approach to DM [36, 37]. The training methodologies reported by the studies predominantly favour classical, face-to-face lectures, notwithstanding reports from several authors advocating for the integration of technology into their educational programs. Most of the articles reported the use of simulations, particularly those requiring minimal human or material resources. It is indeed well established that the use of tabletop simulations and drills enhances student engagement and learning while minimizing costs and infrastructure requirements [38,39,40,41]. The scarce use of computer-based or virtual reality simulations, although proven to be useful and cost-efficient in the long term as simulation tools, could be explained by the high initial cost and low availability of high-quality, easy-to-use tools [42,43,44]. Nevertheless, existing evidence suggests that employing multiple learning methods, particularly interactive and simulation-based methods, yields superior results [45,46,47,48]. The training topics reported in our study are consistent with those in the literature, emphasizing the competences that are considered fundamental for disaster response [10, 16, 49, 50]. Several after-action reports highlight the communication deficiencies observed during disaster responses [51,52,53,54,55]. Consequently, the strong focus on communication skills in DM training aligns with these observations, paving the way for more effective and efficient soft skills training in the field.
Monitoring was reported by a very small fraction of the studies, which could be explained by their time constraints in delivering the course, but it could also derive from a reporting bias. Moreover, those who did report it omitted a validation methodology, which could be explained by the lack of guidelines in this domain or by the lack of awareness regarding its importance. While running/delivering training, there are several methods for monitoring, ranging from formal or informal feedback, discussion groups, and interviews to questionnaires or technology-based solutions [56, 57]. By not properly monitoring the participants throughout the course, training organizers lack the information needed to improve the programme while it is ongoing. The absence of participants’ immediate feedback or engagement tracking can result in trainees’ disinterest and inefficient learning [58, 59]. Furthermore, without a structured monitoring process, training organizers may fail to identify unmet learning objectives, leading to gaps in knowledge and skills that will need to be filled at a later time [60].
Given the urgency and unpredictability of disasters, it is crucial to have responders continuously trained and prepared to integrate their knowledge into practice, making training evaluation of paramount importance. The prevalent use of the second level of the New World Kirkpatrick Model in the evaluation of all included studies underscores the emphasis the training organizers put on assessing skills and knowledge. This level is typically measured via nonparticipatory methods, such as multiple-choice questions or skill tests, which do not provide an explanation of the learning outcome. Additionally, focusing predominantly on skills and knowledge acquisition predisposes trainees to gaps in understanding whether they can effectively apply what they have learned or if the acquired knowledge will translate into actual competencies and influence their behaviour during disaster response [61, 62]. Moreover, without objectively assessing students’ ability to apply newly acquired knowledge and skills, the included papers focus on self-reported confidence and reported skills, which are widely recognized as not being correlated with actual real-life behaviour [63,64,65].
The limited number of articles evaluating behavioural change following the training activity highlights a significant drawback in the M&E of the DM training programmes, findings that are in accordance with the literature [11, 66, 67]. Existing risk communication studies have consistently shown a weak correlation between being knowledgeable about preparedness strategies and translating that knowledge into actual preparedness actions [68,69,70]. Hence, without assessing behavioural changes, training organizers cannot ascertain the training's impact on participants' DM response performance, leading to uncertainty regarding the effectiveness of the capacity-building programme. One reason that could explain this finding is the absence of validated frameworks tailored to the specific needs of DM training. Measuring behavioural changes during an actual disaster response is indeed challenging owing to the multitude of factors influencing response efforts, compounded by the infrequency of disasters [66]. Another possible reason is the lack of practical guidance on how to adapt the existing methods of evaluation to the particularities of the DM.
A small number of studies reported a longitudinal assessment of the participants after the training’s completion, with the majority reassessing trainees within the first 6 months post training. These findings align with previous research underscoring the importance of longitudinal evaluation to comprehensively grasp the complexity involved in the learning process, the transfer of knowledge into practice, and the long-term effects of training [71].
The evaluation methodologies employed, such as pre-post-tests and questionnaires, are well documented in the literature as being widely used owing to their quantitative, measurable and comparable nature, along with their comfortable usage, on the basis of the trainers’ expertise with the tool [43, 72, 73]. However, these methods have several limitations, including their inability to collect qualitative data on changes in students' attitudes and their susceptibility to biases, such as response shifts, practice effects, and spontaneous improvement [72, 74, 75].
Formal and informal feedback, alongside qualitative evaluation, were identified in a restricted number of academic papers. There is compelling evidence indicating that these methods of data collection can greatly aid educators in conducting a more comprehensive assessment of the course's efficacy and efficiency, thus equipping them with the invaluable insights required to improve the course [76, 77].
Despite evidence from the literature that standardized protocols of M&E enhance training quality and improve participant outcomes [78], only a limited number of studies have employed established, validated frameworks or pilot studies for their M&E processes, potentially contributing to the observed variability among reported methodologies. The absence of a framework tailored to DM characteristics and a limited awareness of the subject may also contribute to this situation [13].
The different methodologies used to assess prevalent and recurring subjects in DM stress the importance of expertise and familiarity with frequently studied subjects compared with less frequent ones. Employing multiple evaluation approaches for the same subject enables instructors to gain a comprehensive understanding of the learning process, thereby facilitating the formulation of pertinent conclusions regarding course outcomes. Triage and incident command systems, for example, are extensively documented in the literature, featuring validated key performance indicators and numerous concrete evaluation examples [79,80,81,82].
However, several less common topics were assessed via a limited range of methods, primarily pre-post-tests and questionnaires. For example, evaluating mental health training with these methodologies might prove ineffective due to the complexity of interpersonal skills and the practical nature of mental health interventions, aspects that are inadequately captured by solely quantitative measurement tools [83]. Similarly, training programs for government and NGO-sponsored response teams represent a relatively new concept that is increasingly becoming professionally managed and standardized, with significant efforts from the WHO to support their development [84]. These teams have unique characteristics, including diverse team member backgrounds, limited prior collaboration, and varied professional experiences, all of which complicate the implementation of M&E. This complexity may explain the limited use of diverse evaluation methodologies in assessing these programs [36, 85, 86].
When the learning objectives declared by the authors are analysed and categorized according to the revised Bloom's taxonomy, it is obvious that most of the objectives are focused on trainees’ ability to “Remember”, followed by the ability to “Apply” and “Understand”. The small proportion of the training organizers that aimed at improving the trainees’ ability to “Create”, “Analyse” and “Evaluate”, which are the core of developing critical thinking skills, highlights the need to integrate these goals in future training.
Furthermore, the correlation between the revised Bloom's taxonomy levels (learning objectives) and the New World Kirkpatrick levels (evaluation methodology) reveals that a significant number of the articles have assessed trainees' skills (ability to "Apply" and "Create") using the second level of the New World Kirkpatrick framework as opposed to the third level. The second level of the New World Kirkpatrick Model aims at evaluating trainees’ skills and knowledge by the end of the course, documenting whether they have learned the concepts that were taught. In contrast with the third level, the second level does not analyse whether the students are able to apply or integrate the newly developed skills in their day-to-day life.
Moreover, a low number of studies have evaluated course effectiveness via simulations. While evaluating participants' performance through simulations can have merits, it is important to note that simulations do not accurately reflect behavioural change and are not necessarily comparable to real-life improvements, as suggested by existing evidence [62]. The literature encourages the evaluation of behavioural change (Level 3 of the New World Kirkpatrick Model) to ensure knowledge and skill transfer to the job, with a wide variety of tools being available [87, 88]. By not analysing trainees’ behavioural changes and the impact of training on their day-to-day work, the correlation between acquired skills and knowledge and their impact on real-life DM is unlikely to be certain. In addition, dangerous conclusions could be drawn from a deceptive evaluation, given that some of the included articles consider their graduates to be ready to deploy, following the course conclusion. Therefore, the trainees might face hardship when dealing with real-life situations, although the post training evaluation of their skills or knowledge was satisfactory.
This mismatch between the learning objectives and the evaluated outcomes has the potential to lead to unreliable evaluation outcomes, possibly impacting patients.
While documenting this review, the authors observed a gap in the literature and frameworks on linking the learning objectives with the M&E strategies in DM capacity-building programs. Without clear learning objectives and a strong correlation between them and the M&E strategy, false conclusions could be drawn while analysing the outcomes of training. Future research is needed to fill this scientific gap.
Limitations
There are two main limitations of the current review. First, despite screening multiple databases, it is important to note that PubMed and EMBASE primarily consist of medical and pharmaceutical content. Therefore, it is understandable that training topics and participants predominantly pertain to the healthcare sector. Second, an additional limitation was the inability to establish a direct correlation between each learning objective and its respective evaluation criteria. When constructing Figure nr. 4, the entire set of learning objectives was correlated with the complete evaluation protocol for each study instead of correlating each objective with its corresponding evaluation criteria.
Data availability
Data is provided within the manuscript or supplementary information files.
World Health Organization. Glossary of health emergency and disaster risk management terminology. Geneva: World Health Organization; 2020 Cited 2023 Apr 13.
Berhanu N, Abrha H, Ejigu Y, Woldemichael K. Knowledge, experiences and training needs of health professionals about disaster preparedness and response in southwest Ethiopia: a cross sectional study. Ethiop J Health Sci. 2016;26(5):415–26. https://doi.org/10.4314/ejhs.v26i5.3.
Li S, Gillani AH, Ibrahim MMIM, Omer S, Fang Y. Should we focus more on teaching and training disaster management in health-care colleges? An insight into the students’ knowledge, attitude, and readiness to practice. J Pharm Bioallied Sci. 2022;14(3):147. https://doi.org/10.4103/jpbs.jpbs_420_21.
Kahn LH, Barondess JA. Preparing for disaster: response matrices in the USA and UK. J Urban Health. 2008;85(6):910–22. https://doi.org/10.1007/s11524-008-9310-y.
Yin H, He H, Arbon P, Zhu J, Tan J, Zhang L. Optimal qualifications, staffing and scope of practice for first responder nurses in disaster. J Clin Nurs. 2012;21(1–2):264–71. https://doi.org/10.1111/j.1365-2702.2011.03790.x.
UNESCO. United Nations Decade of Education for Sustainable Development (2005–2014): international implementation scheme - UNESCO Digital Library. Cited 2024 Jul 16.
UNICEF Evaluation Office. A UNICEF Guide for Monitoring and Evaluation. Humanitarian UNICEF. Cited 2024 Jul 16.
International Federation of Red Cross and Red Crescent Societies. Project/programme monitoring and evaluation (M&E) guide. Cited 2024 Dec 20.
Ingrassia PL, Foletti M, Djalali A, Scarone P, Ragazzoni L, Corte FD, et al. Education and training initiatives for crisis management in the European Union: a web-based analysis of available programs. Prehosp Disaster Med. 2014;29(2):115–26. https://doi.org/10.1017/S1049023X14000235.
Voicescu GT, Valente M, Della Corte F, Becerril M, Ragazzoni L, Caviglia M. Medical students’ education in disaster medicine: A systematic literature review of existing curricula. Int J Disaster Risk Reduct. 2022;1(77):103090. https://doi.org/10.1016/j.ijdrr.2022.103090.
Ashcroft J, Byrne MHV, Brennan PA, Davies RJ. Preparing medical students for a pandemic: a systematic review of student disaster training programmes. Postgrad Med J. 2021;97(1148):368–79. https://doi.org/10.1136/postgradmedj-2020-137906.
Brand M, Kerby D, Elledge B, Johnson D, Magas O. A model for assessing public health emergency preparedness competencies and evaluating training based on the local preparedness plan. J Homeland Secur Emerg Manage. 2006;23:3. https://doi.org/10.2202/1547-7355.1182.
Beerens RJJ, Tehler H, Pelzer B. How can we make disaster management evaluations more useful? An empirical study of dutch exercise evaluations. Int J Disaster Risk Sci. 2020;11(5):578–91. https://doi.org/10.1007/s13753-020-00286-7.
World Health Organisation. WHO health emergency programme learning strategies. Cited 2024 Jul 16.
Ripoll Gallardo A, Djalali A, Foletti M, Ragazzoni L, Della Corte F, Lupescu O, et al. Core competencies in disaster management and humanitarian assistance: a systematic review. Disaster Med Public Health Prep. 2015;5(1):1–10. https://doi.org/10.1017/dmp.2015.24.
Walsh L, Subbarao I, Gebbie K, Schor KW, Lyznicki J, Strauss-Riggs K, et al. Core competencies for disaster medicine and public health. Disaster Med Public Health Prep. 2012;6(1):44–52. https://doi.org/10.1001/dmp.2012.4.
Lennquist S. Education and training in disaster medicine. Scand J Surg. 2005;94(4):300–10. https://doi.org/10.1177/145749690509400409.
Feldner K, Dutka P. Exploring the evidence: generating a research question: using the PICOT framework for clinical inquiry. Nephrol Nurs J. 2024;51(4):393–5. https://doi.org/10.37526/1526-744X.2024.51.4.393.
Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med. 2018;169(7):467–73. https://doi.org/10.7326/M18-0850.
Peters MDJ, Marnie C, Tricco AC, Pollock D, Munn Z, Alexander L, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth. 2020;18(10):2119–26. https://doi.org/10.11124/JBIES-20-00167.
Peters MDJ, Godfrey CM, Khalil H, McInerney P, Parker D, Soares CB. Guidance for conducting systematic scoping reviews. JBI Evidence Implementation. 2015;13(3):141. https://doi.org/10.1097/XEB.0000000000000050.
Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan - a web and mobile app for systematic reviews. Syst Rev. 2016;5:210. https://doi.org/10.1186/s13643-016-0384-4.
United Nations Disaster Risk Reduction. Disaster management. 2017 Cited 2024 Jul 16.
World Health Organisation. Classifying health workers. Cited 2024 Dec 20.
Thomas PA, Kern DE, Hughes MT, Tackett SA. Chen BY. Curriculum Development for Medical Education: A Six-Step Approach. JHU Press; 2022. p. 389.
Chatterjee D, Corral J. How to write well-defined learning objectives. J Educ Perioper Med. 2017;19(4):E610 (PMID:29766034).
Sarin RR, Biddinger P, Brown J, Burstein JL, Burkle FM, Char D, et al. Core Disaster Medicine Education (CDME) for emergency medicine residents in the United States. Prehosp Disaster med. 2019;34(05):473–80. https://doi.org/10.1017/S1049023X19004746.
Anderson L, Krathwohl D. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives: complete edition. 1st ed. Addison Wesley Longman, Inc.; 2000. 352 p.
Baetzner AS, Wespi R, Hill Y, Gyllencreutz L, Sauter TC, Saveman B-I, et al. Preparing medical first responders for crises: a systematic literature review of disaster training programs and their effectiveness. Scand J Trauma Resusc Emerg Med. 2022;30(1):76. https://doi.org/10.1186/s13049-022-01056-8.
World Health Organisation. Simulation exercise manual: a practical guide and tool for planning, conducting and evaluating simulation exercises for outbreaks and public health emergency preparedness and response. World Health Organization; 2017;:69. Cited 2024 Jun 16.
Cochrane Training. Methods for data collection. Cited 2024 Jul 17.
Preparedness and Emergency Response Learning Centers. Kirkpatrick evaluation tool Matrix. Cited 2024 Jul 17.
Kirkpatrick J, Kirkpatrick W. An Introduction to the New World Kirkpatrick® Model. Cited 2024 Jul 17.
Heaslip G, Stuns K-K. Effectiveness of humanitarian logistics training: The Finnish Red Cross (FRC) Emergency Response Unit (ERU). J Humanitarian Logistics Supply Chain Manag. 2019;9(2):196–220. https://doi.org/10.1108/JHLSCM-12-2018-0080.
Sarin RR, Cattamanchi S, Alqahtani A, Aljohani M, Keim M, Ciottone GR. Disaster education: a survey study to analyze disaster medicine training in emergency medicine residency programs in the United States. Prehosp Disaster med. 2017;32(4):368–73. https://doi.org/10.1017/S1049023X17000267.
Camacho NA, Hughes A, Jr FMB, Ingrassia PL, Ragazzoni L, Redmond A, et al. Education and Training of Emergency Medical Teams: Recommendations for a Global Operational Learning Framework. PLoS Curr. 2016 [cited 2024 Jul 17]; https://doi.org/10.1371/currents.dis.292033689209611ad5e4a7a3e61520d0
Atack L, Parker K, Rocchi M, Maher J, Dryden T. The impact of an online interprofessional course in disaster management competency and attitude towards interprofessional learning. J Interprof Care. 2009;1(23):586–98. https://doi.org/10.3109/13561820902886238.
Savoia E, Biddinger PD, Fox P, Levin DE, Stone L, Stoto MA. Impact of tabletop exercises on participants’ knowledge of and confidence in legal authorities for infectious disease emergencies. Disaster Med Public Health Prep. 2009;3(2):104–10. https://doi.org/10.1097/DMP.0b013e3181a539bc.
Phan Q, Geller DE, Broughton AS, Swan BA, Wells JS. Evaluating a low-cost disaster preparedness simulation for prelicensure nursing students. Disaster Med Public Health Prep. 2023;17:e343. https://doi.org/10.1017/dmp.2022.280.
Liu J, Huang Y, Li B, Gui L, Zhou L. Development and evaluation of innovative and practical table-top exercises based on a real mass-casualty incident. Disaster Med Public Health Prep. 2023;17:e200. https://doi.org/10.1017/dmp.2022.95.
Thornton M, Iannotti A, Quaranta R, Russo C. The effectiveness of table-top exercises in improving pandemic crisis preparedness. Int J Safety Secur Eng. 2021;11:463–71. https://doi.org/10.18280/ijsse.110420.
Farra SL, Gneuhs M, Hodgson E, Kawosa B, Miller ET, Simon A, et al. Comparative cost of virtual reality training and live exercises for training hospital workers for evacuation. Comput Inform Nurs. 2019;37(9):446–54. https://doi.org/10.1097/CIN.0000000000000540.
Abbas JR, Chu MMH, Jeyarajah C, Isba R, Payton A, McGrath B, et al. Virtual reality in simulation-based emergency skills training: A systematic review with a narrative synthesis. Resuscitation Plus. 2023;1(16):100484. https://doi.org/10.1016/j.resplu.2023.100484.
Alshowair A, Bail J, AlSuwailem F, Mostafa A, Abdel-Azeem A. Use of virtual reality exercises in disaster preparedness training: A scoping review. SAGE Open Med. 2024;14(12):20503121241241936. https://doi.org/10.1177/20503121241241936.
Sharpe R, Benfield G, Roberts G. The undergraduate experience of blended e-learning: a review of UK literature and practice. High Educ Acad; 2006;:103. Cited 2024 Jun 16.
Ingrassia PL, Ragazzoni L, Tengattini M, Carenzo L, Corte FD. Nationwide program of education for undergraduates in the field of disaster medicine: development of a core curriculum centered on blended learning and simulation tools. Prehosp Disaster Med. 2014;29(5):508–15. https://doi.org/10.1017/S1049023X14000831.
Ratiani M, Kitiashvili A, Labartkava N, Sadunishvili P, Tsereteli E, Gvetadze N. Teaching Disaster Risk Reduction with Interactive Methods. Vol. 1. National Curriculum and assessment centre; 112. Cited 2024 Jul 17.
Khorram-Manesh A, Lupesco O, Friedl T, Arnim G, Kaptan K, Djalali AR, et al. Education in disaster management: what do we offer and what do we need? Proposing a new global program. Disaster Med Public Health Prep. 2016;10(6):854–73. https://doi.org/10.1017/dmp.2016.88.
International Council of Nursing. Core competencies in disaster nursing. Version 2.0. 2019.p.16. Cited 2024 Jul 17.
Al Thobaity A, Plummer V, Williams B. What are the most common domains of the core competencies of disaster nursing? A scoping review. Int Emerg Nurs. 2017;31:64–71. https://doi.org/10.1016/j.ienj.2016.10.003.
National commission on terrorist attacks. The 9/11 Commission Report. United States of America; 2004;:585. Cited 2024 Jul 17.
Huang JS, Lien YN. Challenges of emergency communication network for disaster response. In: 2012 IEEE International Conference on Communication Systems (ICCS). 2012;:528–32. Cited 2024 Jul 17https://doi.org/10.1109/ICCS.2012.6406204
Oden RVN, Militello LG, Ross KG, Lopez CE. Four key challenges in disaster response. Proc Human Factors Ergonomics Soc Annual Meet. 2012;56(1):488–92. https://doi.org/10.1177/1071181312561050.
The Kerslake Report: An independent review into the preparedness for, and emergency response to, the Manchester Arena attack on 22nd May 2017. Cited 2024 Jul 17.
Federal Emergency Management Agency National Exercise Division. 1 October After-Action Report. Cited 2024 Jul 17.
Cotton K. Monitoring Student Learning in the Classroom. 1988 Cited 2024 Jul 26.
Yin Albert CC, Sun Y, Li G, Peng J, Ran F, Wang Z, et al. Identifying and monitoring students’ classroom learning behavior based on multisource information. Mob Inf Syst. 2022;2022(1):9903342. https://doi.org/10.1155/2022/9903342.
Kraiger K, Passmore J, Rebelo dos Santos N, Malvezzi S. The Wiley-Blackwell Handbook of the Psychology of Training, Development, and Performance Improvement. John Wiley & Sons; 2015.
Salas E, Tannenbaum SI, Kraiger K, Smith-Jentsch KA. The science of training and development in organizations: what matters in practice. Psychol Sci Public Interest. 2012;13(2):74–101. https://doi.org/10.1177/1529100612436661.
Gasiewski JA, Eagan MK, Garcia GA, Hurtado S, Chang MJ. From Gatekeeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Res High Educ. 2012;53(2):229–61. https://doi.org/10.1007/s11162-011-9247-y.
Elliott PH. Linking Learning to Performance. Cited 2024 Jul 19.
Weersink K, Hall AK, Rich J, Szulewski A, Dagnone JD. Simulation versus real-world performance: a direct comparison of emergency medicine resident resuscitation entrustment scoring. Adv Simul. 2019;4(1):9. https://doi.org/10.1186/s41077-019-0099-4.
Barnsley L, Lyon PM, Ralston SJ, Hibbert EJ, Cunningham I, Gordon FC, et al. Clinical skills in junior medical officers: a comparison of self-reported confidence and observed competence. Med Educ. 2004;38(4):358–67. https://doi.org/10.1046/j.1365-2923.2004.01773.x.
Rodgers DL, Bhanji F, McKee BR. Written evaluation is not a predictor for skills performance in an Advanced Cardiovascular Life Support course. Resuscitation. 2010;81(4):453–6. https://doi.org/10.1016/j.resuscitation.2009.12.018.
Liaw SY, Scherpbier A, Rethans J-J, Klainin-Yobas P. Assessment for simulation learning outcomes: a comparison of knowledge and self-reported confidence with observed clinical performance. Nurse Educ Today. 2012;32(6):e35-39. https://doi.org/10.1016/j.nedt.2011.10.006.
Noh J, Oh EG, Kim SS, Jang YS, Chung HS, Lee O. Development and evaluation of a multimodality simulation disaster education and training program for hospital nurses. Int J of Nurs Pract. 2020;26(3):e12810. https://doi.org/10.1111/ijn.12810.
Geng C, Luo Y, Pei X, Chen X. Simulation in disaster nursing education: A scoping review. Nurs Educ Today. 2021;1(107):105119. https://doi.org/10.1016/j.nedt.2021.105119.
Paton D, Sagala S, Okada N, Jang L, Buergelt P, Gregg C. Making sense of natural hazard mitigation: Personal, social and cultural influences. Environ Hazards. 2010;1(9):183–96. https://doi.org/10.3763/ehaz.2010.0039.
Becker JS, Paton D, Johnston DM, Ronan KR. A model of household preparedness for earthquakes: how individuals make meaning of earthquake information and how this influences preparedness. Nat Hazards. 2012;64(1):107–37. https://doi.org/10.1007/s11069-012-0238-x.
Shaw R, Shiwaku Hirohide Kobayashi K, Kobayashi M. Linking experience, education, perception and earthquake preparedness. Disaster Prev Manag An Int J. 2004;13(1):39–49. https://doi.org/10.1108/09653560410521689
Skryabina E, Reedy G, Amlôt R, Jaye P, Riley P. What is the value of health emergency preparedness exercises? A scoping review study. Int J Disaster Risk Reduct. 2017;1(21):274–83. https://doi.org/10.1016/j.ijdrr.2016.12.010.
Dimitrov D, Rumrill P. Pretest-posttest designs and measurement of change. Work. 2003;1(20):159–65.
Stratton SJ. Quasi-experimental design (pre-test and post-test studies) in prehospital and disaster research. Prehosp Disaster Med. 2019;34(6):573–4. https://doi.org/10.1017/S1049023X19005053.
Latimier A, Riegert A, Peyre H, Ly ST, Casati R, Ramus F. Does pre-testing promote better retention than post-testing? NPJ Sci Learn. 2019;4(1):1–7. https://doi.org/10.1038/s41539-019-0053-1.
Hartley J. The effect of pre-testing on post-test performance. Instr Sci. 1973;2(2):193–214. https://doi.org/10.1007/BF00139871.
Mertens DM, Hesse-Biber S. Mixed methods and credibility of evidence in evaluation. N Dir Eval. 2013;2013(138):5–13. https://doi.org/10.1002/ev.20053.
Sozer EM, Zeybekoglu Z, Kaya M. Using mid-semester course evaluation as a feedback tool for improving learning and teaching in higher education. Assess Eval High Educ. 2019;44(7):1003–16. https://doi.org/10.1080/02602938.2018.1564810.
Yorke M. Formative assessment in higher education: Moves towards theory and the enhancement of pedagogic practice. High Educ. 2003;45(4):477–501. https://doi.org/10.1023/A:1023967026413.
Butler K, Anderson N, Jull A. Evaluating the effects of triage education on triage accuracy within the emergency department: An integrative review. Int Emerg Nurs. 2023;70:101322. https://doi.org/10.1016/j.ienj.2023.101322.
Rüter A, Ortenwall P, Wikström T. Performance indicators for major incident medical management – a possible tool for quality control? Int J Disaster Med. 2009;13(2):52–5. https://doi.org/10.1080/15031430410023355.
Curnow C, Barney R, Bryson J, Keller-Glaze H, Vowels C. Mass Casualty Triage Performance Assessment Tool. 2015 Cited 2024 Jul 19.
Nilsson H, Vikström T, Jonson C-O. Performance indicators for initial regional medical response to major incidents: a possible quality control tool. Scand J Trauma Resusc Emerg Med. 2012;17(20):81. https://doi.org/10.1186/1757-7241-20-81.
El-Den S, Chen TF, Moles RJ, O’Reilly C. Assessing mental health first aid skills using simulated patients. Am J Pharm Educ. 2018;82(2):6222. https://doi.org/10.5688/ajpe6222.
Balde T, Oyugi B, Mbasha J-J, Kamara R, Martinez-Monterrey LG, Relan P, et al. The emergency medical teams initiative in the WHO African region: a review of the development and progress over the past 7 years. Front Public Health. 2024;:12. Cited 2024 Jul 22.https://doi.org/10.3389/fpubh.2024.1387034
Aitken P, Leggat PA, Robertson AG, Harley H, Speare R, Leclercq MG. Education and training of Australian disaster medical assistance team members: results of a national survey. Prehosp Disaster Med. 2011;26(1):41–8. https://doi.org/10.1017/S1049023X10000087.
Kaim A, Bodas M, Camacho NA, Peleg K, Ragazzoni L. Enhancing disaster response of emergency medical teams through “TEAMS 3.0” training package-Does the multidisciplinary intervention make a difference? Front Public Health. 2023;11:1150030. https://doi.org/10.3389/fpubh.2023.1150030
Kardong-Edgren S. Striving for higher levels of evaluation in simulation. Clin Simul Nurs. 2010;6(6):e203–4. https://doi.org/10.1016/j.ecns.2010.07.001.
The Public Health Foundation. Kirkpatrick Level 3 (Behavior) - Evaluation Strategies. Cited 2024 Jul 30.
Alan H, Harmanci Seren AK, Eskin Bacaksiz F, Güngör S, Bilgin O, Baykal Ü. An evaluation of a web-based crisis management training program for nurse managers: the case of the COVID-19 crisis. Disaster Med Public Health Prep. 2023;9(17):e358. https://doi.org/10.1017/dmp.2023.8.
Al-qbelat RM, Subih MahaM, Malak MZ. Effect of Educational Program on Knowledge, Skills, and Personal Preparedness for Disasters Among Emergency Nurses: A Quasi-Experimental Study. Inquiry. 2022;59:00469580221130881. https://doi.org/10.1177/00469580221130881
Tichy M, Bond A, Beckstrand R, Heise B. NPs’ perceptions of disaster preparedness education: Quantitative survey research. Am J Nurs Pract. 2008;13(1):10–22.
Altillo BSA, Gray M, Avashia SB, Norwood A, Nelson EA, Johnston C, et al. Global health on the front lines: an innovative medical student elective combining education and service during the COVID-19 pandemic. BMC Med Educ. 2021;21(1):186. https://doi.org/10.1186/s12909-021-02616-9.
Back DA, Lembke V, Fellmer F, Kaiser D, Kasselmann N, Bickelmayer J, et al. Deployment and disaster medicine in an undergraduate teaching module. Mil Med. 2019;184(5–6):e284–9. https://doi.org/10.1093/milmed/usy250.
Bajow N, Alkhalil S, Maghraby N, Alesa S, Najjar AA, Aloraifi S. Assessment of the effectiveness of a course in major chemical incidents for front line health care providers: a pilot study from Saudi Arabia. BMC Med Educ. 2022;22(1):350. https://doi.org/10.1186/s12909-022-03427-2.
Kirkpatrick D. Techniques for evaluation training programs. J Am Soc Train Dir. 1959;13:21–6.
Bajow NA, AlAssaf WI, Cluntun AA. Course in prehospital major incidents management for health care providers in Saudi Arabia. Prehosp Disaster Med. 2018;33(6):587–95. https://doi.org/10.1017/S1049023X18000791.
Bajow NA, Alawad YI, Aloraifi SM. A basic course in humanitarian health emergency and relief: a pilot study from Saudi Arabia. Prehosp Disaster Med. 2019;34(6):580–7. https://doi.org/10.1017/S1049023X19004977.
Skolnik R. Global Health 101. Jones & Bartlett Publishers; 2012. 465 p.
Bank I, Khalil E. Are pediatric emergency physicians more knowledgeable and confident to respond to a pediatric disaster after an experiential learning experience? Prehosp Disaster Med. 2016;31(5):551–6. https://doi.org/10.1017/S1049023X16000704.
Beaton RD, Johnson LC. Instrument development and evaluation of domestic preparedness training for first responders. Prehosp Disaster Med. 2002;17(3):119–25. https://doi.org/10.1017/S1049023X00000339.
Bentley S, Iavicoli L, Boehm L, Agriantonis G, Dilos B, LaMonica J, et al. A Simulated Mass Casualty Incident Triage Exercise: SimWars. MedEdPORTAL. 2019;15:10823. https://doi.org/10.15766/mep_2374-8265.10823.
Bodas M, Peleg K, Adini B, Ragazzoni L. Training package for emergency medical teams deployed to disaster stricken areas: has ‘TEAMS’ Achieved its goals? Disaster Med Public Health Prep. 2022;16(2):663–9. https://doi.org/10.1017/dmp.2020.359.
Chen G, Gully SM, Eden D. Validation of a new general self-efficacy scale. Organ Res Methods. 2001;4(1):62–83. https://doi.org/10.1177/109442810141004.
Cooper S, Cant R, Porter J, Sellick K, Somers G, Kinsman L, et al. Rating medical emergency teamwork performance: Development of the Team Emergency Assessment Measure (TEAM). Resuscitation. 2010;81(4):446–52. https://doi.org/10.1016/j.resuscitation.2009.11.027.
Carenzo L, Ingrassia PL, Foti F, Albergoni E, Colombo D, Sechi GM, et al. A Region-Wide All-Hazard Training Program for Prehospital Mass Casualty Incident Management: A Real-World Case Study. Disaster Med Public Health Prep. 2023;17:e184. https://doi.org/10.1017/dmp.2022.84.
Levin H. Waiting for Godot: cost-effectiveness analysis in education. N Dir Eval. 2001;2001(90):55–68. https://doi.org/10.1002/ev.12.
Levin HM, McEwan PJ. Cost-Effectiveness Analysis: Methods and Applications. SAGE; 2001. 332 p.
Chang C-W, Lin C-W, Huang C-Y, Hsu C-W, Sung H-Y, Cheng S-F. Effectiveness of the virtual reality chemical disaster training program in emergency nurses: A quasi experimental study. Nurse Educ Today. 2022;119:105613. https://doi.org/10.1016/j.nedt.2022.105613.
Wisniewski R, Dennik-Champion G, Peltier JW. Emergency preparedness competencies: assessing nurses’ educational needs. J Nurs Admin. 2004;34(10):475.
Zhang JX, Schwarzer R. Measuring optimistic self-beliefs: A Chinese adaptation of the General Self-Efficacy Scale. An Int J Psychol Orient. 1995;38(3):174–81.
Chung S, Gardner AH, Schonfeld DJ, Franks JL, So M, Dziuban EJ, et al. Addressing children’s needs in disasters: a regional pediatric tabletop exercise. Disaster Med Public Health Prep. 2018;12(5):582–6. https://doi.org/10.1017/dmp.2017.137.
Collander B, Green B, Millo Y, Shamloo C, Donnellan J, DeAtley C. Development of an “all-hazards” hospital disaster preparedness training course utilizing multi-modality teaching. Prehosp Disaster Med. 2008;23(1):63–7. https://doi.org/10.1017/S1049023X00005598.
Cranmer H, Chan JL, Kayden S, Musani A, Gasquet PE, Walker P, et al. Development of an evaluation framework suitable for assessing humanitarian workforce competencies during crisis simulation exercises. Prehosp Disaster Med. 2014;29(1):69–74. https://doi.org/10.1017/S1049023X13009217.
Daniel P, Gist R, Grock A, Kohlhoff S, Roblin P, Arquilla B. Disaster olympics: a model for resident education. Prehosp Disaster Med. 2016;31(3):237–41. https://doi.org/10.1017/S1049023X16000212.
Dastyar N, Nazari M, Rafati F. Design, Implement, and Evaluate a Short-term Blended Training Program on Nursing Students’ Disaster Response Self-efficacy in Iran. Disaster Med Public Health Prep. 2023;17:e382. https://doi.org/10.1017/dmp.2022.269.
Harden RM. Ten questions to ask when planning a course or curriculum. Med Educ. 1986;20(4):356–65. https://doi.org/10.1111/j.1365-2923.1986.tb01379.x.
Li H-Y, Bi R-X, Zhong Q-L. The development and psychometric testing of a disaster response Self-Efficacy Scale among undergraduate nursing students. Nurse Educ Today. 2017;59:16–20. https://doi.org/10.1016/j.nedt.2017.07.009.
Farhat H, Laughton J, Joseph A, Abougalala W, Dhiab MB, Alinier G. The educational outcomes of an online pilot workshop in CBRNe emergencies. J Emerg Med Trauma Acute Care. 2022;2022(5):38. https://doi.org/10.5339/jemtac.2022.38.
Franc JM, Nichols D, Dong SL. Increasing emergency medicine residents’ confidence in disaster management: use of an emergency department simulator and an expedited curriculum. Prehosp Disaster Med. 2012;27(1):31–5. https://doi.org/10.1017/S1049023X11006807.
Bartley BH, Stella JB, Walsh LD. What a disaster?! assessing utility of simulated disaster exercise and educational process for improving hospital preparedness. Prehosp Disaster Med. 2006;21(4):249–55. https://doi.org/10.1017/S1049023X00003782.
Gershon RRM, Vandelinde N, Magda LA, Pearson JM, Werner A, Prezant D. Evaluation of a Pandemic Preparedness Training Intervention for Emergency Medical Services Personnel. Prehosp Disaster Med. 2009;24(6):508–11. https://doi.org/10.1017/S1049023X00007421.
Ghiga I, Richardson S, Álvarez AMR, Kato M, Naidoo D, Otsu S, et al. PIPDeploy: Development and implementation of a gamified table top simulation exercise to strengthen national pandemic vaccine preparedness and readiness. Vaccine. 2021;39(2):364–71. https://doi.org/10.1016/j.vaccine.2020.11.047.
Glow SD, Colucci VJ, Allington DR, Noonan CW, Hall EC. Managing Multiple-casualty incidents: a rural medical preparedness training assessment. Prehosp Disaster Med. 2013;28(4):334–41. https://doi.org/10.1017/S1049023X13000423.
Henze S, Fellmer F, Wittenberg S, Höppner S, Märdian S, Willy C, et al. Digital adaptation of teaching disaster and deployment medicine under COVID-19 conditions: a comparative evaluation over 5 years. BMC Med Educ. 2022;22(1):717. https://doi.org/10.1186/s12909-022-03783-z.
Hermann S, Gerstner J, Weiss F, Aichele S, Stricker E, Gorgati E, et al. Presentation and evaluation of a modern course in disaster medicine and humanitarian assistance for medical students. BMC Med Educ. 2021;21(1):610. https://doi.org/10.1186/s12909-021-03043-6.
Horney JA. Evaluation of the Certificate in Community Preparedness and Disaster Management Program at the University of North Carolina Gillings School of Global Public Health. Public Health Rep. 2009;124(4):610–6. https://doi.org/10.1177/003335490912400421.
Hsu C-C, Tsai S-H, Tsai P-J, Chang Y-C, Tsai Y-D, Chen Y-C, et al. An Adapted Hybrid Model for Hands-On Practice on Disaster and Military Medicine Education in Undergraduate Medical Students During the COVID-19 Pandemic. J Acute Med. 2022 Cited 2024 Jul 25; https://doi.org/10.6705/j.jacme.202212_12(4).0003
Schwarzer R, Jerusalem M. General self-efficacy scale. 2012 Cited 2024 Jul 24. https://doi.org/10.1037/t00393-000
Kaji AH, Coates W, Fung C-C. A disaster medicine curriculum for medical students. Teach Learn Med. 2010;22(2):116–22. https://doi.org/10.1080/10401331003656561.
Kaplan BG, Connor A, Ferranti EP, Holmes L, Spencer L. Use of an emergency preparedness disaster simulation with undergraduate nursing students. Public Health Nurs. 2012;29(1):44–51. https://doi.org/10.1111/j.1525-1446.2011.00960.x.
Kennedy B, Carson DS, Garr D. South carolina area health education consortium disaster preparedness and response training network: an emerging partner in preparedness training. J Public Health Manag Pract. 2009;15(2 Suppl):S13-19. https://doi.org/10.1097/01.PHH.0000345980.49798.d2.
Guskey TR. Evaluating Professional Development. Corwin Press; 2000. 332 p.
Kesler S, James E, Scheller A, Gray S, Kne L, Hendel-Paterson B. Simulation as a teaching method: evaluation of the University of Minnesota humanitarian crisis simulation. Disaster Med Public Health Prep. 2022;5(17):e121. https://doi.org/10.1017/dmp.2022.28.
Lennquist Montán K, Hreckovski B, Dobson B, Örtenwall P, Montán C, Khorram-Manesh A, et al. Development and evaluation of a new simulation model for interactive training of the medical response to major incidents and disasters. Eur J Trauma Emerg Surg. 2014;40(4):429–43. https://doi.org/10.1007/s00068-013-0350-y.
Levoy K, DeBastiani SD, McCabe BE. Evaluation of a novel disaster nursing education method. Disaster Med Public Health Prep. 2018;12(6):703–10. https://doi.org/10.1017/dmp.2017.150.
Montana M, Mathias F, Rathelot P, Lacroix J, Vanelle P. Development and evaluation of an elective course on the pharmacist’s role in disaster management in France. JEEHP. 2019 Cited 2024 Jul 25;16. https://doi.org/10.3352/jeehp.2019.16.19
Morrison AM, Catanzaro AM. High-fidelity simulation and emergency preparedness. Public Health Nurs. 2010;27(2):164–73. https://doi.org/10.1111/j.1525-1446.2010.00838.x.
Johns C. Becoming a Reflective Practitioner. John Wiley & Sons; 2009. 361 p.
Nybo SE, Klepser SA, Klepser M. Design of a disaster preparedness escape room for first and second-year pharmacy students. Curr Pharm Teach Learn. 2020;12(6):716–23. https://doi.org/10.1016/j.cptl.2020.01.037.
Pate A, Bratberg JP, Robertson C, Smith G. Evaluation of a tabletop emergency preparedness exercise for pharmacy students. Am J Pharm Educ. 2016;80(3):50. https://doi.org/10.5688/ajpe80350.
Pesiridis T, Sourtzi P, Galanis P, Kalokairinou A. Development, implementation and evaluation of a disaster training programme for nurses: A Switching Replications randomized controlled trial. Nurse Educ Pract. 2015;15(1):63–7. https://doi.org/10.1016/j.nepr.2014.02.001.
Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211. https://doi.org/10.1016/0749-5978(91)90020-T.
Ajzen I. Constructing a Theory of Planned Behavior Questionnaire. 2006. 1 p.
Pitts J, Lynch M, Mulholland M, Curtis A, Simpson J, Meacham J. Disaster planning: using an “evolving scenario” approach for pandemic influenza with primary care doctors in training. Educ Prim Care. 2009;20(5):346–52. https://doi.org/10.1080/14739879.2009.11493816.
Pryor E, Heck E, Norman L, Weiner B, Mathews R, Black J, et al. Integrated decision-making in response to weapons of mass destruction incidents: development and initial evaluation of a course for healthcare professionals. Prehosp Disaster Med. 2006;21(1):24–30. https://doi.org/10.1017/S1049023X00003289.
Roberts HV. Sergesketter BF. Quality is Personal: A Foundation for Total Quality Management. Free Press; 1993. p. 200.
Qureshi KA, Gershon RRM, Merrill JA, Calero-Breckheimer A, Murrman M, Gebbie KM, et al. Effectiveness of an emergency preparedness training program for public health nurses in New York City. Fam Community Health. 2004;27(3):242–9. https://doi.org/10.1097/00003727-200407000-00011.
Ragazzoni L, Conti A, Dell’Aringa M, Caviglia M, Maccapani F, Della CF. A nationwide peer-assisted learning program in disaster medicine for medical students. Eur J Emerg Med. 2020;27(4):290–7. https://doi.org/10.1097/MEJ.0000000000000668.
Ripoll-Gallardo A, Ragazzoni L, Mazzanti E, Meneghetti G, Franc JM, Costa A, et al. Residents working with Médecins Sans Frontières: training and pilot evaluation. Scand J Trauma Resusc Emerg Med. 2020;28(1):86. https://doi.org/10.1186/s13049-020-00778-x.
Sarpy SA, Warren CR, Kaplan S, Bradley J, Howe R. Simulating public health response to a severe acute respiratory syndrome (SARS) event: a comprehensive and systematic approach to designing, implementing, and evaluating a tabletop exercise. J Public Health Manag Pract. 2005;Suppl:S75–82. https://doi.org/10.1097/00124784-200511001-00013
Sarpy SA, Chauvin SW, Anderson AC. Evaluation of the effectiveness of the South Central Center for Public Health Preparedness training. Public Health Rep. 2003;118(6):568–72. https://doi.org/10.1093/phr/118.6.568.
Sarpy SA, Chauvin SW, Hites LS, Santacaterina L, Capper S, Cuccia M, et al. The south central center for public health preparedness training system model: a comprehensive approach. Public Health Rep. 2005;120(Suppl 1):52–8. https://doi.org/10.1177/00333549051200S111.
Scott LA, Maddux PT, Schnellmann J, Hayes L, Tolley J, Wahlquist AE. High-fidelity multiactor emergency preparedness training for patient care providers. Am J Disaster Med. 2012;7(3):175–88. https://doi.org/10.5055/ajdm.2012.0093.
Scott LA, Swartzentruber DA, Davis CA, Maddux PT, Schnellman J, Wahlquist AE. Competency in chaos: lifesaving performance of care providers utilizing a competency-based, multi-actor emergency preparedness training curriculum. Prehosp Disaster Med. 2013;28(4):322–33. https://doi.org/10.1017/S1049023X13000368.
Scott LA, Madden LA, Wahlquist AE, Fisher DW. Preparing for the surge: a half-day emergency preparedness training course for the “second front.” Disaster Med Public Health Prep. 2018;12(1):121–6. https://doi.org/10.1017/dmp.2017.30.
Silenas R, Akins R, Parrish A, Edwards J. Developing disaster preparedness competence: an experiential learning exercise for multiprofessional education. Teach Learn Med. 2008;20(1):62–8. https://doi.org/10.1080/10401330701798311.
Tower C, Altman BA, Strauss-Riggs K, Iversen A, Garrity S, Thompson CB, et al. Qualitative assessment of a novel efficacy-focused training intervention for public health workers in disaster recovery. Disaster Med Public Health Prep. 2016;10(4):615–22. https://doi.org/10.1017/dmp.2016.11.
Tsai Y-D, Tsai S-H, Chen S-J, Chen Y-C, Wang J-C, Hsu C-C, et al. Pilot study of a longitudinal integrated disaster and military medicine education program for undergraduate medical students. Medicine (Baltimore). 2020;99(20):e20230. https://doi.org/10.1097/MD.0000000000020230.
Wang C, Wei S, Xiang H, Wu J, Xu Y, Liu L, et al. Development and evaluation of a leadership training program for public health emergency response: results from a Chinese study. BMC Public Health. 2008;8(1):377. https://doi.org/10.1186/1471-2458-8-377.
Wang C, Wei S, Xiang H, Xu Y, Han S, Mkangara OB, et al. Evaluating the effectiveness of an emergency preparedness training programme for public health staff in China. Public Health. 2008;122(5):471–7. https://doi.org/10.1016/j.puhe.2007.08.006.
Wetta-Hall R, Fredrickson DD, Ablah E, Cook DJ, Molgaard CA. Knowing who your partners are: Terrorism-preparedness training for nurses. J Contin Educ Nurs. 2006;37(3):106–12. https://doi.org/10.3928/00220124-20060301-03.
Wiesner L, Kappler S, Shuster A, DeLuca M, Ott J, Glasser E. Disaster training in 24 hours: evaluation of a novel medical student curriculum in disaster medicine. J Emerg Med. 2018;54(3):348–53. https://doi.org/10.1016/j.jemermed.2017.12.008.
Wright KS, Thomas MW, Durham DP, Jackson LM, Porth LL, Buxton M. A public health academic-practice partnership to develop capacity for exercise evaluation and improvement planning. Public Health Rep. 2010;125(5_suppl):107–16. https://doi.org/10.1177/00333549101250S515.
© 2025. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.