Content area
Background
Advancements in health and medical education are key to a promising future, but sustaining their exponential pace requires innovative approaches. Gamification offers a powerful tool to foster engagement and enhance the educational journey of medical professionals. By integrating interactive and motivational elements into training, gamification not only boosts knowledge acquisition but also addresses the monotony of traditional methods, encouraging deeper cognitive engagement and reducing errors in practice. Critical thinking is essential for accurate and timely medical decisions, particularly in diagnosing new patients at emergency department. Enhancing these skills can significantly reduce errors and improve patient outcomes, including lowering mortality rates.
Methods
This project introduces a critical thinking game designed to improve the clinical reasoning skills of pediatric students before they enter the field. The game simulates realistic clinical cases, where students draw randomized cards presenting patient profiles, symptoms, and test results, then race against the clock to deliver most relevant diagnoses and actions. The game employs a Case-Based Morning Report format designed to provide hands-on learning experiences. The gameplay involves three cards presented to students, each containing different types of medical information: a patient’s description, their symptoms, and their examination results. Based on this data, students must create a diagnostic scenario and discuss it with a licensed pediatrician, who serves as the moderator. The moderator ultimately determines the best scenario based on logical reasoning and medical accuracy. The moderator is always an expert in the domain and a university professor, which already ensures a high level of reliability.
In addition, we conducted a single-arm, exploratory pilot study with undergraduate medical students (n = 100) from first to fourth year (Med1–Med4) at the Holy Spirit University of Kaslik (USEK). Participants engaged with the DMRCT platform during scheduled sessions. After gameplay, they completed a structured Likert-scale survey measuring perceived critical thinking improvement, engagement, interface usability, and competitive pressure. Objective gameplay metrics (reaction time, diagnostic accuracy, number of errors) were recorded automatically by the platform’s backend.
This problem-based learning approach fosters excitement, interactivity, and competitiveness, transforming critical training into an engaging experience.
Results
Our multiplayer educational game is hosted on a web-based platform, enabling students to engage in critical thinking exercises remotely. The participants in each session consist of two opposing teams of medical students, who analyze and discuss the case information before presenting their scenarios. The moderator evaluates the scenarios and awards points based on the accuracy and reasoning presented. Our critical thinking game was tested with 100 medical students (Med1–Med4) through randomized clinical scenarios. Feedback revealed that students viewed the game as a valuable complement to traditional morning rounds, enhancing diagnostic synthesis, problem-solving, and decision-making, particularly for advanced learners. Students reported increased motivation, teamwork, and the ability to practice decision-making in a low-risk environment. Technical evaluations confirmed the platform’s reliability, real-time scoring, and analytics integration, with pilot sessions showing active participation, multiple valid diagnoses, and meaningful moderator-student interactions that deepened clinical reasoning skills.
Conclusion
By immersing students in dynamic, field-relevant scenarios, our critical thinking game enriches analytical reasoning, problem-solving abilities, and clinical judgment. Transforming education into an interactive, game-based experience cultivates skilled, confident practitioners prepared to meet the complex challenges of real-world pediatric care. The pilot evaluation among medical students revealed high engagement, improved diagnostic accuracy, and a positive perception of clinical reasoning development.
Introduction
Critical thinking enables individuals to assess problems objectively based on reasoning and facts, allowing them to develop more effective solutions. Medical students, in particular, must cultivate critical thinking skills to analyze complex situations, collaborate in groups, discern between reliable and inaccurate information, and maintain an open mind. These abilities are essential for them to become proficient in their field, as they encourage questioning existing opinions, attitudes, and facts rather than passively accepting them.
Medical professionals frequently face challenges in diagnosis and treatment, with nearly one-third of patient issues stemming from incorrect diagnoses. Enhancing doctors’ diagnostic and critical thinking skills as they go through residency and medical school training is very important [1]. One way to achieve this is through nonclinical skills training, such as research and quality improvement, rather than relying solely on direct patient care for learning. The integration of critical thinking into medical education can better prepare students for real-world medical scenarios [2].
Additionally, gamified learning environments can support this by managing cognitive load, stimulating learner autonomy, and increasing attention retention. These principles inform the pedagogical design of DMRCT (Developing Medical Reasoning and Critical Thinking).
The use of digital tools in medical education has been increasingly validated by recent studies. For example, a recent review found that blended and digital learning approaches significantly improve knowledge acquisition, confidence, and engagement among medical students [3]. Technologies such as simulations, mobile applications, virtual reality, and video-based learning help learners practice rare or critical events in safe environments, improve decision making and procedural skills, and support skill coordination [4, 5]. E-learning tools further provide flexibility, interactivity, and accessibility, making medical education more cost-efficient and broadly available [6]. Finally, digital feedback mechanisms enable timely, personalized feedback that enhances formative assessment and learner motivation in clinical settings [7].
In response, remote learning and digital adaptations gained prominence, ultimately facilitating the rise of Serious Games educational applications that combine learning and training with engaging gaming elements.
A serious game is a game designed for a primary purpose other than pure entertainment, typically aimed at education, training, health promotion, or behavior change. Serious games leverage the engaging and motivational features of games such as challenges, rules, feedback, and interactivity while embedding pedagogical or practical objectives to facilitate learning and skill acquisition [8, 9].
Serious Games serve as an effective tool to enhance targeted skills, such as critical thinking, by integrating competition, excitement, and experiential learning [10, 11, 12, 13, 14, 15–16].
Serious Games are increasingly recognized for their ability to bridge gaps in traditional education by combining training with interactive learning experiences. They are defined as an “educational application, whose initial intention is to combine, coherently and at the same time, serious aspects, in a non-exhaustive and non-exclusive way, teaching, learning, communication, or even information with the fun aspects of video games” [12, 17, 18]. These games leverage gaming principles to provide practical training opportunities, especially in medical education, where they help students develop diagnostic and decision-making skills.
In this context, a Serious Game has been developed to enhance the critical thinking skills of medical students, specifically those who have completed three years of study and have acquired foundational medical knowledge.
The primary goal of this project is to motivate medical students by making the learning and training process more engaging while reinforcing critical thinking skills.
We hypothesize that gamified decision-making through DMRCT enhances students’ clinical reasoning and critical thinking in a more engaging and effective way than traditional educational methods. This is evaluated through a pilot study measuring engagement, diagnostic accuracy, and perceived learning impact across various levels of medical students.
Materials and methods
Product discovery phase
The product discovery phase aimed to understand the educational and clinical needs related to developing critical thinking in pediatric medical students and to define the pedagogical scope of the game.
Semi-structured interviews and empathy mapping exercises were conducted with pediatricians and undergraduate medical students to explore current gaps in decision-making training and identify desirable game features.
User personas
Three primary user groups were identified:
(1) Pediatric medical students (24–27 years old) needing structured critical thinking training.
(2) Experienced pediatricians acting as game moderators.
(3) Medical educators interested in innovative teaching tools.
User interviews & empathy mapping
Interviews with pediatricians and medical students provided insights into challenges in critical thinking education. Empathy mapping revealed key frustrations and learning preferences, guiding game design choices.
Insights from these sessions revealed a strong demand for a simulation-based, collaborative, and feedback-oriented learning approach aligned with active and experiential learning principles. These findings guided the conceptual design and educational objectives of the DMRCT game.
Features identification
Functional features by user role
Moderator functions
The moderator, typically a senior pediatrician or medical educator, plays a central role in game orchestration. Through a dedicated and secure administrative interface, the moderator is empowered to:
Create and edit diagnostic scenarios by inserting, modifying, or removing medical case cards.
Control gameplay mechanics, including randomized card selection and management of conflicting card logic.
Monitor student performance by observing live scores, tracking historical data, and assessing response accuracy.
Provide scores and feedback using embedded scoring tools and optional communication via chat.
Student functions
Students interact with the game through a streamlined interface that supports both independent participation and collaborative gameplay. They are able to:
Join or create a team by selecting peers from their academic cohort.
Participate in diagnostic challenges, answer timed clinical cases, and view scoring feedback.
Track team progress and performance, with access to previous game results and personal statistics.
The design emphasizes usability and encourages engagement, collaboration, and clinical reasoning within a competitive format.
Product development phase
This project adopted a mixed-method approach that combined the development of a digital serious game with a pilot evaluation of its educational feasibility and impact. The study was conceived as an exploratory, single-arm pilot, designed to test proof-of-concept rather than to conduct a comparative trial. The primary objective was to evaluate the game’s potential to enhance medical students’ critical thinking and diagnostic reasoning while documenting engagement, usability, and preliminary learning outcomes.
The product development phase followed an agile and user-centered design methodology, focusing on accessibility, engagement, and scalability.
The initial version of the game was a physical card game co-designed with pediatricians at CHU-NDS Hospital to validate its clinical relevance and educational utility. Following successful in-person sessions, the game was translated into a web-based platform using a modular architecture (Node.js, Socket.io, PHP, MySQL).
This architecture allowed the game to function both in face-to-face classroom settings and remotely, ensuring pedagogical continuity and data traceability.
System architecture
To achieve scalability and data traceability, the game was converted into a web-based multiplayer platform using a modular, event-driven architecture composed of Node.js, Socket.io, PHP, and MySQL.
The transition preserved the original gameplay mechanics while introducing digital enhancements:
Automated data recording (reaction time, accuracy, team performance).
Standardized rule enforcement reducing moderator subjectivity.
Real-time communication between students and moderators.
Analytics dashboards for formative feedback via Redash integration.
Cross-platform accessibility on any modern browser.
Each user registers as either a student or moderator and is verified via an automated email confirmation system. Moderators can create sessions, manage scoring, and provide immediate debriefing, while students engage in real-time diagnostic challenges requiring teamwork and justification of clinical reasoning.
Figure 1 illustrates the architecture, showing the interaction between clients (students/moderators), the web application server, CMS, and database components.
[See PDF for image]
Fig. 1
System architecture of the DMRCT multiplayer web application
The aim was to create an accessible, scalable, and engaging multiplayer educational experience that could be deployed seamlessly in clinical and academic settings.
Database model
A relational database schema (MySQL) was implemented to link users, teams, sessions, and gameplay results. Logic-basefid constraints prevented incoherent card combinations (e.g., incompatible symptoms and patient demographics). This ensured clinical realism and uniform gameplay across sessions.
The Entity-Relationship (ER) model illustrated in Fig. 2 was developed to map the logical relationships between key components of the system, specifically:
User and Role Management: Differentiates between student and moderator access levels.
Team Structure: Associates students with specific game teams.
Gameplay Sessions: Links users to specific rounds, performance data, and card selections.
Conflicting Card Logic: Identifies non-coherent combinations of patient demographics, symptoms, and test results to preserve clinical realism during randomization.
[See PDF for image]
Fig. 2
Entity–relationship model of the DMRCT database
Product testing phase
Participants
100 undergraduate medical students (43 males, 57 females; ages 22–27) from Med1–Med4 voluntarily participated. Recruitment was within NDS-Hospital and informed consent was obtained electronically via the platform. No exclusion criteria were applied. Informed participation was obtained through an electronic consent notification embedded in the platform login. Moderation of sessions was carried out by experienced pediatricians and medical educators to ensure clinical relevance and validity.
Procedure
Students were divided into two balanced teams per session. Each round consisted of randomized card presentation (patient demographics, symptoms, test results) and timed responses (30 s) to propose a diagnosis. Points were awarded for accuracy and reasoning. Pediatricians acted as moderators to score performance and provide immediate feedback during post-round debriefs.
Game rules
Students were given 30 s to buzz in and propose a diagnosis with justification. Points were awarded for correct answers and deducted for incorrect attempts, with additional opportunities for team collaboration and scoring. Each session was moderated by a pediatrician who facilitated gameplay, ensured rule adherence, and conducted a debriefing discussion at the end of each round to reinforce reasoning and provide corrective feedback.
The game is played by two opposing teams, each composed of 6 to 7 students, and a moderator who evaluates the scenarios presented to him, therefore he must possess a substantial amount of knowledge or experience to form appropriate judgement. Each game session is built from several 20–30-minute rounds. To begin the game, one student from each team is picked on a condition of both attaining the same level of education to play, and they cannot receive help from their team. After having the 3 cards displayed to them by the moderator, the timer of 30 s begins for one of them to buzz and provide a reasonable diagnosis. The first student to succeed wins 8 points for his team, and if another student delivers further correct information or diagnosis, they win the remaining 2 points. However, if his scenario does not pass, they lose 5 points, and the turn is given to his opponent to provide with his diagnosis, and if successful, his team 6 points instead. The same procedure applies for the other 2 points. In case none of the students came up with an opposing scenario to the failed attempt, they can trade 5 points from their score to interchange one of the 3 cards at random. On the other hand, the moderator’s position included game management, diagnosis judgment, and guidance to both parties to make the most of the game.
Survey instrument
After each session, students completed a validated 5-point Likert questionnaire adapted from Cheiban [19, 20] assessing:
Engagement and motivation.
Usability and interface satisfaction.
Perceived improvement in clinical reasoning and critical thinking.
Value compared to traditional morning rounds.
Data collection and analysis
The platform automatically recorded:
Reaction time (seconds).
Diagnostic accuracy (% correct diagnoses).
Team scoring patterns and improvement trends.
Descriptive statistics (mean ± SD) were used due to the exploratory nature of the study. No inferential tests were applied.
Qualitative comments from students and moderators were also analyzed to identify patterns in engagement, teamwork, and learning experience.
Results
Strategic impact
The transition from physical game into a web application was not merely technical it aligned the tool with modern medical education practices that demand flexible, remote-access solutions capable of data tracking, analytics, and continuous content updates. By moving to a web architecture, the application positioned itself for broad adoption across healthcare education institutions.
Game prototyping
To bring our brainstorming and ideas to life, first, a web page was built that portrays the teams, scores, time, and stacks along with a database for the cards seen in Fig. 3.
[See PDF for image]
Fig. 3
Three different states of picking cards
After three trials of the game, the last beta version’s interface is represented in Fig. 4, which shows the fields that the moderator must fill. Once the names are registered, the Submit button begins the game.
[See PDF for image]
Fig. 4
Registration Form
When all three cards are revealed, the timer of 30 s begins, waiting for either student to click the buzzer to share their diagnosis. The moderator manages the game by stopping the timer himself and providing the score for the corresponding teams. These features are portrayed in Fig. 5 where the opened cards, timer, and the score inputs are shown.
[See PDF for image]
Fig. 5
Moderator enters the score for both teams
The restart button is used in case of any errors occurring during the game, and the End Game is clicked at the end of each session that guides the user to the Results page, Fig. 6, that summarizes the rounds’ statistics.
[See PDF for image]
Fig. 6
Results
Figure 7 displays the card creation interface, while Fig. 8 shows the card management panel (edit/delete). To prevent clinically incoherent case combinations, a PHP-based backend script flags conflicting card combinations and restricts their randomization.
[See PDF for image]
Fig. 7
Insert Card Page
[See PDF for image]
Fig. 8
Edit & Delete Cards page
Technical evaluation and platform functionality
The multiplayer platform was deployed and tested with students and moderators using a live web-based environment. Several rounds were conducted involving randomized case cards, buzzer responses, team interaction, and moderator scoring.
Key performance aspects validated:
Buzzer Synchronization and Scoring Logic: Students could respond within the timed 30-second window; moderators scored answers with justifications using a dedicated interface.
Interface Functionality: Both moderator and student interfaces worked as intended, showing randomized card sets, active timers, team scores, and chat functions.
Session Management: Sessions were launched successfully via email verification links, and participation tracking was enforced through PHPMailer validation.
Figure 9 shows server logs capturing a live session.
[See PDF for image]
Fig. 9
Logged info of moderator in the server application
Statistical tracking and redash analytics integration
This system enabled the creation of performance dashboards (Figs. 10 and 11), with graphs displaying individual student improvement and aggregated team results. These insights were accessible to moderators through a web-based interface and were instrumental in identifying students’ learning patterns and needs.
[See PDF for image]
Fig. 10
Average Score Example
[See PDF for image]
Fig. 11
Student Improvement Graph
Gameplay observations and card logic
During in-person pilot testing, the game was run with 10 pediatric students and a licensed pediatrician moderator using randomized card sets. The following were observed:
High Engagement: Students demonstrated competitiveness and active participation in each round.
Diagnostic Range: Some card sets resulted in multiple valid differential diagnoses (10 + per set), reinforcing the importance of critical thinking and justification.
Moderator Role: The post-round debriefs and justification session by the moderator was found to be a key learning moment, helping students refine their reasoning.
User feedback and educational impact
Key findings from the survey include
Complementary Role: Both DMRCT and MR were perceived as effective educational formats. Students viewed DMRCT as a supplementary tool, not a replacement for in-person clinical exposure.
Clinical Reasoning Development: Morning rounds were deemed essential for foundational knowledge and diagnostic thinking, particularly among Med1 and Med2 students who preferred MR for its collaborative pace.
Advanced Learner Benefits: Med4 students favored DMRCT, reporting that the real-time gameplay enhanced their problem-solving, diagnostic synthesis, and decision-making under pressure.
Engagement Boost: Students appreciated the gamified learning environment, describing the sessions as more motivating and dynamic, even among those initially less interested in pediatrics.
Training Tool Recognition: DMRCT was acknowledged as a promising tool in pediatric case training, offering authentic scenarios and immediate feedback.
While age and academic level influenced students’ preferences and perceived benefits, gender was not found to be a significant factor in how the game was received.
Student perception of learning value
Based on verbal and written feedback, students:
Viewed the platform as a direct method of teaching critical thinking, unlike most traditional methods that rely on passive observation.
Noted that DMRCT allowed them to “practice decision-making without fear of real-life consequences.”
Emphasized the potential of having such a platform available “on-demand” across courses not limited to pediatrics.
Quantitative survey findings
A total of 100 medical students (Med1 through Med4) participated in the pilot phase and completed the post-game survey. The responses showed strong positive perceptions of the tool: (Table 1)
Table 1. Student feedback on DMRCT platform (n = 100)
Survey Item | Statement | Mean Score (± SD) |
|---|---|---|
Q1 | The game improved my clinical reasoning skills. | 4.3 ± 0.5 |
Q2 | I was more engaged than during traditional lectures. | 4.7 ± 0.3 |
Q3 | The interface was intuitive and easy to use. | 4.5 ± 0.6 |
Q4 | I enjoyed the competitive aspect of the game. | 4.2 ± 0.7 |
Q5 | I would recommend this game to peers. | 4.6 ± 0.4 |
Students in Med3 and Med4 tended to give slightly higher engagement and usefulness scores, though no statistical significance was tested due to the exploratory nature of the study.
Gameplay analytics
The DMRCT platform recorded objective gameplay metrics automatically for each session. Overall, students displayed efficient performance across the core diagnostic challenge.
Average completion time: 7.5 min (± 1.2).
Average reaction time per decision screen: 2.8 s (± 0.6).
Diagnostic accuracy (final diagnosis correctness): 88%.
Leaderboard variance: Wide, but no evidence of user frustration or disengagement.
These indicators suggest that students were actively engaged and capable of making timely and accurate clinical decisions under simulated pressure.
Discussion
In the evolving landscape of medical education, digital tools such as serious games are emerging as effective platforms for reinforcing cognitive and clinical reasoning skills. This study introduced a serious game DMRCT designed to enhance critical thinking (CT) in pediatric education through interactive, case-based gameplay. Our results indicate that serious gaming can serve as a valuable complement to traditional formats such as bedside teaching and morning rounds.
The pedagogical design of DMRCT is grounded in experiential learning theory [21], where learners actively engage in problem-solving, reflection, and knowledge application. The game also integrates principles of team-based and problem-based learning by immersing students in realistic clinical challenges that require collective reasoning and feedback-driven improvement.
Critical thinking in clinical contexts involves higher-order cognitive processes such as analysis, application, evaluation, and synthesis, which are crucial for safe and effective patient care [2]. Traditional education methods often fall short in directly addressing these cognitive layers. Serious games, by contrast, have shown promise in stimulating these skills by situating learners in simulated, yet authentic, problem-solving environments [11, 22, 23]. In our game, the timed diagnostic challenges and team-based competition fostered reflective reasoning, prioritization, and justification of clinical decisions.
Student feedback collected through structured questionnaires confirmed both the educational value and the motivational appeal of the DMRCT game. Consistent with findings in the literature [12, 13], participants reported increased engagement and a stronger grasp of decision-making logic. Notably, senior students (Med4) found the game particularly helpful in refining their diagnostic reasoning under time pressure, while junior students (Med1–2) valued morning rounds more for building foundational knowledge. This reinforces the need for adaptive, learner-level-specific deployment of gamified tools [24].
Prototype testing affirmed the soundness of the game’s instructional approach. Students demonstrated active participation and collaborative problem-solving even with limited functionality (manual buzzer, basic scoring). Research shows that collaborative gameplay can strengthen interpersonal reasoning, enhance retention, and build confidence in clinical decision-making [25, 26–27].
The transition to a digital format was not intended to alter the face-to-face interaction but to enable automated data collection, consistent rule enforcement, and scalable deployment across institutions.
From a technical standpoint, the web-based platform allowed for seamless real-time interaction via Socket.io and supported robust data analytics through Redash. Tracking metrics such as buzzer response time, scoring accuracy, and session progression gives educators tools for both summative and formative assessment. These capabilities align with current trends in data-informed medical education [1, 28, 29–30].
Furthermore, the transition from Unity to a web-based interface was driven by the need for scalability, accessibility, and cross-platform compatibility factors emphasized in post-pandemic educational innovation [31]. The open architecture of the game now allows for future integration with LMS systems, cross-specialty case scenarios, and potentially AI-driven feedback systems directions well-supported in recent literature on digital education platforms [32].
This study’s limitations include the absence of a control group, short observation window, and potential selection bias due to voluntary participation.
To address current limitations, future work will adopt a randomized or stepped-wedge, multi-site design with an active control (standard case discussion or morning rounds). We will recruit whole cohorts with stratification by academic level to reduce selection bias, and we will implement pre-registered protocols, a priori power calculations, and blinded rubric-based scoring with inter-rater reliability. Outcomes will include validated pre/post measures of clinical reasoning and critical thinking, OSCE performance, and longitudinal retention at 4–8 weeks. Facilitator influence will be minimized through standardized training, fidelity checklists, and platform-level constraints on rule changes; moderator effects will be modeled statistically. Missing data will be handled using multiple imputation with sensitivity analyses. These steps aim to strengthen internal validity, mitigate bias, and improve the generalizability of findings.
In future work, we will adopt a randomized, multi-site design with an active control and whole-cohort recruitment to reduce bias. Pre-registered protocols, power calculations, and blinded scoring will ensure methodological rigor. Outcomes will include validated pre/post measures of clinical reasoning, OSCE performance, and retention. Standardized facilitator training and platform safeguards will further strengthen validity and generalizability.
In contrast to established digital platforms such as AMBOSS, Prognosis: Your Diagnosis, and others, which primarily emphasize individual learning through static case libraries or passive quiz formats, DMRCT introduces a real-time, multiplayer decision-making experience. While AMBOSS offers comprehensive reference content and question banks, it lacks active diagnostic simulation. Prognosis focuses on case resolution in a single-player format, often without time pressure or peer interaction. DMRCT distinguishes itself by combining interactive diagnostic scenarios, live peer competition, time-based scoring, and automated performance tracking, thereby creating a dynamic, gamified environment that more closely mirrors the complexity and urgency of real-world clinical reasoning.
Ultimately, the DMRCT game demonstrates that serious gaming is more than just an engaging teaching method, it is a pedagogically sound, data-rich, and adaptable educational intervention. Ultimately, the DMRCT game demonstrates that serious gaming is more than just an engaging teaching method; it is a pedagogically sound intervention, grounded in problem-based and experiential learning principles, that fosters active participation, teamwork, and critical reasoning. At the same time, it is a data-rich platform that systematically captures objective metrics such as reaction times, diagnostic accuracy, and scoring trends, as well as subjective learner feedback. These features not only enable continuous monitoring of individual and group performance but also provide educators with actionable insights for formative assessment and curriculum improvement. As healthcare training increasingly embraces technology, tools like these will be critical to bridging cognitive skill gaps and fostering clinical excellence among future practitioners.
Conclusion
This study presented the design, development, and validation of a serious game platform aimed at enhancing critical thinking skills in pediatric medical education. The DMRCT game was found to be a valuable pedagogical tool, offering engaging, case-based learning that complements traditional methods like morning rounds. The combination of clinical realism, structured gameplay, and interactive competition enabled learners to practice decision-making, justify diagnoses, and collaborate in a risk-free digital environment.
The platform’s web-based architecture ensured broad accessibility and scalability, while the integration of real-time feedback mechanisms and analytics enabled moderators to track student progress and engagement. Initial feedback from students and educators underscored the game’s educational value, particularly in fostering confidence, motivation, and structured reasoning.
Looking ahead, the potential applications of the platform extend beyond pediatrics. Its modular structure makes it adaptable to other medical specialties, allowing for the development of unified, discipline-specific training tools. The integration of artificial intelligence (AI) to support moderator decisions such as automated scoring or real-time feedback on differential diagnoses represents a promising avenue for future development. Additionally, expanding the platform internationally could contribute to the standardization of critical thinking training across medical curricula.
Ultimately, serious games like DMRCT have the potential to reduce diagnostic errors, improve clinical reasoning, and reshape the way soft skills are taught in healthcare education offering a scalable, data-driven, and learner-centered solution for the demands of modern medical training.
Planned future developments include integration of AI-based feedback for individualized learning, the expansion of case libraries to include diverse specialties (e.g., internal medicine, emergency medicine), the implementation of pre/post testing with validated tools to measure critical thinking gains, the multi-institutional trials to validate effectiveness across diverse cohorts and the seamless integration with learning management systems and institutional dashboards.
As education continues to shift toward interactive, learner-centered formats, DMRCT offers a timely and evidence-informed solution to the challenge of cultivating clinical reasoning in the next generation of physicians.
Acknowledgements
Not applicable.
Clinical trial number
Not Applicable.
Authors’ contributions
MK, KC and ZG developed the initial idea, SR and CM designed the research approach. SR design the product roadmap. MK, SR and CM contributed to the conceptualization and methodology of this research project. CZ and GM developed the software and prepared the original draft of the paper. CM, SR and MK reviewed and edited the original draft. MK, SR and CM supervised the research project. SR, CM, MK validated the results. MK, and ZG made the first investigation and contributed with SR and CM on the findings. All authors have read and agreed to the published version of the manuscript.
Funding
Not applicable.
Data availability
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
Declarations
Ethics approval and consent to participate
This study was reviewed by the Research Ethics Committee of the Holy Spirit University of Kaslik (USEK), Lebanon. It was deemed exempt from formal ethical approval as it did not involve identifiable personal data, clinical intervention, or biological materials. The research was conducted as part of an undergraduate biomedical engineering capstone project and was limited to simulated educational scenarios involving volunteer student participants. The study was carried out in accordance with the ethical principles of the Declaration of Helsinki (2013 revision) and the academic research guidelines of USEK. All participants were informed of the study’s educational purpose and voluntarily gave their verbal informed consent prior to taking part in the testing sessions.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
1. Zayapragassarazan Z, Menon V, Kar S, Batmanabane G. Understanding Critical Thinking to Create Better Doctors, [Online]. Available: https://files.eric.ed.gov/fulltext/ED572834.pdf
2. Facione PA. Critical thinking: What it is and why it counts. 2015 update. Hermosa Beach (CA): Measured Reasons LLC; 2015.Facione PA. Available from: https://www.insightassessment.com
3. Shiau, S et al. Digital learning of clinical skills and its impact on medical students: A systematic review. BMC Med Educ; 2024; 24, 64. [DOI: https://dx.doi.org/10.1186/s12909-024-06471-2]
4. Ellaway, R; Masters, K. AMEE guide 32: E-Learning in medical education part 1: Learning, teaching and assessment. Med Teach; 2008; 30,
5. Ruiz, JG; Mintzer, MJ; Leipzig, RM. The impact of e-learning in medical education. Acad Med; 2006; 81,
6. Abdelrahman, M; Aljohani, A; Aljohani, A; Alharbi, A; Alnasser, S; Alshehri, A. Factors influencing the effectiveness of e-learning in healthcare: A scoping review. Front Public Health; 2023; 11, 1186140. [DOI: https://dx.doi.org/10.3389/fpubh.2023.1186140]
7. Sukhera, J et al. Exploring the role of digital technology for feedback exchange in clinical education: A systematic review. Syst Reviews; 2024; 13, 25. [DOI: https://dx.doi.org/10.1186/s13643-024-02705-y]
8. Djaouti, D; Alvarez, J; Jessel, J-P; Rampnoux,; Olivier,. Origins Serious Games; 2011; [DOI: https://dx.doi.org/10.1007/978-1-4471-2161-9_3]
9. Michael D, Chen S. Serious games: games that educate, train, and inform. Boston (MA): Thomson Course Technology; 2006. Available from: https://books.google.com/books?id=K2pQAAAAMAAJ
10. Montalbano L, Gallo L, Ferrante G, Malizia V, Cilluffo G, Fasola S, Alesi M, La Grutta S. Serious games: a new approach to foster information and practices about COVID-19? Heliyon. 2022;8(5):e09400. Available from: https://doi.org/10.1016/j.heliyon.2022.e09400.
11. Graafland, M; Schraagen, JM; Schijven, MP. Systematic review of serious games for medical education and surgical skills training. Br J Surg; 2012; 99,
12. Kayed, JE; Akl, T; Massoud, C et al. Serious game for radiotherapy training. BMC Med Educ; 2024; 24, 463. [DOI: https://dx.doi.org/10.1186/s12909-024-05430-1]
13. Damaševičius, R; Maskeliūnas, R; Blažauskas, T. Serious games and gamification in healthcare: A Meta-Review. Information; 2023; 14, 105. [DOI: https://dx.doi.org/10.3390/info14020105]
14. Jolly, AK; Selvarajah, D; Micallef, J et al. Adapting the gamified educational networking online learning management system to test a decentralized Simulation-Based education model to instruct Paramedics-in-Training on the emergency intraosseous access and infusion skill. Cureus; 2024; 04,
15. Montalbano L, Gallo L, Ferrante G, Malizia V, Cilluffo G, Fasola S, Alesi M, La Grutta S. Serious games: a new approach to foster information and practices about COVID-19? Front Robot AI. 2022;9:830950. Available from: https://doi.org/10.3389/frobt.2022.830950.
16. Buajeeb, W; Chokpipatkun, J; Achalanan, N et al. The development of an online serious game for oral diagnosis and treatment planning: evaluation of knowledge acquisition and retention. BMC Med Educ; 2023; 23, 830. [DOI: https://dx.doi.org/10.1186/s12909-023-04789-x]
17. Akl, EA; Pretorius, RW; Sackett, K et al. The effect of educational games on medical students’ learning outcomes: a systematic review. BMC Med Educ; 2010; 10, 29. [DOI: https://dx.doi.org/10.1186/1472-6920-10-29]
18. Boeker, M; Andel, P; Vach, W; Frankenschmidt, A. Game-based e-learning is more effective than a conventional instructional method: a randomized controlled trial with third-year medical students. PLoS ONE; 2013; 8,
19. Cheiban K. Developing medical reasoning and critical thinking in medical students. Master’s thesis. Holy Spirit University of Kaslik (USEK); Kaslik, Lebanon; 2019.
20. Cheiban, K. Développement de La pensée critique et du raisonnement clinique; 2018; Kaslik, Holy Spirit University of Kaslik:
21. Kolb DA. Experiential learning: experience as the source of learning and development. Englewood Cliffs (NJ): Prentice Hall; 1984.Available from: https://books.google.com/books?id=jpbeAQAAIAAJ
22. Gorbanev, I; Agudelo-Londoño, S; González, RA et al. A systematic review of serious games in medical education: quality of evidence and pedagogical strategy. Med Educ Online; 2018; 23,
23. Dankbaar, MEW; Storm, DJ; Teeuwen, IC et al. A serious game can be a serious tool for training clinical reasoning: a randomized controlled trial. BMJ Simul Technol Enhanc Learn; 2017; 3,
24. Kiili, K. Digital game-based learning: towards an experiential gaming model. Internet High Educ; 2005; 8,
25. Noh S, Zin N, Mohamed H. 2020. Serious Games Requirements for Higher-Order Thinking Skills in Science Education. [online] Thesai.org. Available at: <https://thesai.org/Downloads/Volume11No6/Paper_27-Serious_Games_Requirements_for_Higher_Order.pdf [Accessed 15 July 2022].
26. Arias-Calderón M, Castro J, Gayol S. Serious games as a method for enhancing learning engagement: student perception on online higher education during COVID-19. Frontiers in Psychology. 2022;13:889975. https://doi.org/10.3389/fpsyg.2022.889975.
27. Wang, R; DeMaria, S; Goldberg, A; Katz, D. A systematic review of serious games in training health care professionals. Simul Healthc; 2016; 11,
28. Ellaway, RH; Pusic, MV; Galbraith, RM; Cameron, T. Developing the role of big data and analytics in health professional education. Med Teach; 2014; 36,
29. Chan, T; Sherbino, J. The role of technology in competency-based medical education. Perspect Med Educ; 2015; 4,
30. Zairi I, Dhiab M, Mzoughi K, Mrad I, Abdessalem I, Kraiem S. 2021. Serious Game Design with medical students as a Learning Activity for Developing the 4Cs Skills: Communication, Collaboration, Creativity and Critical Thinking: A qualitative research. [online] PubMed Central (PMC). Available at: <https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8796679/ [Accessed 15 July 2022].
31. Wilcha, RJ. Effectiveness of virtual medical teaching during the COVID-19 crisis: systematic review. JMIR Med Educ; 2020; 6,
32. Cheng, A; Kolbe, M; Grant, V et al. A conceptual framework for the development of virtual simulation in healthcare education. Simul Healthc; 2020; 15,
© The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.