Content area
Background
Digital Problem-Based Learning (DPBL) is becoming more frequently used to facilitate the development of knowledge and skills in medical education, yet student satisfaction and engagement with DPBL remain insufficiently understood.
Methods
This mixed-methods systematic review aimed to examine how medical students perceive and experience DPBL. We searched four databases (Feb 5–Jun 30, 2024) following JBI and PRISMA guidelines, yielding 3459 abstracts and 56 included studies. Studies published at any time and in any language were considered. Two researchers independently conducted screening, selection, quality assessment and analysis. A segregated approach was used to synthesize the data. This method included a thematic synthesis of the qualitative data and a narrative review/meta-analysis for quantitative data where appropriate. The findings of both syntheses were then integrated and validated by stakeholders.
Results
The mixed-methods synthesis demonstrated that both quantitative and qualitative findings complemented each other, offering a comprehensive understanding of medical students’ perceptions of DPBL. Overall, students had a positive evaluation of DPBL, despite some mixed perceptions. Quantitatively, the satisfaction rate was 78.51% (95% CI: 78.07% − 78.96%) across 20 studies. Qualitatively, students’ social perceptions varied, with some feeling isolated and others valued the focused learning environment. DPBL tasks provided ownership, autonomy, and flexibility. Technology was useful, engaging, and motivational, though feedback was occasionally lacking. Visual and auditory features were appreciated, but tactile realism was limited. The study findings were validated by 10 medical students.
Conclusions
Our findings suggest that DPBL design still struggles to reconcile technological innovation with the social principles of traditional PBL. A hybrid model may offer a practical way to bridge this gap.
Background
Digital Problem-Based Learning (DPBL) has emerged as a particularly effective pedagogical and curricular approach in the context of health professions education. It is recognized for its superior ability to cultivate specific knowledge and skills when compared with traditional Problem-Based Learning (PBL) and conventional learning methods. Moreover, DPBL has been noted for its potential to enhance key aspects of traditional PBL [1].
Traditionally, PBL is rooted in constructivist learning theory [2, 3], represents a student-centered pedagogy, and is characterized by collaboration, personal autonomy, generativity, reflectivity, active engagement, personal relevance, and pluralism [4]. As articulated by Walker et al. [5], “PBL is an instructional (and curricular) learner-centered approach that empowers learners to conduct research, integrate theory and practice, and apply knowledge and skills to develop a viable solution to a defined problem. Critical to the success of the approach is the selection of ill-structured problems (often interdisciplinary) and a tutor who guides the learning process and conducts a thorough debriefing at the conclusion of the learning experience”. By incorporating digital tools into PBL, DPBL increases the authenticity of learning experiences [6], making them more reflective of real-world clinical situations. In turn, this authenticity enhances the appeal and accessibility of PBL [7, 8] particularly for a generation of students who are increasingly accustomed to digital interactions in their daily lives [9]. Additionally, the use of digital platforms has previously been found to foster improved communication and collaboration among participants [7, 8, 10] as these tools often include features that support real-time discussion, resource sharing, and teamwork. A core tenet of PBL, self-directed learning is also strengthened using digital technologies [7, 8, 10], which offer students greater autonomy in accessing resources, managing their learning pace, and exploring topics in greater depth.
Beyond the enhancement of knowledge, skills, and potential competencies, the integration of digital tools within PBL has been shown to significantly boost the student engagement and motivation [11]. The interactive and often gamified nature of digital platforms can make the learning process more engaging, which is particularly important in maintaining student interest and participation in rigorous and demanding health professions education.
In medical education, PBL is frequently employed [12,13,14] due to its strong alignment with clinical settings and its effectiveness in promoting practical learning and skill acquisition [15, 16]. Traditionally, PBL has been conducted in conventional classroom settings [17, 18]. However, with recent advancements, PBL is increasingly integrated with digital components, including online learning, podcasts, digital games, educational software, and digital learning objectives. The COVID-19 pandemic accelerated the adoption of online learning and digital tools for distance and asynchronous education [19]. Worldwide, medical schools were compelled to transition to online learning modalities [20]. This shift was already underway in many institutions, driven by Generation Z’s preference for off-site and on-demand learning [9] in addition to the value they ascribe to interactive and visually appealing content [11].
Digital PBL is broadly defined as the integration of various digital technologies into the delivery of PBL [1]. This expansive definition underscores the versatility of DPBL, which can encompass any digital method or technology integrated into a PBL curriculum to enhance learning. Digital PBL can be implemented through fully digital platforms, blended PBL environments, or in traditionally delivered face-to-face settings [1].
Despite these developments, the rapid adoption of digital methods has not afforded sufficient opportunity to thoroughly explore how to design digital interventions that are responsive to the perceptions of medical students and that effectively support their learning. There is a lack of clarity regarding how students perceive that digital tools enhance learning and how these tools should be integrated into the curriculum to maximize their benefits. Consequently, it remains unclear how medical students perceive these digital tools and how curriculum design attributes influence the perceived advantages of DPBL over traditional PBL. Therefore, one main challenge lies in designing digital curriculum interventions that align with medical students’ perceptions and support their learning processes. Gaining these insights might enable a more effective integration of DPBL into the design of educational experiences, allowing educators to fully leverage its benefits while addressing potential limitations.
This review aims to synthesise how medical students perceive and evaluate their experiences with DPBL. By integrating satisfaction measures with students’ narrated learning experiences, this study seeks to offer a clearer portrait of the complexities that shape their learning with these environments.
Subsequently, the following primary research question emerged:
How do medical students evaluate their perceptions of and satisfaction with DPBL?
Sub-questions:
*
What is the level of satisfaction among medical students with the use of DPBL? (Quantitative)
*
How do medical students perceive the human relational aspect, the tasks they are set, and the technical tools available to them in relation to DPBL? (Qualitative)
Methods
The reporting of this mixed-methods systematic review adheres to the Joana Briggs Institute’s guidance [21] and the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) 2020 guidelines [22] to ensure a systematic and transparent approach. We employed the JBI MMSR approach, as detailed in Sect. 8.3, specifically utilizing a convergent segregated approach for synthesis and integration. The protocol was registered with the Open Science Framework.
Inclusion and exclusion criteria
To address the relevance of the included studies, a series of inclusion criteria was created in advance conceptualized using Population, Intervention, Phenomena of interest, Outcomes, Context and Types of studies as visualized in Table 1. In general, the included studies were those that reported medical students’ evaluation of the use of DPBL in medical education. Digital PBL was defined based on the definition of Tudor Car et al. [1]. The delivery of DPBL was not limited to specific digital technologies or time frames. DPBL was the mechanism through which their learning experiences were interpreted. Evaluations were included if they were assessed either qualitatively or quantitatively.
[IMAGE OMITTED: SEE PDF]
Search strategy
The systematic literature search was conducted as a block search with three blocks concerning “Medical education”, “Problem-based learning”, and ”Digital technologies” to ensure a comprehensive literature search. It was designed in collaboration with an experienced research librarian. The aim was to include all articles related to medical students’ perspectives on the use of DPBL in medical education.
The literature search was conducted on February 5, 2024, and the inclusion of further studies ended on June 30, 2024. The search was conducted in the following databases with no limitation to language or time period: Medline via PubMed, Embase, ProQuest Eric, and Scopus. To increase the literature saturation, reference lists of eligible studies were scanned for potential additional eligible studies. In addition, forward citation tracking of relevant articles was conducted. Relevant medical subject headings (MeSH in PubMed and Emtree in Embase) and text words related to “Medical education”, “Problem-based learning”, and ”Digital technologies” were applied to the literature search.
The search strategy in each of the four databases can be seen in Supplemental material 1.
Evidence screening and selection
Following the search, all identified citations were uploaded to a reference management program (RefWorks) with duplicates removed. Subsequently, the titles and abstracts were screened by two independent reviewers (first author and last author) for assessment against the inclusion criteria. Full texts of the studies meeting the inclusion criteria were retrieved. Their details were then imported to the screening tool Rayyan [23] for management, assessment, and review of information.
The full texts of the selected studies were assessed by the same two independent reviewers (first author and last author). This time to determine in detail if they met the inclusion criteria. Studies that did not meet the inclusion criteria were excluded with reasons documented. Disagreements concerning the assessment and the inclusion or exclusion of particular studies, were resolved through discussion with a third reviewer (TR).
Pilot testing of the study selection process was undertaken by the review team to ensure the consistency of the approach.
Assessment of methodological quality
All included studies were critically appraised using standardized JBI critical appraisal instruments by two independent reviewers to assess methodological validity. The qualitative and mixed-methods studies were assessed by the first author and second author and the quantitative studies by the first author and the last author. Disagreements were resolved through discussion, and when necessary, with a third reviewer (the second or last author). Quantitative studies and quantitative components of mixed methods studies were appraised according to their study design, using one of the following instruments: the JBI Critical Appraisal Checklist for Randomized Controlled Trials [24], the JBI Critical Appraisal Checklist for Quasi-Experimental Studies [25], or the JBI Checklist for Analytical Cross-Sectional Studies [26]. All checklist items were assessed. However, only data related to our primary outcome of interest, student satisfaction, were evaluated. Other outcomes reported under each item were not considered. Qualitative studies and the qualitative component of mixed methods studies were assessed using the JBI Critical Appraisal Checklist for Qualitative Research [27]. The results of the critical appraisal were reported in narrative form and in tables. All studies, regardless of the results of their methodological quality, underwent data extraction and synthesis (where possible). Studies were not excluded based on the quality of reporting, as this may lead to valuable insights being disregarded [28]. The results of the critical appraisal will be incorporated into the result section and to refine the nuances of the findings in the discussion section.
Data extraction
The first author and the last author independently extracted data from the included articles using a data extraction form in Rayyan. A pilot test was conducted prior to the actual data extraction. The extracted data aligned with the research questions and the study’s aim, including details on article characteristics such as authorship, country, location, population, study design, and type of DPBL. Authors of the primary studies were contacted for missing or additional data if required.
For the quantitative component, data were extracted from quantitative and mixed methods (quantitative component only) studies, focusing on satisfaction outcomes from Likert scale or percentages. Similarly, for the qualitative component, data were extracted from qualitative and mixed methods (qualitative component only) studies, including specific quotes and text from qualitative interviews, as well as free-text comments from questionnaires.
Data synthesis
This review followed a convergent segregated approach to synthesis and integration according to the JBI methodology for mixed methods systematic reviews [21]. This involved conducting separate quantitative and qualitative syntheses, followed by integration of the resulting quantitative and qualitative evidence.
To visualize the process of study inclusion, a PRISMA flow diagram was used. Additionally, a basic descriptive analysis was presented on the frequency counts of publication year, study locations and type of DPBL. These descriptive results are mapped in a visual presentation using tables and graphs.
Quantitative synthesis
Satisfaction values reported as percentages were used directly. For studies using a Likert scale, we converted values to percentages following Table 2 formulas. In some cases, satisfaction was averaged from the two highest values, a method we adopted to align with the findings of the respective articles. Overall student satisfaction with DPBL in medical education was calculated as a weighted mean percentage including a 95% confidence interval (CI) of the overall weighted mean. For each study, 95% CI´s for proportions were calculated using Wilson’s method, which provides more reliable interval estimates for binomial outcomes and ensures values remain within the 0–100% range.
[IMAGE OMITTED: SEE PDF]
Qualitative synthesis
The qualitative data synthesis and analysis of the qualitative findings were conducted by the first author and the last author using the Activity-Centered Analysis and Design (ACAD) model proposed by Goodyear et al. [29]. The ACAD model provided a metatheoretical framework for understanding complex learning situations by focusing on the activities at the heart of these perceptions. Unlike more prescriptive learning theories, the ACAD model emphasizes an activity-centered approach, which distinguishes it from models that focus on the teacher, content, or technology [29]. This approach was particularly relevant to align with the PBL and constructivist theoretical backgrounds of this systematic mixed-methods review. In the context of DPBL, the ACAD model categorizes instructional design options into three distinct categories: social design, epistemic design, and set design. Social design refers to how students engage with tasks; individually, in groups, or as teams. Additionally, it refers to the social roles. Epistemic design refers to the tasks that students are asked to perform. Lastly, set design refers to the digital tools with which students interact. By categorizing students’ perceptions into these three design options, the authors aimed to efficiently identify key design considerations for educators who design future DPBL interventions [29].
To minimize the researcher’s risk of confirmation bias and to have a more robust and representative model that aligned with the research evidence, medical students reviewed and evaluated the draft model with the identified design variables together with the first author. In addition, this increased the possibility of expertise within the author group and lived experience by those who are actively engaged in medical education on a regular basis.
The reflexive thematic analysis provided by Braun and Clarke [30,31,32] inspired the qualitative analysis process. First, the quotes and comments from the included articles were read thoroughly [31, 32]. Then data were coded in Nvivo®. Student quotes from the included qualitative studies were separated into one of the three ACAD categories. Afterwards, the codes from each category were copied into a table in Word®. Column 1 contained the quote, column 2 included the initial code, and column 3 the proposed themes. We referred to the themes as meta-themes because initially the quotes came from different sources (rather than themes, as suggested by Braun and Clarke). Based on column 2 of the table, the themes were identified. This was an iterative process. The themes were reviewed and visualized on a whiteboard to provide an overview of the content. These themes then produced a comprehensive set of synthesized findings that can be used as a basis for evidence-based practice. Where textual pooling was not possible, the findings were presented in narrative form.
Integration of quantitative evidence and qualitative evidence
The findings of each single method synthesis included in this systematic mixed-methods review were configured according to the JBI methodology for mixed methods systematic reviews [21]. This involved quantitative evidence and qualitative evidence being juxtaposed and organized into a line of argument to produce an overall configured analysis. Where configuration was not possible the findings were presented in narrative form.
Results
Study inclusion
The results from the search strategy and selection process can be seen in Fig. 1. Of the 56 studies included in the review (See Additional file 1: Table 1 for a mapping of key information from included studies), twenty studies were included in the quantitative analysis of satisfaction [33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52].
Forty-four studies were included in the qualitative analysis of medical students’ perceptions [6, 17, 20, 34, 35, 38, 39, 41, 45, 47, 50, 53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85].
Out of the 20 quantitative and 44 qualitative studies mentioned, eight studies had data included in both the qualitative and quantitative analysis [34, 35, 38, 39, 41, 45, 47, 50].
[IMAGE OMITTED: SEE PDF]
Methodological quality
Quantitative studies
All twenty quantitative studies were evaluated using the appropriate JBI critical appraisal tool for each study design. The overall methodological quality of the included studies within the quantitative analysis was rated 6/8, 7/9 and 8/13, depending on the checklist applied. Specific methodological challenges for each group of studies are described below.
Eight of the twenty quantitative studies included in the quantitative analyses [33, 36, 40,41,42,43, 46, 48] were evaluated with JBI’s checklist for analytical cross sectional studies [26] (Supplemental material 2). In general, a methodological challenge was identified primarily in relation to confounding factors (Q5/Q6).
Three of the twenty quantitative studies included in the quantitative analysis were mixed-methods studies [34, 45, 47] but the quantitative components of the studies were evaluated based on the revised JBI critical appraisal tool for the assessment of risk of bias of quasi-experimental studies [25] alongside 7 other studies with this study design [35, 37,38,39, 44, 49, 52] (Supplemental material 2). Regarding the reported outcomes, we only evaluated satisfaction in question 5–9. Overall, the studies meet several criteria for internal validity, but there are some shortcomings, particularly regarding bias related to participant retention (Q8).
Two of the quantitative studies included in the quantitative analysis [50, 51] were evaluated based on JBI critical appraisal tool for the assessment of risk of bias for randomized controlled trials [24] (Supplemental material 2). Regarding the outcomes reported we only evaluated on satisfaction in question 7–12. In generel methodological challenges were identified primarily in relation to allocation concealment, blinding, follow up-and details on attrition (Q2, Q4, Q5, Q7 and Q10).
Qualitative studies
The 44 studies included in the qualitative analysis had their qualitative components evaluated with the JBI Critical appraisal checklist for Qualitative Research [27] (Supplemental material 2).
Overall, most of the methodological issues were addressed in the included studies (average of 6/10).
However, the primary challenges identified pertained to congruity, as well as researcher positioning and influence.
Characteristics of included studies
Key information from the included studies is presented in Additional file 1: Table 1.
The included publications were published over a long period stretching from 1997 to 2024 (Fig. 2). Despite a smaller peak in 2012, most publications were published late in the period and especially after the COVID-19 pandemic. The highest number of publications was published in 2023 (n = 13).
[IMAGE OMITTED: SEE PDF]
Most of the included publications originate from research conducted in Europe (n = 20). North America, Asia, and the Middle East were also frequently contributing. On the contrary no publications from Africa were represented.
From Additional file 1: Table 1 it can be seen that the most prevalent location was within a university (n = 46) wheras ten studies were conducted within a clinical setting either at a hospital department or within general practice. Additionally it can be seen that 10,120 medical students were included in the 56 studies. The quantitative analysis was based on 3,504 medical students and the qualitative analysis was based on 6,616.
The most prevalent type of technology in the included studies was “Online teaching and learning” which encompassed various modes of digital instruction, such as attending online lectures, participating in virtual discussions, or engaging with peers through online forums. This was followed by the use of “Simulation/virtual environments and patients” (Table 3). However, these technologies were not represented until 2008. As opposed to this, the use of video has been represented during the entire period of all the studies included in this review (from 1997 to 2024) and has undergone some change from being used as videodisc-based cases to interactive videoconferencing.
[IMAGE OMITTED: SEE PDF]
Quantitative evidence
What is the level of satisfaction among medical students with the use of DPBL?
Twelve studies [33,34,35,36,37,38,39,40,41,42,43,44] presented overall student satisfaction with the DPBL intervention as percentages directly in their results sections. In some of the studies in which satisfaction was presented directly in percentages the authors defined satisfaction as the two or more highest values. This was the case for three studies [35, 43, 44].
Six articles presented their results as a mean value on a 5-point Likert scale [45,46,47,48,49,50]. Two articles measured satisfaction using Likert scales with other levels than five. Raupach et al. measured satisfaction on a 6-point Likert scale (1= ‘excellent’, 6 =‘poor’) [51], whereas Diaz-Perez et al. rated the teaching method from 0.0 (poorly recommended) to 10 (highly recommended) [52].
The overall student satisfaction with the use of DPBL in medical education indicated a generally high level of student satisfaction. As can be seen from Fig. 3, an overall satisfaction rate of 78.51% (95% CI of the overall weighted mean 78.07% −78.96%) was seen from the 20 studies directly including satisfaction in their results section.
[IMAGE OMITTED: SEE PDF]
Qualitative evidence
How do medical students perceive the human relational aspect, the tasks they are set, and the technical tools available to them in relation to DPBL?
From the analysis of the qualitative studies, 453 quotes from interviews and free comments from students regarding their perceptions of DPBL were analyzed. One hundred and ninety-five quotes were placed under epistemic design, 150 quotes were placed under set design and finally, 108 quotes were placed under social design of the ACAD-model. The subsequent thematic analysis of the quotes from the categories gave the following meta-themes under each design modality, as shown in Table 4. The following sections provide a detailed account of the meta-themes identified within the three ACAD-design modalities. Selected codes are included where they most effectively illustrate the core dimensions of each meta-theme, although they are not discussed in detail.
[IMAGE OMITTED: SEE PDF]
Social design (how students engage with each other)
Three meta-themes emerged: Social Relations, Focus, and Communication
Social Relations captured students’ mixed experiences with interpersonal relations within a distance learning setting. Some felt isolated and missed physical presence and relational contact with other people, while others appreciated the reduced social presence. Social relations were also separated in binary terms of either having contact or not. In addition, the quality of the contact was elaborated on.
Some students found that the learning space became increasingly psychologically safe due to the online delivery mode, while others described feeling more insecure in the same learning environment – highlighting significant variation in the perceived quality of interaction.
Focus referred to students’ perception of increased concentration and task orientation in digital environments. The online format was seen as more efficient, with fewer distractions and more professional communication. The lack of physical context (e.g. travel and weather) induced by the technology was also viewed as a benefit due to increased efficiency and equality.
Communication gave rise to a specific human dimension in relation to the need for communication with other people. Not only was it important for the students to have contact with others, they also stressed the importance of qualitative aspects - particularly the ability to interpret body language and engage in more genuine, meaningful conversations.
Together, these themes highlight the nuanced ways in which digital formats reshape social dynamics in PBL, offering both opportunities and challenges.
Epistemic design (the nature of the learning task)
Three meta-themes were identified: Ownership, Contentment, and Digital Pedagogy
Tasks within the digital PBL context gave the students a sense of Ownership. This was explained by the flexibility in the design of the task which allowed students greater autonomy in deciding – such as when and where to engage with the learning activities. They described DPBL task as both efficient and convenient. One student highlighted this by stating: “Online learning has saved a great amount of time. Time that I had spent commuting, eating, getting dressed, etc., all had been spared with online learning”. Ownership was also reflected in the opportunity to revisit the course content when needed and to work at their own self-guided pace, reinforcing their autonomy and control over the learning process.
Contentment among the students was generally displayed in the language they used when describing the DPBL experience. They expressed being “pleased” with the task and described them as “useful”, “motivational”, “engaging”, “enjoyable”, “relevant”, and even felt “enthusiastic” while working on them. This overall satisfaction captured their positive emotional response to the learning format. However, in contrast to this general sense of contentment, some students also pointed to a lack of feedback in relation to the tasks, which they felt diminished their learning experience.
The final meta-theme of this category was Digital Pedagogy, which described how the students experienced the structure and delivery of the tasks as supportive for their learning. They frequently emphasized that the tasks provided clear guidance and instruction, along with a valued structure and overview that helped them navigate the content. Students also appreciated the level of preparation, the illustrative elements, and the appropriate number of resources. The academic level of the tasks was highlighted as well, particularly in terms of differentiated learning tailored to their individual needs. While some students praised the feedback they received, others pointed to its absence – indicating a degree of variation in how feedback was implemented across different contexts.
These themes suggest that well-designed tasks can foster motivation and engagement, but also underscore the importance of consistent feedback and instructional clarity.
Set design (the digital technologies with which students interact)
Within this category three meta-themes emerged: Representation of reality, Technical Issues, and Value.
One of the three meta-themes arising from this category was Representation of reality, with students’ perspectives centering on the role of the senses in shaping their understanding. Both visual and auditive aspects of the technologies were highlighted as helpful in improving the comprehension of the subject matter. In contrast, the tactile dimension was consistently described as lacking, which limited the perceived realism of the digital tools. Students often evaluated the technologies by comparing them to real-life experiences, forming what could be explained as a taxonomy of realism. This ranged from the least realistic- such as text or printed pictures - to digital pictures or video, followed by simulations and finally real clinical settings, which students considered the most authentic and desirable learning environments.
The second meta-theme arising from this category was Technical Issues, which, as expected, included a range of challenges commonly associated with online or distance learning. These included problems such as unstable internet connections, poor sound quality, background noise, and issues with muting microphones. Other difficulties were related to accessing websites, switching between applications, and situations in which not all group members could participate. In addition, students expressed a need for more practice in using the technology and feeling sufficiently skilled and experienced. Usability was also a recurring concern, with students preferring tools that were easy to navigate and time-efficient. These technical challenges occasionally hindered participation and required students to develop new digital skills to manage the learning environment effectively.
The motivation for using the technologies was generally high, and students expressed considerable optimism. The theme of Value reflected their overall positive attitudes toward the digital tools, which were often seen as supportive elements in their learning process. Students highlighted how the technologies made learning more manageable and adaptable to their individual needs. Despite occasional technical challenges, the perceived benefits – such as greater flexibility, time savings, and ease of access – often outweighed the drawbacks, reinforcing the sense that technologies added meaningful value to their educational experience.
These findings suggest that while technology is generally well-received, its effectiveness depends on both its realism and usability.
Mixed-methods synthesis of findings
The quantitative synthesis revealed that medical students’ satisfaction with the use of DPBL in medical education was generally high. The quantitative data showed a satisfaction rate of 78.51% (95% CI of the overall weighted mean 78.07% − 78.96%) from the 20 studies included in the quantitative analysis of satisfaction.
The qualitative data, which included 453 quotes from interviews and free comments from students, were analyzed and categorized within the ACAD-model under the three design modalities: social, epistemic, and set design. The qualitative synthesis revealed a highly nuanced variation in perceptions of DPBL experience. Several of the themes included both positive and negative perceptions, such as Social Relations and Representation of Reality, revealing a complex relation to DPBL experiences. Some themes were more clearly defined as positive or negative perceptions, such as Ownership and Technical Issues.
Together, the two syntheses supported each other and provided a more comprehensive understanding of medical students’ experiences with DPBL, increasing confidence in the findings. The complexity of the qualitative findings is clarified by the quantitative synthesis. Thereby giving us a mixed-methods synthesis stating medical students have both positive and negative perceptions of DPBL educational activities, but overall, they evaluate it positively. In conclusion, medical students are generally content with DPBL and seem to mostly value the positive aspects.
Stakeholder validation
Ten medical students (nine from the 3rd semester and one from the 11th semester) validated the findings. In addition to validating the findings, the students provided nuanced feedback. They generally expressed satisfaction with DPBL, highlighting its flexibility, autonomy, and motivational aspects. However, they emphasized the importance of social interaction, noting that digital formats may contribute to feelings of isolation. Several students also pointed out the need for consistent feedback and mentioned that while visual and auditory elements are well supported in digital formats, tactile learning remains a challenge. Their comments underscored the diversity of preferences and the importance of balancing digital tools with interpersonal interaction in DPBL.
Discussion
The growing integration of digital technologies into PBL has given rise to a considerable body of debate in the field of medical education, particularly with regard to the influence of such innovations on students´ learning [86]. While DPBL offers increased flexibility and access, concerns have been raised about its ability to preserve the collaborative and student-centered principles that underpin traditional PBL. Despite its increasing popularity, the voice of the medical student has been overlooked in this debate. This limitation creates a gap in our understanding of DPBL’s educational quality and possible deficiencies. Without incorporating the student perspective, the debate risks becoming overly technocratic, missing critical insights into how design choices shape the learning experience.
To address this gap, this review synthesized the satisfaction reported by medical students regarding DPBL and analyzed their narrated experiences within such learning environments. By combining quantitative and qualitative findings from 56 studies, the review aimed to clarify how specific elements of curriculum and pedagogy may shape the strengths and limitations of DPBL in the field of medical education. The elements in question included peer interaction, potential isolation, facilitator-student relationships, task design, level of autonomy, feedback mechanisms, digital tools, sensory realism, and technical issues related to the use of digital platforms.
The quantitative synthesis demonstrated that DPBL was generally considered to be satisfactory, with a weighted average of 78.51% across 20 studies. Nevertheless, satisfaction scores exhibited marked variability, ranging from 50 to 100%, and inconsistencies were identified in how satisfaction was measured across studies. These discrepancies underscore the need for caution when interpreting satisfaction as a sole indicator of DPBL educational quality. This variability mirrors the findings reported by Tudor Car et al. [1] in their review of nine randomised controlled trials. The inconsistencies may be partially explained by contextual differences across studies, variations in how satisfaction is defined and measured, and differing student expectations and backgrounds [44, 46]. Our methodological approach helped to harmonise these differences and facilitated a more integrated interpretation. Nevertheless, the identified discrepancies underline the necessity for a more standardised approach to measuring satisfaction in DPBL research, enabling more valid comparisons and informing improvements in DPBL design.
To address this issue, we propose the development of standardized core outcome sets (COS) [87] specifically tailored to evaluating DPBL in medical education. COS have been successfully implemented in fields such as rheumatology [88], pediatrics [89, 90], and oncology [91,92,93] to improve outcome consistency across studies. Establishing a COS for DPBL research could support more rigorous and consistent assessments and promote more evidence-informed decisions in curriculum design [87].
Additionally, to shed light on what underlies DPBL satisfaction average, we examined 453 student quotes from 44 studies, analyzed using the ACAD framework [29]. This lens made it possible to identify how nuances of students’ experiences were linked to three specific dimensions of DPBL design, including social interaction, task structure, and the digital environment. The experiences of students within these dimensions were diverse and, at times, contradictory. While some students valued the flexibility, autonomy, and concentration provided by digital formats, others described feelings of isolation, insufficient feedback, and a perceived lack of tactile realism. These contrasting accounts suggest that the perceived benefits and challenges of DPBL can coexist, yet favourable satisfaction scores may mask key tensions experienced by students.
These results are consistent with previous studies, which, despite reporting favourable student peceptions of DPBL, also identified crucial tensions in its implementation [6, 11, 14, 15, 94]. However, by using the ACAD framework, our review offers a more structured way to interpret the tensions experienced by students—not as isolated issues but as consequences of specific design decisions related to social, epistemic, and technological aspects of DPBL [29]. This interpretive lens shifts the conversation from identifying problems to understanding how they might be intentionally addressed through curriculum design, including planning the social arrangements, the nature of the learning tasks, and the technological features of the environment.
Despite the favourable average satisfaction, our qualitative synthesis suggests concerns regarding the extent to which DPBL can fulfil the core principles of PBL, particularly those related to social learning. As emphasised by foundational authors such as Barrows [2], Savery and Duffy [3], and Lebow [4], collaboration and interaction are central to the philosophy of PBL. While these authors could not have anticipated the profound changes brought by digital technologies, our findings suggest that DPBL still struggles to reconcile technological innovation with the social aspects of learning that remain pedagogically essential. This includes considering the frequency and quality of student–facilitator interactions, feedback mechanisms, and the opportunities for peer learning. One potential solution to this issue is to integrate a hybrid approach combining digital and in-person PBL activities. This approach may offer a more supportive and balanced learning environment.
In considering the task design, several students stated that they appreciated the autonomy offered by DPBL; however, they also experienced difficulties due to unclear expectations and limited guidance.The learning tasks could be structured with progressive difficulty, timely formative feedback, and clearly articulated goals to support independent learning while minimizing ambiguity. While Babitsch et al. [95] posit that DPBL can be implemented fully online with outcomes comparable to traditional PBL, such recommendations may underestimate the challenges students face when learning tasks are insufficiently understood or poorly supported.
Furthermore, within the digital environment, some students frequently characterised the tools as lacking sufficient sensory realism. To enhance authenticity, educators may benefit from incorporating virtual patients or interactive dashboards that reflect clinical reality more accurately. Elnaga et al. [96] employed high-fidelity virtual patient simulators and reported promising results in enhancing student performance by simulating full-body patient interaction. Budakoğlu et al. [97] also claim that multimedia formats, such as animated patient scenarios, can improve the perceived realism of DPBL.
Finally, although this review did not apply language restrictions, only a limited number of publications originating from some geographical regions, particularly Africa and South America. Consequently, variations in how students from these cultures perceive and experience DPBL may therefore have been underrepresented. Future research should explore a wider spectrum of cultural and educational contexts to ensure a more globally relevant understanding of DPBL in medical education.
Conclusion
This review demonstrates that although DPBL is generally well received by medical students, satisfaction alone is an inadequate indicator of its educational quality. Through a synthesis of quantitative and qualitative data, we provide evidence that students’ learning experiences are influenced by an interplay of curricular and pedagogical design aspects, some valued, others problematic, that are often hidden by aggregated metrics. Therefore, successful implementation of DPBL requires more than technological innovation; it demands careful attention to social dynamics, particularly the interactions between students and facilitators, which significantly influence the learning experience. Our findings indicate that DPBL continues to struggle with reconciling technological innovation and the pedagogically essential social diemnsions of learning. A thoughtfully designed hybrid DPBL model may offer a promising path forward by integrating digital flexibility with the collaborative foundations of traditional PBL.
Data availability
The datasets supporting the conclusions of this article are included within the article (and its additional files).
Abbreviations
DPBL:
Digital problem-based learning
PBL:
Problem-based learning
JBI:
Joana briggs institute
PRISMA:
Preferred reporting items for systematic reviews
MMSR:
Mixed-methods systematic review
ACAD:
Activity-centred analysis and design
Tudor Car L, Kyaw BM, Dunleavy G, Smart NA, Semwal M, Rotgans JI, et al. Digital problem-based learning in health professions: systematic review and meta-analysis by the digital health education collaboration. J Med Internet Res. 2019;21(2):e12945.
Barrows HS. How to design a problem based curriculum for the preclinical years. New York: Springer Publishing Co; 1985.
Savery JR, Duffy TM. Problem based learning: an instructional model and its constructivist framework. Educational Technol. 1995;35(5):31–8.
Lebow D. Constructivist values for instructional systems design: five principles toward a new mindset. Educ Tech Res Dev. 1993;41(3):4–16.
Walker A, Leary H, Hmelo-Silver C, Ertmer P, Cindy E, Hmelo-Silver A, Walker H, Leary. & Peggy A. Ertmer. Essential readings in Problem-Based learning: exploring and extending the legacy of Howard S. Barrows. 1st. ed. ed.: Purdue University; 2015.
Silén C, Wirell S, Kvist J, Nylander E, Smedby O. Advanced 3D visualization in student-centred medical education. Med Teach. 2008;30(5):115.
Rich SK, Keim RG, Shuler CF. Problem-based learning versus a traditional educational methodology: a comparison of preclinical and clinical periodontics performance. J Dent Educ. 2005;69(6):649–62.
McParland MNL, Livingston G. The effectiveness of problem-based learning compared to traditional teaching in undergraduate psychiatry. Med Educ. 2004;38(8):859–67.
Shorey S, Chan V, Rajendran P, Ang E. Learning styles, preferences and needs of generation Z healthcare students: scoping review. Nurse Educ Pract. 2021;57: 103247.
Clark CE. Problem-based learning: how do the outcomes compare with traditional teaching? Br J Gen Practice: J Royal Coll Gen Practitioners. 2006;56(530):722–3.
Chiu CJ, Lee J. A video game and film based PBL environment for online learning. J Educ Technol Soc. 2013;16(3):73–83.
Hassan SS, Nausheen F, Scali F, Mohsin H, Thomann C. A constructivist approach to teach neuroanatomy lab: students’ perceptions of an active learning environment. Scott Med J. 2022;67(3):80–6.
Burgun A, Darmoni S, Duff FL, Wéber J. Problem-based learning in medical informatics for undergraduate medical students: an experiment in two medical schools. Int J Med Inform. 2006;75(5):396–402.
Ibrahim ME, Al-Shahrani AM, Abdalla ME, Abubaker IM, Mohamed ME. The effectiveness of problem-based learning in acquisition of knowledge, soft skills during basic and preclinical sciences: medical students’ points of view. Acta Inform Med. 2018;26(2):119–24.
Gerzina TM, Worthington R, Byrne S, McMahon C. Student use and perceptions of different learning aids in a problem-based learning (PBL) dentistry course. J Dent Educ. 2003;67(6):641–53.
Beachey WD. A comparison of problem-based learning and traditional curricula in baccalaureate respiratory therapy education. Respir Care. 2007;52(11):1497–506.
Hutchcraft ML, Wallon RC, Fealy SM, Jones D, Galvez R. Evaluation of the road to birth software to support obstetric problem-based learning education with a cohort of pre-clinical medical students. Multimodal Technologies and Interaction. 2023;7(8): 84.
Zahid MA, Varghese R, Mohammed AM, Ayed AK. Comparison of the problem based learning-driven with the traditional didactic-lecture-based curricula. Int J Med Educ. 2016;7:181–7.
Aristovnik A, Keržič D, Ravšelj D, Tomaževič N, Umek L. Impacts of the Covid-19 pandemic on life of higher education students: global survey dataset from the first wave. Data Brief. 2021;39: 107659.
Yamamoto K, Akiyoshi K, Kondo H, Akioka H, Teshima Y, Yufu K, et al. Innovations in online classes introduced during the COVID-19 pandemic and their educational outcomes in Japan. BMC Med Educ. 2023;23(1):894.
Lizarondo, L., Stern, C., , Carrier, J., Godfrey, C., Rieger, K., Salmond, S.,… Loveday, H. Chapter 8: Mixed methods systematic reviews (2020). In: Aromataris E,Lockwood C, Porritt K, Pilla B, Jordan Z, editors. , editor. Manual for Evidence Synthesis Available from: https://synthesismanual.jbi.global.: JBI; 2024.
Page MJ, McKenzie JE, Bossuy PM. &, The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ 2021:372n71.
Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan — a web and mobile app for systematic reviews. Syst Reviews. 2016;5(210).
Barker TH, Stone JC, Sears K, Klugar M, Tufanaru C, Leonardi-Bee J, et al. The revised JBI critical appraisal tool for the assessment of risk of bias for randomized controlled trials. JBI Evid Synth. 2023;21(3):494–506.
Barker TH, Habibi N, Aromataris E, Stone JC, Leonardi-Bee J, Sears K, et al. The revised JBI critical appraisal tool for the assessment of risk of bias for quasi-experimental studies. JBI Evid Synth. 2024;22(3):378–88.
Moola S, Munn Z, Tufanaru C, Aromataris E, Sears K, Sfetc R, Currie M, Lisy K, Qureshi R, Mattis P, Mu P. In: Aromataris EMZ, editor. Chapter 7: systematic reviews of etiology and risk. JBI manual for evidence synthesis. JBI Manual for Evidence Synthesis; 2020.
Lockwood C, Munn Z, Porritt K. Qualitative research synthesis: methodological guidance for systematic reviewers utilizing meta-aggregation. International Journal of Evidence-Based Healthcare. 2015;13(3):179–87.
Hannes K. Chapter 4: critical appraisal of qualitative research. In: Noyes JBA, Hannes K, Harden A, Harris J, Lewin S, Lockwood C, editors. Supplementary guidance for inclusion of qualitative research in Cochrane systematic reviews of interventions. Cochrane Collaboration Qualitative Methods Group; 2011.
Goodyear P, Carvalho L, Yeoman P. Activity-centred analysis and design (ACAD): core purposes, distinctive qualities and current developments. Educ Tech Res Dev. 2021;69(2):445–64.
Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.
Braun V, Clarke V. Reflecting on reflexive thematic analysis. Qual Res Sport Exerc Health. 2019;11(4):589–97.
Braun V, Clarke V. One size fits all? What counts as quality practice in (reflexive) thematic analysis? Qual Res Psychol. 2021;18(3):328–52.
Gürpinar E, Zayim N, Başarici I, Gündüz F, Asar M, Oǧuz N. E-learning and problem based learning integration in cardiology education. Anadolu Kardiyoloji Derg. 2009;9(3):158–64.
Woltering V, Herrler A, Spitzer K, Spreckelsen C. Blended learning positively affects students’ satisfaction and the role of the tutor in the problem-based learning process: results of a mixed-method evaluation. Adv Health Sci Educ. 2009;14(5):725–38.
Cowan B, Sabri H, Kapralos B, Porte M, Backstein D, Cristancho S, et al. A serious game for total knee arthroplasty procedure, education and training. J Cyber Ther Rehabil. 2010;3(3):285–98.
Alamro AS, Schofield S. Supporting traditional PBL with online discussion forums: a study from Qassim medical school. Med Teach. 2012;34(Suppl 1):20.
Harris DM, Ryan K, Rabuck C. Using a high-fidelity patient simulator with first-year medical students to facilitate learning of cardiovascular function curves. Adv Physiol Educ. 2012;36(3):213–9.
Kim K-, Kee C. Evaluation of an e-PBL model to promote individual reasoning. Med Teach. 2013;35(3):e978–83.
Reis LO, Ikari O, Taha-Neto KA, Gugliotta A, Denardi F. Delivery of a urology online course using moodle versus didactic lectures methods. Int J Med Inform. 2015;84(2):149–54.
Alkhowailed MS, Rasheed Z, Shariq A, Elzainy A, El Sadik A, Alkhamiss A, et al. Digitalization plan in medical education during COVID-19 lockdown. Inf Med Unlock. 2020;20: 100432.
Alduraibi SK, El Sadik A, Elzainy A, Alsolai A, Alduraibi A. Medical imaging in problem-based learning and impact on the students: A cross-sectional study. J Pak Med Assoc. 2022;72(9):1731–5.
Bhattacharya S, Pal N, Ghosh K, Chakrabarty P, Saha S, Das D. Perception and experience of medical students regarding hybrid problem-based learning technique at a medical college in West Bengal, India: a cross-sectional study. J Clin Diagn Res. 2022;16(9):DC24-7.
Alsaif F, Neel L, Almuaiqel S, Almuhanna A, Alanazi JFO, Almansour M, et al. Experience of sudden shift from traditional to virtual Problem-Based learning during COVID-19 pandemic at a medical college in Saudi Arabia. Adv Med Educ Pract. 2023;14:453–61.
Chen J, Gao B, Wang K, Lei Y, Zhang S, Jin S, et al. Wechat as a platform for blending problem/case-based learning and paper review methods in undergraduate paediatric orthopaedics internships: a feasibility and effectiveness study. BMC Med Educ. 2023;23(1):322.
Kasai H, Shikino K, Saito G, Tsukamoto T, Takahashi Y, Kuriyama A, et al. Alternative approaches for clinical clerkship during the COVID-19 pandemic: online simulated clinical practice for inpatients and outpatients-a mixed method. BMC Med Educ. 2021;21(1): 149.
Razzak RA, Hasan Z, Stephen A. Medical student perceptions of integration of a customized cloud based learning operating system into problem based learning tutorials. Electron J e-Learning. 2020;18(1):25–39.
Arain SA, Afsar NA, Rohra DK, Zafar M. Learning clinical skills through audiovisual aids embedded in electronic-PBL sessions in undergraduate medical curriculum: perception and performance. Adv Physiol Educ. 2019;43(3):378–82.
Awan ZA, Awan AA, Alshawwa L, Tekian A, Park YS. Assisting the integration of social media in problem-based learning sessions in the faculty of medicine at King Abdulaziz university. Med Teach. 2018;40:S37–42.
Yan X, Zhu Y, Fang L, Ding P, Fang S, Zhou J, et al. Enhancing medical education in respiratory diseases: efficacy of a 3D printing, problem-based, and case-based learning approach. BMC Med Educ. 2023;23(1):512.
Servos U, Reiß B, Stosch C, Karay Y, Matthes J. A simple approach of applying blended learning to problem-based learning is feasible, accepted and does not affect evaluation and exam results—a just pre-pandemic randomised controlled mixed-method study. Naunyn-Schmiedebergs Arch Pharmacol. 2023;396(1):139–48.
Raupach T, Muenscher C, Anders S, Steinbach R, Pukrop T, Hege I, et al. Web-based collaborative training of clinical reasoning: a randomized trial. Med Teach. 2009;31(9):431.
Diaz-Perez JA, Echeverri-Perico JH, Hurtado-Gomez GA, Aranda-Valderrama P, Vera-Cala LM. Evaluation of a teaching system based on vertical integration of clinical areas, virtual autopsy, pathology museum and digital microscopy for medical students. Lab Invest. 2011;91: 130A.
Kraft SK, Honebein PC, Prince MJ, Marrero DG. The socrates curriculum: an innovative integration of technology and theory in medical education. J Audiov Media Med. 1997;20(4):166–71.
Baer RW, Chamberlain NR. Environmental factors promoting the effective use of a computer-assisted clinical case for second-year osteopathic medical students. J Am Osteopath Assoc. 1998;98(7):380–5.
Carr MM, Hewitt J, Scardamalia M, Reznick RK. Internet-based otolaryngology case discussions for medical students. J Otolaryngol. 2002;31(4):197–201.
Allen M, Sargeant J, Mann K, Fleming M, Premi J. Videoconferencing for practice-based small-group continuing medical education: feasibility, acceptability, effectiveness, and cost. J Contin Educ Health Prof. 2003;23(1):38–47.
Bowdish BE, Chauvin SW, Kreisman N, Britt M. Travels towards problem based learning in medical education (VPBL). Instr Sci. 2003;31(4–5):231–53.
Ryan G, Dolling T, Barnet S. Supporting the problem-based learning process in the clinical years: evaluation of an online clinical reasoning guide. Med Educ. 2004;38(6):638–45.
Balslev T, De Grave WS, Muijtjens AMM, Scherpbier AJJA. Comparison of text and video cases in a postgraduate problem-based learning format. Med Educ. 2005;39(11):1086–92.
Kerfoot BP, Masser BA, Hafler JP. Influence of new educational technology on problem-based learning at Harvard Medical School. Med Educ. 2005;39(4):380–7.
Nathoo AN, Goldhoff P, Quattrochi JJ. Evaluation of an interactive case-based online network (ICON) in a problem based learning environment. Adv Health Sci Educ Theory Pract. 2005;10(3):215–30.
de Leng B, Dolmans D, van de Wiel M, Muijtjens A, van der Vleuten C. How video cases should be used as authentic stimuli in problem-based medical education. Med Educ. 2007;41(2):181–8.
Gonzalez ML, Salmoni AJ. Online problem-based learning in postgraduate medical education - content analysis of reflection comments. Teach High Educ. 2008;13(2):183–92.
Poulton T, Conradi E, Kavia S, Round J, Hilton S. The replacement of ‘paper’ cases by interactive online virtual patients in problem-based learning. Med Teach. 2009;31(8):752–8.
Moeller S, Spitzer K, Spreckelsen C. How to configure blended problem based learning-results of a randomized trial. Med Teach. 2010;32(8):328.
Varga-Atkins T, Dangerfield P, Brigden D. Developing professionalism through the use of wikis: a study with first-year undergraduate medical students. Med Teach. 2010;32(10):824–9.
Wünschel M, Leichtle U, Wülker N, Kluba T. Using a web-based orthopaedic clinic in the curricular teaching of a German university hospital: analysis of learning effect, student usage and reception. Int J Med Inf. 2010;79(10):716–21.
Rampling J, O’Brien A, Hindhaugh K, Woodham L, Kavia S. Use of an online virtual environment in psychiatric problem-based learning. Psychiatr. 2012;36(10):391–6.
Stebbings S, Bagheri N, Perrie K, Blyth P, McDonald J. Blended learning and curriculum renewal across three medical schools: the rheumatology module at the University of Otago. Australas J Educ Technol. 2012;28(7):1176–89.
van Mook WNKA, Muijtjens AMM, Gorter SL, Zwaveling JH, Schuwirth LW, van der Vleuten CPM. Web-assisted assessment of professional behaviour in problem-based learning: more feedback, yet no qualitative improvement? Adv Health Sci Educ Theory Pract. 2012;17(1):81–93.
George P, Dumenco L, Dollase R, Taylor JS, Wald HS, Reis SP. Introducing technology into medical education: two pilot studies. Patient Educ Couns. 2013;93(3):522–4.
Woodham LA, Ellaway RH, Round J, Vaughan S, Poulton T, Zary N. Medical student and tutor perceptions of video versus text in an interactive online virtual patient for problem-based learning: a pilot study. J Med Internet Res. 2015;17(6):e151.
Kang S, Kim S, Kwon Y, Kim T, Park H, Park H, et al. The virtual asthma guideline e-learning program: learning effectiveness and user satisfaction. Korean J Intern Med. 2018;33(3):604–11.
Bintley HL, Bell A, Ashworth R. Remember to breathe: teaching respiratory physiology in a clinical context using simulation. Adv Physiol Educ. 2019;43(1):76–81.
Prochazkova K, Novotny P, Hancarova M, Prchalova D, Sedlacek Z. Teaching a difficult topic using a problem-based concept resembling a computer game: development and evaluation of an e-learning application for medical molecular genetics. BMC Med Educ. 2019;19(1):390–2.
Cohen-Osher M, Davies TA, Flynn DB, Young ME, Hoffman M. Finding information framework: a tool to teach life-long learning skills. PRiMER. 2021;5:16.
McIntyre K. Bridging the gap: implementation of an online induction course to support students’ transition into first year medicine. MedEdPublish (2016) 2021;9:193.
Petek D, Kolšek M, Šter MP, Rifel J, Homar V, Gorup EC, et al. Undergraduate online teaching of family medicine during the epidemic with SARS-CoV-2. Zdravniski Vestn. 2021;90(11–12):575–86.
Bell L, Lemos M, Mottaghy FM, Lindner O, Heinzel A. Teaching of nuclear cardiology in times of pandemic: transfer of a case-based interactive course from classroom to distance learning. Nuklearmedizin Nucl Med. 2022;61(1):6–15.
Koch LK, Correll-Buss A, Chang OH. Implementation and effectiveness of a completely virtual pathology rotation for visiting medical students. Am J Clin Pathol. 2022;157(3):406–12.
Alzayani S, Al-Roomi K, Ahmed J. Two years into digital transformation: the lived experiences of middle Eastern medical students in a problem-based curriculum. Khyber Med Univ J. 2023;15(4):211–7.
Cheung BHH, Foo DCC, Chu KM, Co M, Lee LS. Perception from students regarding online synchronous interactive teaching in the clinical year during COVID-19 pandemic. BMC Med Educ. 2023;23(1):5.
Nugroho D, Hermasari BK. Using online flipped classroom in problem-based learning medical curriculum: a mixed method study. J Educ Learn (EduLearn). 2023;17(2):294–300.
Vidal Villa A, Illesca Pretty M, González Osorio L, Godoy-Pozo J. Problem-Based learning virtual with a peer tutor in pandemic. Rev Med Chil. 2023;151(5):551–9.
Atwa HS, Nasr El-Din WA, Kumar AP, Potu BK, Tayem YI, Al-Ansari A, et al. Online or face-to-face problem-based learning tutorials? Comparing perceptions and preferences of students and tutors. Front Educ. 2024. https://doi.org/10.3389/feduc.2024.1354494.
Divito CB, Katchikian BM, Gruenwald JE, Burgoon JM. The tools of the future are the challenges of today: the use of ChatGPT in problem-based learning medical education. Med Teach. 2024;46(3):320–2.
Kirkham JJ, Gorst S, Altman DG, Blazeby JM, Clarke M, Tunis S, et al. Core outcome set-standardised protocol items: the COS-STAP statement. Trials. 2019;20(1):116.
Tugwell P, Boers M. Omeract conference on outcome measures in rheumatoid arthritis clinical trials: introduction. J Rheumatol. 1993;20(3):528–30.
Webbe JSI, Gale C. Core outcome sets. Arch Dis Child Educ Pract Ed. 2017. https://doi.org/10.1136/archdischild-2016-312117.
Karas J, Ashkenazi S, Guarino A, Lo Vecchio A, Shamir R, Vandenplas Y, et al. A core outcome set for clinical trials in acute diarrhoea. Arch Dis Child. 2015;100(4):359–63.
McNair AGK, Whistance RN, Forsythe RO, Macefield R, Rees J, Pullyblank AM, et al. Core outcomes for colorectal cancer surgery: a consensus study. PLoS Med. 2016;13(8): e1002071.
Avery KNL, Chalmers KA, Brookes ST, Blencowe NS, Coulman K, Whale K, et al. Development of a core outcome set for clinical effectiveness trials in esophageal cancer resection surgery. Ann Surg. 2018;267(4):700–10.
MacLennan S, Williamson PR, Bekema H, Campbell M, Ramsay C, N’Dow J, et al. A core outcome set for localised prostate cancer effectiveness trials. BJU Int. 2017;120(5B):E64–79.
Trullàs JC, Blay C, Sarri E, Pujol R. Effectiveness of problem-based learning methodology in undergraduate medical education: a scoping review. BMC Med Educ. 2022;22(1):104.
Babitsch B, Pöche-Guckelberger I, Maske D, Egbert N, Hübner U. Concepts and implementation of digital Problem-Based learning in Health-Related study programmes: A scoping review. Health-Related study programmes: A scoping review. Educ Med J. 2024;16(4):1–17.
Elnaga HHA, Ahmed MB, Fathi MS, Eissa S. Virtual versus paper-based PBL in a pulmonology course for medical undergraduates. BMC Med Educ. 2023;23(1):433.
Budakoğlu Iİ, Coşkun Ö, Özeke V. E-PBL with multimedia animations: a design-based research. BMC Med Educ. 2023. https://doi.org/10.1186/s12909-023-04298-x.
© 2025. This work is licensed under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.