Introduction
As medical knowledge and technological advancements continue to grow, staying abreast will continue to be a key challenge in the practice of medicine. To address this challenge in medical education, evidence-based medicine (EBM) and health technology assessment (HTA) have emerged as crucial frameworks [1-3], supported by advancements in health informatics [4]. They inform decision-making in healthcare through systematic evidence syntheses that focus on the effectiveness, safety, and cost-effectiveness of interventions [5,6]. Fundamentally, EBM has been defined as the integration of (i) current best evidence assessed with a systematic approach, (ii) clinical expertise, and (iii) patient values in healthcare delivery [6]. Thus, EBM primarily concerns patient-level decisions. An HTA, by contrast, primarily provides information for healthcare decision-making above the individual patient level, that is, at various organizational levels. Thus, compilations of economic, societal, and ethical implications may constitute important additional components of an HTA [5,7].
Medical programs should include all steps of evidence-based practice in their curricula. This includes translating uncertainty into answerable questions, searching for and retrieving relevant evidence, critically appraising such evidence regarding validity and clinical importance, applying it to practice, and evaluating performance [8]. EBM, which originally focused on critical appraisal, the development of systematic reviews, and clinical practice guidelines, has also been described as essential in the training of young clinicians [9]. However, a recent systematic review reported that evidence regarding effective teaching modalities remains scant [10]. Additionally, research on the transfer of EBM skills to clinical practice is reportedly scarce [11]. Furthermore, a scoping review showed that many studies have focused on the education of other healthcare professionals, such as nursing students, rather than physicians [12].
Medical students in Sweden must meet 23 national outcomes to obtain a medical degree, as outlined in the Swedish Higher Education Ordinance. Of these 23 outcomes, 11 are related to scientific skills and attitudes (hereafter described as scholarly degree outcomes). This national framework plays a crucial role in promoting quality, consistency, and accountability, ensuring that graduates meet the needs of patients and healthcare systems [13]. However, this framework also allows universities to develop their own teaching practices, intending to support students in attaining learning outcomes. Previous research has shown that there are considerable differences among outcome frameworks related to scientific research —also called scholarly activities—between countries and, consequently, between teaching and assessment [14,15]. In this context, a better understanding of medical students’ experiences of educational content that has contributed to their perceived skills in evidence-based practice could be helpful. Therefore, the present study aimed to explore how final-semester medical students experience the assessment of scholarly degree outcomes and teaching and learning activities related to systematic reviews/HTA. Furthermore, we investigated the association between this experience and the students’ perceptions of having developed sufficient skills to incorporate scientific evidence into patient care.
Methods
This cross-sectional study, which involved six medical programs in Sweden, was based on anonymous questionnaire responses from final-semester medical students and an overall mapping of scientifically related teaching and learning activities within the participating medical programs. The study was assessed by the Swedish Ethical Review Authority, which determined that the Ethical Review Act was not applicable and had no ethical objections to the study (2023-03120-0). Participation was voluntary, and the participants were informed about the study orally and in writing. Providing a completed questionnaire represented consent to participate. The recruitment period was September 9, 2023 to January 15, 2024.
Setting
All seven medical programs in Sweden were invited to participate, and all but one accepted the invitation. Thus, the study was conducted within the context of six medical programs (University of Gothenburg, Karolinska Institute, Linköping University, Lund University, Umeå University, and Örebro University). All universities were currently in the process of implementing a six-year medical program, but the present study concerned the preceding 5.5-year program, including 2 to 2.5 years of primarily preclinical aspects related to medical science, and the remaining years mainly covering clinical education, including direct patient care. All programs integrated clinical activities early on. They also integrated the teaching of professionalism and scientific education throughout the program. Unlike the six-year medical program, about 1.5 years of internship was required after the 5.5-year program to obtain a medical certificate. However, medical students who passed the 9th semester could obtain a temporary appointment to work within the profession.
During the 4th or 5th year, all the programs incorporated a mandatory research project course that spanned approximately 20 weeks and corresponded to 30 European Credit Transfer System (ECTS) credits. During this course, each student conducted an individual research project and presented a research report that was structured akin to a scientific publication. Supervisors were active researchers, holding a minimum of a PhD, who could offer a suitable project within their area of expertise. Optional co-supervisor(s), such as PhD students or other researchers in the same area, could also be appointed.
Swedish medical programs, however, differ from each other in terms of the broader context of scientific education. Since course content and related learning outcomes are developed locally, it is not feasible to comprehensively map teaching and learning activities related to EBM. Nevertheless, two notable differences emerged when we mapped the number of ECTS credits for mandatory research projects as well as the presence and type of education specifically targeting EBM, either as a single course or longitudinally during clinical semesters. First, some programs offered hands-on, credit-bearing, EBM-related learning activities during clinical courses at the later stages of the medical program (Linköping and Lund), whereas others did not. In Linköping, this learning activity spanned the 6th, 7th, 9th, 10th, and 11th semesters, comprising 1 ECTS per semester. In Lund, this learning activity spanned the 6th, 7th, and 8th semesters, comprising 1.5 ECTS per semester. Second, some programs included an additional mandatory research project course (Lund and Örebro, in semesters 5 and 6, respectively), whereas the others did not. This educational element encompassed a scholarly project of 10 weeks (15 ECTS credits) that students carried out under supervision, either alone or in pairs.
Participants
Medical students from the 11th semester in six medical schools were invited to participate. A total of 642 students were registered for this semester at the time of the study. The students were informed orally and in writing about the aims of the study, stating that participation was voluntary and that opting out of participation would not affect their education. Printed copies of the questionnaire were distributed by the authors on an occasion scheduled adjacent to a routine mandatory learning activity during a curricular course. By answering the questionnaire, the students gave their informed consent to participate.
Data collection
The data were retrieved using an anonymous questionnaire developed specifically for this study (see S1 File). It contained eight sections (45 items) in the following order: (I) experience regarding the adequacy of the assessment during the medical program in terms of the 11 scholarly degree outcomes (from a total of 23 degree outcomes) (11 items: 1a‒k), (II) the comprehensibility of these degree outcomes (11 items: 2a‒k); (III) perceptions regarding the attainment of sufficient skills in how to ground patient work on scientific evidence during the medical program (1 item: 3); (IV) experience of having undergone education on HTA during the program (1 item: 4); (V) experience of training during the medical program in scientific skills related to the procedural steps of a systematic review (5 items: 5a‒e) as well as an HTA based on a systematic review of patient benefits and harms of an intervention (8 items: 5a‒e, 6a‒c); (VI) knowledge-based single best answer (SBA) questions (5 items: 7‒11); (VII) open-ended questions (2 items: 12-13; not reported herein); and (VIII) responder characteristics (6 items: 14‒19). In Sections I‒V, the item statements were rated on a 5-point Likert scale from 1 (totally disagree) to 5 (totally agree). In Section VI, there were five SBAs with five response alternatives. These questions concerned the interpretation of research results of direct relevance for medical practice, that is, how to interpret (i) a case/control study, (ii) a diagnostic study, (iii) a meta-analysis with certainty of evidence expressed according to Grading of Recommendations, Assessment, Development, and Evaluations (GRADE) [16], (iv) a forest plot including a meta-analysis, and (v) a health economics statement regarding cost-effectiveness. The open-ended questions in Section VI (not reported in this article) asked the students about learning activities that had been useful for their development of skills to incorporate scientific evidence into patient care. Finally, responder characteristics (Section VII) included gender, age, previous doctoral degree, previous studies in other medical faculties, whether the 30-credit research project comprised a systematic review, and working life experience as an intern/junior doctor. It took approximately 15–20 minutes to complete the questionnaire. Local conditions determined whether the students could be compensated for their participation, for example, with a sandwich lunch.
The questionnaire was tested for face validity by inviting colleagues in the regional HTA center, Sahlgrenska University Hospital in Gothenburg, for comments. Two academically experienced senior consultants provided feedback in writing, and the questionnaire was discussed at an in-house HTA meeting. Furthermore, the questionnaire was tested by a general practitioner, who also provided feedback in writing.
Statistical analysis
The statistical analyses were performed using SPSS (IBM SPSS Statistics for Windows, version 27.0, Armonk, NY). The internal consistency of items within Sections I, II, and V was analyzed using Cronbach’s alpha, with corrected item-total correlation noted for each item. Spearman’s correlation coefficient was calculated to explore the correlation between individual items in Section I and the corresponding items in Section II. We used the Mann–Whitney and chi-square tests for comparisons between groups. The results are presented as numbers (percentages) or medians (interquartile ranges). To facilitate interpretation, the mean ± standard deviation was also provided where relevant, although normal distribution was not assumed in the statistical analyses. In the dichotomized analyses, students whose responses were 4 or 5 on the Likert scale were generally categorized as agreeing with the statement.
To investigate predictors of students’ perception that they had acquired adequate skills to incorporate scientific evidence into patient care, we performed logistic regression analyses to obtain crude and adjusted odds ratios (OR) with 95% confidence intervals (CI). The students’ dichotomized agreement with the statement “During the medical program, I have acquired sufficient skills in how to ground patient work on scientific evidence” was the dependent variable. We included the following individual characteristics as covariates in univariate and multivariate analyses: age (≤25 vs. > 25 years), gender (female vs. male), professional experience (yes vs. no), number of correct SBAs (≥4 vs. < 4) and having performed a systematic review for the master’s thesis (yes vs. no).
Concerning the students’ experience of the adequacy of the assessment regarding the 11 scholarly degree outcomes, we included variables in the multivariate analyses that statistically significantly predicted agreement with the statement in question in a multivariate model based solely on these outcomes. This approach was also applied to the educational content variables. Finally, we performed sensitivity analyses to investigate the robustness of the results. These analyses were based on a leave-one-out basis, excluding data from one university at a time. In additional sensitivity analyses, we omitted variables for which the OR shifted from > 1 in the crude analysis to < 1 in the main model or in any other sensitivity analysis. We also performed a sensitivity analysis in which the scholarly degree outcomes were omitted.
Results
In total, 433 of the 641 students returned the questionnaire, resulting in a response rate of 68%. The student characteristics are presented in Table 1. Briefly, the median age was 25 years, and 255 (59%) students were female. The students had a median score of 3 correct answers on the SBAs (range 0‒5) (for details, see S1 Table).
[Figure omitted. See PDF.]
Overall and according to their agreement with the statement “During the medical program, I have acquired sufficient skills in how to ground patient work on scientific evidence.” The results are presented as numbers (percentages) or medians (interquartile ranges). To facilitate interpretation, the mean ± standard deviation is presented in italics in parentheses where applicable.
Experience regarding degree outcomes and training
Overall, many students experienced that they had been adequately assessed regarding the scholarly degree outcomes (Table 1). For instance, 361 (83%) students reported adequate assessment of the outcome related to self-reflection and professional attitude, and the corresponding proportion for eight other outcomes ranged between 67% and 78%. By contrast, half or less than half of the students reported having been adequately assessed in the area of skills related to two scholarly degree outcomes: improvement work and the use of digital tools. Further, the comprehensibility of the degree outcomes was generally good (see S2 Table). All corresponding items in Sections I and II, except for the item regarding the use of digital tools, were correlated (correlation coefficients: 0.30‒0.35, all P < 0.0001; see S3 Table).
Concerning learning activities in the procedural steps of a systematic review, 250 (58%) students stated that they had been trained to define a specific research question using the PICO (P = patients, I = intervention, C = comparison, O = outcomes) format; 271 (63%) stated that they had been trained to find relevant literature pertinent to a specific research question; 308 (71%) stated that they had been trained to appraise scientific articles using checklists; 172 (40%) stated that they had been trained to synthesize results from multiple studies; and 135 (31%) stated that they had been trained to assess the certainty of evidence according to GRADE (Table 1). By contrast, only 26 (6%) students reported having been trained in the area of HTA, and 87 (20%) students provided a rating at a level of agreement of 3 or higher on this statement. Regarding training in HTA-specific aspects not included in a systematic review, 133 (31%), 93 (21%), and 284 (66%) students reported having been trained in the evaluation of organizational, economic, and ethical aspects, respectively, related to the introduction or withdrawal of health technologies in healthcare.
For items in Sections I, II, and V, the Cronbach’s alpha ranged between 0.81 and 0.86 (see S4 Table). The corrected item‒total correlation ranged between 0.41 and 0.63 (Section I), 0.40 and 0.62 (Section II), and 0.43 and 0.62 (Section V).
Perceived skills in EBM and HTA
Overall, 359 (83%) students perceived that they had developed sufficient skills in how to ground patient work on scientific evidence. In the univariate model, the students’ experience of having been adequately assessed on all scholarly degree outcomes predicted agreement with the statement “During the medical program, I have acquired sufficient skills in how to ground patient work on scientific evidence” (Table 2). However, when the individual items were included in the same model, the statistical significance remained for only two outcomes (see S4 Table). Correspondingly, almost all items concerning the students’ experience of scientifically related educational content predicted agreement with their perception of having been sufficiently prepared to incorporate scientific evidence into patient care. Nevertheless, when these items were included in the same model, the statistical significance remained constant for only four of them (see S5 Table).
[Figure omitted. See PDF.]
In the multivariate model, the students’ experience of the adequacy of the assessment regarding the scholarly degree outcome (i.e., “Demonstrate knowledge about the scientific foundation of medicine, insights into current research as well as knowledge of the link between science and proven experience”) predicted their perception of having developed sufficient skills in incorporating scientific evidence into patient care (Table 2). Additional predictors included the experience of having undergone education on HTA during the program as well as training in scientific skills related to two procedural steps of a systematic review/HTA, that is, appraising scientific articles by using checklists and assessing organizational aspects associated with the implementation or withdrawal of health technologies in healthcare. Finally, the presence of hands-on, credit-bearing, EBM-related learning activities during clinical courses was also a predictor. The sensitivity analyses showed that the results were robust (see S6 Table).
Discussion
This cross-sectional study of six medical programs in Sweden, comprising more than 400 final-semester medical students, shows that most students experienced having been adequately assessed on scientifically related degree outcomes. Reported adequate assessment on the scholarly outcome “Demonstrate knowledge about the scientific foundation of medicine…” predicted students’ perception of having developed sufficient skills in incorporating scientific evidence into patient care. Additional predictors included learning activities related to systematic reviews and HTAs as well as hands-on, credit-bearing, EBM-related learning activities during clinical courses. Surprisingly few students experienced that they had been trained in the area of HTA, and a non-negligible proportion reported not having been trained in the important procedural steps of a systematic review.
In medical programs, assessment of scholarly degree outcomes is critical as scientific principles underpin virtually every aspect of medical practice, guiding physicians, for instance, during the process of differential diagnosis and in decision-making regarding treatment alternatives and follow-up strategies. In addition, assessment per se is an essential part of constructive alignment, meaning that learning outcomes, teaching activities, and assessments are aligned with each other [17]. Our finding that the assessment of the scholarly degree outcome (i.e., “Demonstrate knowledge about the scientific foundation of medicine…”) was key to students’ perception of having acquired sufficient skills in incorporating scientific evidence into patient care underscores the crucial role of well-thought-out assessments. Although the importance of this degree outcome may not be surprising, given the inclusion of the broad scope of “the scientific foundation of medicine”, the results can provide valuable insights for constructive alignment.
The findings of this study suggest that hands-on EBM-related learning activities during clinical courses, including assessments that contribute credits toward graduation, have significant educational value. Again, this is in line with constructive alignment. Furthermore, contextual learning acknowledges the importance of embedding learning into an authentic setting, thereby making it more meaningful to students in their future profession [18,19]. Indeed, hands-on, credit-bearing, EBM-related learning activities during clinical courses predicted students’ perceptions of having acquired sufficient skills in applying scientific evidence in patient care. This suggests that integrating EBM teaching into clinical courses may help students understand how scientific findings are used in clinical practice. The benefits of hands-on EBM-related learning activities are also in line with the findings of systematic review overviews concluding that EBM learning strategies should be multifaceted and integrated into the clinical context [20,21]. More recent studies also show benefits of learning activities of EBM both early and late in the medical program [22–24]. Moreover, our finding that repeated learning activities across multiple clinical courses can enhance students’ learning of EBM aligns with recent research indicating that EBM habits and knowledge tend to decline after a single instructional program in the 4th or 5th year of the medical program [24].
A graduate physician in modern health care is typically exposed to numerous guidelines, which should be based on a systematic evidence synthesis [25]. Given that the rigor of development, including the approach to evidence synthesis, is a key domain in assessing the quality of a guideline [26] and, consequently, its applicability for a physician as part of EBM, it may be surprising that a non-negligible proportion of the final-semester medical students reported that they had not been trained in the steps required to perform a systematic review. For instance, 70% reported not having been trained to assess evidence using the GRADE system. In this context, it may be of interest to note that the self-perceived competence of medical students and physicians was often low in the EBM domains of analysis and application [27]. Given that GRADE is recommended for determining the certainty of evidence [28], and that interpreting certainty and strength of recommendations are key competencies within evidence-based practice, our findings suggest that this area of education may need greater emphasis in curriculum development [29]. The fact that many guidelines have apparent quality problems regarding the underlying evidence synthesis [30] further underscores the need to include training within the procedural steps of systematic reviews in the curriculum.
Notably, 60% of the medical students indicated that they had not been trained to synthesize results from multiple studies, for example, in a meta-analysis. In the absence of experience, graduate physicians may find it difficult to distinguish useful meta-analyses from flawed ones. Arguably, when this educational content is lacking, EBM in clinical practice may be hampered, especially as only 3% of published meta-analyses are reportedly decent and clinically useful [31]. Again, identifying and critically appraising key elements of systematic reviews and interpreting presentations of pooled results have been described as core competencies of evidence-based practice [29]. Furthermore, these aspects are considered to require practical training during the curriculum [29], illustrating the challenges involved.
Regarding the appraisal of scientific articles using checklists, our findings are more encouraging. About 70% of the medical students reported having received training within this component of a systematic review/HTA. This educational content therefore seems to have been largely implemented in medical programs. However, since these results reflect a core competency—the ability to critically evaluate the integrity, reliability, and applicability of health-related research [29]—the reverse finding, that 30% of the students reported not to have been trained implies that there is room for curriculum improvement.
Although HTA is not currently described as a core competency for health professionals, it is intricately linked to EBM. From a medical student perspective, knowledge about HTA may contribute insights to future professional life beyond evidence synthesis. Indeed, an HTA often also includes economic, societal, and ethical aspects and informs healthcare decision-making at the organizational level. Undoubtedly, such decisions have implications for physicians’ everyday practice. In the context of medical programs, a deep understanding of other academic disciplines, sucjh as health economics, is not required. However, an understanding of overarching prioritization principles is essential. Our finding that many final-semester medical students interpreted the concept of cost-effectiveness erroneously may therefore suggest that HTA and prioritization principles deserve increased attention in learning activities during medical school.
A notable strength of this study is that it contributes knowledge regarding predictors of the perception of having developed adequate skills to incorporate scientific evidence into patient care, a core aspect of EBM. Furthermore, the study achieved a relatively high response rate from students across six medical programs. Thus, the findings can be acceptably generalizable. Nevertheless, it must be noted that students who chose not to participate in this study could have reported worse perceptions than those who did. In addition, they could have performed worse on the SBAs. Indeed, an original study with an 88% participation rate, including medical as well as other students, reported that 64% of those participating passed the subsequent exam whereas only 15% of those not participating did so [32]. Nevertheless, the sensitivity analysis—in which the site with a considerably low response rate was excluded—yielded similar results. Furthermore, our results were robust in the other sensitivity analyses. Finally, the results related to internal consistency suggest that the scales used in the questionnaire were reliable.
An important limitation of the present study is that our results reflect students’ perceptions of having developed sufficient skills in incorporating scientific evidence into patient care and not the actual assessment of these skills. Indeed, self-reported data comprising students’ perceptions and experiences are inherently subjective and possibly influenced by factors such as emotions and context, and there is an obvious risk of recall bias as the medical program spans several years. For future evaluations, using the Fresno test could provide valuable insights into students’ knowledge and skills in EBM as well as the impact of teaching interventions [33]. Other limitations include the cross-sectional design, which allows conclusions regarding associations but not causality. However, self-reported data are regularly used in research and can allow for quantitative analyses. Furthermore, it should be noted that we did not comprehensively map or analyze the assessment methods applied in the scholarly degree outcomes at the universities under study. Indeed, the study was limited to the questionnaire data and the overall mapping of the included programs; therefore, the possibility that other factors may also be of importance cannot be excluded. Finally, a potential limitation was that questions regarding the awareness and knowledge of regulatory assessment documentation as a source of scientific evidence were not included in the questionnaire. This may be particularly relevant given the new HTA regulation, which indicates closer collaboration and the use of convergent methodologies [34]. Additionally, the establishment of the Data Analysis and Real World Interrogation Network within the European Union (DARWIN EU) highlights a focus on leveraging evidence from healthcare databases across the EU [35]. Nevertheless, the SBAs addressing the interpretation of a meta-analysis using GRADE to evaluate the certainty of evidence, a forest plot with a meta-analysis, and a health economics statement on cost-effectiveness may partially capture the ability of final-year medical students to interpret regulatory assessment documentation. Furthermore, the SBA regarding the interpretation of a case/control study based on register data may provide some information related to DARWIN EU.
Conclusions
Scientific principles form the foundation of nearly every aspect of medical practice. Our findings suggest that there should be an increased focus on educational efforts in medical programs aimed at supporting the learning process of medical students in perceiving themselves as prepared to incorporate scientific evidence into patient care: (i) ensuring assessment of key content relating to scholarly outcomes, including the scientific foundation of medicine; (ii) incorporating learning activities on HTA and the systematic review process; and (iii) integrating hands-on, credit-bearing, EBM-related learning activities into clinical courses.
Supporting information
S1 File. Questionnaire.
https://doi.org/10.1371/journal.pone.0321211.s001
(DOCX)
S1 Table. Results SBA.
https://doi.org/10.1371/journal.pone.0321211.s002
(DOCX)
S2 Table. Comprehensibility of scholarly degree outcomes.
https://doi.org/10.1371/journal.pone.0321211.s003
(DOCX)
S3 Table. Correlations.
https://doi.org/10.1371/journal.pone.0321211.s004
(DOCX)
S4 Table. Internal consistency.
https://doi.org/10.1371/journal.pone.0321211.s005
(DOCX)
S5 Table. Regression analyses of variables for the final model.
https://doi.org/10.1371/journal.pone.0321211.s006
(DOCX)
S6 Table. Sensitivity analyses.
https://doi.org/10.1371/journal.pone.0321211.s007
(DOCX)
S1 Data. Details about questionnaire data used in the analyses.
https://doi.org/10.1371/journal.pone.0321211.s008
(DOCX)
Acknowledgments
The authors want to thank the medical students who devoted their time to participating in the study. Furthermore, we are grateful for valuable comments on the questionnaire from colleagues in HTA-centrum, Sahlgrenska University Hospital, particularly those who provided feedback in writing: Christina Bergh, MD, professor and senior consultant in obstetrics and gynecology, and Annika Strandell, MD, associate professor and senior consultant in obstetrics and gynecology. In addition, thanks are due to Staffan Svensson, MD, PhD, for providing feedback from a general practitioner’s perspective. We also thank Max Petzold, professor of biostatistics at the University of Gothenburg, for checking the multivariate model, and librarian Maria Björklund, who set up and carried out data collection at Lund University.
References
1. 1. Allen M, Singh Brar S, Farrell L. Medical education needs to teach health technology assessment. Med Teach. 2010;32(1):62–4. pmid:20095776
* View Article
* PubMed/NCBI
* Google Scholar
2. 2. Goodman RL. Commentary: health care technology and medical education: putting physical diagnosis in its proper place. Acad Med. 2010;85(6):945–6. pmid:20505391
* View Article
* PubMed/NCBI
* Google Scholar
3. 3. Green ML. Evidence-based medicine training in graduate medical education: past, present and future. J Eval Clin Pract. 2000;6(2):121–38. pmid:10970006
* View Article
* PubMed/NCBI
* Google Scholar
4. 4. Lokker C, McKibbon KA, Afzal M, Navarro T, Linkins LA, Haynes RB, et al. The McMaster Health Information Research unit: over a quarter-century of health informatics supporting evidence-based medicine. J Med Internet Res. 2024;26:e58764. pmid:39083765
* View Article
* PubMed/NCBI
* Google Scholar
5. 5. Drummond MF, Schwartz JS, Jönsson B, Luce BR, Neumann PJ, Siebert U, et al. Key principles for the improved conduct of health technology assessments for resource allocation decisions. Int J Technol Assess Health Care. 2008;24(3):244–58; discussion 362-8. pmid:18601792
* View Article
* PubMed/NCBI
* Google Scholar
6. 6. Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ. 1996;312(7023):71–2. pmid:8555924
* View Article
* PubMed/NCBI
* Google Scholar
7. 7. Lafortune L, Farand L, Mondou I, Sicotte C, Battista R. Assessing the performance of health technology assessment organizations: a framework. Int J Technol Assess Health Care. 2008;24(1):76–86. pmid:18218172
* View Article
* PubMed/NCBI
* Google Scholar
8. 8. Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5(1):1. pmid:15634359
* View Article
* PubMed/NCBI
* Google Scholar
9. 9. Djulbegovic B, Guyatt GH. Progress in evidence-based medicine: a quarter century on. Lancet. 2017;390(10092):415–23. pmid:28215660
* View Article
* PubMed/NCBI
* Google Scholar
10. 10. Howard B, Diug B, Ilic D. Methods of teaching evidence-based practice: a systematic review. BMC Med Educ. 2022;22(1):742. pmid:36289534
* View Article
* PubMed/NCBI
* Google Scholar
11. 11. Ahmadi S-F, Baradaran HR, Ahmadi E. Effectiveness of teaching evidence-based medicine to undergraduate medical students: a BEME systematic review. Med Teach. 2015;37(1):21–30. pmid:25401408
* View Article
* PubMed/NCBI
* Google Scholar
12. 12. Larsen CM, Terkelsen AS, Carlsen AF, Kristensen HK. Methods for teaching evidence-based practice: a scoping review. BMC Med Educ. 2019;19(1):259.
* View Article
* Google Scholar
13. 13. Delany C, Kosta L, Ewen S, Nicholson P, Remedios L, Harms L. Identifying pedagogy and teaching strategies for achieving nationally prescribed learning outcomes. High Educ Res Dev. 2016;35(5):895–909.
* View Article
* Google Scholar
14. 14. Global minimum essential requirements in medical education. Med Teach. 2002;24(2):130-5. pmid:12098431
* View Article
* PubMed/NCBI
* Google Scholar
15. 15. Hautz SC, Hautz WE, Keller N, Feufel MA, Spies C. The scholar role in the National Competence Based Catalogues of Learning Objectives for Undergraduate Medical Education (NKLM) compared to other international frameworks. Ger Med Sci. 2015;13:Doc20. pmid:26609287
* View Article
* PubMed/NCBI
* Google Scholar
16. 16. Atkins D, Best D, Briss PA, Eccles M, Falck-Ytter Y, Flottorp S, et al. Grading quality of evidence and strength of recommendations. BMJ. 2004;328(7454):1490. pmid:15205295
* View Article
* PubMed/NCBI
* Google Scholar
17. 17. Biggs J. Enhancing teaching through constructive alignment. High Educ. 1996;32(3):347–64.
* View Article
* Google Scholar
18. 18. Billett S. Learning through work: workplace participatory practices. Workplace learning in context: Routledge; 2004;125–41.
* View Article
* Google Scholar
19. 19. Lave J, Wenger E. Situated learning: legitimate peripheral participation. Cambridge University Press; 1991.
20. 20. Young T, Rohwer A, Volmink J, Clarke M. What are the effects of teaching evidence-based health care (EBHC)? Overview of systematic reviews. PLoS One. 2014;9(1):e86706. pmid:24489771
* View Article
* PubMed/NCBI
* Google Scholar
21. 21. Bala MM, Poklepović Peričić T, Zajac J, Rohwer A, Klugarova J, Välimäki M, et al. What are the effects of teaching Evidence-Based Health Care (EBHC) at different levels of health professions education? An updated overview of systematic reviews. PLoS One. 2021;16(7):e0254191. pmid:34292986
* View Article
* PubMed/NCBI
* Google Scholar
22. 22. Kumaravel B, Jenkins H, Chepkin S, Kirisnathas S, Hearn J, Stocker CJ, et al. A prospective study evaluating the integration of a multifaceted evidence-based medicine curriculum into early years in an undergraduate medical school. BMC Med Educ. 2020;20(1):278. pmid:32838775
* View Article
* PubMed/NCBI
* Google Scholar
23. 23. Menard L, Blevins AE, Trujillo DJ, Lazarus KH. Integrating evidence-based medicine skills into a medical school curriculum: a quantitative outcomes assessment. BMJ Evid Based Med. 2021;26(5):249–50. pmid:33093190
* View Article
* PubMed/NCBI
* Google Scholar
24. 24. Kasai H, Saito G, Takeda K, Tajima H, Kawame C, Hayama N, et al. Effect of a workplace-based learning program on clerkship students’ behaviors and attitudes toward evidence-based medicine practice. Med Educ Online. 2024;29(1):2357411. pmid:38785167
* View Article
* PubMed/NCBI
* Google Scholar
25. 25. Kredo T, Bernhardsson S, Machingaidze S, Young T, Louw Q, Ochodo E, et al. Guide to clinical practice guidelines: the current state of play. Int J Qual Health Care. 2016;28(1):122–8. pmid:26796486
* View Article
* PubMed/NCBI
* Google Scholar
26. 26. Brouwers MC, Kerkvliet K, Spithoff K. The AGREE Reporting Checklist: a tool to improve reporting of clinical practice guidelines. BMJ. 2016;352:i1152. pmid:26957104
* View Article
* PubMed/NCBI
* Google Scholar
27. 27. Romero-Robles MA, Soriano-Moreno DR, García-Gutiérrez FM, Condori-Meza IB, Sing-Sánchez CC, Bulnes Alvarez SP, et al. Self-perceived competencies on evidence-based medicine in medical students and physicians registered in a virtual course: a cross-sectional study. Med Educ Online. 2022;27(1):2010298. pmid:34919030
* View Article
* PubMed/NCBI
* Google Scholar
28. 28. Kolaski K, Logan LR, Ioannidis JPA. Guidance to best tools and practices for systematic reviews. Syst Rev. 2023;12(1):96. pmid:37291658
* View Article
* PubMed/NCBI
* Google Scholar
29. 29. Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, et al. Core competencies in evidence-based practice for health professionals: consensus statement based on a systematic review and delphi survey. JAMA Netw Open. 2018;1(2):e180281. pmid:30646073
* View Article
* PubMed/NCBI
* Google Scholar
30. 30. Lunny C, Ramasubbu C, Puil L, Liu T, Gerrish S, Salzwedel DM, et al. Over half of clinical practice guidelines use non-systematic methods to inform recommendations: a methods study. PLoS One. 2021;16(4):e0250356. pmid:33886670
* View Article
* PubMed/NCBI
* Google Scholar
31. 31. Ioannidis JPA. The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses. Milbank Q. 2016;94(3):485–514. pmid:27620683
* View Article
* PubMed/NCBI
* Google Scholar
32. 32. Carlsson T, Winder M, Eriksson AL, Wallerstedt SM. Student characteristics associated with passing the exam in undergraduate pharmacology courses-a cross-sectional study in six university degree programs. Med Sci Educ. 2020;30(3):1137–44. pmid:34457776
* View Article
* PubMed/NCBI
* Google Scholar
33. 33. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003;326(7384):319–21. pmid:12574047
* View Article
* PubMed/NCBI
* Google Scholar
34. 34. The European Parliament and the Council fo the European Union. Regulation on health technology assessment (2021/2282). 2021. https://eur-lexeuropaeu/legal-content/EN/TXT/?uri=CELEX:32021R2282
* View Article
* Google Scholar
35. 35. Claire R, Elvidge J, Hanif S, Goovaerts H, Rijnbeek PR, Jónsson P, et al. Advancing the use of real world evidence in health technology assessment: insights from a multi-stakeholder workshop. Front Pharmacol. 2024;14:1289365. pmid:38283835
* View Article
* PubMed/NCBI
* Google Scholar
Citation: Wallerstedt SM, Garwicz M, Henriksson P, Mossberg K, Naumburg E, Wahlberg J, et al. (2025) Preparing medical students to incorporate scientific evidence into patient care: A cross-sectional study. PLoS ONE 20(4): e0321211. https://doi.org/10.1371/journal.pone.0321211
About the Authors:
Susanna M. Wallerstedt
Roles: Conceptualization, Formal analysis, Investigation, Methodology, Project administration, Writing – original draft, Writing – review & editing
E-mail:[email protected]
Affiliations: Department of Pharmacology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden, HTA-centrum, Sahlgrenska University Hospital, Gothenburg, Sweden
ORICD: https://orcid.org/0000-0001-7238-1680
Martin Garwicz
Roles: Investigation, Methodology, Writing – review & editing
Affiliation: Department of Experimental Medical Science, Lund University, Lund, Sweden
Pontus Henriksson
Roles: Investigation, Methodology, Writing – review & editing
Affiliation: Department of Health, Medicine and Caring Sciences, Linköping University, Linköping, Sweden
ORICD: https://orcid.org/0000-0003-2482-7048
Karin Mossberg
Roles: Investigation, Methodology, Writing – review & editing
Affiliations: Department of Public Health and Community Medicine, Sahlgrenska Academy, University of Gothenburg, Sweden, Närhälsan Herrestad Vårdcentral, Primary Health Care, Region Västra Götaland, Sweden
Estelle Naumburg
Roles: Investigation, Methodology, Writing – review & editing
Affiliation: Department of Clinical Sciences, Unit of Paediatrics, Umeå University, Umeå, Sweden
Jeanette Wahlberg
Roles: Investigation, Methodology, Writing – review & editing
Affiliations: School of Medical Sciences, Faculty of Medicine and Health, Örebro University, Örebro, Sweden, Department of Endocrinology and Diabetes, Örebro University Hospital, Örebro, Sweden
ORICD: https://orcid.org/0000-0003-4061-6830
Riitta Möller
Roles: Investigation, Methodology, Writing – original draft, Writing – review & editing
Affiliations: Department of Medical Epidemiology and Biostatistics, Karolinska Institutet, Stockholm, Sweden, Department of Otolaryngology, Head and Neck Surgery, Karolinska University Hospital, Stockholm, Sweden
[/RAW_REF_TEXT]
1. Allen M, Singh Brar S, Farrell L. Medical education needs to teach health technology assessment. Med Teach. 2010;32(1):62–4. pmid:20095776
2. Goodman RL. Commentary: health care technology and medical education: putting physical diagnosis in its proper place. Acad Med. 2010;85(6):945–6. pmid:20505391
3. Green ML. Evidence-based medicine training in graduate medical education: past, present and future. J Eval Clin Pract. 2000;6(2):121–38. pmid:10970006
4. Lokker C, McKibbon KA, Afzal M, Navarro T, Linkins LA, Haynes RB, et al. The McMaster Health Information Research unit: over a quarter-century of health informatics supporting evidence-based medicine. J Med Internet Res. 2024;26:e58764. pmid:39083765
5. Drummond MF, Schwartz JS, Jönsson B, Luce BR, Neumann PJ, Siebert U, et al. Key principles for the improved conduct of health technology assessments for resource allocation decisions. Int J Technol Assess Health Care. 2008;24(3):244–58; discussion 362-8. pmid:18601792
6. Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ. 1996;312(7023):71–2. pmid:8555924
7. Lafortune L, Farand L, Mondou I, Sicotte C, Battista R. Assessing the performance of health technology assessment organizations: a framework. Int J Technol Assess Health Care. 2008;24(1):76–86. pmid:18218172
8. Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5(1):1. pmid:15634359
9. Djulbegovic B, Guyatt GH. Progress in evidence-based medicine: a quarter century on. Lancet. 2017;390(10092):415–23. pmid:28215660
10. Howard B, Diug B, Ilic D. Methods of teaching evidence-based practice: a systematic review. BMC Med Educ. 2022;22(1):742. pmid:36289534
11. Ahmadi S-F, Baradaran HR, Ahmadi E. Effectiveness of teaching evidence-based medicine to undergraduate medical students: a BEME systematic review. Med Teach. 2015;37(1):21–30. pmid:25401408
12. Larsen CM, Terkelsen AS, Carlsen AF, Kristensen HK. Methods for teaching evidence-based practice: a scoping review. BMC Med Educ. 2019;19(1):259.
13. Delany C, Kosta L, Ewen S, Nicholson P, Remedios L, Harms L. Identifying pedagogy and teaching strategies for achieving nationally prescribed learning outcomes. High Educ Res Dev. 2016;35(5):895–909.
14. Global minimum essential requirements in medical education. Med Teach. 2002;24(2):130-5. pmid:12098431
15. Hautz SC, Hautz WE, Keller N, Feufel MA, Spies C. The scholar role in the National Competence Based Catalogues of Learning Objectives for Undergraduate Medical Education (NKLM) compared to other international frameworks. Ger Med Sci. 2015;13:Doc20. pmid:26609287
16. Atkins D, Best D, Briss PA, Eccles M, Falck-Ytter Y, Flottorp S, et al. Grading quality of evidence and strength of recommendations. BMJ. 2004;328(7454):1490. pmid:15205295
17. Biggs J. Enhancing teaching through constructive alignment. High Educ. 1996;32(3):347–64.
18. Billett S. Learning through work: workplace participatory practices. Workplace learning in context: Routledge; 2004;125–41.
19. Lave J, Wenger E. Situated learning: legitimate peripheral participation. Cambridge University Press; 1991.
20. Young T, Rohwer A, Volmink J, Clarke M. What are the effects of teaching evidence-based health care (EBHC)? Overview of systematic reviews. PLoS One. 2014;9(1):e86706. pmid:24489771
21. Bala MM, Poklepović Peričić T, Zajac J, Rohwer A, Klugarova J, Välimäki M, et al. What are the effects of teaching Evidence-Based Health Care (EBHC) at different levels of health professions education? An updated overview of systematic reviews. PLoS One. 2021;16(7):e0254191. pmid:34292986
22. Kumaravel B, Jenkins H, Chepkin S, Kirisnathas S, Hearn J, Stocker CJ, et al. A prospective study evaluating the integration of a multifaceted evidence-based medicine curriculum into early years in an undergraduate medical school. BMC Med Educ. 2020;20(1):278. pmid:32838775
23. Menard L, Blevins AE, Trujillo DJ, Lazarus KH. Integrating evidence-based medicine skills into a medical school curriculum: a quantitative outcomes assessment. BMJ Evid Based Med. 2021;26(5):249–50. pmid:33093190
24. Kasai H, Saito G, Takeda K, Tajima H, Kawame C, Hayama N, et al. Effect of a workplace-based learning program on clerkship students’ behaviors and attitudes toward evidence-based medicine practice. Med Educ Online. 2024;29(1):2357411. pmid:38785167
25. Kredo T, Bernhardsson S, Machingaidze S, Young T, Louw Q, Ochodo E, et al. Guide to clinical practice guidelines: the current state of play. Int J Qual Health Care. 2016;28(1):122–8. pmid:26796486
26. Brouwers MC, Kerkvliet K, Spithoff K. The AGREE Reporting Checklist: a tool to improve reporting of clinical practice guidelines. BMJ. 2016;352:i1152. pmid:26957104
27. Romero-Robles MA, Soriano-Moreno DR, García-Gutiérrez FM, Condori-Meza IB, Sing-Sánchez CC, Bulnes Alvarez SP, et al. Self-perceived competencies on evidence-based medicine in medical students and physicians registered in a virtual course: a cross-sectional study. Med Educ Online. 2022;27(1):2010298. pmid:34919030
28. Kolaski K, Logan LR, Ioannidis JPA. Guidance to best tools and practices for systematic reviews. Syst Rev. 2023;12(1):96. pmid:37291658
29. Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, et al. Core competencies in evidence-based practice for health professionals: consensus statement based on a systematic review and delphi survey. JAMA Netw Open. 2018;1(2):e180281. pmid:30646073
30. Lunny C, Ramasubbu C, Puil L, Liu T, Gerrish S, Salzwedel DM, et al. Over half of clinical practice guidelines use non-systematic methods to inform recommendations: a methods study. PLoS One. 2021;16(4):e0250356. pmid:33886670
31. Ioannidis JPA. The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses. Milbank Q. 2016;94(3):485–514. pmid:27620683
32. Carlsson T, Winder M, Eriksson AL, Wallerstedt SM. Student characteristics associated with passing the exam in undergraduate pharmacology courses-a cross-sectional study in six university degree programs. Med Sci Educ. 2020;30(3):1137–44. pmid:34457776
33. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003;326(7384):319–21. pmid:12574047
34. The European Parliament and the Council fo the European Union. Regulation on health technology assessment (2021/2282). 2021. https://eur-lexeuropaeu/legal-content/EN/TXT/?uri=CELEX:32021R2282
35. Claire R, Elvidge J, Hanif S, Goovaerts H, Rijnbeek PR, Jónsson P, et al. Advancing the use of real world evidence in health technology assessment: insights from a multi-stakeholder workshop. Front Pharmacol. 2024;14:1289365. pmid:38283835
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2025 Wallerstedt et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Objective
To explore teaching- and assessment-related factors that predict medical students’ perceived attainment of sufficient skills to incorporate scientific evidence into patient care.
Methods
An anonymous questionnaire was distributed to final-semester students in six medical programs in Sweden. The students were asked to rate statements concerning the extent to which 11 national degree outcomes related to the scientific basis of medicine (scholarly degree outcomes) had been adequately assessed during the program; their perceived preparedness for evidence-based patient care; and training during the program regarding the components of a systematic review/health technology assessment (HTA).
Results
In total, 433 students (median age: 25 years [interquartile range: 24‒28], 59% female) participated in the study (response rate: 68%). A multivariate analysis indicated that experienced adequate assessment on a single scholarly degree outcome (i.e., “Demonstrate knowledge of the scientific foundation of medicine and insight into current research as well as knowledge of the link between science and proven experience”) predicted the students’ perception of having developed sufficient skills in incorporating scientific evidence into patient care (odds ratio: 6.17 [95% confidence interval: 3.10; 12.3]). The educational content predictors of this perception included the teaching of HTA (11.3 [1.44; 89.5]) and training regarding two components of a systematic review/HTA: appraising scientific articles using checklists (2.46 [1.23; 4.90]) and assessing organizational aspects related to the introduction/withdrawal of a health technology (2.65 [1.05; 6.67]). The presence of hands-on, credit-bearing, evidence-based medicine (EBM)-related learning activities during clinical courses was also predictive (4.68 [1.69; 13.0]).
Conclusions
This study highlights important educational activities that prepare medical students to incorporate scientific evidence into patient care: (i) adequate assessment of key content regarding scholarly outcomes, including the scientific foundation of medicine; (ii) learning activities about HTA and the systematic review process; and (iii) hands-on application of EBM-related learning activities integrated into clinical courses.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer