Abstract
Background
Objective Structured Clinical Examination (OSCE) is important to assess clinical competencies in health professions. However, in Latin America, a region with limited resources, the implementation and quality of OSCEs remain underexplored despite their increasing use. This study analyses how the OSCE has been applied and how its quality has evolved in Latin America.
Methods
A scoping review methodology was used, including a search across PubMed, Scopus, WOS, LILACS and Scielo, including studies on the implementation of OSCE in Latin America, written in English, French, Portuguese, or Spanish. Their quality was assessed using the AMEE guidelines 81 and 49 criteria and MMERSQI. Data were extracted regarding OSCE structure, evaluator training, validity, reliability, and the use of simulated patients.
Results
365 articles were obtained, of which 69 met the inclusion criteria. The first report on OSCE implementation in the region dates back to 2000. Three countries accounted for 84.06% of the reports (Chile, Mexico, Brazil). 68.12% was applied in undergraduate programs. In this group, the implementation was mainly in Medicine (69.57%), with lesser use in physiotherapy (7.95%) and nursing (2.9%). The number of stations and duration of each varied, with 18-station circuits being the most common. Evidence of validity and reliability of the OSCE was reported in 26.09%, feedback to students in 33,33%, and simulated patient training in 37.68% of the reports. A notable trend in the quinquennial analysis is the increased use of high-fidelity simulations and the shift towards remote OSCEs during the pandemic. The inclusion of inactive stations, inadequate training for simulated patients, and the absence of evidence supporting instrument validation are recurrently reported challenges in OSCE studies. The overall methodological quality has improved, as evidenced by OSCE Committee and Blueprint in nearly 50% of the studies and rising MMERSQI scores, especially in recent years.
Conclusion
While there has been progress in OSCE implementation, particularly in medical education, gaps remain in standardization, validation, training, and resource allocation. Further efforts are needed to ensure consistent quality, particularly in training simulated patients, addressing inactive stations, and ensuring instrument reliability. Addressing these gaps is crucial for enhancing the effectiveness of OSCEs in resource-limited settings and advancing health professional education across the region.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer




