Content area
Section Background
Process mining is an emerging discipline that allows for the analysis of procedural executions performed in a training context, providing objective information about adherence with a normative procedural model (similarity), the number of repetitions of steps (reworks), and performance metrics, which can be used as objective feedback for trainees to guide learning through a process-oriented feedback approach. The aim of this study was to assess whether interventions based on information derived from process mining analysis improve the attainment of procedural proficiency.
AbstractSection Methods
Twenty anaesthesia and emergency medicine residents participated in a training program on ultrasound-guided internal jugular central venous catheter placement that took place in a simulated environment. The participants were randomized into a process-oriented training group (n = 10), which received supplementary interventions during training according to the information obtained with process mining tools, and a control group (n = 10), for whom the simulation-based training program was unchanged. Video recordings of each student were obtained before and after the training. Two blinded observers evaluated each recording using a global rating scale (primary outcome) and checklist. Procedure execution time and process-oriented metrics (rework and similarity) were measured. The pre- and posttraining performance indicators were compared within groups and between groups. The interrater reliability of the global rating scale scores was calculated using the intraclass correlation coefficient. We used the Wilcoxon signed-rank test for intragroup comparisons and the Mann‒Whitney test for intergroup comparisons. Statistical significance was set at P < .05, adjusted for multiple comparisons.
AbstractSection Results
There were no differences between groups in the pretraining measures. During post training, both groups showed improved performance in ultrasound-guided central venous catheter placement compared with their pretraining performance. The global scale results, checklist results, and execution times were not significantly different between the control and process-oriented groups. However, the process-oriented group showed a significant improvement in similarity to the expected performance and a greater reduction in rework than did the control group.
AbstractSection Conclusion
The process-oriented approach, along with the procedural training program, decreases rework and increases adherence to the reference process model of the ultrasound-guided central venous catheter in the execution of the procedure in a simulated environment. Moreover, classic procedural assessment indicators do not capture this effect.
Background
Medical education has undergone a major revolution in the last 20 years. It transitioned from traditional learning from patients’ experiences, which was based on role modelling and a “hands-on model,” to a more comprehensive and integrated learning approach, incorporating student-centred strategies such as workplace-based assessment, simulation, and e-learning instruction [1]. The reasons for this change are the recognition and application of adult learning theories, the reduction in duty hours, the difficulties in accessing clinical campuses, and the ethical imperative to protect patients’ safety. Directly practising on patients is no longer allowed for students who have not previously engaged in simulated training [2].
New strategies developed in the field of medical education are cornerstones for the implementation of appropriate feedback methodologies. Feedback promotes learning by informing students about their progress and difficulties, aiding in the identification of learning needs, and guiding students’ efforts to address deficiencies [3, 4]. Feedback promotes reflection, decision-making, and constant improvement. However, to achieve this effect, feedback-assisted learning should be provided carefully and instructively, oriented towards specific and observed actions, without any value judgement, and it should adhere to the procedure as closely as possible [3, 4]. The main barriers to feedback are the lack of an objective and time, inappropriate timing or physical space, the absence of teacher training, and unidirectional communication [5, 6]. Several feedback theories and methods of providing feedback have been described, but none have proven to be the best [7].
A bedside medical procedure or surgery can be broken down into several steps that must be performed in a predetermined sequence [8, 9]. The sequence and steps can be used to teach a procedure [10,11,12] and be analysed to provide feedback on its execution [13, 14]. This approach is called a “process-oriented” approach, and it is carried out with information derived from process mining tools [15, 16]. Process mining is an emerging engineering discipline aimed at analysing data extracted from event logs to understand how processes are conducted in reality [17, 18]. The process mining approach allows an analysis of bedside medical procedures and surgeries as processes on the basis of a control flow perspective and by using analytical tools. Thus, it is possible to obtain knowledge on how the procedures are executed in a way that makes this knowledge explicit, clearly showing activities that are not carried out, activities that are repeated, or activities that are executed in an unexpected order. Moreover, the process mining approach allows for objective comparisons between each execution and a reference model as well as between the executions of different operators or the executions of the same operator at various times. This knowledge can be used to provide feedback to performers in procedural training contexts. A previous study showed that this information is understandable for instructors, helping them characterize students’ performance [19], and that trainees perceive it as useful for their learning [13]. The impact of feedback through a process-oriented approach on achieving procedural competence has not yet been studied. The hypothesis is that interventions based on information derived from a process-oriented analysis of procedure execution in the context of simulation training improve the attainment of procedural proficiency.
Methods
Training program
The study was approved by the Ethics Committee of the Medical Faculty of the Pontificia Universidad Catolica de Chile, Santiago, Chile, on August 8th, 2018. The Ethics Committee waived the requirement for written informed consent. All study procedures were conducted in accordance with the ethical standards outlined in the Declaration of Helsinki. A clinical trial number is not applicable. Postgraduate year one students of the Anesthesia and Emergency Medicine Programs without previous experience in central venous catheter placement were invited to participate voluntarily. The participants were randomly assigned to a control group (CG) or a process-oriented feedback group (POG). Randomization was performed using a randomization list stratified by speciality to balance the number of residents in each group. Residents in both groups participated in an ultrasound-guided internal jugular central venous catheter placement program (for simplicity, the procedure is referred to as “ultrasound-guided central venous catheter”), a simulation-based training through deliberate practice, which has previously proven helpful in developing this competence [20]. In this program, residents perform repetitive practices with defined learning objectives, with concurrent feedback provided during every session and their technical performance assessed via replicable methods. The training was structured in three stages for both groups (Fig. 1): (1) web-based cognitive training, involving online narrative lecture and complementary literature reviews; (2) a central venous catheter workshop, involving an in-person demonstration session of central venous catheter (CVC) placement; and (3) four sessions of deliberate practice at four stations, namely, (3.1) principles of ultrasound, (3.2) procedure preparation (gown, glove, draping), (3.3) ultrasound (US) scanning and puncture under US guidance, and (3.4) catheter insertion under US guidance.
[IMAGE OMITTED: SEE PDF]
All participants performed two complete procedures using the “Blue Phantom Torso” (http://www.bluephantom.com) as part of the study, one after the web-based cognitive training, defined as the pretest, and the other after the four sessions of deliberate practice, defined as the posttest. Both sessions were video recorded for subsequent analysis.
Process-oriented educational interventions
After the demonstration session, the POG underwent three interventions following a process-oriented approach (Fig. 1). All interventions were cognitive tasks that did not include hands-on training.
1. 1)
Process-oriented teaching: In a group session, the execution of ultrasound-guided central venous catheter placement was taught and explained using a reference process model represented in business process model notation (BPMN) [21], de facto standard notation for process representation, which has been shown to be easily understood by users in the health care area [22]. This model breaks down ultrasound-guided central venous catheter placement into the necessary steps for successful execution, defined by expert consensus, using the Delphi methodology [10].
2. 2)
Self-labelling: After this session and before the deliberate practice stage, each resident performed the ultrasound-guided central venous catheter placement in a “Blue Phantom Torso”, which was recorded on video. The activities defined in the BPMN model were previously taught using the PomeLog platform [23]. Each resident labelled the activities performed during their own execution of the procedure, identifying the start and end times of each activity. PomeLog is an observer-based approach tool [9] used to generate event logs from processes not supported by an information system. It visualizes video recordings and allows users to tag activities manually according to when they start and finish in the video. The list of tagged steps can be exported as an event log file to be analysed with process mining tools. The aim of this intervention was to familiarize residents with the reference model and allow them to observe their own execution from a process perspective.
3. 3)
Process-oriented feedback: Applying the methodology described by Lira et al. [13] and using the same platform and video previously described, one researcher made a report based on process mining analysis, which included procedural execution diagrams, visual comparisons with ideal performance, and nonexecuted and repetitive activities. Each POG resident received this type of report on the PRE execution before deliberate practice sessions, which was explained and discussed by one of the researchers.
Data capture and control flow analysis
PomeLog was used to generate event logs. The event log corresponds to the sequential record of executed activities (event) in one or more particular cases of process execution (case), collecting information about the student who executed the activity (resource), the start and end timestamp of the activity, and other attributes [18]. Using the PomeLog platform [23], video recordings of the execution of ultrasound-guided central venous catheter placement in the phantom were labelled by one of the researchers, similar to the residents’ self-labelling interventions. The event log was processed with a discovery algorithm based on the fuzzy algorithm concept [24] combined with some characteristics from the family of heuristic algorithms to obtain a process model representing each executed procedure. This model process representation allowed us to identify the aspects of procedural execution that could be delivered as feedback to residents, namely, (a) rework, which is when a student repeats one or more activities in the execution of the procedure, (b) activities executed in the incorrect order, and (c) a student’s performance analysis, including the duration of activities and the transition time between them.
Indicators
Two researchers blinded to the location group independently assessed each resident’s performance using a specific checklist for this procedure [25, 26], a global rating scale [27] and total execution time. Both researchers were trained in the use of these instruments.
Using the event log, two process indicators based on the process perspective were calculated. In particular, the similarity indicator between every resident’s execution and the reference process model of the ultrasound-guided central venous catheter placement and rework indicators was used.
1. 1.
Similarity indicator: The Levenshtein edit distance (LED) [28] was used as a similarity indicator, which compares two character sequences on the basis of the number of character insertions, deletions, and substitutions required to make both sequences of characters equal and then divides this number by the sum of the lengths of both strings. To use it, each activity defined in the reference model [10] was assigned a specific character, which allowed each resident’s execution to be represented as a sequence of characters and compared with the ideal sequence of steps described in the reference model. The indicator for each resident was calculated as one minus the LED, which can be interpreted as follows: 1 indicates that the sequence executed by the resident matches that of the reference model, and 0 indicates that the sequence executed by the resident is completely different from that of the reference model.
2. 2.
Rework indicator: This indicator refers to the number of times a step is repeated, considering that each step must be executed only once. This indicator was calculated for each step as the sum of the number of times each resident executed the step over the number of resident executions in which this step was present. A rework indicator equal to 1 means that the step was executed only once by each resident, while an indicator with a value greater than 1 means that the step was executed more than once by each resident.
Statistical analysis
The data were analysed using StataCorp. 2023. (Stata Statistical Software: Release 18. College Station, TX: StataCorp LLC.) The primary outcome of the study was the difference in the global rating scale (GRS) score between the groups. A sample size of 14 subjects (0.05, two-tailed) was calculated to find a 25% variation, with a confidence level of 95% and a margin error of ± 5% in the GRS, using data obtained from the effect of a simulation training program from Corvetto et al. [20], assuming a power of 0.9 for pairwise (pre–post) comparisons. A total of 20 participants were enrolled, considering possible losses. The interrater reliability of the GRS scores was calculated using the intraclass correlation coefficient (ICC) considering the same fixed set of raters/tests: a fixed set of raters rates all subjects, and the ratings are not averaged [29, 30]. The authors conducted a statistical analysis comparing the CG and POG before (PRE) and after (POST) the process-oriented educational interventions and intergroup comparisons at those times. The Wilcoxon signed-rank test for intragroup comparisons and the Mann‒Whitney test for intergroup comparisons were used. The statistical significance was set at P <.05, which was adjusted for multiple comparisons by the Benjamini‒Hochberg false discovery rate (FDR) correction method. The median, interquartile range, and minimum and maximum values are also presented.
Results
A total of 20 postgraduate year-one students were enrolled, of whom 14 were anaesthesia students and 6 were emergency medicine students, 7 females and 13 males. Ten residents were randomly assigned to each group (seven anaesthesia residents and three emergency medicine residents). One participant in the CG withdrew after recording their pretest execution without completing the deliberate practice stage. The remaining 19 residents completed the training programs and recorded their pre- and posttest performances. The interrater reliability was good, with an ICC (95% CI) of 0.746 (0.562–0.859).
Both groups performed similarly in execution after the web-based cognitive training and before the three process-oriented interventions (PRE). There were no differences in the checklist scores, GRS score, execution times, or process indicators (Table 1). At the end of the training (POST), both groups showed improved performance in the ultrasound-guided central venous catheter placement compared with their pretraining performance (PRE) (Table 1). Concerning the checklist results, GRS score, and execution times, there were no significant differences between the CG and POG. When comparing process-oriented indicators and execution time, only POG showed significant improvements. However, the intergroup comparison (Table 1) revealed that the POG had greater similarity with the reference model (Fig. 2) and lower activity rework than did the CG at the end of the training program (Fig. 3). This is particularly relevant in the following categories of activities: ‘vein puncture with trocar under ultrasound vision’, ‘blood return’, ‘drop probe,’ ‘remove the syringe,’ ‘guidewire install,’ ‘remove trocar,’ ‘check wire in short axis’ and ‘check wire in long axis’. In contrast, the CG showed no significant improvement in the process indicators for these categories postintervention.
[IMAGE OMITTED: SEE PDF]
[IMAGE OMITTED: SEE PDF]
[IMAGE OMITTED: SEE PDF]
Discussion
The main finding of this study was that using process mining tools improved the execution of an ultrasound-guided internal jugular central venous catheter from a process-oriented perspective in a simulated environment.
Both groups showed a significant improvement in their performance following the training program, which Corvetto and Cols previously demonstrated [20]. However, the POG showed a significant improvement in control flow indicators, expressed as an increase in similarity with a reference process model of ultrasound-guided central venous catheter placement and a reduction in rework (repetition of activities in the procedure executions). Both improvements are significant because they result in better execution of the procedure, are closer to an ideal and have a lower likelihood of causing harm.
One explanation for this is that BPMN, in which the intervention is deconstructed and presented in all the necessary sequential steps to successfully perform the procedure, makes learning easier because it allows the students to clearly identify and practice the steps responsible for the failure in the procedural execution [31]. Additionally, every student received personal feedback on their first performance, highlighting how close or far they were from the ideal execution and reminding the student of the appropriate sequence and flow so that the student could improve their performance [32]. In the POG group, the task performance was closer to the ideal, especially at relevant points, such as vein puncture, guidewire insertion, and the use of ultrasound. A previous study [33] identified these activities as having a high cognitive load for trainees and demonstrated that strategies that contribute to better management allow for more effective training with evidence of retention over time. In line with our previous work [13], breaking down the procedure into its steps, the sequence in which these steps should be performed, and an analysis of its flow enables a procedure to be taught and for objective, timely, and useful feedback to be provided to improve students’ execution.
The differences between the two groups from the process perspective were not expressed in the differences in the classic indicators used to assess ultrasound-guided central venous catheter competence. In the case of the checklist, the tools are designed to determine the presence or absence of a step in the execution of the procedure without incorporating information on whether it has been repeated and, more importantly, without making a judgement on the quality of the execution of the step. The use of checklists can trivialize the assessment process [34] and does not reflect higher levels of expertise [35]. Therefore, checklists are probably more useful for teaching simpler procedures than complex procedures with greater clinical implications.
Unlike the checklist, in the GRS, task components are evaluated using a Likert scale according to preestablished criteria. The GRS is more subjective than a checklist but has a more robust construct and concurrent validity [36, 37], enabling the assessment of process quality execution more deeply. Additionally, when comparing the performance of these tools in assessing ultrasound-guided central venous catheter placement without an ultrasound guide, the GRS was more reliable than the checklist after a comparison using a generalizability scale [38]. Ma et al. identified as incompetent with GRS, one-third of the candidates which obtaining a passing score with the use of a checklist [27]. However, a recent systematic review [39] comparing the use of a checklist versus the GRS in a simulation assessment suggested that the checklist had better interrater reliability. In contrast, the interitem and interstation reliabilities were better using the GRS. Ultimately, there is still no consensus on the best tool for assessing competence in CVC placement, and important dimensions of competence are often underrepresented in these tools [40]. Thus, the classic indicators (checklist and GSR) do not assess the procedure from a control flow perspective, which includes the proper sequence in which the steps are executed; nor do they assess the omission and unexpected repetition of these steps. The explicit consideration of these aspects is a relevant contribution of the process-oriented methodology to the teaching and learning process.
The development and application of new technologies in the operating room allow the incorporation of a “surgical data science” approach [41] into the understanding and quantification of different levels of competence [42]. This approach has generated a growing interest in defining objective indicators on the basis of factual data to understand the differences between novices and experts and to use these indicators as surgical competency assessment tools [43]. These differences have been described in terms of dissection patterns in robotic surgery [44], the use of cutting devices in laparoscopy [45], and eye movement analysis in endoscopic sinus surgery [46], among others. However, bedside procedures lack this technological support. Therefore, obtaining data and applying surgical data science to define objective indicators are significant research challenges. The process-oriented approach allowed us to focus on aspects that current tools cannot address. Enabling the delivery of useful information goes a step further in achieving procedural competence, reducing rework, and bringing the execution closer to a reference procedural pattern from the control flow perspective. A limitation of this study is the assessment of competence in a simulated environment without evidence of transfer to a clinical setting. As suggested by Sawyer et al., the translation of procedural skills to a real-world setting is crucial [47]. However, classic indicators may not be detected when learning is achieved on the simulator, which is revealed in clinical scenarios [48]. In this study, compared with the CG, the POG exhibited less rework and better compliance with a reference process model of the procedure while maintaining good performance from the perspective of classical indicators, which suggests that these students may perform better in more demanding environments. Although these findings may be of clinical importance, the differences found in rework and similarity correspond to the secondary outcomes of this research; therefore, they could be underestimated since the power was not calculated for them. Further work is needed to establish the real significance of these differences.
In addition to the potential positive impact of integrating process-based feedback into procedural training outcomes, the process-oriented approach employed in this research makes a dual contribution. First, it introduces a methodology for generating objective data, offering a valuable analytical tool for training procedures that lack support from automated data capture technologies. Second, it reveals facets of learning in simulation environments that traditional evaluation indicators in procedural training fail to capture, enabling comparisons of process-related aspects that have previously been overlooked.
On the basis of these findings, the next step is the implementation of a training program, based on a process-oriented approach, for ultrasound-guided central venous access placement and the subsequent assessment of transferring this approach to real-world clinical settings to determine whether there are differences in relevant clinical outcomes compared with those of traditional teaching methodologies.
Conclusion
The use of a process-oriented approach improved the procedural training program in a simulated environment, decreased rework and increased adherence to the reference process model for ultrasound-guided central venous catheter placement. Moreover, classic procedural assessment indicators do not capture these effect of using a process-oriented approach.
Data availability
The data that support the findings of this study are available as a supplementary file (Data_Set_English.xlsx).
Abbreviations
BPMN:
Business process model notation
ICC:
Intraclass correlation coefficient
CG:
Control group
CVC:
Central venous catheter
GRS:
Global rating scale
IRB:
Institutional review board
LED:
Levenshtein edit distance
POG:
Process-oriented feedback group
US:
Ultrasound
FDR:
False discovery rate
Cox M, Irby DM, Epstein RM. Assessment in medical education. N Engl J Med. 2007;356:387–96. https://doi.org/10.1056/nejmra054784.
Iglehart JK. Revisiting Duty-Hour Limits — IOM recommendations for patient safety and resident education. N Engl J Med. 2008;359:2633–5. https://doi.org/10.1056/nejmp0808736.
Schartel SA. Giving feedback – an integral part of education. Best Pract Res Clin Anaesthesiol. 2012;26:77–87. https://doi.org/10.1016/j.bpa.2012.02.003.
Wong A. Review article: teaching, learning, and the pursuit of excellence in anesthesia education. Can J AnesthesiaJ Can D’anesthésie. 2012;59:171–81. https://doi.org/10.1007/s12630-011-9636-x.
Fulham NM, Krueger KL, Cohen TR. Honest feedback: barriers to receptivity and discerning the truth in feedback. Curr Opin Psychol. 2022;46: 101405. https://doi.org/10.1016/j.copsyc.2022.101405.
McCutcheon S, Duchemin A-M. Overcoming barriers to effective feedback: a solution-focused faculty development approach. Int J Med Educ. 2020;11:230–2. https://doi.org/10.5116/ijme.5f7c.3157.
Ramani S, Könings KD, Ginsburg S, van der Vleuten CPM. Feedback redefined: principles and practice. J Gen Intern Med. 2019;34:744–9. https://doi.org/10.1007/s11606-019-04874-2.
Neumuth T. Surgical process modeling. Innovative Surgical Sciences. 2017;2:123–37. https://doi.org/10.1515/iss-2017-0005.
Lalys F, Jannin P. Surgical process modelling: a review. Int J Comput Assist Radiol Surg. 2014;9:495–511. https://doi.org/10.1007/s11548-013-0940-5.
de la Fuente R, Fuentes R, Munoz-Gama J, Dagnino J, Sepúlveda M. Delphi method to achieve clinical consensus for a BPMN representation of the central venous access placement for training purposes. Int J Environ Res Public Health. 2020;17: 3889. https://doi.org/10.3390/ijerph17113889.
de la Fuente R, Kattan E, Munoz-Gama J, Puente I, Navarrete M, Kychenthal C, et al. Development of a comprehensive percutaneous dilatational tracheostomy process model for procedural training: A Delphi-based experts consensus. Acta Anaesthesiol Scand. 2021;65:244–56. https://doi.org/10.1111/aas.13716.
Kattan E, de la Fuente R, Putz F, Vera M, Corvetto M, Inzunza O, et al. Simulation-based mastery learning of bronchoscopy-guided percutaneous dilatational tracheostomy: competency acquisition and skills transfer to a cadaveric model. Simul Healthc. 2020;16:157–62. https://doi.org/10.1097/sih.0000000000000491.
Lira R, Salas-Morales J, Leiva L, de la Fuente R, Fuentes R, Delfino A, et al. Process-oriented feedback through process mining for surgical procedures in medical training: the ultrasound-guided central venous catheter placement case. Int J Environ Res Public Health. 2019;16: 1877. https://doi.org/10.3390/ijerph16111877.
Martínez JJ, Galvez-Yanjari V, de la Fuente R, Kychenthal C, Kattan E, Bravo S, et al. Process-oriented metrics to provide feedback and assess the performance of students who are learning surgical procedures: the percutaneous dilatational tracheostomy case. Med Teach. 2022;44:1244–52. https://doi.org/10.1080/0142159x.2022.2073209.
de la Fuente R, Fuentes R, Munoz-Gama J, Riquelme A, Altermatt FR, Pedemonte J, et al. Control-flow analysis of procedural skills competencies in medical training through process mining. Postgrad Med J. 2020;96:250–6. https://doi.org/10.1136/postgradmedj-2019-136802.
Munoz-Gama J, Galvez V, Fuente R, de la, Sepúlveda M, Fuentes R. Interactive Process Mining in Healthcare. In: Fernandez-Llatas C, editor. Interactive Process Mining in Healthcare, 2020, pp. 233–42. https://doi.org/10.1007/978-3-030-53993-1_14
van der Aalst W. Process mining, data science in action. Second. Springer; 2016. https://doi.org/10.1007/978-3-662-49851-4.
van der Aalst W, Adriansyah A, van Dongen B. Replaying history on process models for conformance checking and performance analysis. WIREs Data Min Knowl Discov. 2012;2:182–92. https://doi.org/10.1002/widm.1045.
Galvez V, de la Fuente R, Meneses C, Leiva L, Fagalde G, Herskovic V, et al. Process-Oriented instrument and taxonomy for teaching surgical procedures in medical training: the Ultrasound-Guided insertion of central venous catheter †. Int J Environ Res Public Heal. 2020;17:3849. https://doi.org/10.3390/ijerph17113849.
Corvetto MA, Pedemonte JC, Varas D, Fuentes C, Altermatt FR. Simulation-based training program with deliberate practice for ultrasound-guided jugular central venous catheter placement. Acta Anaesthesiol Scand. 2017;61:1184–91. https://doi.org/10.1111/aas.12937.
Group OM. Business Process Model and Notation™ (Bpmn™) Version 2.0 n.d. http://www.omg.org/spec/BPMN/2.0/
Pufahl L, Zerbato F, Weber B, Weber I. Bpmn in healthcare: challenges and best practices. Inf Syst. 2022;107: 102013. https://doi.org/10.1016/j.is.2022.102013.
Leiva L, Munoz-Gama J, Salas-Morales J, Galvez V, Lee WLJ, de la Fuente R et al. Pomelog: generating event logs from unplugged processes. Proceedings of the Dissertation Award, Doctoral Consortium, and Demonstration Track at BPM 2019 co-located with 17th International Conference on Business Process Management (BPM 2019), vol. 2420, 2019, pp. 189–93.
Günther CW, van der Aalst WMP. Business Process Management, 5th International, Conference BPM. 2007, Brisbane, Australia, September 24–28, 2007. Proceedings. vol. 4714, Springer Berlin Heidelberg; 2007, pp. 328–43. https://doi.org/10.1007/978-3-540-75183-0_24
Barsuk JH, Ahya SN, Cohen ER, McGaghie WC, Wayne DB. Mastery learning of temporary hemodialysis catheter insertion by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;54:70–6. https://doi.org/10.1053/j.ajkd.2008.12.041.
Nguyen B-V, Prat G, Vincent J-L, Nowak E, Bizien N, Tonnelier J-M, et al. Determination of the learning curve for ultrasound-guided jugular central venous catheter placement. Intensive Care Med. 2013;40:66–73. https://doi.org/10.1007/s00134-013-3069-7.
Ma IWY, Zalunardo N, Pachev G, Beran T, Brown M, Hatala R, et al. Comparing the use of global rating scale with checklists for the assessment of central venous catheterization skills using simulation. Adv Heal Sci Educ. 2011;17:457–70. https://doi.org/10.1007/s10459-011-9322-3.
Yujian L, Bo L. A normalized levenshtein distance metric. IEEE Trans Pattern Anal Mach Intell. 2007;29:1091–5. https://doi.org/10.1109/tpami.2007.1078.
Shrout PE, Fleiss JL. Intraclass correlations: uses in assessing rater reliability. Psychol Bull. 1979;86:420–8. https://doi.org/10.1037/0033-2909.86.2.420.
Fleiss JL. The Design and Analysis of Clinical Experiments. 2011:91–119. https://doi.org/10.1002/9781118032923.ch4
Zhang Q, Fiorella L. An integrated model of learning from errors. Educ Psychol. 2023;58:18–34. https://doi.org/10.1080/00461520.2022.2149525.
Harris DJ, Vine SJ, Wilson MR, McGrath JS, LeBel M-E, Buckingham G. The effect of observing novice and expert performance on acquisition of surgical skills on a robotic platform. PLoS ONE. 2017;12:e0188233. https://doi.org/10.1371/journal.pone.0188233.
McGraw R, Chaplin T, Rocca N, Rang L, Jaeger M, Holden M, et al. Cognitive load theory as a framework for simulation-based, ultrasound-guided internal jugular catheterization training: once is not enough. CJEM. 2018;21:141–8. https://doi.org/10.1017/cem.2018.456.
Vleuten CPMVD, Schuwirth LWT. Assessing professional competence: from methods to programmes. Méd Educ. 2005;39:309–17. https://doi.org/10.1111/j.1365-2929.2005.02094.x.
Hodges B, Regehr G, McNaughton N, Tiberius R, Hanson M. Osce checklists do not capture increasing levels of expertise. Acad Med. 1999;74:1129–34. https://doi.org/10.1097/00001888-199910000-00017.
Norcini JJ. Underst Med Educ. 2010;232–45. https://doi.org/10.1002/9781444320282.ch16.
Regehr G, MacRae H, Reznick RK, Szalay D. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. 1998;73:993–7. https://doi.org/10.1097/00001888-199809000-00020.
Lord JA, Zuege DJ, Mackay MP, Ordons AR, des, Lockyer J. Picking the right tool for the job: A reliability study of 4 assessment tools for central venous catheter insertion. J Grad Méd Educ. 2019;11:422–9. https://doi.org/10.4300/jgme-d-19-00107.1.
Ilgen JS, Ma IWY, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49:161–73. https://doi.org/10.1111/medu.12621.
Ma IW, Sharma N, Brindle ME, Caird J, McLaughlin K. Measuring competence in central venous catheterization: a systematic-review. SpringerPlus. 2014;3:33. https://doi.org/10.1186/2193-1801-3-33.
Maier-Hein L, Vedula SS, Speidel S, Navab N, Kikinis R, Park A, et al. Surgical data science for next-generation interventions. Nat Biomed Eng. 2017;1:691–6. https://doi.org/10.1038/s41551-017-0132-7.
Azari D, Greenberg C, Pugh C, Wiegmann D, Radwin R. In search of characterizing surgical skill. J Surg Educ. 2019;76:1348–63. https://doi.org/10.1016/j.jsurg.2019.02.010.
Vedula SS, Ishii M, Hager GD. Objective assessment of surgical technical skill and competency in the operating room. Annu Rev Biomed Eng. 2016;19:1–25. https://doi.org/10.1146/annurev-bioeng-071516-044435.
Ma R, Vanstrum EB, Nguyen JH, Chen A, Chen J, Hung AJ. A novel dissection gesture classification to characterize robotic dissection technique for renal hilar dissection. J Urol. 2020;205:271–5. https://doi.org/10.1097/ju.0000000000001328.
Hosogi H, Obama K, Tsunoda S, Hisamori S, Nishigori T, Tanaka E, et al. Educational application of intraoperative records from an energy device in laparoscopic gastrectomy: a preliminary report. Surg Today. 2021;51:829–35. https://doi.org/10.1007/s00595-020-02160-x.
Ahmidi N, Ishii M, Fichtinger G, Gallia GL, Hager GD. An objective and automated method for assessing surgical skill in endoscopic sinus surgery using eye-tracking and tool-motion data. Int Forum Allergy Rhinol. 2012;2:507–15. https://doi.org/10.1002/alr.21053.
Sawyer T, White M, Zaveri P, Chang T, Ades A, French H, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90:1025–33. https://doi.org/10.1097/acm.0000000000000734.
Stefanidis D, Scerbo MW, Montero PN, Acker CE, Smith WD. Simulator training to automaticity leads to improved skill transfer compared with traditional proficiency-based training: a randomized controlled trial. Ann Surg. 2011;255:30–7. https://doi.org/10.1097/sla.0b013e318220ef31.
© 2025. This work is licensed under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.