1. Introduction
Imaging technologies play a pivotal role in surgical interventions. The major applications include preoperative staging to assess local tumor invasion, metastatic spread and planning for intraoperative proceeding including 3D models for practice. Intraoperatively, orientation on anatomical landmarks provided by conventional imaging as well as virtual reality application for real-time detection of tumors and metastatic spread might be applied [1,2].
Robotic assisted radical prostatectomy (RARP) is the standard of care for localized prostate cancer [1]. Due to anatomical conditions of the pelvis, important nerve and vessel structures and difficulties detecting lymph node metastases, RARP is a challenging procedure [3]. Imaging technologies are therefore warranted to support the surgeon and to improve outcomes of patients. Interestingly, robotic procedures are prone to visual enhancement as an endoscopic camera is already used. Augmented reality and simulations are therefore an obvious to implement addition [4].
During prostatectomy, preoperative planning is paramount either for the primary tumor and its surrounding structures as well as for lymphatic spread into locoregional lymph nodes (LN). Here, 3D reconstructions and prostate models might help to visualize extraprostatic tumor growth and neurovascular invasion. As 3D models might help understanding of anatomical positions, they might support surgical planning. Furthermore, those reconstructions might be then used to overlay with the video console intraoperatively to guide the surgeon. Further, in conventional surgery, beta and gamma probes are used to detect metastases marked by radioligands, for example. Those technical solutions require adoption for robotic surgery. Interestingly, RARP is already using imaging through the video console. Therefore, combination of this approach with modern imaging technologies might be one of the cornerstones of modern RARP.
The present study is a systematic review of new imaging technologies for RARP that focuses on the preoperative planning and intraoperative utilization of technology as well as teaching modalities specific for RARP.
2. Materials and Methods
A systematic literature analysis was conducted on 24 January 2023. Pubmed, Web of Science and Scopus database were systematically queried with a predefined research string defined as followed:
((robotic prostatectomy) AND (augmented reality)) OR ((robotic prostatectomy) AND (molecular imaging)) OR ((robotic prostatectomy) AND (neuronal imaging)) OR ((robotic prostatectomy) AND (virtual reality)) OR ((robotic prostatectomy) AND (new imaging technology). All identified papers were considered for further analysis.
First, all duplicates originating from the three databases were removed. All identified studies from the three databases were then analyzed for overall eligibility. Therefore, only original articles were included and replies, editorials, reviews and book chapters were removed. The received articles were then screened for inclusion criteria. Inclusion criteria were then original articles covering any aspect of new technologies specific for RARP that either support imaging of the primary tumor or lymph node metastases to improve surgical outcomes or that focus on imaging and visualization for training modalities. Accordingly, we excluded articles that reported preoperative imaging without direct intraoperative utilization (screening via ultrasound or MRI, staging imaging including PSMA-PET CT imaging), preclinical models or technology for salvage lymphadenectomy. The focus was put on research published within the last 5 years. However, the literature research was conducted without restriction of the publishing year. The analysis for eligibility was performed independently by two researchers. In cases of disagreement, a third researcher was involved to form consensus. The protocol of this systematic review was not registered prior initiation of the study.
Analyses were performed according to the PRISMA guideline for systematic reviews [5].
3. Results
The systematic search on Pubmed, Scopus and Web of Science with the described research string revealed 511 studies. After removing duplicates, 229 studies were screened. Here, 95 studies were identified for eligibility. Next, 49 studies were excluded for not meeting the inclusion criteria. A total of 46 articles were selected after qualitative analysis for this review (see Figure 1).
All 46 studies were analyzed and a level of evidence was determined based on the 2011 Oxford Center for evidence-based medicine level of evidence [6]. All studies were then categorized by the assessed organ (prostate, LN, abdominal wall), the area of application (preoperative planning, visualization of the primary tumor, intraoperative diagnostics, intraoperative detection of LN, education/training and feedback) as well as by the applied imaging modality (see Table 1).
Below we report our findings stratified by the imaging modalities for the primary tumor and locoreginonal LN detection as well as applications for training purposes.
3.1. Primary Tumor
From the identified 46 studies, 19 studies focused on imaging of the primary tumor either for preoperative planning, intraoperative tumor detection, real time imaging or to support intraoperative diagnostics.
3.1.1. Preoperative Planning
Four studies focused on preoperative planning. Shirk et al. conducted a randomized trial (n = 92) to evaluate the performance of surgeons when reviewing virtual models prior RARP. The trial revealed improved oncological outcomes with significantly lower postoperative detectable PSA (31% vs. 9%, p = 0.036) and a trend towards lower positive margin rates. Surgeons changed their surgical strategy in 32% of the cases based on the reviewed model leading to a trend towards bilateral nerve sparing [8]. In addition, a retrospective analysis performed by Checcuci et al. revealed the use of a 3D model as a protective factor for positive surgical margins [10]. Similar results have been shown by Martini et al., that compared patients before and after introduction of 3D models derived from 3T-MRI [9].
When looking at the patient perspective, Wake et al. revealed that patients gain a better understanding of their disease when their organ was 3D printed versus visualized in augmented reality or viewed on a 3D or 2D screen [7].
3.1.2. Intraoperative Tumor Detection and Real Time Imaging
A total of 12 studies focused on intraoperative real-time monitoring or augmented reality regarding the primary tumor.
Samei et al. demonstrate the feasibility of real-time augmented reality-based motion tracking of the prostate using ultrasound [13]. The working group has developed their system further and have tested the combination of preoperative MRI and ultrasound guidance in twelve patients undergoing RARP. Thereby, the surgeon can navigate the transducer via the robotic instruments. Imaging data are then overlayed on the endoscopic image. An accuracy of 3.2 mm is achieved [19].
A phase I study by Kratiras et al. using a tablet-based image guidance system that mapped the preoperative MRI to the patient revealed that such solutions are mainly used during challenging steps of RARP at the bladder neck and apical dissection as well as during nerve sparing [14].
Mehralivand et al. present a virtual reality imaging technology derived from preoperative MRI imaging that can be overlayed at several time points of RARP. However, according to the authors this system shows the limitation of VR-imaging, that it is not integrated into the video console as it was not considered useful for challenging surgical situations [17].
Schiavina et al. used MRI derived 3D models of the prostate that are superimposed on the video stream of the robotic system. The research group aimed to evaluate the impact of this technological support on intraoperative nerve sparing planning during RARP. The initial surgical plan was changed in 38.5% of all patients with 11.5% of all patients presenting with positive surgical margins after surgery. The sensitivity of the model was 70%, the specificity was 100% and the accuracy 92% [20].
Similarly, Porpiglia et al. demonstrate the feasibility of augmented reality RARP with a model accuracy of 1–5 mm with 85% of mismatch being less than 3 mm [11]. This group further tested hyperaccuracy 3D reconstruction with similar results [16]. In another study, Porpiglia et al. used an elastic augmented reality model to detect areas of capsular involvement during the nerve-sparing phase of RARP. This model has been developed to superimpose images even during the dynamic phases of surgery with deformation of the prostate. The authors demonstrate a superiority compared to 2D cognitive RARP in terms of detection of capsular involvement [15].
Although most research groups use MRI as input data for the augmented or virtual reality models, Canda et al. also incorporate PSMA-PET-imaging data for their VR model. The model was used in five RARP cases and revealed the clinical feasibility of this approach [18].
Intraoperative real-time augmented reality assistance might be achieved by deep-learning approaches and requires computing power. Therefore, Tanzi et al. investigate different algorithms to achieve this and demonstrated the superiority of a new convolutional neural network they applied with an intersection over unit (IoU) of 0.894 [21]. Similarly, Padovan et al. achieve real time 3D model alignment by semantic segmentation and use convolutional neuronal networks and motion analysis to compensate for rotation. Here IoU scores greater than 0.80 were achieved [22].
When evaluating the surgeons′ view on this development of augmented reality RARP, surgeons revealed a strongly positive opinion about this support for all evaluated critical steps of a RARP including bladder neck dissection, nerve sparing and apex dissection [12].
An example of the potential application of virtual reality superimposing video console real-time imaging is provided in Figure 2.
3.1.3. Intraoperative Diagnostics
Three studies reported the use of new imaging modalities for intraoperative diagnostics.
Lopez et al. used confocal laser endomicroscopy to detect tumors as well as damage to the neurovascular bundle. The study revealed the clinical feasibility with standard robotic instrumentation [23]. The frequency of abdominal wall hematoma caused by trocars during insertion for RARP might be decreased through an infrared device that detects veins. In a study of 724 cases, the device led to change in trocar placement in 65% of all cases and decreased the frequency of abdominal wall hematoma from 8.8% to 2.6% (p = 0.03) [24]. Bianchi et al. demonstrate the application of augmented reality to perform intraoperative frozen sections. In this study augmented reality was used to guide intraoperative frozen section in 20 patients that were propensity score matched against 20 patients. Positive surgical margins at the level of the index lesion were significantly reduced in the augmented reality guided group (5% vs. 20%, p = 0.01) [25].
3.2. Intraoperative Detection of Lymph Node Metastases
Intraoperative detection of lymph nodes via specialized imaging has been analyzed in 12 studies identified by our literature search. Thereby three technologies were used, namely fluorescence cameras and drop-in beta and gamma probes. In recent studies PSMA has thereby been used as a target for the fluorescent dye.
Van der Poel et al. revealed the feasibility of an approach to use intraoperative fluorescent imaging to detect SN during RARP. Hereby, the tracer indocyanine-(ICG)-99mTc was injected into the prostate under ultrasound guidance three hours prior to surgery. Two hours after injection, SPECT-CT were acquired to detect SN. Intraoperatively, a fluorescence laparoscope and a laparoscopic gamma probe were used to identify SN. In total, 11 patients underwent this procedure. Fluorescent imaging improved the detection SN in this setting, especially in areas with high background radioactivity [26].
De Korne et al. analyzed whether the site of injection has an impact on detection of SN during surgery. In this study, 67 patients received an ICG-99mTc-nanocolloid injection into the prostate. Intratumoral tracer injection increased the chance of visualizing nodal metastases [29].
KleinJan et al. report a combined approach using indocyanine green-99mTc-nanocolloid as a radioactive and fluorescent tracer. Here, no improvement in detection rates of sentinel lymph nodes was observed. The procedure is described as safe [27]. This tracer was further evaluated by van den Berg et al. They revealed that the combination of ICG-99mTc-nanocolloid together with the lymphangiographic tracer fluorescein improves lymph node detection in patients undergoing RARP [28]. Özkan et al. revealed in a patient cohort of 50 patients that of nine LN positive patients eight had fluorescent positive LN whereas six were detected by preoperative PSMA-PET CT [37].
Another study using indocyanine green-99mTc-nanocolloid revealed higher detection rates of positive lymph nodes in patients undergoing sentinel node biopsy during RARP [32]. A recent phase-II trial analysed the status of indocyanine green-99mTc-nanocolloid further and revealed that intratumoral application improves detection rates compared to application into the prostate. However, metastatic spread from non-index tumors was not detected by the intratumoral application. Therefore, the authors propose combining the intratumoral and intraprostatic tracer injection to optimize sentinel lymph node detection [33]. In a retrospective study, Hinsenveld et al. revealed that the combination of preoperative PSMA PET-CT and 99mTc-nanocolloid for sentinel lymph node detection increased the overall detection in patients with PSMA negative lymph node metastases [30].
Collamati et al. performed a different approach and further develop the approach of SPECT-isotopes by using 68Ga-PSMA-11 and a DROP-IN beta particle detector [31]. A comparable approach has been performed by Gondoputro et al. This group used a DROP-IN γ-probe and 99mTc PSMA as a tracer to detect lymph node metastases. This prospective single-arm study (n = 12) revealed a high detection rate of positive lymph nodes outside the resection template. A total of 11 metastatic lymph nodes were detected that were not visible on PSMA-PET imaging [34].
Dell’Oglio et al. performed a study to compare a DROP-IN gamma probe with traditional laparoscopic gamma probes as well as fluorescence guidance. Thereby, 47 sentinel lymph node procedures were conducted in the intervention group with 100% detection in the intervention group. Furthermore, 91% of those were identified by fluorescence imaging and 76% by the laparoscopic gamma probe [35].
The sensitivity and specificity of the concept of PSMA guided surgery is currently being tested in a phase II study by Gandaglia et al. In a planned interim analysis, sensitivity (67%), specificity (100%), positive predictive value (100%), and negative predictive values (90%) were observed. Despite an overall good performance, the authors raise the issue of suboptimal sensitivity leading to missed micrometastases of the approach [36].
3.3. Training
A total of 15 studies focused on training of surgeons for RARP.
3.3.1. Virtual Training
Various approaches using imaging for virtual- or simulation-based surgical training have been described in 12 studies.
Hung et al. report on one of the first simulators that was still limited to basic skill training [38]. However, Aghazadeh et al. already report a positive correlation between simulated robotic performance and robotic clinical performance [39]. Further virtual reality models are used, and it has been demonstrated that they improve surgical skills of novice surgeons [41].
Shim et al. demonstrate in a study with 45 participants that educational videos are comparable to expert-guided training but are superior to unguided training to fulfil robotic surgical tasks [42]. Shim et al. also investigated the performance of procedure specific training modules in virtual simulators for vesicourethral anastomosis and revealed significant improvements in live surgery after undergoing the training module [43].
Papalois et al. discuss a mixed reality application to train surgical decision making and anatomical structures. Here, multi-rater agreement reached 70.0% for every step of the training and significant improvement was achieved through the training [48].
Basic robotic skills acquired in the lab are transferable to the operating room according to Almarzouq et al. as they observe a positive correlation between the Global Evaluative Assessment of Robotic Skills (GEARS) scores for defined practice sessions on a simulator compared to GEARS scores during urethro-vesical anastomosis and bladder mobilization [44].
Simulation-based training and its effectiveness might depend on the experience level that trainees have. Hoogenes et al. revealed in a randomized trial that two different training programs led to different outcomes in junior trainees but not in more experienced trainees. The hereby used dV-Trainer (dV-T) (Mimic Technologies, Inc., Seattle, WA, USA) uses similar hand and foot controls as a da Vinci console whereas the da Vinci Surgical Skills Simulator (dVSSS) is software that is integrated into the console and uses the normal hand and foot controls [40]. This impact on the learning curve of surgeons is further analyzed by Wang et al. Here, surgeons with VR training revealed shorter learning curves than surgeons without, leading to shorter procedure times and especially anastomosis times (25.1 ± 7.1 min versus 40.0 ± 12.4 min; p = 0.015) [45].
A new development is full procedure simulation. Ebbing et al. demonstrate the face and content validity of a full procedure simulation module [47].
Besides from improving surgical skills based on simulation training, Olsen et al. address the question of when to proceed from simulation-based training to live surgery. This research group found a simulator score based on performance during bladder neck dissection, neurovascular bundle dissection and ureterovesical anastomosis that predicts experience levels of surgeons. According to the authors this score might be used to define which surgeon can proceed to supervised clinical training [46].
Further, training scores derived from simulation-based training not only correlate with scores in live surgery but can also impact clinical outcomes as continence recovery rates. In a study from Sanford et al., a high performance during VR needle driving led to a continence recovery rate after 24 months of 98.5% versus 84.9% in surgeons with lower scores (p = 0.028) [49].
3.3.2. Peer Review and Structured Feedback
Video review has been identified as an important element of training by van der Leun et al. In this study, students revealed significantly less injuries to the urethra or performed sutures with higher accuracy when reviewing videos of their training [50].
As RARP can be recorded including the use of augmented reality platforms, mentoring and teaching is possible from remote, potentially improving diffusion of robotic training beyond centers [51].
The future in this process might be artificial-intelligence-based video-labeling. A preliminary study of Youssef et al. demonstrates the feasibility of self-training of novices to surgical procedures to perform segmentation of RARP videos [52].
4. Discussion
RARP is a challenging procedure that requires precise treatment planning and intraoperative visualization. Various imaging tools have been developed to optimize outcomes of patients with PC. Thereby, the combination of innovative imaging tools and intraoperative guidance on the video console are at the center of the current research. We provide a comprehensive overview of the current literature and insights into future developments.
New imaging technologies can provide assistance at every step of RARP. Treatment of the primary tumor as well as LN dissection can be improved by incorporating those technologies into the surgical workflow as outlined in several feasibility studies described in the results part of this manuscript. Ultimately, surgical training can be enhanced by those advances and might be standardized by the support of simulation-based training and standardized performance metrics in order to improve outcomes [53].
Guidelines have not incorporated most of the described techniques yet. The current EAU guideline describes MRI-guided and PSMA-PET-based normograms that omit the necessity for LN dissection in certain patients. Our review has not covered this topic, as it does not cover staging that is used for both RARP and conventional prostatectomy. Sentinel lymph node biopsy and the use of indocyanin green is discussed in the current guidelines, but insufficient evidence is seen as an obstacle to a broad use of this technology as a meta-analysis revealed a sensitivity of 95.2% and NPV of 98.0% for finding LN metastases in those patients regardless of the primary surgical approach [1,54].
Despite a currently low uptake in guidelines, the impact of current augmented reality models in small case series is dramatic. Around one out of three procedures are performed differently when using the superimposed imaging as described by Schiavina et al. [20]. As the input imaging modalities (MRI, PSMA-PET CT) and robotic systems for RARP become increasingly available [55], the impact of a combination of both technologies might be dramatic for future management of patients with prostate cancer. This impact is not only restricted to the surgical procedure, where preoperative imaging can help to guide surgeons. In addition, preoperative counseling of patients and planning of surgeries can be impacted by virtual reality approaches. However, some adaptions must be made to implement all technologies. To address this, several groups have developed novel tools to provide fluorescence imaging or detection of radioactivity on robotic consoles. However, further efforts will be required to optimize the interfaces between surgeons and the currently developed tools. Interestingly, the addition of a variety of tools to the conventional surgical platforms might change the market leadership of currently dominating companies. Potentially, platforms that are open for development and quickly integrate new cost-effective tools might provide advantages for urologists.
Currently, urologic curricula have no structure for robotic assisted surgery. In the Netherlands, residents already participate in robotic surgery mostly in their final year of residency. However, no criteria exist when residents are allowed to take up training or surgery. At some institutions residents are required to complete online training courses while others are required to reach a certain threshold at simulator-based training [56]. New imaging technologies and combining them with surgical curricula might help to structure the education of future surgeons. As demonstrated in our analysis, those curricula must be designed differently for less experienced and experienced surgeons. New biomarkers can be developed to predict surgical learning curves and to give feedback to surgeons. Interestingly, lifelong training might be possible under those conditions as experienced surgeons can still receive feedback from those algorithms or peer surgeons despite spatial distance.
Financial toxicity has to be considered when adding new armamentarium to the diagnostic and treatment landscape of prostate cancer patients [57]. Adding more features and specialized instruments to a robotic system requires more resources in the healthcare system and might add to the direct costs of RARP that can already be considerable with conventional tools [58]. RARP exceeds the costs of conventional surgery by approximately EUR 2000 [59]. Further equipment such as gamma-probes or preoperative imaging is required for AR and VR augmented approaches. Interestingly, all cited studies in this manuscript do not report on the actual costs of the approach. In addition, not only the direct costs of required technology are of note. When integrating these technologies in a complex metaverse, new infrastructure is required which might not be affordable and thereby might not be provided all over the world [60]. However, some technologies might not necessarily increase the financial toxicity, as especially virtual reality models mostly rely on software and might therefore profit from low material costs and potentially improved outcomes. Future studies will have to assess the impact of new technologies on the overall costs of RARP.
Ultimately, the emerging imaging tools can contribute to a complete change in urologic surgery as it may allow for a step-by-step development towards autonomous surgery. Currently, RARP is performed completely controlled by a surgeon. Current tracking and localization techniques as outlined in the manuscript can help to determine organ boundaries or tumor location. In a first approach this information supports a surgeon to detect those important structures or provides feedback for training. Further development of those techniques might leverage this information and lead to autonomous surgery [61].
The adoption of technology might be impacted by the age of surgeons. Similar to the use of technology in urological patient cohorts [62], age plays a role for the overall use of technology. Technology adoption in physicians is dependent on the age of physicians. However, only very high age is impacting the uptake significantly [63]. Still, there is a need to further investigate the adoption of those new technologies amongst surgeons beyond the feasibility that has been shown in the analyzed studies in this manuscript. In addition, the importance of training and stepwise implementation of the described imaging modalities is paramount [64]
Most of the discussed studies are highly limited by their study design. Mostly, match-pair analysis, retrospective cohort analysis or exploratory single-arm studies are performed. Therefore, a final conclusion regarding the impact of those new technologies on outcomes of patients cannot be drawn and requires further research especially in the light of cost-effectiveness. The authors do not report on direct and indirect costs associated with the introduction of those new technologies. Further, a variety of different instruments and programs are used throughout the studies. Especially for AI-based applications, precise reporting of the algorithm functions will be paramount to facilitate a clear explanation [65]. Standardization and use in several consecutive studies might improve those issues in the future.
5. Conclusions
New imaging technologies have been increasingly tested to reduce complications and improve surgical outcomes of patients undergoing RARP. Currently, the feasibility of combined approaches using preoperative imaging or intraoperatively applied radioactive or fluorescent dyes has been demonstrated while a prospective confirmation of improvements is currently ongoing. Balancing improvements in surgical outcomes against financial toxicity of new imaging technologies for RARP will be a cornerstone for a broad clinical implementation.
Conceptualization: S.R. and G.E.C.; methodology: S.R.; software: S.R.; validation: S.R.; formal analysis: S.R. and M.A.K.; investigation: S.R., M.A.K. and T.W.; resources: S.R.; data curation: S.R. and M.A.K.; writing—original draft preparation, S.R., M.A.K. and T.W.; writing—review and editing: K.-F.K., I.R.B., M.T., S.P., J.G.R., A.V., P.P., E.C., C.G.S. and G.E.C.; visualization, S.R. and E.C.; supervision: S.R. and G.E.C.; project administration: S.R. and G.E.C.; funding acquisition: S.R. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
Not applicable.
The authors declare no conflict of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 2. Augmented reality application for RARP. (A) 3D model of the prostate with intraprostatic lesion (C) 3D model of the prostate with lesion with capsular contact (bright green). (B,D) AI 3D-AR RARP: the 3D virtual model of the prostate was automatically overlapped to in vivo anatomy thanks to the AI; then a 3D guided selective biopsy was performed (images are courtesy of Prof. Porpiglia).
Evidence synthesis.
Author [Ref.] | Year | Study Design | LoE [ |
Organ | Area of Application | Imaging Modality |
---|---|---|---|---|---|---|
Wake et al. [ |
2019 | Prospective | IV | Prostate | Preoperative Planning | MRI—Virtual Reality, 3D models |
Shirk et al. [ |
2022 | Prospective | II | Prostate | Preoperative Planning | MRI—Virtual reality |
Martini et al. [ |
2022 | Retrospective | IV | Prostate | Preoperative Planning | MRI—3D models |
Checcuci et al. [ |
2022 | Retrospective | IV | Prostate | Preoperative Planning | MRI—3D models |
Porpilgia et al. [ |
2018 | Prospective | IV | Prostate | Visualization of PT | MRI—console |
Porpiglia et al. [ |
2018 | Prospective | III | Prostate | Visualization of PT | MRI—console |
Samei et al. [ |
2018 | Prospective | IV | Prostate | Visualization of PT | Ultrasound |
Kratiras et al. [ |
2019 | Prospective | IV | Prostate | Visualization of PT | MRI—tablet |
Porpiglia et al. [ |
2019 | Prospective | III | Prostate | Visualization of PT | MRI—console |
Porpiglia et al. [ |
2019 | Prospective | III | Prostate | Visualization of PT | MRI—console |
Mehralivand et al. [ |
2019 | Prospective | IV | Prostate | Visualization of PT | MRI—separate display |
Canda et al. [ |
2020 | Prospective | IV | Prostate | Visualization of PT | MRI/PSMA-PET—console |
Samei et al. [ |
2020 | Prospective | IV | Prostate | Visualization of PT | MRI/Ultrasound—console |
Schiavina et al. [ |
2021 | Prospective | IV | Prostate | Visualization of PT | MRI—console |
Tanzi et al. [ |
2021 | Retrospective | IV | Prostate | Visualization of PT | Console |
Padovan et al. [ |
2022 | Retrospective | IV | Prostate | Visualization of PT | Console |
Lopez et al. [ |
2016 | Prospective | IV | Prostate | Intraoperative diagnostics | Confocal |
Law et al. [ |
2018 | Retrospective | IV | Abdominal wall | Intraoperative diagnostics | Infrared |
Bianchi et al. [ |
2021 | Prospective | III | Prostate | Intraoperative diagnostics | Augmented reality—console |
van der Poel et al. [ |
2011 | Prospective | III | LN | Intraoperative detection of LN | Fluorescent camera |
KleinJan et al. [ |
2016 | Prospective | IV | LN | Intraoperative detection of LN | Fluorescent camera |
van den Berg et al. [ |
2017 | Prospective | IV | LN | Intraoperative detection of LN | Fluorescent camera |
De Korne et al. [ |
2019 | Retrospective | III | LN | Intraoperative detection of LN | Fluorescent camera/gamma probe |
Hinsenveld et al. [ |
2020 | Retrospective | III | LN | Intraoperative detection of LN | Fluorescent camera |
Collamati et al. [ |
2020 | Prospective | IV | LN | Intraoperative detection of LN | DROP-IN beta particle detector |
Mazzone et al. [ |
2021 | Retrospective | III | LN | Intraoperative detection of LN | Fluorescent camera |
Wit et al. [ |
2022 | Prospective | II | LN | Intraoperative detection of LN | Fluorescent camera |
Gondoputro et al. [ |
2022 | Prospective | IV | LN | Intraoperative detection of LN | Drop-IN gamma detector |
DellÓglio et al. [ |
2021 | Prospective | IV | LN | Intraoperative detection of LN | Drop-in gamma probe, laparoscopic gamma probe, fluorescent camera |
Gandaglia et al. [ |
2022 | Prospective | III | LN | Intraoperative detection of LN | Drop-in gamma probe |
Özkan et al. [ |
2022 | Retrospective | IV | LN | Intraoperative detection of LN | Fluorescent camera |
Hung et al. [ |
2011 | Prospective | IV | Prostate | Education/ |
Simulator—basic skills |
Aghazadeh et al. [ |
2016 | Prospective | IV | Prostate | Education/ |
Simulator—clinical skills |
Hoogenes et al. [ |
2018 | Prospective | III | Prostate | Education/ |
Simulator |
Harrison et al. [ |
2018 | Prospective | III | Prostate | Education/ |
Simulator—clinical skills |
Shim et al. [ |
2018 | Prospective | IV | Prostate | Education/ |
Video instruction vs. Guided |
Shim et al. [ |
2018 | Prospective | IV | Prostate | Education/ |
Simulator |
Almarzouq et al. [ |
2020 | Prospective | III | Prostate | Education/ |
Simulator |
Wang et al. [ |
2021 | Prospective | III | Prostate | Education/ |
Simulator |
Olsen et al. [ |
2021 | Prospective | III | Prostate | Education/ |
Simulator |
Ebbing et al. [ |
2021 | Prospective | IV | Prostate | Education/ |
Full procedure simulator |
Papalois et al. [ |
2022 | Prospective | IV | Prostate | Education/ |
Mixed reality/VR Glasses |
Sanford et al. [ |
2022 | Prospective | IV | Prostate | Education/ |
VR Simulator |
Van der Leun et al. [ |
2022 | Prospective | III | Prostate | Feedback | Simulator—video |
Noël et al. [ |
2022 | Prospective | IV | Prostate | Feedback | Remote Teaching |
Cheikh et al. [ |
2022 | Retrospective | IV | Prostate | Feedback | Video labeling |
Abbreviation: LoE: Level of evidence, LN: lymph node, VR: virtual reality.
References
1. Mottet, N.; van den Bergh, R.C.N.; Briers, E.; Van den Broeck, T.; Cumberbatch, M.G.; De Santis, M.; Fanti, S.; Fossati, N.; Gandaglia, G.; Gillessen, S. et al. EAU-EANM-ESTRO-ESUR-SIOG Guidelines on Prostate Cancer-2020 Update. Part 1: Screening, Diagnosis, and Local Treatment with Curative Intent. Eur. Urol.; 2021; 79, pp. 243-262. [DOI: https://dx.doi.org/10.1016/j.eururo.2020.09.042]
2. Esperto, F.; Prata, F.; Autrán-Gómez, A.M.; Rivas, J.G.; Socarras, M.; Marchioni, M.; Albisinni, S.; Cataldo, R.; Scarpa, R.M.; Papalia, R. New Technologies for Kidney Surgery Planning 3D, Impression, Augmented Reality 3D, Reconstruction: Current Realities and Expectations. Curr. Urol. Rep.; 2021; 22, 35. [DOI: https://dx.doi.org/10.1007/s11934-021-01052-y]
3. Tewari, A.; Peabody, J.O.; Fischer, M.; Sarle, R.; Vallancien, G.; Delmas, V.; Hassan, M.; Bansal, A.; Hemal, A.K.; Guillonneau, B. et al. An Operative and Anatomic Study to Help in Nerve Sparing during Laparoscopic and Robotic Radical Prostatectomy. Eur. Urol.; 2003; 43, pp. 444-454. [DOI: https://dx.doi.org/10.1016/S0302-2838(03)00093-9]
4. Amparore, D.; Pecoraro, A.; Checcucci, E.; De Cillis, S.; Piramide, F.; Volpi, G.; Piana, A.; Verri, P.; Granato, S.; Sica, M. et al. 3D imaging technologies in minimally invasive kidney and prostate cancer surgery: Which is the urologists’ perception?. Minerva Urol. Nephrol.; 2022; 74, pp. 178-185. [DOI: https://dx.doi.org/10.23736/S2724-6051.21.04131-X]
5. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E. et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ; 2021; 372, n71. [DOI: https://dx.doi.org/10.1136/bmj.n71]
6. Howick, J.; Glasziou, P.; Greenhalgh, T.; Heneghan, C.; Liberati, A.; Moschetti, I.; Chalmers, I.; Moschetti, I.; Phillips, B.; Thornton, H. The Oxford 2011 Levels of Evidence. CEBM. Available online: https://www.cebm.ox.ac.uk/resources/levels-of-evidence/ocebm-levels-of-evidence (accessed on 1 March 2023).
7. Wake, N.; Rosenkrantz, A.B.; Huang, R.; Park, K.U.; Wysock, J.S.; Taneja, S.S.; Huang, W.C.; Sodickson, D.K.; Chandarana, H. Patient-specific 3D printed and augmented reality kidney and prostate cancer models: Impact on patient education. 3D Print. Med.; 2019; 5, 4. [DOI: https://dx.doi.org/10.1186/s41205-019-0041-3]
8. Shirk, J.D.; Reiter, R.; Wallen, E.M.; Pak, R.; Ahlering, T.; Badani, K.K.; Porter, J.R. Effect of 3-Dimensional, Virtual Reality Models for Surgical Planning of Robotic Prostatectomy on Trifecta Outcomes: A Randomized Clinical Trial. J. Urol.; 2022; 208, pp. 618-625. [DOI: https://dx.doi.org/10.1097/JU.0000000000002719]
9. Martini, A.; Falagario, U.G.; Cumarasamy, S.; Jambor, I.; Wagaskar, V.G.; Ratnani, P.; Haines, K.G., III; Tewari, A.K. The Role of 3D Models Obtained from Multiparametric Prostate MRI in Performing Robotic Prostatectomy. J. Endourol.; 2022; 36, pp. 387-393. [DOI: https://dx.doi.org/10.1089/end.2021.0541]
10. Checcucci, E.; Pecoraro, A.; Amparore, D.; De Cillis, S.; Granato, S.; Volpi, G.; Sica, M.; Verri, P.; Piana, A.; Piazzolla, P. et al. The impact of 3D models on positive surgical margins after robot-assisted radical prostatectomy. World J. Urol.; 2022; 40, pp. 2221-2229. [DOI: https://dx.doi.org/10.1007/s00345-022-04038-8]
11. Porpiglia, F.; Fiori, C.; Checcucci, E.; Amparore, D.; Bertolo, R. Augmented Reality Robot-assisted Radical Prostatectomy: Preliminary Experience. Urology; 2018; 115, 184. [DOI: https://dx.doi.org/10.1016/j.urology.2018.01.028]
12. Porpiglia, F.; Bertolo, R.; Amparore, D.; Checcucci, E.; Artibani, W.; Dasgupta, P.; Montorsi, F.; Tewari, A.; Fiori, C. Augmented reality during robot-assisted radical prostatectomy: Expert robotic surgeons’ on-the-spot insights after live surgery. Minerva Urol. Nephrol.; 2018; 70, pp. 226-229. [DOI: https://dx.doi.org/10.23736/S0393-2249.18.03143-0] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/29611674]
13. Samei, G.; Goksel, O.; Lobo, J.; Mohareri, O.; Black, P.; Rohling, R.; Salcudean, S. Real-Time FEM-Based Registration of 3-D to 2.5-D Transrectal Ultrasound Images. IEEE Trans. Med. Imaging; 2018; 37, pp. 1877-1886. [DOI: https://dx.doi.org/10.1109/TMI.2018.2810778] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/29994583]
14. Kratiras, Z.; Gavazzi, A.; Belba, A.; Willis, B.; Chew, S.; Allen, C.; Amoroso, P.; Dasgupta, P. Phase I study of a new tablet-based image guided surgical system in robot-assisted radical prostatectomy. Minerva Urol. Nephrol.; 2019; 71, pp. 92-95. [DOI: https://dx.doi.org/10.23736/S0393-2249.18.03250-2]
15. Porpiglia, F.; Checcucci, E.; Amparore, D.; Manfredi, M.; Massa, F.; Piazzolla, P.; Manfrin, D.; Piana, A.; Tota, D.; Bollito, E. et al. Three-dimensional Elastic Augmented-reality Robot-assisted Radical Prostatectomy Using Hyperaccuracy Three-dimensional Reconstruction Technology: A Step Further in the Identification of Capsular Involvement. Eur. Urol.; 2019; 76, pp. 505-514. [DOI: https://dx.doi.org/10.1016/j.eururo.2019.03.037] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30979636]
16. Porpiglia, F.; Checcucci, E.; Amparore, D.; Autorino, R.; Piana, A.; Bellin, A.; Piazzolla, P.; Massa, F.; Bollito, E.; Gned, D. et al. Augmented-reality robot-assisted radical prostatectomy using hyper-accuracy three-dimensional reconstruction (HA3D™) technology: A radiological and pathological study. BJU Int.; 2019; 123, pp. 834-845. [DOI: https://dx.doi.org/10.1111/bju.14549]
17. Mehralivand, S.; Kolagunda, A.; Hammerich, K.; Sabarwal, V.; Harmon, S.; Sanford, T.; Gold, S.; Hale, G.; Romero, V.V.; Bloom, J. et al. A multiparametric magnetic resonance imaging-based virtual reality surgical navigation tool for robotic-assisted radical prostatectomy. Turk. J. Urol.; 2019; 45, pp. 357-365. [DOI: https://dx.doi.org/10.5152/tud.2019.19133] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31509508]
18. Canda, A.E.; Aksoy, S.F.; Altinmakas, E.; Koseoglu, E.; Falay, O.; Kordan, Y.; Çil, B.; Balbay, M.D.; Esen, T. Virtual reality tumor navigated robotic radical prostatectomy by using three-dimensional reconstructed multiparametric prostate MRI and (68)Ga-PSMA PET/CT images: A useful tool to guide the robotic surgery?. BJUI Compass; 2020; 1, pp. 108-115. [DOI: https://dx.doi.org/10.1002/bco2.16]
19. Samei, G.; Tsang, K.; Kesch, C.; Lobo, J.; Hor, S.; Mohareri, O.; Chang, S.; Goldenberg, S.L.; Black, P.C.; Salcudean, S. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med. Image Anal.; 2020; 60, 101588. [DOI: https://dx.doi.org/10.1016/j.media.2019.101588]
20. Schiavina, R.; Bianchi, L.; Lodi, S.; Cercenelli, L.; Chessa, F.; Bortolani, B.; Gaudiano, C.; Casablanca, C.; Droghetti, M.; Porreca, A. et al. Real-time Augmented Reality Three-dimensional Guided Robotic Radical Prostatectomy: Preliminary Experience and Evaluation of the Impact on Surgical Planning. Eur. Urol. Focus; 2021; 7, pp. 1260-1267. [DOI: https://dx.doi.org/10.1016/j.euf.2020.08.004]
21. Tanzi, L.; Piazzolla, P.; Porpiglia, F.; Vezzetti, E. Real-time deep learning semantic segmentation during intra-operative surgery for 3D augmented reality assistance. Int. J. Comput. Assist. Radiol. Surg.; 2021; 16, pp. 1435-1445. [DOI: https://dx.doi.org/10.1007/s11548-021-02432-y]
22. Padovan, E.; Marullo, G.; Tanzi, L.; Piazzolla, P.; Moos, S.; Porpiglia, F.; Vezzetti, E. A deep learning framework for real-time 3D model registration in robot-assisted laparoscopic surgery. Int. J. Med. Robot.; 2022; 18, e2387. [DOI: https://dx.doi.org/10.1002/rcs.2387] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35246913]
23. Lopez, A.; Zlatev, D.V.; Mach, K.E.; Bui, D.; Liu, J.J.; Rouse, R.V.; Harris, T.; Leppert, J.T.; Liao, J.C. Intraoperative Optical Biopsy during Robotic Assisted Radical Prostatectomy Using Confocal Endomicroscopy. J. Urol.; 2016; 195, pp. 1110-1117. [DOI: https://dx.doi.org/10.1016/j.juro.2015.10.182] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/26626214]
24. Law, K.W.; Ajib, K.; Couture, F.; Tholomier, C.; Bondarenko, H.D.; Preisser, F.; Karakiewicz, P.I.; Zorn, K.C. Use of the AccuVein AV400 during RARP: An infrared augmented reality device to help reduce abdominal wall hematoma. Can. J. Urol.; 2018; 25, pp. 9384-9388. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30125516]
25. Bianchi, L.; Chessa, F.; Angiolini, A.; Cercenelli, L.; Lodi, S.; Bortolani, B.; Molinaroli, E.; Casablanca, C.; Droghetti, M.; Gaudiano, C. et al. The Use of Augmented Reality to Guide the Intraoperative Frozen Section During Robot-assisted Radical Prostatectomy. Eur. Urol.; 2021; 80, pp. 480-488. [DOI: https://dx.doi.org/10.1016/j.eururo.2021.06.020] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34332759]
26. van der Poel, H.G.; Buckle, T.; Brouwer, O.R.; Valdés Olmos, R.A.; van Leeuwen, F.W. Intraoperative laparoscopic fluorescence guidance to the sentinel lymph node in prostate cancer patients: Clinical proof of concept of an integrated functional imaging approach using a multimodal tracer. Eur. Urol.; 2011; 60, pp. 826-833. [DOI: https://dx.doi.org/10.1016/j.eururo.2011.03.024] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/21458154]
27. KleinJan, G.H.; van den Berg, N.S.; de Jong, J.; Wit, E.M.; Thygessen, H.; Vegt, E.; van der Poel, H.G.; van Leeuwen, F.W. Multimodal hybrid imaging agents for sentinel node mapping as a means to (re)connect nuclear medicine to advances made in robot-assisted surgery. Eur. J. Nucl. Med. Mol. Imaging; 2016; 43, pp. 1278-1287. [DOI: https://dx.doi.org/10.1007/s00259-015-3292-2] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/26768422]
28. van den Berg, N.S.; Buckle, T.; KleinJan, G.H.; van der Poel, H.G.; van Leeuwen, F.W.B. Multispectral Fluorescence Imaging During Robot-assisted Laparoscopic Sentinel Node Biopsy: A First Step Towards a Fluorescence-based Anatomic Roadmap. Eur. Urol.; 2017; 72, pp. 110-117. [DOI: https://dx.doi.org/10.1016/j.eururo.2016.06.012]
29. de Korne, C.M.; Wit, E.M.; de Jong, J.; Valdés Olmos, R.A.; Buckle, T.; van Leeuwen, F.W.B.; van der Poel, H.G. Anatomical localization of radiocolloid tracer deposition affects outcome of sentinel node procedures in prostate cancer. Eur. J. Nucl. Med. Mol. Imaging; 2019; 46, pp. 2558-2568. [DOI: https://dx.doi.org/10.1007/s00259-019-04443-z]
30. Hinsenveld, F.J.; Wit, E.M.K.; van Leeuwen, P.J.; Brouwer, O.R.; Donswijk, M.L.; Tillier, C.N.; Vegt, E.; van Muilekom, E.; van Oosterom, M.N.; van Leeuwen, F.W.B. et al. Prostate-Specific Membrane Antigen PET/CT Combined with Sentinel Node Biopsy for Primary Lymph Node Staging in Prostate Cancer. J. Nucl. Med.; 2020; 61, pp. 540-545. [DOI: https://dx.doi.org/10.2967/jnumed.119.232199]
31. Collamati, F.; van Oosterom, M.N.; De Simoni, M.; Faccini, R.; Fischetti, M.; Mancini Terracciano, C.; Mirabelli, R.; Moretti, R.; Heuvel, J.O.; Solfaroli Camillocci, E. et al. A DROP-IN beta probe for robot-assisted (68)Ga-PSMA radioguided surgery: First ex vivo technology evaluation using prostate cancer specimens. EJNMMI Res.; 2020; 10, 92. [DOI: https://dx.doi.org/10.1186/s13550-020-00682-6]
32. Mazzone, E.; Dell’Oglio, P.; Grivas, N.; Wit, E.; Donswijk, M.; Briganti, A.; Leeuwen, F.V.; Poel, H.V. Diagnostic Value, Oncologic Outcomes, and Safety Profile of Image-Guided Surgery Technologies During Robot-Assisted Lymph Node Dissection with Sentinel Node Biopsy for Prostate Cancer. J. Nucl. Med.; 2021; 62, pp. 1363-1371. [DOI: https://dx.doi.org/10.2967/jnumed.120.259788] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33547208]
33. Wit, E.M.K.; van Beurden, F.; Kleinjan, G.H.; Grivas, N.; de Korne, C.M.; Buckle, T.; Donswijk, M.L.; Bekers, E.M.; van Leeuwen, F.W.B.; van der Poel, H.G. The impact of drainage pathways on the detection of nodal metastases in prostate cancer: A phase II randomized comparison of intratumoral vs intraprostatic tracer injection for sentinel node detection. Eur. J. Nucl. Med. Mol. Imaging; 2022; 49, pp. 1743-1753. [DOI: https://dx.doi.org/10.1007/s00259-021-05580-0] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34748059]
34. Gondoputro, W.; Scheltema, M.J.; Blazevski, A.; Doan, P.; Thompson, J.E.; Amin, A.; Geboers, B.; Agrawal, S.; Siriwardana, A.; Van Leeuwen, P.J. et al. Robot-Assisted Prostate-Specific Membrane Antigen-Radioguided Surgery in Primary Diagnosed Prostate Cancer. J. Nucl. Med.; 2022; 63, pp. 1659-1664. [DOI: https://dx.doi.org/10.2967/jnumed.121.263743] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35241483]
35. Dell’Oglio, P.; Meershoek, P.; Maurer, T.; Wit, E.M.K.; van Leeuwen, P.J.; van der Poel, H.G.; van Leeuwen, F.W.B.; van Oosterom, M.N. A DROP-IN Gamma Probe for Robot-assisted Radioguided Surgery of Lymph Nodes During Radical Prostatectomy. Eur. Urol.; 2021; 79, pp. 124-132. [DOI: https://dx.doi.org/10.1016/j.eururo.2020.10.031] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33203549]
36. Gandaglia, G.; Mazzone, E.; Stabile, A.; Pellegrino, A.; Cucchiara, V.; Barletta, F.; Scuderi, S.; Robesti, D.; Leni, R.; Samanes Gajate, A.M. et al. Prostate-specific membrane antigen Radioguided Surgery to Detect Nodal Metastases in Primary Prostate Cancer Patients Undergoing Robot-assisted Radical Prostatectomy and Extended Pelvic Lymph Node Dissection: Results of a Planned Interim Analysis of a Prospective Phase 2 Study. Eur. Urol.; 2022; 82, pp. 411-418. [DOI: https://dx.doi.org/10.1016/j.eururo.2022.06.002] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35879127]
37. Özkan, A.; Köseoğlu, E.; Canda, A.E.; Çil, B.E.; Aykanat, C.İ.; Sarıkaya, A.F.; Tarım, K.; Armutlu, A.; Kulaç, İ.; Barçın, E. et al. Fluorescence-guided extended pelvic lymphadenectomy during robotic radical prostatectomy. J. Robot. Surg.; 2022; 17, pp. 885-890. [DOI: https://dx.doi.org/10.1007/s11701-022-01480-z] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36329287]
38. Hung, A.J.; Zehnder, P.; Patil, M.B.; Cai, J.; Ng, C.K.; Aron, M.; Gill, I.S.; Desai, M.M. Face, content and construct validity of a novel robotic surgery simulator. J. Urol.; 2011; 186, pp. 1019-1024. [DOI: https://dx.doi.org/10.1016/j.juro.2011.04.064]
39. Aghazadeh, M.A.; Mercado, M.A.; Pan, M.M.; Miles, B.J.; Goh, A.C. Performance of robotic simulated skills tasks is positively associated with clinical robotic surgical performance. BJU Int.; 2016; 118, pp. 475-481. [DOI: https://dx.doi.org/10.1111/bju.13511]
40. Hoogenes, J.; Wong, N.; Al-Harbi, B.; Kim, K.S.; Vij, S.; Bolognone, E.; Quantz, M.; Guo, Y.; Shayegan, B.; Matsumoto, E.D. A Randomized Comparison of 2 Robotic Virtual Reality Simulators and Evaluation of Trainees’ Skills Transfer to a Simulated Robotic Urethrovesical Anastomosis Task. Urology; 2018; 111, pp. 110-115. [DOI: https://dx.doi.org/10.1016/j.urology.2017.09.023]
41. Harrison, P.; Raison, N.; Abe, T.; Watkinson, W.; Dar, F.; Challacombe, B.; Van Der Poel, H.; Khan, M.S.; Dasgupa, P.; Ahmed, K. The Validation of a Novel Robot-Assisted Radical Prostatectomy Virtual Reality Module. J. Surg. Educ.; 2018; 75, pp. 758-766. [DOI: https://dx.doi.org/10.1016/j.jsurg.2017.09.005]
42. Shim, J.S.; Kim, J.Y.; Pyun, J.H.; Cho, S.; Oh, M.M.; Kang, S.H.; Lee, J.G.; Kim, J.J.; Cheon, J.; Kang, S.G. Comparison of effective teaching methods to achieve skill acquisition using a robotic virtual reality simulator: Expert proctoring versus an educational video versus independent training. Medicine; 2018; 97, e13569. [DOI: https://dx.doi.org/10.1097/MD.0000000000013569] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30572458]
43. Shim, J.S.; Noh, T.I.; Kim, J.Y.; Pyun, J.H.; Cho, S.; Oh, M.M.; Kang, S.H.; Cheon, J.; Lee, J.G.; Kim, J.J. et al. Predictive Validation of a Robotic Virtual Reality Simulator: The Tube 3 module for Practicing Vesicourethral Anastomosis in Robot-Assisted Radical Prostatectomy. Urology; 2018; 122, pp. 32-36. [DOI: https://dx.doi.org/10.1016/j.urology.2018.08.013] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30144481]
44. Almarzouq, A.; Hu, J.; Noureldin, Y.A.; Yin, A.; Anidjar, M.; Bladou, F.; Tanguay, S.; Kassouf, W.; Aprikian, A.G.; Andonian, S. Are basic robotic surgical skills transferable from the simulator to the operating room? A randomized, prospective, educational study. Can. Urol. Assoc. J.; 2020; 14, pp. 416-422. [DOI: https://dx.doi.org/10.5489/cuaj.6460] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32569567]
45. Wang, F.; Zhang, C.; Guo, F.; Sheng, X.; Ji, J.; Xu, Y.; Cao, Z.; Lyu, J.; Lu, X.; Yang, B. The application of virtual reality training for anastomosis during robot-assisted radical prostatectomy. Asian J. Urol.; 2021; 8, pp. 204-208. [DOI: https://dx.doi.org/10.1016/j.ajur.2019.11.005] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33996477]
46. Olsen, R.G.; Bjerrum, F.; Konge, L.; Jepsen, J.V.; Azawi, N.H.; Bube, S.H. Validation of a Novel Simulation-Based Test in Robot-Assisted Radical Prostatectomy. J. Endourol.; 2021; 35, pp. 1265-1272. [DOI: https://dx.doi.org/10.1089/end.2020.0986] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33530867]
47. Ebbing, J.; Wiklund, P.N.; Akre, O.; Carlsson, S.; Olsson, M.J.; Höijer, J.; Heimer, M.; Collins, J.W. Development and validation of non-guided bladder-neck and neurovascular-bundle dissection modules of the RobotiX-Mentor® full-procedure robotic-assisted radical prostatectomy virtual reality simulation. Int. J. Med. Robot.; 2021; 17, e2195. [DOI: https://dx.doi.org/10.1002/rcs.2195] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33124140]
48. Papalois, Z.A.; Aydın, A.; Khan, A.; Mazaris, E.; Rathnasamy Muthusamy, A.S.; Dor, F.; Dasgupta, P.; Ahmed, K. HoloMentor: A Novel Mixed Reality Surgical Anatomy Curriculum for Robot-Assisted Radical Prostatectomy. Eur. Surg. Res.; 2022; 63, pp. 40-45. [DOI: https://dx.doi.org/10.1159/000520386]
49. Sanford, D.I.; Ma, R.; Ghoreifi, A.; Haque, T.F.; Nguyen, J.H.; Hung, A.J. Association of Suturing Technical Skill Assessment Scores between Virtual Reality Simulation and Live Surgery. J. Endourol.; 2022; 36, pp. 1388-1394. [DOI: https://dx.doi.org/10.1089/end.2022.0158]
50. van der Leun, J.A.; Siem, G.; Meijer, R.P.; Brinkman, W.M. Improving Robotic Skills by Video Review. J. Endourol.; 2022; 36, pp. 1126-1135. [DOI: https://dx.doi.org/10.1089/end.2021.0740]
51. Noël, J.; Moschovas, M.C.; Patel, E.; Rogers, T.; Marquinez, J.; Rocco, B.; Mottrie, A.; Patel, V. Step-by-step optimisation of robotic-assisted radical prostatectomy using augmented reality. Int. Braz. J. Urol.; 2022; 48, pp. 600-601. [DOI: https://dx.doi.org/10.1590/s1677-5538.ibju.2022.99.10]
52. Cheikh Youssef, S.; Hachach-Haram, N.; Aydin, A.; Shah, T.T.; Sapre, N.; Nair, R.; Rai, S.; Dasgupta, P. Video labelling robot-assisted radical prostatectomy and the role of artificial intelligence (AI): Training a novice. J. Robot. Surg.; 2022; 17, pp. 695-701. [DOI: https://dx.doi.org/10.1007/s11701-022-01465-y] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36309954]
53. Hung, A.J.; Chen, J.; Jarc, A.; Hatcher, D.; Djaladat, H.; Gill, I.S. Development and Validation of Objective Performance Metrics for Robot-Assisted Radical Prostatectomy: A Pilot Study. J. Urol.; 2018; 199, pp. 296-304. [DOI: https://dx.doi.org/10.1016/j.juro.2017.07.081] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/28765067]
54. Wit, E.M.K.; Acar, C.; Grivas, N.; Yuan, C.; Horenblas, S.; Liedberg, F.; Valdes Olmos, R.A.; van Leeuwen, F.W.B.; van den Berg, N.S.; Winter, A. et al. Sentinel Node Procedure in Prostate Cancer: A Systematic Review to Assess Diagnostic Accuracy. Eur. Urol.; 2017; 71, pp. 596-605. [DOI: https://dx.doi.org/10.1016/j.eururo.2016.09.007] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27639533]
55. Bravi, C.A.; Paciotti, M.; Sarchi, L.; Mottaran, A.; Nocera, L.; Farinha, R.; De Backer, P.; Vinckier, M.-H.; De Naeyer, G.; D’Hondt, F. et al. Robot-assisted Radical Prostatectomy with the Novel Hugo Robotic System: Initial Experience and Optimal Surgical Set-up at a Tertiary Referral Robotic Center. Eur. Urol.; 2022; 82, pp. 233-237. [DOI: https://dx.doi.org/10.1016/j.eururo.2022.04.029] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35568597]
56. Beulens, A.J.W.; Vaartjes, L.; Tilli, S.; Brinkman, W.M.; Umari, P.; Puliatti, S.; Koldewijn, E.L.; Hendrikx, A.J.M.; van Basten, J.P.; van Merriënboer, J.J.G. et al. Structured robot-assisted surgery training curriculum for residents in Urology and impact on future surgical activity. J. Robot. Surg.; 2021; 15, pp. 497-510. [DOI: https://dx.doi.org/10.1007/s11701-020-01134-y] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32772237]
57. Imber, B.S.; Varghese, M.; Ehdaie, B.; Gorovets, D. Financial toxicity associated with treatment of localized prostate cancer. Nat. Rev. Urol.; 2020; 17, pp. 28-40. [DOI: https://dx.doi.org/10.1038/s41585-019-0258-3] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31792431]
58. Özman, O.; Tillier, C.N.; van Muilekom, E.; van de Poll-Franse, L.V.; van der Poel, H.G. Financial Toxicity After Robot-Assisted Radical Prostatectomy and Its Relation with Oncologic, Functional Outcomes. J. Urol.; 2022; 208, pp. 978-986. [DOI: https://dx.doi.org/10.1097/JU.0000000000002897]
59. Bolenz, C.; Gupta, A.; Hotze, T.; Ho, R.; Cadeddu, J.A.; Roehrborn, C.G.; Lotan, Y. Cost comparison of robotic, laparoscopic, and open radical prostatectomy for prostate cancer. Eur. Urol.; 2010; 57, pp. 453-458. [DOI: https://dx.doi.org/10.1016/j.eururo.2009.11.008]
60. Checcucci, E.; Verri, P.; Amparore, D.; Cacciamani, G.E.; Rivas, J.G.; Autorino, R.; Mottrie, A.; Breda, A.; Porpiglia, F. The future of robotic surgery in urology: From augmented reality to the advent of metaverse. Ther. Adv. Urol.; 2023; 15, 17562872231151853. [DOI: https://dx.doi.org/10.1177/17562872231151853]
61. Andras, I.; Mazzone, E.; van Leeuwen, F.W.B.; De Naeyer, G.; van Oosterom, M.N.; Beato, S.; Buckle, T.; O’Sullivan, S.; van Leeuwen, P.J.; Beulens, A. et al. Artificial intelligence and robotics: A combination that is changing the operating room. World J. Urol.; 2020; 38, pp. 2359-2366. [DOI: https://dx.doi.org/10.1007/s00345-019-03037-6]
62. Rodler, S.; Buchner, A.; Stief, C.G.; Heinemann, V.; Staehler, M.; Casuscelli, J. Patients’ Perspective on Digital Technologies in Advanced Genitourinary Cancers. Clin. Genitourin. Cancer; 2021; 19, pp. 76-82.e76. [DOI: https://dx.doi.org/10.1016/j.clgc.2020.03.018]
63. Zachrison, K.S.; Yan, Z.; Samuels-Kalow, M.E.; Licurse, A.; Zuccotti, G.; Schwamm, L.H. Association of Physician Characteristics with Early Adoption of Virtual Health Care. JAMA Netw. Open; 2021; 4, e2141625. [DOI: https://dx.doi.org/10.1001/jamanetworkopen.2021.41625]
64. Gandaglia, G.; Schatteman, P.; De Naeyer, G.; D’Hondt, F.; Mottrie, A. Novel Technologies in Urologic Surgery: A Rapidly Changing Scenario. Curr. Urol. Rep.; 2016; 17, 19. [DOI: https://dx.doi.org/10.1007/s11934-016-0577-3]
65. Cacciamani, G.E.; Chu, T.N.; Sanford, D.I.; Abreu, A.; Duddalwar, V.; Oberai, A.; Kuo, C.C.J.; Liu, X.; Denniston, A.K.; Vasey, B. et al. PRISMA AI reporting guidelines for systematic reviews and meta-analyses on AI in healthcare. Nat. Med.; 2023; 29, pp. 14-15. [DOI: https://dx.doi.org/10.1038/s41591-022-02139-w]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
New imaging technologies play a pivotal role in the current management of patients with prostate cancer. Robotic assisted radical prostatectomy (RARP) is a standard of care for localized disease and through the already imaging-based console subject of research towards combinations of imaging technologies and RARP as well as their impact on surgical outcomes. Therefore, we aimed to provide a comprehensive analysis of the currently available literature for new imaging technologies for RARP. On 24 January 2023, we performed a systematic review of the current literature on Pubmed, Scopus and Web of Science according to the PRISMA guidelines and Oxford levels of evidence. A total of 46 studies were identified of which 19 studies focus on imaging of the primary tumor, 12 studies on the intraoperative tumor detection of lymph nodes and 15 studies on the training of surgeons. While the feasibility of combined approaches using new imaging technologies including MRI, PSMA-PET CT or intraoperatively applied radioactive and fluorescent dyes has been demonstrated, the prospective confirmation of improvements in surgical outcomes is currently ongoing.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details








1 Department of Urology, University Hospital of Munich, 81377 Munich, Germany
2 Department of Urology, Klinikum Mannheim, 68167 Mannheim, Germany;
3 Urology and Nephrology Department, Virgen del Rocío University Hospital, Manuel Siurot s/n, 41013 Seville, Spain;
4 Institute for Urology and Reproductive Health, Sechenov University, 117418 Moscow, Russia;
5 Department of Urology, University of Modena and Reggio Emilia, 42122 Modena, Italy;
6 Department of Urology, Hospital Clinico San Carlos, 28040 Madrid, Spain;
7 Urology Unit, Azienda Ospedaliera Universitaria Integrata Verona, 37126 Verona, Italy;
8 Division of Urology, IRCCS Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy;
9 Department of Surgery, Candiolo Cancer Institute, FPO-IRCCS, Candiolo, 10060 Turin, Italy;
10 USC Institute of Urology, University of Southern California, Los Angeles, CA 90007, USA;