Advanced search
1 file | 2.07 MB Add to list

Improving augmented reality through deep learning : real-time instrument delineation in robotic renal surgery

(2023) EUROPEAN UROLOGY. 84(1). p.86-91
Author
Organization
Abstract
Several barriers prevent the integration and adoption of augmented reality (AR) in robotic renal surgery despite the increased availability of virtual three-dimensional (3D) models. Apart from correct model alignment and deformation, not all instruments are clearly visible in AR. Superimposition of a 3D model on top of the surgical stream, including the instruments, can result in a potentially hazardous surgical situation. We demonstrate real-time instrument detection during AR-guided robotassisted partial nephrectomy and show the generalization of our algorithm to ARguided robot-assisted kidney transplantation. We developed an algorithm using deep learning networks to detect all nonorganic items. This algorithm learned to extract this information for 65 927 manually labeled instruments on 15 100 frames. Our setup, which runs on a standalone laptop, was deployed in three different hospitals and used by four different surgeons. Instrument detection is a simple and feasible way to enhance the safety of AR-guided surgery. Future investigations should strive to optimize efficient video processing to minimize the 0.5-s delay
Keywords
Real time, Deep learning, Renal cell carcinoma, Robotic surgery, Kidney transplantation, Partial nephrectomy, Instrument segmentation, Augmented reality, Three-dimensional models

Downloads

  • published version.pdf
    • full text (Published version)
    • |
    • open access
    • |
    • PDF
    • |
    • 2.07 MB

Citation

Please use this url to cite or link to this publication:

MLA
De Backer, Pieter, et al. “Improving Augmented Reality through Deep Learning : Real-Time Instrument Delineation in Robotic Renal Surgery.” EUROPEAN UROLOGY, vol. 84, no. 1, Elsevier, 2023, pp. 86–91, doi:10.1016/j.eururo.2023.02.024.
APA
De Backer, P., Van Praet, C., Simoens, J., Peraire Lores, M., Creemers, H., Mestdagh, K., … Mottrie, A. (2023). Improving augmented reality through deep learning : real-time instrument delineation in robotic renal surgery. EUROPEAN UROLOGY, 84(1), 86–91. https://doi.org/10.1016/j.eururo.2023.02.024
Chicago author-date
De Backer, Pieter, Charles Van Praet, Jente Simoens, Maria Peraire Lores, Heleen Creemers, Kenzo Mestdagh, Charlotte Allaeys, et al. 2023. “Improving Augmented Reality through Deep Learning : Real-Time Instrument Delineation in Robotic Renal Surgery.” EUROPEAN UROLOGY 84 (1): 86–91. https://doi.org/10.1016/j.eururo.2023.02.024.
Chicago author-date (all authors)
De Backer, Pieter, Charles Van Praet, Jente Simoens, Maria Peraire Lores, Heleen Creemers, Kenzo Mestdagh, Charlotte Allaeys, Saar Vermijs, Pietro Piazza, Angelo Mottaran, Carlo Andrea Bravi, Marco Paciotti, Luca Sarchi, Rui Farinha, Stefano Puliatti, Francesco Cisternino, Federica Ferraguti, Charlotte Debbaut, Geert De Naeyer, Karel Decaestecker, and Alexandre Mottrie. 2023. “Improving Augmented Reality through Deep Learning : Real-Time Instrument Delineation in Robotic Renal Surgery.” EUROPEAN UROLOGY 84 (1): 86–91. doi:10.1016/j.eururo.2023.02.024.
Vancouver
1.
De Backer P, Van Praet C, Simoens J, Peraire Lores M, Creemers H, Mestdagh K, et al. Improving augmented reality through deep learning : real-time instrument delineation in robotic renal surgery. EUROPEAN UROLOGY. 2023;84(1):86–91.
IEEE
[1]
P. De Backer et al., “Improving augmented reality through deep learning : real-time instrument delineation in robotic renal surgery,” EUROPEAN UROLOGY, vol. 84, no. 1, pp. 86–91, 2023.
@article{01GTH8RSPG3B5S3P3NE5MQ9KTF,
  abstract     = {{Several barriers prevent the integration and adoption of augmented reality (AR) in robotic renal surgery despite the increased availability of virtual three-dimensional (3D) models. Apart from correct model alignment and deformation, not all instruments are clearly visible in AR. Superimposition of a 3D model on top of the surgical stream, including the instruments, can result in a potentially hazardous surgical situation. We demonstrate real-time instrument detection during AR-guided robotassisted partial nephrectomy and show the generalization of our algorithm to ARguided robot-assisted kidney transplantation. We developed an algorithm using deep learning networks to detect all nonorganic items. This algorithm learned to extract this information for 65 927 manually labeled instruments on 15 100 frames. Our setup, which runs on a standalone laptop, was deployed in three different hospitals and used by four different surgeons. Instrument detection is a simple and feasible way to enhance the safety of AR-guided surgery. Future investigations should strive to optimize efficient video processing to minimize the 0.5-s delay}},
  author       = {{De Backer, Pieter and Van Praet, Charles and Simoens, Jente and Peraire Lores, Maria and Creemers, Heleen and Mestdagh, Kenzo and Allaeys, Charlotte and Vermijs, Saar and Piazza, Pietro and Mottaran, Angelo and Andrea Bravi, Carlo and Paciotti, Marco and Sarchi, Luca and Farinha, Rui and Puliatti, Stefano and Cisternino, Francesco and Ferraguti, Federica and Debbaut, Charlotte and De Naeyer, Geert and Decaestecker, Karel and Mottrie, Alexandre}},
  issn         = {{0302-2838}},
  journal      = {{EUROPEAN UROLOGY}},
  keywords     = {{Real time,Deep learning,Renal cell carcinoma,Robotic surgery,Kidney transplantation,Partial nephrectomy,Instrument segmentation,Augmented reality,Three-dimensional models}},
  language     = {{eng}},
  number       = {{1}},
  pages        = {{86--91}},
  publisher    = {{Elsevier}},
  title        = {{Improving augmented reality through deep learning : real-time instrument delineation in robotic renal surgery}},
  url          = {{http://doi.org/10.1016/j.eururo.2023.02.024}},
  volume       = {{84}},
  year         = {{2023}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: