The integration of intra-operative sensors into surgical robots is a hot research topic since this can significantly facilitate complex surgical procedures by enhancing surgical awareness with real-time tissue information. However, currently available intra-operative sensing technologies are mainly based on image processing and force feedback, which normally require heavy computation or complicated hardware modifications of existing surgical tools. This paper presents the design and integration of electrical bio-impedance sensing into a commercial surgical robot tool, leading to the creation of a novel smart instrument that allows the identification of tissues by simply touching them. In addition, an advanced user interface is designed to provide guidance during the use of the system and to allow augmented-reality visualization of the tissue identification results. The proposed system imposes minor hardware modifications to an existing surgical tool, but adds the capability to provide a wealth of data about the tissue being manipulated. This has great potential to allow the surgeon (or an autonomous robotic system) to better understand the surgical environment. To evaluate the system, a series of ex-vivo experiments were conducted. The experimental results demonstrate that the proposed sensing system can successfully identify different tissue types with 100% classification accuracy. In addition, the user interface was shown to effectively and intuitively guide the user to measure the electrical impedance of the target tissue, presenting the identification results as augmented-reality markers for simple and immediate recognition.

Design and Integration of Electrical Bio-impedance Sensing in Surgical Robotic Tools for Tissue Identification and Display

Dall'Alba, D;De Momi, E;Fiorini, P
2019-01-01

Abstract

The integration of intra-operative sensors into surgical robots is a hot research topic since this can significantly facilitate complex surgical procedures by enhancing surgical awareness with real-time tissue information. However, currently available intra-operative sensing technologies are mainly based on image processing and force feedback, which normally require heavy computation or complicated hardware modifications of existing surgical tools. This paper presents the design and integration of electrical bio-impedance sensing into a commercial surgical robot tool, leading to the creation of a novel smart instrument that allows the identification of tissues by simply touching them. In addition, an advanced user interface is designed to provide guidance during the use of the system and to allow augmented-reality visualization of the tissue identification results. The proposed system imposes minor hardware modifications to an existing surgical tool, but adds the capability to provide a wealth of data about the tissue being manipulated. This has great potential to allow the surgeon (or an autonomous robotic system) to better understand the surgical environment. To evaluate the system, a series of ex-vivo experiments were conducted. The experimental results demonstrate that the proposed sensing system can successfully identify different tissue types with 100% classification accuracy. In addition, the user interface was shown to effectively and intuitively guide the user to measure the electrical impedance of the target tissue, presenting the identification results as augmented-reality markers for simple and immediate recognition.
2019
electrical bio-impedance; tissue identification; da Vinci Research Kit; user interface; intra-operative sensing; augmented reality
File in questo prodotto:
File Dimensione Formato  
frobt-06-00055.pdf

accesso aperto

Licenza: Creative commons
Dimensione 774.98 kB
Formato Adobe PDF
774.98 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1020324
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 16
social impact