In the future, surgical robots will grant the option of executing surgical tasks autonomously, supervised by the surgeon. We propose a simple framework for learning surgical action primitives that can be used as building blocks for composing more elaborate surgical tasks. Our method is based on Learning from Demonstration (LfD) as this allows us to exploit existing expert knowledge from recordings of surgical procedures. We demonstrate that we can learn needle manipulation actions from human demonstration, constructing an action library which is used to autonomously execute part of a surgical suturing task. Actions are learned from single demonstrations and we use Dynamic Movement Primitives (DMPs) to encode low-level Cartesian space trajectories. Our method is experimentally validated in a non-clinical setting, where we show that learned actions can be generalized to previously unseen conditions. Experiments show a 81 % task success rate for moderate variations from the initial conditions of the demonstration with a mean needle insertion error of 3.8 mm.

Autonomous Needle Manipulation for Robotic Surgical Suturing Based on Skills Learned from Demonstration

Dall'Alba, Diego;Fiorini, Paolo;
2021-01-01

Abstract

In the future, surgical robots will grant the option of executing surgical tasks autonomously, supervised by the surgeon. We propose a simple framework for learning surgical action primitives that can be used as building blocks for composing more elaborate surgical tasks. Our method is based on Learning from Demonstration (LfD) as this allows us to exploit existing expert knowledge from recordings of surgical procedures. We demonstrate that we can learn needle manipulation actions from human demonstration, constructing an action library which is used to autonomously execute part of a surgical suturing task. Actions are learned from single demonstrations and we use Dynamic Movement Primitives (DMPs) to encode low-level Cartesian space trajectories. Our method is experimentally validated in a non-clinical setting, where we show that learned actions can be generalized to previously unseen conditions. Experiments show a 81 % task success rate for moderate variations from the initial conditions of the demonstration with a mean needle insertion error of 3.8 mm.
2021
978-1-6654-1873-7
Medical robotics , Computer aided software engineering , Surgical Instruments , Surgery , Surgical Needles
File in questo prodotto:
File Dimensione Formato  
Schwaner2021_AutonomousNeedleManipulation.pdf

solo utenti autorizzati

Tipologia: Documento in Post-print
Licenza: Accesso ristretto
Dimensione 735.45 kB
Formato Adobe PDF
735.45 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1066634
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? ND
social impact