Automated surgical gestures classification and recognition are important precursors for achieving the goal of objective evaluation of surgical skills. Many works have been done to discover and validate metrics based on the motion of instruments that can be used as features for automatic classification of surgical gestures. In this work, we present a series of angular metrics that can be used together with Cartesian-based metrics to better describe different surgical gestures. These metrics can be calculated both in Cartesian and joint space, and they are used in this work as features for automatic classification of surgical gestures. To evaluate the proposed metrics, we introduce a novel surgical dataset that contains both Cartesian and joint spaces data acquired with da Vinci Research Kit (dVRK) while a single expert operator is performing 40 subsequent suturing exercises. The obtained results confirm that the application of metrics in the joint space improves the accuracy of automatic gesture classification.

Angular metrics and an effort based metric used as features for an automatic classifying algorithm of surgical gestures

Marco Bombieri;Diego Dall'Alba;Sanat Ramesh;Giovanni Menegozzo;Paolo Fiorini
2020-01-01

Abstract

Automated surgical gestures classification and recognition are important precursors for achieving the goal of objective evaluation of surgical skills. Many works have been done to discover and validate metrics based on the motion of instruments that can be used as features for automatic classification of surgical gestures. In this work, we present a series of angular metrics that can be used together with Cartesian-based metrics to better describe different surgical gestures. These metrics can be calculated both in Cartesian and joint space, and they are used in this work as features for automatic classification of surgical gestures. To evaluate the proposed metrics, we introduce a novel surgical dataset that contains both Cartesian and joint spaces data acquired with da Vinci Research Kit (dVRK) while a single expert operator is performing 40 subsequent suturing exercises. The obtained results confirm that the application of metrics in the joint space improves the accuracy of automatic gesture classification.
2020
Robotic surgical gestures
Angular metrics
Joints space metrics
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1027814
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact