In the context of ultrasound (US) guided breast biopsy, image fusion techniques can be employed to track the position of USinvisible lesions previously identified on a pre-operative image. Such methods have to account for the large anatomical deformations resulting from probe pressure during US scanning within the real-time constraint. Although biomechanical models based on the finite element (FE) method represent the preferred approach to model breast behavior, they cannot achieve real-time performances. In this paper we propose to use deep neural networks to learn large deformations occurring in ultrasoundguided breast biopsy and then to provide accurate prediction of lesion displacement in real-time. We train a U-Net architecture on a relatively small amount of synthetic data generated in an offline phase from FE simulations of probe-induced deformations on the breast anatomy of interest. Overall, both training data generation and network training are performed in less than 5 hours, which is clinically acceptable considering that the biopsy can be performed at most the day after the pre-operative scan. The method is tested both on synthetic and on real data acquired on a realistic breast phantom. Results show that our method correctly learns the deformable behavior modelled via FE simulations and is able to generalize to real data, achieving a target registration error comparable to that of FE models, while being about a hundred times faster.

Physics-based deep neural network for real-time lesion tracking in ultrasound-guided breast biopsy

Eleonora Tagliabue;Diego Dall’Alba;Paolo Fiorini;
2019-01-01

Abstract

In the context of ultrasound (US) guided breast biopsy, image fusion techniques can be employed to track the position of USinvisible lesions previously identified on a pre-operative image. Such methods have to account for the large anatomical deformations resulting from probe pressure during US scanning within the real-time constraint. Although biomechanical models based on the finite element (FE) method represent the preferred approach to model breast behavior, they cannot achieve real-time performances. In this paper we propose to use deep neural networks to learn large deformations occurring in ultrasoundguided breast biopsy and then to provide accurate prediction of lesion displacement in real-time. We train a U-Net architecture on a relatively small amount of synthetic data generated in an offline phase from FE simulations of probe-induced deformations on the breast anatomy of interest. Overall, both training data generation and network training are performed in less than 5 hours, which is clinically acceptable considering that the biopsy can be performed at most the day after the pre-operative scan. The method is tested both on synthetic and on real data acquired on a realistic breast phantom. Results show that our method correctly learns the deformable behavior modelled via FE simulations and is able to generalize to real data, achieving a target registration error comparable to that of FE models, while being about a hundred times faster.
2019
Ultrasound-guided Breast Biopsy, Deep Neural Networks, Real-time Simulation
File in questo prodotto:
File Dimensione Formato  
DNNbreast_iris.pdf

accesso aperto

Tipologia: Documento in Post-print
Licenza: Dominio pubblico
Dimensione 3.54 MB
Formato Adobe PDF
3.54 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1018553
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact