PhD defense of Tamara DUPUY of TIMC GMCAO on tuesday, the 13th of june, at 1:30pm:
« AI Assistance for Real-Time Prostate Biopsy Navigation. »
Place: Amphithéâtre Inférieur Nord du bâtiment Jean Roget, Facultés de Médecine & Pharmacie de l'Université Grenoble Alpes, site Santé, La Tronche
- Sandrine VOROS, Directrice de recherche INSERM Délégation Auvergne-Rhone-Alpes, Supervisor
- Jocelyne TROCCAZ, Directrice de recherche, CNRS Délégation ALPES, Co-supervisor
- Clément BEITONE, Maître de conférences, Université Grenoble Alpes, Co-supervisor
- Alain LALANDE, Maître de conférences - Praticien Hospitalier, Université de Bourgogne, Reporter
- Antoine SIMON, Maître de conférences HDR, Université de Rennes, Reporter
- Gaëlle FIARD, Professeure des Universités - Praticienne Hospitalière, Université Grenoble Alpes, Examiner
- Carole LARTIZIEN, Directrice de recherche, CNRS Délégation Rhône Auvergne, Examiner
- Ingerid REINERTSEN, Professeure associée, SINTEF Digital, Examiner
2D/3D Medical image registration ; Deep learning ; Prostate biopsy ; Ultrasound-guided procedure
Prostate cancer is a major public health problem. Only a prostate biopsy examination can confirm its diagnosis. During this examination, the accuracy of biopsy targeting is complicated by the ultrasound (2D US) modality commonly used for guidance, and by potential movements or deformations of the prostate. These elements make biopsy location and targeting difficult and imprecise, which can compromise both diagnosis and therapeutic decisions.
In this context, the main objective of this thesis is to design and develop a navigation assistance device allowing the urologist to have a continuous and precise tracking of the biopsy location he/she is targeting, considering the prostate deformation. For this purpose, we proposed 2D/3D real-time rigid registration methods, allowing to localize the current intraoperative 2D US image (guidance) with respect to a US reference volume (acquired at the beginning of navigation). The main contributions of our work are to consider prior trajectory information using artificial intelligence approaches (by deep learning), and by respecting the clinical conditions and realism.
By adding prior trajectory information, considering in particular the previous registration results, the relative probe tracking and a segmentation-based multitask learning, we observed a significant improvement of the registration quality. These results were obtained on a large clinical database and evaluated through a cumulative protocol, relevant and adapted to the clinical application. The identification of new types of trajectory information as well as new ways to integrate them into a deep architecture remain perspectives.