Targeting Prostate Biopsy: A Straightforward CNN Solution for 3D MR-Trus Image Registration
Résumé
Automatic fusion of transrectal ultrasound (TRUS) and magnetic resonance (MR) images for targeted prostate biopsies has significantly enhanced the detection of aggressive cancers. However, obtaining a robust and accurate automatic MR-TRUS registration is challenging because of the significant differences between the two modalities. This study presents a data-driven deep learning method to predict the rigid transform (6 rotation and translation parameters) between 3D MR and 3D TRUS images. An end-to-end convolutional neural network is proposed to learn the voxel-level spatial correspondence for multimodal registration tasks. We focus on the model stability and generalization by combining multiple techniques from preprocessing to training strategy. The results show that the proposed network allows accurate prediction of the rigid parameters with a surface registration error (SRE) of 1.81 (± 0.45) mm. The experiments also demonstrate that the proposed method achieves competitive performance compared to other complex rigid networks. The proposed network does not incorporate complexity modules, highlighting the importance of integrating a simpler model design with efficient data preprocessing
Origine | Fichiers produits par l'(les) auteur(s) |
---|