Abstract
This thesis focuses on the automatic image analysis of transperineal ultrasound (TPUS) data, which is used to investigate female pelvic floor problems. These problems have a high prevalence, but the understanding of pelvic floor (dys-)function is limited. TPUS analysis of the pelvic floor is done manually, which is time-consuming and observer dependent. This hinders both the research into interpretation of TPUS data and its clinical use. To overcome these problems we use automatic image analysis.
Currently, one of the main methods used, to analyse the TPUS is manually selecting and segmenting the slice of minimal hiatal dimensions (SMHD). In the first chapter of this thesis we show that reliable automatic segmentation of the urogenital hiatus and the puborectalis muscle in the SMHD can be successfully implemented, using deep learning. Furthermore, we show that this can also be used to successfully automate the process of selecting and segmenting the SMHD.
4D TPUS is available in the clinical practice but by the aforementioned method only provides 1D and 2D parameters. Therefore, information stored within TPUS about the volume appearance of the pelvic floor muscles and muscle functionality is not analyzed. In the third chapter of this thesis we propose a reproducible manual 3D segmentation protocol of the puborectalis muscle. The resulting manual segmentations can be used to train active appearance models and convolutional neural networks, these algorithms can be used for reliable automatic 3D segmentation.
In the fifth chapter of we show that on this data it is possible to identify all subdivisions of the main pelvic floor muscle group, the levator ani muscles, on new TPUS data. In the last chapter we apply unsupervised deep learning to our data and show that this can be used for classification of the TPUS data.
The segmentation results presented in this thesis are an important step to reduce the TPUS analysis time and will therefore ease the study of large populations and clinical TPUS analysis. The 3D identification and segmentation of the levator ani muscle subdivisions helps to identify if they are still intact. This is an important step to better informed clinical decision-making.
Currently, one of the main methods used, to analyse the TPUS is manually selecting and segmenting the slice of minimal hiatal dimensions (SMHD). In the first chapter of this thesis we show that reliable automatic segmentation of the urogenital hiatus and the puborectalis muscle in the SMHD can be successfully implemented, using deep learning. Furthermore, we show that this can also be used to successfully automate the process of selecting and segmenting the SMHD.
4D TPUS is available in the clinical practice but by the aforementioned method only provides 1D and 2D parameters. Therefore, information stored within TPUS about the volume appearance of the pelvic floor muscles and muscle functionality is not analyzed. In the third chapter of this thesis we propose a reproducible manual 3D segmentation protocol of the puborectalis muscle. The resulting manual segmentations can be used to train active appearance models and convolutional neural networks, these algorithms can be used for reliable automatic 3D segmentation.
In the fifth chapter of we show that on this data it is possible to identify all subdivisions of the main pelvic floor muscle group, the levator ani muscles, on new TPUS data. In the last chapter we apply unsupervised deep learning to our data and show that this can be used for classification of the TPUS data.
The segmentation results presented in this thesis are an important step to reduce the TPUS analysis time and will therefore ease the study of large populations and clinical TPUS analysis. The 3D identification and segmentation of the levator ani muscle subdivisions helps to identify if they are still intact. This is an important step to better informed clinical decision-making.
Original language | English |
---|---|
Qualification | Doctor of Philosophy |
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 7 Jul 2021 |
Place of Publication | Enschede |
Publisher | |
Print ISBNs | 978-94-6419-229-2 |
DOIs | |
Publication status | Published - 7 Jul 2021 |