Deep learning enables automatic quantitative assessment of the puborectalis muscle and the urogenital hiatus in the plane of minimal hiatal dimensions

F. van den Noort (Corresponding Author), C.H. van der Vaart , A.T.M. Bellos-Grob, M.K. van de Waarsenburg, C.H. Slump, Marijn van Stralen

Research output: Contribution to journalArticleAcademicpeer-review

1 Citation (Scopus)
16 Downloads (Pure)

Abstract

OBJECTIVES: Measuring the length, width and area of the urogenital hiatus (UH), and the length and mean echo intensity (MEP) of the Puborectalis muscle (PRM) automatically and observer independently in the plane of minimal hiatal dimensions from transperineal ultrasound (TPUS) images by automatic segmentation of the UH and the PRM using deep learning.

METHODS: In 1318 3D/4D TPUS volume datasets, images of the plane of minimal hiatal dimensions were manually obtained and the UH and the PRM were manually segmented. Those images were obtained from 253 nulliparae at 12 and 36 weeks pregnancy with the PRM at rest, contraction and Valsalva. A total of 713 images were used to train a convolutional neural network (CNN) to automatically segment the UH and the PRM in the plane of minimal hiatal dimensions. In the remaining dataset (test set 1, TS1, 601 images, 4 images were excluded), the performance of the CNN was evaluated and compared to the manual segmentations. The performance of the CNN was also tested on 119 images of an independent dataset of 40 nulliparae at 12 weeks pregnancy. This dataset was acquired and manually segmented by another different observer (TS2, 2 images were excluded). For these segmentations, the segmentation success was manually scored. Based on the CNN segmentations the following clinically relevant parameters were measured; the length, width and area of the UH, and the length and mean echo intensity of the PRM. The overlap (Dice similarity index (DSI)), surface distance (mean absolute distance (MAD) and Hausdorff distance(HDD)) between manual and CNN segmentations were measured to investigate the similarity of both segmentations. For the measured clinically relevant parameters, the intraclass correlation coefficients (ICC) between manual and CNN results were determined.

RESULTS: Fully automatic CNN segmentation was successful in 99.0% and 93.2% for TS1 and TS2, respectively. DSI, MAD and HDD showed good overlap and distance between manual and CNN segmentations in both test sets. This was reflected in the ICC values of the length (0.96 resp. 0.95), width (0.77 resp. 0.87) and area (0.96 resp. 0.91) of the UH and the length of the PRM (0.87 resp. 0.73) and the MEP (0.95 resp. 0.97), which showed a good to very good agreement.

CONCLUSION: Deep learning can be used to automatically and reliably segment the PRM and UH in 2D, in the plane of minimal hiatal dimensions, of the nulliparous female pelvic floor. These segmentations can be used to reliably measure the parameters; hiatal dimensions, PRM length and MEP. This article is protected by copyright. All rights reserved.
Original languageEnglish
Pages (from-to)270-275
JournalUltrasound in obstetrics & gynecology
Volume54
Issue number2
Early online date21 Nov 2018
DOIs
Publication statusPublished - Aug 2019

Fingerprint

Learning
Muscles
Pregnancy
Pelvic Floor
Datasets

Keywords

  • UT-Hybrid-D
  • Convolutional Neural Network (CNN)
  • Deep learning
  • Puborectalis muscle
  • Segmentation
  • Transperineal ultrasound
  • Urogenital hiatus

Cite this

@article{06251806d6334defa21362fd8d207328,
title = "Deep learning enables automatic quantitative assessment of the puborectalis muscle and the urogenital hiatus in the plane of minimal hiatal dimensions",
abstract = "OBJECTIVES: Measuring the length, width and area of the urogenital hiatus (UH), and the length and mean echo intensity (MEP) of the Puborectalis muscle (PRM) automatically and observer independently in the plane of minimal hiatal dimensions from transperineal ultrasound (TPUS) images by automatic segmentation of the UH and the PRM using deep learning.METHODS: In 1318 3D/4D TPUS volume datasets, images of the plane of minimal hiatal dimensions were manually obtained and the UH and the PRM were manually segmented. Those images were obtained from 253 nulliparae at 12 and 36 weeks pregnancy with the PRM at rest, contraction and Valsalva. A total of 713 images were used to train a convolutional neural network (CNN) to automatically segment the UH and the PRM in the plane of minimal hiatal dimensions. In the remaining dataset (test set 1, TS1, 601 images, 4 images were excluded), the performance of the CNN was evaluated and compared to the manual segmentations. The performance of the CNN was also tested on 119 images of an independent dataset of 40 nulliparae at 12 weeks pregnancy. This dataset was acquired and manually segmented by another different observer (TS2, 2 images were excluded). For these segmentations, the segmentation success was manually scored. Based on the CNN segmentations the following clinically relevant parameters were measured; the length, width and area of the UH, and the length and mean echo intensity of the PRM. The overlap (Dice similarity index (DSI)), surface distance (mean absolute distance (MAD) and Hausdorff distance(HDD)) between manual and CNN segmentations were measured to investigate the similarity of both segmentations. For the measured clinically relevant parameters, the intraclass correlation coefficients (ICC) between manual and CNN results were determined.RESULTS: Fully automatic CNN segmentation was successful in 99.0{\%} and 93.2{\%} for TS1 and TS2, respectively. DSI, MAD and HDD showed good overlap and distance between manual and CNN segmentations in both test sets. This was reflected in the ICC values of the length (0.96 resp. 0.95), width (0.77 resp. 0.87) and area (0.96 resp. 0.91) of the UH and the length of the PRM (0.87 resp. 0.73) and the MEP (0.95 resp. 0.97), which showed a good to very good agreement.CONCLUSION: Deep learning can be used to automatically and reliably segment the PRM and UH in 2D, in the plane of minimal hiatal dimensions, of the nulliparous female pelvic floor. These segmentations can be used to reliably measure the parameters; hiatal dimensions, PRM length and MEP. This article is protected by copyright. All rights reserved.",
keywords = "UT-Hybrid-D, Convolutional Neural Network (CNN), Deep learning, Puborectalis muscle, Segmentation, Transperineal ultrasound, Urogenital hiatus",
author = "{van den Noort}, F. and {van der Vaart}, C.H. and A.T.M. Bellos-Grob and {van de Waarsenburg}, M.K. and C.H. Slump and {van Stralen}, Marijn",
note = "Wiley deal",
year = "2019",
month = "8",
doi = "10.1002/uog.20181",
language = "English",
volume = "54",
pages = "270--275",
journal = "Ultrasound in obstetrics & gynecology",
issn = "0960-7692",
publisher = "Wiley",
number = "2",

}

Deep learning enables automatic quantitative assessment of the puborectalis muscle and the urogenital hiatus in the plane of minimal hiatal dimensions. / van den Noort, F. (Corresponding Author); van der Vaart , C.H. ; Bellos-Grob, A.T.M.; van de Waarsenburg, M.K.; Slump, C.H.; van Stralen, Marijn.

In: Ultrasound in obstetrics & gynecology, Vol. 54, No. 2, 08.2019, p. 270-275.

Research output: Contribution to journalArticleAcademicpeer-review

TY - JOUR

T1 - Deep learning enables automatic quantitative assessment of the puborectalis muscle and the urogenital hiatus in the plane of minimal hiatal dimensions

AU - van den Noort, F.

AU - van der Vaart , C.H.

AU - Bellos-Grob, A.T.M.

AU - van de Waarsenburg, M.K.

AU - Slump, C.H.

AU - van Stralen, Marijn

N1 - Wiley deal

PY - 2019/8

Y1 - 2019/8

N2 - OBJECTIVES: Measuring the length, width and area of the urogenital hiatus (UH), and the length and mean echo intensity (MEP) of the Puborectalis muscle (PRM) automatically and observer independently in the plane of minimal hiatal dimensions from transperineal ultrasound (TPUS) images by automatic segmentation of the UH and the PRM using deep learning.METHODS: In 1318 3D/4D TPUS volume datasets, images of the plane of minimal hiatal dimensions were manually obtained and the UH and the PRM were manually segmented. Those images were obtained from 253 nulliparae at 12 and 36 weeks pregnancy with the PRM at rest, contraction and Valsalva. A total of 713 images were used to train a convolutional neural network (CNN) to automatically segment the UH and the PRM in the plane of minimal hiatal dimensions. In the remaining dataset (test set 1, TS1, 601 images, 4 images were excluded), the performance of the CNN was evaluated and compared to the manual segmentations. The performance of the CNN was also tested on 119 images of an independent dataset of 40 nulliparae at 12 weeks pregnancy. This dataset was acquired and manually segmented by another different observer (TS2, 2 images were excluded). For these segmentations, the segmentation success was manually scored. Based on the CNN segmentations the following clinically relevant parameters were measured; the length, width and area of the UH, and the length and mean echo intensity of the PRM. The overlap (Dice similarity index (DSI)), surface distance (mean absolute distance (MAD) and Hausdorff distance(HDD)) between manual and CNN segmentations were measured to investigate the similarity of both segmentations. For the measured clinically relevant parameters, the intraclass correlation coefficients (ICC) between manual and CNN results were determined.RESULTS: Fully automatic CNN segmentation was successful in 99.0% and 93.2% for TS1 and TS2, respectively. DSI, MAD and HDD showed good overlap and distance between manual and CNN segmentations in both test sets. This was reflected in the ICC values of the length (0.96 resp. 0.95), width (0.77 resp. 0.87) and area (0.96 resp. 0.91) of the UH and the length of the PRM (0.87 resp. 0.73) and the MEP (0.95 resp. 0.97), which showed a good to very good agreement.CONCLUSION: Deep learning can be used to automatically and reliably segment the PRM and UH in 2D, in the plane of minimal hiatal dimensions, of the nulliparous female pelvic floor. These segmentations can be used to reliably measure the parameters; hiatal dimensions, PRM length and MEP. This article is protected by copyright. All rights reserved.

AB - OBJECTIVES: Measuring the length, width and area of the urogenital hiatus (UH), and the length and mean echo intensity (MEP) of the Puborectalis muscle (PRM) automatically and observer independently in the plane of minimal hiatal dimensions from transperineal ultrasound (TPUS) images by automatic segmentation of the UH and the PRM using deep learning.METHODS: In 1318 3D/4D TPUS volume datasets, images of the plane of minimal hiatal dimensions were manually obtained and the UH and the PRM were manually segmented. Those images were obtained from 253 nulliparae at 12 and 36 weeks pregnancy with the PRM at rest, contraction and Valsalva. A total of 713 images were used to train a convolutional neural network (CNN) to automatically segment the UH and the PRM in the plane of minimal hiatal dimensions. In the remaining dataset (test set 1, TS1, 601 images, 4 images were excluded), the performance of the CNN was evaluated and compared to the manual segmentations. The performance of the CNN was also tested on 119 images of an independent dataset of 40 nulliparae at 12 weeks pregnancy. This dataset was acquired and manually segmented by another different observer (TS2, 2 images were excluded). For these segmentations, the segmentation success was manually scored. Based on the CNN segmentations the following clinically relevant parameters were measured; the length, width and area of the UH, and the length and mean echo intensity of the PRM. The overlap (Dice similarity index (DSI)), surface distance (mean absolute distance (MAD) and Hausdorff distance(HDD)) between manual and CNN segmentations were measured to investigate the similarity of both segmentations. For the measured clinically relevant parameters, the intraclass correlation coefficients (ICC) between manual and CNN results were determined.RESULTS: Fully automatic CNN segmentation was successful in 99.0% and 93.2% for TS1 and TS2, respectively. DSI, MAD and HDD showed good overlap and distance between manual and CNN segmentations in both test sets. This was reflected in the ICC values of the length (0.96 resp. 0.95), width (0.77 resp. 0.87) and area (0.96 resp. 0.91) of the UH and the length of the PRM (0.87 resp. 0.73) and the MEP (0.95 resp. 0.97), which showed a good to very good agreement.CONCLUSION: Deep learning can be used to automatically and reliably segment the PRM and UH in 2D, in the plane of minimal hiatal dimensions, of the nulliparous female pelvic floor. These segmentations can be used to reliably measure the parameters; hiatal dimensions, PRM length and MEP. This article is protected by copyright. All rights reserved.

KW - UT-Hybrid-D

KW - Convolutional Neural Network (CNN)

KW - Deep learning

KW - Puborectalis muscle

KW - Segmentation

KW - Transperineal ultrasound

KW - Urogenital hiatus

UR - http://www.scopus.com/inward/record.url?scp=85068226749&partnerID=8YFLogxK

U2 - 10.1002/uog.20181

DO - 10.1002/uog.20181

M3 - Article

C2 - 30461079

VL - 54

SP - 270

EP - 275

JO - Ultrasound in obstetrics & gynecology

JF - Ultrasound in obstetrics & gynecology

SN - 0960-7692

IS - 2

ER -