Semantic Segmentation of Terrestrial Laser Scans of Railway Catenary Arches: A Use Case Perspective

Bram Ton*, Faizan Ahmed, Jeroen Linssen

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

7 Citations (Scopus)
141 Downloads (Pure)

Abstract

Having access to accurate and recent digital twins of infrastructure assets benefits the renovation, maintenance, condition monitoring, and construction planning of infrastructural projects. There are many cases where such a digital twin does not yet exist, such as for legacy structures. In order to create such a digital twin, a mobile laser scanner can be used to capture the geometric representation of the structure. With the aid of semantic segmentation, the scene can be decomposed into different object classes. This decomposition can then be used to retrieve cad models from a cad library to create an accurate digital twin. This study explores three deep-learning-based models for semantic segmentation of point clouds in a practical real-world setting: PointNet++, SuperPoint Graph, and Point Transformer. This study focuses on the use case of catenary arches of the Dutch railway system in collaboration with Strukton Rail, a major contractor for rail projects. A challenging, varied, high-resolution, and annotated dataset for evaluating point cloud segmentation models in railway settings is presented. The dataset contains 14 individually labelled classes and is the first of its kind to be made publicly available. A modified PointNet++ model achieved the best mean class Intersection over Union (IoU) of 71% for the semantic segmentation task on this new, diverse, and challenging dataset.
Original languageEnglish
Article number222
JournalSensors
Volume23
Issue number1
Early online date26 Dec 2022
DOIs
Publication statusPublished - Jan 2023

Fingerprint

Dive into the research topics of 'Semantic Segmentation of Terrestrial Laser Scans of Railway Catenary Arches: A Use Case Perspective'. Together they form a unique fingerprint.

Cite this