Abstract
To reduce the cost of manually annotating training data for supervised classifiers, we propose an automated approach to extract training data of urban objects in six classes: buildings, fences, man-made poles, vegetation, vehicles, and low objects. In this study, two segmentation algorithms are firstly implemented to generate meaningful objects from the non-ground point cloud. Then, we generated valid strict rules to label partial RANSAC (Random Sample Consensus) planes and meaningful objects as training data. The strict rules are built upon the semantic knowledge formed by the features of geometric, eigenvalue, RANSAC plane, multidimensional slice, and relative location. The accuracy of strict rule-based (SRB) training data is higher than 98.5 % for buildings, man-made poles, vegetation, and vehicles. The accuracy of low objects and fences reaches 97.10 % and 94.99 %, respectively. Finally, we compared the performance of the KPConv and PointNET++ networks trained by SRB and manually labeled training data to evaluate the effectiveness of our training data. The KPConv overall accuracy using manually labeled and (SRB) training data are 91.5 % and 86.8 % in the Paris dataset, 95.6 % and 92.0 % in the Freiburg dataset, respectively. The experiments demonstrate that automatically labeled training data can achieve similar accuracy compared to manual labels when coupled with two deep learning networks. Therefore, SRB training data extraction can effectively deal with the problem of training data scarcity and provide significant advancements in urban point cloud classification, where manual labeling of training data remains a crucial challenge.
Original language | English |
---|---|
Pages (from-to) | 313-334 |
Number of pages | 22 |
Journal | ISPRS journal of photogrammetry and remote sensing |
Volume | 195 |
Early online date | 13 Dec 2022 |
DOIs | |
Publication status | Published - Jan 2023 |
Keywords
- Mobile laser scanning
- Urban area
- Meaningful object
- Strict rule-based (SRB)
- Training data
- Deep Learning (DL)
- ITC-ISI-JOURNAL-ARTICLE
- 2023 OA procedure