Contour detection for UAV-based cadastral mapping

S.C. Crommelinck, R.H. Bennett, M. Gerke, Ying Yang, G. Vosselman

Research output: Contribution to journalArticleAcademicpeer-review

56 Citations (Scopus)
174 Downloads (Pure)

Abstract

Unmanned aerial vehicles (UAVs) provide a flexible and low-cost solution for the acquisition of high-resolution data. The potential of high-resolution UAV imagery to create and update cadastral maps is being increasingly investigated. Existing procedures generally involve substantial fieldwork and many manual processes. Arguably, multiple parts of UAV-based cadastral mapping workflows could be automated. Specifically, as many cadastral boundaries coincide with visible boundaries, they could be extracted automatically using image analysis methods. This study investigates the transferability of gPb contour detection, a state-of-the-art computer vision method, to remotely sensed UAV images and UAV-based cadastral mapping. Results show that the approach is transferable to UAV data and automated cadastral mapping: object contours are comprehensively detected at completeness and correctness rates of up to 80%. The detection quality is optimal when the entire scene is covered with one orthoimage, due to the global optimization of gPb contour detection. However, a balance between high completeness and correctness is hard to achieve, so a combination with area-based segmentation and further object knowledge is proposed. The localization quality exhibits the usual dependency on ground resolution. The approach has the potential to accelerate the process of general boundary delineation during the creation and updating of cadastral maps.
Original languageEnglish
Article number171
Pages (from-to)1-13
Number of pages13
JournalRemote sensing
Volume9
Issue number2
DOIs
Publication statusPublished - 2017

Keywords

  • ITC-ISI-JOURNAL-ARTICLE
  • ITC-GOLD

Fingerprint

Dive into the research topics of 'Contour detection for UAV-based cadastral mapping'. Together they form a unique fingerprint.

Cite this