Detection of radioactive waste sites in the Chornobyl exclusion zone using UAV-based lidar data and multispectral imagery

S. Briechle*, Norbert Molitor, Peter Krzystek, G. Vosselman

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

4 Citations (Scopus)

Abstract

The severe accident at the Chornobyl Nuclear Power Plant (ChNPP) in 1986 resulted in extraordinary contamination of the surrounding territory, which necessitated the creation of the Chornobyl Exclusion Zone (ChEZ). During the accident, liquidation materials contaminated by radioactive fallout (e.g., contaminated soil and trees) were buried in so-called Radioactive Waste Temporary Storage Places (RWTSPs). The exact locations of these burials were not always sufficiently documented. However, for safety management, including eventual remediation works, it is crucial to know their locations and rely on precise hazard maps. Over the past 34 years, most of these so-called trenches and clamps have been exposed to natural processes. In addition to settlement
and erosion, they have been overgrown with dense vegetation. To date, more than 700 burials have been thoroughly investigated, but a large number of burial sites (approximately 300) are still unknown. In the past, numerous burials were identified based on settlement or elevation in the decimeter range, and vegetation anomalies that tend to appear in the immediate vicinity. Nevertheless, conventional detection methods are time-, effort- and radiation dose-intensive. Airborne gamma spectrometry and visual ground inspection of morphology and vegetation can provide useful complementary information, but it is insufficient for precisely localizing unknown burial sites in many cases. Therefore, sensor technologies, such as UAV-based lidar and multispectral
imagery, have been identified as potential alternative solutions. This paper presents a novel method to detect radioactive waste sites based on a set of prominent features generated from high-resolution remote sensing data in combination with a random forest (RF) classifier. Initially, we generate a digital terrain model (DTM) and 3D vegetation map from the data and derive tree-based features, including tree density, tree height, and tree species. Feature subsets compiled from normalized DTM height, fast point feature histograms (FPFH), and lidar metrics are then incorporated. Next, an RF classifier is trained on reference areas defined by visual interpretation of the DTM grid. A backward feature selection strategy reduces the feature space significantly and avoids overfitting.
Feature relevance assessment clearly demonstrates that the members of all feature subsets represent a final list of the most prominent features. For three representative study areas, the mean overall accuracy (OA) is 98.2% when using area-wide test data. Cohens’ kappa coefficient ranges from 0.609 to 0.758. Additionally, we demonstrate the transferability of a trained classifier to an adjacent study area (OA = 93.6%, = 0.452). As expected, when utilizing the classifier on geometrically incorrect and incomplete reference data, which were generated from old maps and orthophotos based on visual inspection, the OA decreases significantly to 65.1% ( = 0.481). Finally, detection is verified through 38 borings that successfully confirm the existence of previously unknown buried nuclear materials in classified areas. These results demonstrate that the proposed methodology is applicable to detecting area-wide unknown radioactive biomass burials in the ChEZ.
Original languageEnglish
Pages (from-to)345-362
Number of pages18
JournalISPRS journal of photogrammetry and remote sensing
Volume167
Early online date1 Aug 2020
DOIs
Publication statusPublished - Sep 2020

Keywords

  • UAV
  • Lidar
  • Multispectral imagery
  • Radioactive waste sites
  • 3D vegetation mapping
  • Machine Learning
  • ITC-ISI-JOURNAL-ARTICLE

Fingerprint

Dive into the research topics of 'Detection of radioactive waste sites in the Chornobyl exclusion zone using UAV-based lidar data and multispectral imagery'. Together they form a unique fingerprint.

Cite this