BTS: a binary tree sampling strategy for object identification based on deep learning

Xianwei Lv, Zhenfeng Shao*, Xiao Huang, Wen Zhou, Dongping Ming, Jiaming Wang, Chengzhuo Tong

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review


Object-based convolutional neural networks (OCNNs) have achieved great performance in the field of land-cover and land-use classification. Studies have suggested that the generation of object convolutional positions (OCPs) largely determines the performance of OCNNs. Optimized distribution of OCPs facilitates the identification of segmented objects with irregular shapes. In this study, we propose a morphology-based binary tree sampling (BTS) method that provides a reasonable, effective, and robust strategy to generate evenly distributed OCPs. The proposed BTS algorithm consists of three major steps: 1) calculating the required number of OCPs for each object, 2) dividing a vector object into smaller sub-objects, and 3) generating OCPs based on the sub-objects. Taking the object identification in land-cover and land-use classification as a case study, we compare the proposed BTS algorithm with other competing methods. The results suggest that the BTS algorithm outperforms all other competing methods, as it yields more evenly distributed OCPs that contribute to better representation of objects, thus leading to higher object identification accuracy. Further experiments suggest that the efficiency of BTS can be improved when multi-thread technology is implemented.

Original languageEnglish
Pages (from-to)1-27
Number of pages27
JournalInternational journal of geographical information science
Publication statusE-pub ahead of print/First online - 24 Sep 2021


  • convolutional neural network
  • deep learning
  • object convolutional position
  • Object identification
  • UT-Hybrid-D


Dive into the research topics of 'BTS: a binary tree sampling strategy for object identification based on deep learning'. Together they form a unique fingerprint.

Cite this