TY - JOUR
T1 - Semantic-aware unsupervised domain adaptation for height estimation from single-view aerial images
AU - Zhao, Wufan
AU - Persello, C.
AU - Stein, A.
N1 - Funding Information:
We thank the Dutch Cadaster (Kadaster) for their help in data preparation and preprocessing for this study. Wufan Zhao acknowledges the financial support from the China Scholarship Council and the Foundation of Anhui Province Key Laboratory of Physical Geographic Environment, P.R. China (Grant No. 2022PGE012 ).
Publisher Copyright:
© 2023 The Author(s)
PY - 2023/2
Y1 - 2023/2
N2 - Traditional acquisition of height data to generate normalized digital surface models (nDSMs) of very high spatial resolution is time-consuming and expensive. Height estimation by means of optical remote sensing images is a more efficient and timely way to do so. Recent studies employed supervised learning methods. State-of-the-art computer vision methods, however, overlook semantic consistency during remote sensing image translation and neglect multi-task correlations for specific task learning. To address these problems, this paper proposes a semantic-aware unsupervised domain adaptation method for height estimation. The method consists of image translation and multitask representation learning for height estimation. We tested the transferability of our method from the ISPRS Postdam data set to the Vaihingen data set and a custom dataset of Enschede. In the image translation task, our method improved the Fréchet Inception Distance (FID) metric by at least 12.8% and 12.1% on the two datasets, respectively. In the height estimation task, our method achieved RMSEs of 3.257 m and 3.875 m, which are at least 0.603 m and 0.072 m lower than the compared unsupervised algorithm, and achieved competitive results with the supervised learning algorithm. Our results show the advantages of the proposed method in height estimation and image translation as compared to alternative strategies. We conclude that adding semantic supervision improves height estimation from single-view orthophoto under unsupervised domain adaptation. It also alleviates the problem of limited access to nDSM data for training the method.
AB - Traditional acquisition of height data to generate normalized digital surface models (nDSMs) of very high spatial resolution is time-consuming and expensive. Height estimation by means of optical remote sensing images is a more efficient and timely way to do so. Recent studies employed supervised learning methods. State-of-the-art computer vision methods, however, overlook semantic consistency during remote sensing image translation and neglect multi-task correlations for specific task learning. To address these problems, this paper proposes a semantic-aware unsupervised domain adaptation method for height estimation. The method consists of image translation and multitask representation learning for height estimation. We tested the transferability of our method from the ISPRS Postdam data set to the Vaihingen data set and a custom dataset of Enschede. In the image translation task, our method improved the Fréchet Inception Distance (FID) metric by at least 12.8% and 12.1% on the two datasets, respectively. In the height estimation task, our method achieved RMSEs of 3.257 m and 3.875 m, which are at least 0.603 m and 0.072 m lower than the compared unsupervised algorithm, and achieved competitive results with the supervised learning algorithm. Our results show the advantages of the proposed method in height estimation and image translation as compared to alternative strategies. We conclude that adding semantic supervision improves height estimation from single-view orthophoto under unsupervised domain adaptation. It also alleviates the problem of limited access to nDSM data for training the method.
KW - Generative adversarial network
KW - Height estimation
KW - Optical remote sensing imagery
KW - Semantic consistency
KW - Unsupervised domain adaptation
KW - UT-Hybrid-D
KW - ITC-ISI-JOURNAL-ARTICLE
KW - ITC-HYBRID
UR - https://www.scopus.com/pages/publications/85146430031
U2 - 10.1016/j.isprsjprs.2023.01.003
DO - 10.1016/j.isprsjprs.2023.01.003
M3 - Article
AN - SCOPUS:85146430031
SN - 0924-2716
VL - 196
SP - 372
EP - 385
JO - ISPRS journal of photogrammetry and remote sensing
JF - ISPRS journal of photogrammetry and remote sensing
ER -