Temporally Consistent Horizon Lines

Florian Kluger, Hanno Ackermann, Michael Yin Yang, Bodo Rosenhahn

Research output: Working paper

66 Downloads (Pure)

Abstract

The horizon line is an important geometric feature for many image processing and scene understanding tasks in computer vision. For instance, in navigation of autonomous vehicles or driver assistance, it can be used to improve 3D reconstruction as well as for semantic interpretation of dynamic environments. While both algorithms and datasets exist for single images, the problem of horizon line estimation from video sequences has not gained attention. In this paper, we show how convolutional neural networks are able to utilise the temporal consistency imposed by video sequences in order to increase the accuracy and reduce the variance of horizon line estimates. A novel CNN architecture with an improved residual convolutional LSTM is presented for temporally consistent horizon line estimation. We propose an adaptive loss function that ensures stable training as well as accurate results. Furthermore, we introduce an extension of the KITTI dataset which contains precise horizon line labels for 43699 images across 72 video sequences. A comprehensive evaluation shows that the proposed approach consistently achieves superior performance compared with existing methods.
Original languageEnglish
PublisherArXiv.org
Pages1-16
Number of pages16
DOIs
Publication statusPublished - 9 Jan 2020

Keywords

  • ITC-GOLD

Fingerprint

Dive into the research topics of 'Temporally Consistent Horizon Lines'. Together they form a unique fingerprint.

Cite this