Autonomous UAV-based 3d-reconstruction of structures for aerial physical interaction

Beril Sirmaçek, Ramy Rashad, Patrick Radl

    Research output: Contribution to conferencePaperpeer-review

    4 Citations (Scopus)

    Abstract

    In most of the scenarios which are addressed by the literature, there is at least a rough prior-knowledge about the structure which needs to be investigated. This prior-knowledge is most of the time come as 2D/3D drawings, CAD models, GIS data or point clouds acquired by laser imaging or photogrammetry. Rough knowledge about the structure can give opportunity to apply a pre-segmentation to the 3D environment and already design a flying path before starting inspection with drone (Mansouri et al., 2018). Even when the drone can make a path re-planning during the flight depending on the details inspected on the structure, the algorithms are still not totally unaware of what to expect approximately where. Herein, we introduce a framework which is able to make path planning on the flight without having any prior information about the environment and the structure which needs to be investigated. Stereo cameras placed on the drone enable us to create a rough representation of the 3D environment in real-time, in order to determine the object of interest which must eventually have a dense and accurate 3D model.

    We have conducted our research and experiments in order to answer the following main question; In order to generate a complete 3D visual model of an unknown object, is it possible to determine a suitable flight direction, considering the completeness of the model on-the-fly?

    For generating the rough real-time 3D environment models, we benefit from the
    robust approach ORBSLAM2 which extract visual features (ORBs) and finds their 3D positions based on a stereo camera observation [ref?]. ORBSLAM2 algorithm results with generating a point cloud where each point corresponds to the ORB features located in the correct position within the 3D environment. The initial object segmentation is done by looking at 3D point cloud normal vectors and calculating the differences of normals (DoN) [ref?] as a clue of the 3D object separation. This 3D segmentation of the ORBSLAM2 outcome point cloud gives us the approximate 3D appearance of our object of interest. Denser representation of the object can be generated again with photogrammetry while the drone determines the best positions to observe the model in order to complete the dense 3D point cloud. The best positions for the drone are
    determined by calculating a 3D entropy map from the point cloud of the structure. The lowest entropy positions are assumed to be the locations where the drone should look from a certain distance and from slightly different angles, in order to add more points to the point cloud. The flight re-planning and point cloud densifying method is repeated until the lowest entropy points reach to a satisfactory value (a pre-determined threshold). In this way, we guarantee to not have holes or poor density information in some local areas of the 3D model.
    The work flow of our novel, fully automated and active drone path planning algorithm is represented in Figure 1. The mathematical implementation details of each algorithm step will be given in the full version of this manuscript.
    We have implemented our algorithms with in the ROS framework and the image acquisitions are done with a simulation of the stereo camera mounted on a drone in Gazebo environment. Our experimental results indicate high potential of our framework in order to perform fully automated 3D reconstruction and inspection work on environments without prior information. This 3D reconstructed visual model will allow the drone to perform the interaction task with any complex shaped body with more autonomy without the prior knowledge of the geometry of it.
    Original languageEnglish
    Pages601-605
    Number of pages5
    DOIs
    Publication statusPublished - 4 Jun 2019
    EventInternational Conference on Unmanned Aerial Vehicles in Geomatics 2019 - Enschede, Netherlands
    Duration: 10 Jun 201914 Jun 2019

    Conference

    ConferenceInternational Conference on Unmanned Aerial Vehicles in Geomatics 2019
    Abbreviated titleUAV-g 2019
    Country/TerritoryNetherlands
    CityEnschede
    Period10/06/1914/06/19

    Fingerprint

    Dive into the research topics of 'Autonomous UAV-based 3d-reconstruction of structures for aerial physical interaction'. Together they form a unique fingerprint.

    Cite this