Indoor 3d modeling and flexible space subdivision from point clouds

S. Nikoohemat, A. Diakité, S. Zlatanova, G. Vosselman

Research output: Chapter in Book/Report/Conference proceedingChapterAcademicpeer-review

93 Downloads (Pure)

Abstract

Indoor navigation can be a tedious process in a complex and unknown environment. It gets more critical when the first responders try to intervene in a big building after a disaster has occurred. For such cases, an accurate map of the building is among the best supports possible. Unfortunately, such a map is not always available, or generally outdated and imprecise, leading to error prone decisions. Thanks to advances in the laser scanning, accurate 3D maps can be built in relatively small amount of time using all sort of laser scanners (stationary, mobile, drone), although the information they provide is generally an unstructured point cloud. While most of the existing approaches try to extensively process the point cloud in order to produce an accurate architectural model of the scanned building, similar to a Building Information Model (BIM), we have adopted a space-focused approach. This paper presents our framework that starts from point-clouds of complex indoor environments, performs advanced processes to identify the 3D structures critical to navigation and path planning, and provides fine-grained navigation networks that account for obstacles and spatial accessibility of the navigating agents. The method involves generating a volumetric-wall vector model from the point cloud, identifying the obstacles and extracting the navigable 3D spaces. Our work contributes a new approach for space subdivision without the need of using laser scanner positions or viewpoints. Unlike 2D cell decomposition or a binary space partitioning, this work introduces a space enclosure method to deal with 3D space extraction and non-Manhattan World architecture. The results show more than 90% of spaces are correctly extracted. The approach is tested on several real buildings and relies on the latest advances in indoor navigation.
Original languageEnglish
Title of host publicationISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Subtitle of host publicationISPRS Geospatial Week 2019
EditorsG. Vosselman, S.J. Oude Elberink, M.Y. Yang
Place of PublicationEnschede
PublisherInternational Society for Photogrammetry and Remote Sensing (ISPRS)
Pages285-292
Number of pages8
VolumeIV
Edition2/W5
DOIs
Publication statusPublished - 29 May 2019
Event4th ISPRS Geospatial Week 2019 - University of Twente, Enschede, Netherlands
Duration: 10 Jun 201914 Jun 2019
Conference number: 4
https://www.gsw2019.org/

Publication series

NameISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
PublisherCopernicus
ISSN (Print)2194-9042

Conference

Conference4th ISPRS Geospatial Week 2019
CountryNetherlands
CityEnschede
Period10/06/1914/06/19
Internet address

Fingerprint

Navigation
Lasers
Motion planning
Enclosures
Disasters
Decomposition
Scanning

Keywords

  • ITC-GOLD

Cite this

Nikoohemat, S., Diakité, A., Zlatanova, S., & Vosselman, G. (2019). Indoor 3d modeling and flexible space subdivision from point clouds. In G. Vosselman, S. J. Oude Elberink, & M. Y. Yang (Eds.), ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences: ISPRS Geospatial Week 2019 (2/W5 ed., Vol. IV, pp. 285-292). (ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences). Enschede: International Society for Photogrammetry and Remote Sensing (ISPRS). https://doi.org/10.5194/isprs-annals-IV-2-W5-285-2019
Nikoohemat, S. ; Diakité, A. ; Zlatanova, S. ; Vosselman, G. / Indoor 3d modeling and flexible space subdivision from point clouds. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences: ISPRS Geospatial Week 2019. editor / G. Vosselman ; S.J. Oude Elberink ; M.Y. Yang. Vol. IV 2/W5. ed. Enschede : International Society for Photogrammetry and Remote Sensing (ISPRS), 2019. pp. 285-292 (ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences).
@inbook{7ebe8036ab8a4dc38faf1c4f72642763,
title = "Indoor 3d modeling and flexible space subdivision from point clouds",
abstract = "Indoor navigation can be a tedious process in a complex and unknown environment. It gets more critical when the first responders try to intervene in a big building after a disaster has occurred. For such cases, an accurate map of the building is among the best supports possible. Unfortunately, such a map is not always available, or generally outdated and imprecise, leading to error prone decisions. Thanks to advances in the laser scanning, accurate 3D maps can be built in relatively small amount of time using all sort of laser scanners (stationary, mobile, drone), although the information they provide is generally an unstructured point cloud. While most of the existing approaches try to extensively process the point cloud in order to produce an accurate architectural model of the scanned building, similar to a Building Information Model (BIM), we have adopted a space-focused approach. This paper presents our framework that starts from point-clouds of complex indoor environments, performs advanced processes to identify the 3D structures critical to navigation and path planning, and provides fine-grained navigation networks that account for obstacles and spatial accessibility of the navigating agents. The method involves generating a volumetric-wall vector model from the point cloud, identifying the obstacles and extracting the navigable 3D spaces. Our work contributes a new approach for space subdivision without the need of using laser scanner positions or viewpoints. Unlike 2D cell decomposition or a binary space partitioning, this work introduces a space enclosure method to deal with 3D space extraction and non-Manhattan World architecture. The results show more than 90{\%} of spaces are correctly extracted. The approach is tested on several real buildings and relies on the latest advances in indoor navigation.",
keywords = "ITC-GOLD",
author = "S. Nikoohemat and A. Diakit{\'e} and S. Zlatanova and G. Vosselman",
year = "2019",
month = "5",
day = "29",
doi = "10.5194/isprs-annals-IV-2-W5-285-2019",
language = "English",
volume = "IV",
series = "ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences",
publisher = "International Society for Photogrammetry and Remote Sensing (ISPRS)",
pages = "285--292",
editor = "G. Vosselman and {Oude Elberink}, S.J. and M.Y. Yang",
booktitle = "ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences",
edition = "2/W5",

}

Nikoohemat, S, Diakité, A, Zlatanova, S & Vosselman, G 2019, Indoor 3d modeling and flexible space subdivision from point clouds. in G Vosselman, SJ Oude Elberink & MY Yang (eds), ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences: ISPRS Geospatial Week 2019. 2/W5 edn, vol. IV, ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, International Society for Photogrammetry and Remote Sensing (ISPRS), Enschede, pp. 285-292, 4th ISPRS Geospatial Week 2019, Enschede, Netherlands, 10/06/19. https://doi.org/10.5194/isprs-annals-IV-2-W5-285-2019

Indoor 3d modeling and flexible space subdivision from point clouds. / Nikoohemat, S.; Diakité, A.; Zlatanova, S.; Vosselman, G.

ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences: ISPRS Geospatial Week 2019. ed. / G. Vosselman; S.J. Oude Elberink; M.Y. Yang. Vol. IV 2/W5. ed. Enschede : International Society for Photogrammetry and Remote Sensing (ISPRS), 2019. p. 285-292 (ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences).

Research output: Chapter in Book/Report/Conference proceedingChapterAcademicpeer-review

TY - CHAP

T1 - Indoor 3d modeling and flexible space subdivision from point clouds

AU - Nikoohemat, S.

AU - Diakité, A.

AU - Zlatanova, S.

AU - Vosselman, G.

PY - 2019/5/29

Y1 - 2019/5/29

N2 - Indoor navigation can be a tedious process in a complex and unknown environment. It gets more critical when the first responders try to intervene in a big building after a disaster has occurred. For such cases, an accurate map of the building is among the best supports possible. Unfortunately, such a map is not always available, or generally outdated and imprecise, leading to error prone decisions. Thanks to advances in the laser scanning, accurate 3D maps can be built in relatively small amount of time using all sort of laser scanners (stationary, mobile, drone), although the information they provide is generally an unstructured point cloud. While most of the existing approaches try to extensively process the point cloud in order to produce an accurate architectural model of the scanned building, similar to a Building Information Model (BIM), we have adopted a space-focused approach. This paper presents our framework that starts from point-clouds of complex indoor environments, performs advanced processes to identify the 3D structures critical to navigation and path planning, and provides fine-grained navigation networks that account for obstacles and spatial accessibility of the navigating agents. The method involves generating a volumetric-wall vector model from the point cloud, identifying the obstacles and extracting the navigable 3D spaces. Our work contributes a new approach for space subdivision without the need of using laser scanner positions or viewpoints. Unlike 2D cell decomposition or a binary space partitioning, this work introduces a space enclosure method to deal with 3D space extraction and non-Manhattan World architecture. The results show more than 90% of spaces are correctly extracted. The approach is tested on several real buildings and relies on the latest advances in indoor navigation.

AB - Indoor navigation can be a tedious process in a complex and unknown environment. It gets more critical when the first responders try to intervene in a big building after a disaster has occurred. For such cases, an accurate map of the building is among the best supports possible. Unfortunately, such a map is not always available, or generally outdated and imprecise, leading to error prone decisions. Thanks to advances in the laser scanning, accurate 3D maps can be built in relatively small amount of time using all sort of laser scanners (stationary, mobile, drone), although the information they provide is generally an unstructured point cloud. While most of the existing approaches try to extensively process the point cloud in order to produce an accurate architectural model of the scanned building, similar to a Building Information Model (BIM), we have adopted a space-focused approach. This paper presents our framework that starts from point-clouds of complex indoor environments, performs advanced processes to identify the 3D structures critical to navigation and path planning, and provides fine-grained navigation networks that account for obstacles and spatial accessibility of the navigating agents. The method involves generating a volumetric-wall vector model from the point cloud, identifying the obstacles and extracting the navigable 3D spaces. Our work contributes a new approach for space subdivision without the need of using laser scanner positions or viewpoints. Unlike 2D cell decomposition or a binary space partitioning, this work introduces a space enclosure method to deal with 3D space extraction and non-Manhattan World architecture. The results show more than 90% of spaces are correctly extracted. The approach is tested on several real buildings and relies on the latest advances in indoor navigation.

KW - ITC-GOLD

UR - https://ezproxy2.utwente.nl/login?url=https://library.itc.utwente.nl/login/2019/chap/nikoohemat_ind.pdf

U2 - 10.5194/isprs-annals-IV-2-W5-285-2019

DO - 10.5194/isprs-annals-IV-2-W5-285-2019

M3 - Chapter

VL - IV

T3 - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

SP - 285

EP - 292

BT - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

A2 - Vosselman, G.

A2 - Oude Elberink, S.J.

A2 - Yang, M.Y.

PB - International Society for Photogrammetry and Remote Sensing (ISPRS)

CY - Enschede

ER -

Nikoohemat S, Diakité A, Zlatanova S, Vosselman G. Indoor 3d modeling and flexible space subdivision from point clouds. In Vosselman G, Oude Elberink SJ, Yang MY, editors, ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences: ISPRS Geospatial Week 2019. 2/W5 ed. Vol. IV. Enschede: International Society for Photogrammetry and Remote Sensing (ISPRS). 2019. p. 285-292. (ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences). https://doi.org/10.5194/isprs-annals-IV-2-W5-285-2019