Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/105568
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: Geometrically consistent plane extraction for dense indoor 3D maps segmentation
Author: Pham, T.
Eich, M.
Reid, I.
Wyeth, G.
Citation: Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2016, vol.2016-November, pp.4199-4204
Publisher: IEEE
Issue Date: 2016
ISBN: 9781509037629
ISSN: 2153-0858
2153-0866
Conference Name: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016) (9 Oct 2016 - 14 Oct 2016 : Daejeon, SOUTH KOREA)
Statement of
Responsibility: 
Trung T. Pham, Markus Eich, Ian Reid, Gordon Wyeth
Abstract: Modern SLAM systems with a depth sensor are able to reliably reconstruct dense 3D geometric maps of indoor scenes. Representing these maps in terms of meaningful entities is a step towards building semantic maps for autonomous robots. One approach is to segment the 3D maps into semantic objects using Conditional Random Fields (CRF), which requires large 3D ground truth datasets to train the classification model. Additionally, the CRF inference is often computationally expensive. In this paper, we present an unsupervised geometric-based approach for the segmentation of 3D point clouds into objects and meaningful scene structures. We approximate an input point cloud by an adjacency graph over surface patches, whose edges are then classified as being either on or off. We devise an effective classifier which utilises both global planar surfaces and local surface convexities for edge classification. More importantly, we propose a novel global plane extraction algorithm for robustly discovering the underlying planes in the scene. Our algorithm is able to enforce the extracted planes to be mutually orthogonal or parallel which conforms usually with human-made indoor environments. We reconstruct 654 3D indoor scenes from NYUv2 sequences to validate the efficiency and effectiveness of our segmentation method.
Rights: ©2016 IEEE
DOI: 10.1109/IROS.2016.7759618
Grant ID: http://purl.org/au-research/grants/arc/CE140100016
http://purl.org/au-research/grants/arc/FL130100102
Published version: http://dx.doi.org/10.1109/iros.2016.7759618
Appears in Collections:Aurora harvest 8
Computer Science publications

Files in This Item:
File Description SizeFormat 
RA_hdl_105568.pdf
  Restricted Access
Restricted access5.36 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.