Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

8-2017

Abstract

The parsing of building facades is a key component to the problem of 3D street scenes reconstruction, which is long desired in computer vision. In this paper, we propose a deep learning based method for segmenting a facade into semantic categories. Man-made structures often present the characteristic of symmetry. Based on this observation, we propose a symmetric regularizer for training the neural network. Our proposed method can make use of both the power of deep neural networks and the structure of man-made architectures. We also propose a method to refine the segmentation results using bounding boxes generated by the Region Proposal Network. We test our method by training a FCN-8s network with the novel loss function. Experimental results show that our method has outperformed previous state-of-the-art methods significantly on both the ECP dataset and the eTRIMS dataset. As far as we know, we are the first to employ end-to-end deepconvolutional neural network on full image scale in the task of building facades parsing.

Keywords

Artificial intelligence, Deep neural networks, Facades, Formal languages, Image segmentation, Neural networks, Semantics, Building facades, Convolutional neural network, Learning approach, Learning-based methods, Man-made structures, Segmentation results, Semantic category, State-of-the-art methods, Deep learning

Discipline

Artificial Intelligence and Robotics | Databases and Information Systems | Numerical Analysis and Scientific Computing

Research Areas

Data Science and Engineering

Publication

Proceedings of the 26th International Joint Conference on Artificial Intelligence, IJCAI 2017: Melbourne, Australia, August 19-25

First Page

2301

Last Page

2307

ISBN

9780999241103

Identifier

10.24963/ijcai.2017/320

Publisher

IJCAI

City or Country

San Francisco, CA

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.24963/ijcai.2017/320

Share

COinS