Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

2-2020

Abstract

Automatic map extraction is of great importance to urban computing and location-based services. Aerial image and GPS trajectory data refer to two different data sources that could be leveraged to generate the map, although they carry different types of information. Most previous works on data fusion between aerial images and data from auxiliary sensors do not fully utilize the information of both modalities and hence suffer from the issue of information loss. We propose a deep convolutional neural network called DeepDualMapper which fuses the aerial image and trajectory data in a more seamless manner to extract the digital map. We design a gated fusion module to explicitly control the information flows from both modalities in a complementary-aware manner. Moreover, we propose a novel densely supervised refinement decoder to generate the prediction in a coarse-to-fine way. Our comprehensive experiments demonstrate that DeepDualMapper can fuse the information of images and trajectories much more effectively than existing approaches, and is able to generate maps with higher accuracy.

Keywords

Aerial images, Coarse to fine, Fusion modules, GPS trajectories, Information flows, Information loss, Trajectory data, Urban computing

Discipline

Artificial Intelligence and Robotics | Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

Proceedings of the 34th AAAI Conference on Artificial Intelligence, New York, 2020 February 7-12

First Page

1037

Last Page

1045

ISBN

9781577358350

Identifier

10.1609/aaai.v34i01.5453

Publisher

AAAI Press

City or Country

New York

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1609/aaai.v34i01.5453

Share

COinS