Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

2-2016

Abstract

Cross-modal hashing (CMH) is an efficient technique for the fast retrieval of web image data, and it has gained a lot of attentions recently. However, traditional CMH methods usually apply batch learning for generating hash functions and codes. They are inefficient for the retrieval of web images which usually have streaming fashion. Online learning can be exploited for CMH. But existing online hashing methods still cannot solve two essential problems: Efficient updating of hash codes and analysis of cross-modal correlation. In this paper, we propose Online Cross-modal Hashing (OCMH) which can effectively address the above two problems by learning the shared latent codes (SLC). In OCMH, hash codes can be represented by the permanent SLC and dynamic transfer matrix. Therefore, inefficient updating of hash codes is transformed to the efficient updating of SLC and transfer matrix, and the time complexity is irrelevant to the database size. Moreover, SLC is shared by all the modalities, and thus it can encode the latent cross-modal correlation, which further improves the overall cross-modal correlation between heterogeneous data. Experimental results on two real-world multi-modal web image datasets: MIR Flickr and NUS-WIDE, demonstrate the effectiveness and efficiency of OCMH for online cross-modal web image retrieval.

Keywords

Cross-modal hashing, image retrieval, codes, transfer matrix

Discipline

Computer Sciences | Databases and Information Systems

Publication

Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16): Phoenix, AZ, February 12-17, 2016

First Page

294

Last Page

300

ISBN

9781577357605

Publisher

AAAI Press

City or Country

Pala Alto, CA

Additional URL

http://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/view/12125

Share

COinS