Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
10-2013
Abstract
Our objective is to estimate the relevance of an image to a query for image search purposes. We address two limitations of the existing image search engines in this paper. First, there is no straightforward way of bridging the gap between semantic textual queries as well as users’ search intents and image visual content. Image search engines therefore primarily rely on static and textual features. Visual features are mainly used to identify potentially useful recurrent patterns or relevant training examples for complementing search by image reranking. Second, image rankers are trained on query-image pairs labeled by human experts, making the annotation intellectually expensive and timeconsuming. Furthermore, the labels may be subjective when the queries are ambiguous, resulting in difficulty in predicting the search intention. We demonstrate that the aforementioned two problems can be mitigated by exploring the use of click-through data, which can be viewed as the footprints of user searching behavior, as an effective means of understanding query. The correspondences between an image and a query are determined by whether the image was searched and clicked by users under the query in a commercial image search engine. We therefore hypothesize that the image click counts in response to a query are as their relevance indications. For each new image, our proposed graph-based label propagation algorithm employs neighborhood graph search to find the nearest neighbors on an image similarity graph built up with visual representations from deep neural networks and further aggregates their clicked queries/click counts to get the labels of the new image. We conduct experiments on MSR-Bing Grand Challenge and the results show consistent performance gain over various baselines. In addition, the proposed approach is very efficient, completing annotation of each query-image pair within just 15 milliseconds on a regular PC.
Keywords
Click-through data, Deep neural networks, Image search, Neighborhood graph search
Discipline
Databases and Information Systems | Data Storage Systems | Graphics and Human Computer Interfaces
Research Areas
Intelligent Systems and Optimization
Publication
MM '13: Proceedings of the 21st ACM International Conference on Multimedia: October 21-25, Barcelona, Spain
First Page
397
Last Page
400
ISBN
9781450324045
Identifier
10.1145/2502081.2508128
Publisher
ACM
City or Country
Barcelona, Spain
Citation
PAN, Yingwei; TING, Yao; YANG, Kuiyuan; LI, Houqiang; NGO, Chong-wah; WANG, Jingdong; and MEI, Tao.
Image search by graph-based label propagation with image representation from DNN. (2013). MM '13: Proceedings of the 21st ACM International Conference on Multimedia: October 21-25, Barcelona, Spain. 397-400.
Available at: https://ink.library.smu.edu.sg/sis_research/6459
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Included in
Databases and Information Systems Commons, Data Storage Systems Commons, Graphics and Human Computer Interfaces Commons