Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
12-2011
Abstract
Tag-based social image search has attracted great interest and how to order the search results based on relevance level is a research problem. Visual content of images and tags have both been investigated. However, existing methods usually employ tags and visual content separately or sequentially to learn the image relevance. This paper proposes a tag-based image search with visual-text joint hypergraph learning. We simultaneously investigate the bag-of-words and bag-of-visual-words representations of images and accomplish the relevance estimation with a hypergraph learning approach. Each textual or visual word generates a hyperedge in the constructed hypergraph. We conduct experiments with a real-world data set and experimental results demonstrate the effectiveness of our approach.
Keywords
hypergraph learning, tag-based image search, visual-text
Discipline
Databases and Information Systems
Publication
MM '11: Proceedings of the 2011 ACM Multimedia Conference: November 28 - December 1, 2011, Scottsdale, AZ, USA
First Page
1517
Last Page
1520
ISBN
9781450306164
Identifier
10.1145/2072298.2072054
Publisher
ACM
City or Country
New York
Citation
GAO, Yue; WANG, Meng; LUAN, Huanboo; SHEN, Jialie; YAN, Shuicheng; and TAO, Dacheng.
Tag-based social image search with visual-text joint hypergraph learning. (2011). MM '11: Proceedings of the 2011 ACM Multimedia Conference: November 28 - December 1, 2011, Scottsdale, AZ, USA. 1517-1520.
Available at: https://ink.library.smu.edu.sg/sis_research/1447
Copyright Owner and License
Authors
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
http://doi.org/10.1145/2072298.2072054