Publication Type

Journal Article

Version

acceptedVersion

Publication Date

8-2010

Abstract

With the proliferation of Web 2.0 applications, usersupplied social tags are commonly available in social media as a means to bridge the semantic gap. On the other hand, the explosive expansion of social web makes an overwhelming number of web videos available, among which there exists a large number of near-duplicate videos. In this paper, we investigate techniques which allow effective annotation of web videos from a data-driven perspective. A novel classifier-free video annotation framework is proposed by first retrieving visual duplicates and then suggesting representative tags. The significance of this paper lies in the addressing of two timely issues for annotating query videos. First, we provide a novel solution for fast near-duplicate video retrieval. Second, based on the outcome of near-duplicate search, we explore the potential that the data-driven annotation could be successful when huge volume of tagged web videos is freely accessible online. Experiments on cross sources (annotating Google videos and Yahoo! videos using YouTube videos) and cross time periods (annotating YouTube videos using historical data) show the effectiveness and efficiency of the proposed classifier-free approach for web video tag annotation.

Keywords

Data-driven, near-duplicate video search, video annotation, web video

Discipline

Data Storage Systems | Graphics and Human Computer Interfaces

Research Areas

Intelligent Systems and Optimization

Publication

IEEE Transactions on Multimedia

Volume

12

Issue

5

First Page

448

Last Page

461

ISSN

1520-9210

Identifier

10.1109/TMM.2010.2050651

Publisher

Institute of Electrical and Electronics Engineers

Share

COinS