Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

11-2011

Abstract

Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimension reduction. In this paper, we investigate a problem of Unsupervised Multiple Kernel Learning (UMKL), which does not require class labels of training data as needed in a conventional multiple kernel learning task. Since a kernel essentially defines pairwise similarity between any two examples, our unsupervised kernel learning method mainly follows two intuitive principles: (1) a good kernel should allow every example to be well reconstructed from its localized bases weighted by the kernel values; (2) a good kernel should induce kernel values that are coincided with the local geometry of the data. We formulate the unsupervised multiple kernel learning problem as an optimization task and propose an efficient alternating optimization algorithm to solve it. Empirical results on both classification and dimension reductions tasks validate the efficacy of the proposed UMKL algorithm.

Discipline

Computer Sciences | Databases and Information Systems

Publication

JMLR: Workshop and Conference Proceedings: 3rd Asian Conference on Machine Learning 2011, November 13-15, Taoyuan, Taiwan

Volume

20

First Page

129

Last Page

144

ISSN

1532-4435

Publisher

JMLR

City or Country

Cambridge, MA

Additional URL

http://jmlr.csail.mit.edu/proceedings/papers/v20/zhuang11/zhuang11.pdf

Share

COinS