Publication Type

Journal Article

Version

publishedVersion

Publication Date

1-2018

Abstract

Recently, lots of dictionary learning methods have been proposed and successfully applied. However, many of them assume that the noise in data is drawn from Gaussian or Laplacian distribution and therefore they typically adopt the 2 or 1 norm to characterize these two kinds of noise, respectively. Since this assumption is inconsistent with the real cases, the performance of these methods is limited. In this paper, we propose a novel dictionary learning with structured noise (DLSN) method for handling noisy data. We decompose the original data into three parts: clean data, structured noise, and Gaussian noise, and then characterize them separately. We utilize the low-rank technique to preserve the inherent subspace structure of clean data. Instead of only using the predefined distribution to fit the real distribution of noise, we learn an adaptive dictionary to characterize structured noise and employ the 2 norm to depict Gaussian noise. Such a mechanism can characterize noise more precisely. We also prove that our proposed optimization method can converge to a critical point and the convergence rate is at least sublinear. Experimental results on the data clustering task demonstrate the effectiveness and robustness of our method.

Keywords

Dictionary learning, Structured noise, Low rank representation, Sparse representation

Discipline

Artificial Intelligence and Robotics | Graphics and Human Computer Interfaces

Areas of Excellence

Digital transformation

Publication

Neurocomputing

Volume

273

First Page

414

Last Page

423

ISSN

0925-2312

Identifier

10.1016/J.NEUCOM.2017.07.041

Publisher

Elsevier

Additional URL

https://doi.org/10.1016/J.NEUCOM.2017.07.041

Share

COinS