K-SVD meets transform learning: Transform K-SVD

Ender M. Eksioglu, Ozden Bayir

Research output: Contribution to journalArticlepeer-review

45 Citations (Scopus)

Abstract

Recently there has been increasing attention directed towards the analysis sparsity models. Consequently, there is a quest for learning the operators which would enable analysis sparse representations for signals in hand. Analysis operator learning algorithms such as the Analysis K-SVD have been proposed. Sparsifying transform learning is a paradigm which is similar to the analysis operator learning, but they differ in some subtle points. In this paper, we propose a novel transform operator learning algorithm called as the Transform K-SVD, which brings the transform learning and the K-SVD based analysis dictionary learning approaches together. The proposed Transform K-SVD has the important advantage that the sparse coding step of the Analysis K-SVD gets replaced with the simple thresholding step of the transform learning framework. We show that the Transform K-SVD learns operators which are similar both in appearance and performance to the operators learned from the Analysis K-SVD, while its computational complexity stays much reduced compared to the Analysis K-SVD.

Original languageEnglish
Article number6727427
Pages (from-to)347-351
Number of pages5
JournalIEEE Signal Processing Letters
Volume21
Issue number3
DOIs
Publication statusPublished - Mar 2014

Keywords

  • Analysis operator learning
  • dictionary learning
  • sparse representation
  • sparsifying transform learning

Fingerprint

Dive into the research topics of 'K-SVD meets transform learning: Transform K-SVD'. Together they form a unique fingerprint.

Cite this