Full Text

Turn on search term navigation

Copyright © 2017 Xuan Zhu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Sparse representation has recently attracted enormous interests in the field of image super-resolution. The sparsity-based methods usually train a pair of global dictionaries. However, only a pair of global dictionaries cannot best sparsely represent different kinds of image patches, as it neglects two most important image features: edge and direction. In this paper, we propose to train two novel pairs of Direction and Edge dictionaries for super-resolution. For single-image super-resolution, the training image patches are, respectively, divided into two clusters by two new templates representing direction and edge features. For each cluster, a pair of Direction and Edge dictionaries is learned. Sparse coding is combined with the Direction and Edge dictionaries to realize super-resolution. The above single-image super-resolution can restore the faithful high-frequency details, and the POCS is convenient for incorporating any kind of constraints or priors. Therefore, we combine the two methods to realize multiframe super-resolution. Extensive experiments on image super-resolution are carried out to validate the generality, effectiveness, and robustness of the proposed method. Experimental results demonstrate that our method can recover better edge structure and details.

Details

Title
Image Super-Resolution Based on Sparse Representation via Direction and Edge Dictionaries
Author
Zhu, Xuan; Wang, Xianxian; Wang, Jun; Peng, Jin; Liu, Li; Mei, Dongfeng
Publication year
2017
Publication date
2017
Publisher
John Wiley & Sons, Inc.
ISSN
1024123X
e-ISSN
15635147
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
1917271474
Copyright
Copyright © 2017 Xuan Zhu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.