Treffer: PSCSC-Net: A Deep Coupled Convolutional Sparse Coding Network for Pansharpening.

Title:
PSCSC-Net: A Deep Coupled Convolutional Sparse Coding Network for Pansharpening.
Source:
IEEE Transactions on Geoscience & Remote Sensing. Jan2022, Vol. 60 Issue 1, p1-16. 16p.
Database:
Business Source Premier

Weitere Informationen

Given a low-resolution multispectral (MS) image and a high-resolution panchromatic image, the task of pansharpening is to generate a high-resolution MS image. Deep learning (DL)-based methods receive extensive attention recently. Different from the existing DL-based methods, this article proposes a novel deep neural network for pansharpening inspired by the learned iterative soft thresholding algorithm. First, a coupled convolutional sparse coding-based pansharpening (PSCSC) model and related traditional optimization algorithm are proposed. Then, following the procedures of traditional algorithm for solving PSCSC, an interpretable end-to-end deep pansharpening network is developed using a deep unfolding strategy. The designed deep architecture can also be understood in the view of details injection (DI)-based scheme. This work offers a solution that integrates the DL-, DI-, and variational optimization-based schemes into a framework. The experimental results on the reduced- and full-scale datasets demonstrate that the proposed deep pansharpening network outperforms popular traditional methods and some current DL-based methods. [ABSTRACT FROM AUTHOR]

Copyright of IEEE Transactions on Geoscience & Remote Sensing is the property of IEEE and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)