Skip to main content

A variational method for multisource remote-sensing image fusion

Buy Article:

$71.00 + tax (Refund Policy)

With the increasing availability of multisource image data from Earth observation satellites, image fusion, a technique that produces a single image which preserves major salient features from a set of different inputs, has become an important tool in the field of remote sensing since usually the complete information cannot be obtained by a single sensor. In this article, we develop a new pixel-based variational model for image fusion using gradient features. The basic assumption is that the fused image should have a gradient that is close to the most salient gradient in the multisource inputs. Meanwhile, we integrate the inputs with the average quadratic local dispersion measure for the purpose of uniform and natural perception. Furthermore, we introduce a split Bregman algorithm to implement the proposed functional more effectively. To verify the effect of the proposed method, we visually and quantitatively compare it with the conventional image fusion schemes, such as the Laplacian pyramid, morphological pyramid, and geometry-based enhancement fusion methods. The results demonstrate the effectiveness and stability of the proposed method in terms of the related fusion evaluation benchmarks. In particular, the computation efficiency of the proposed method compared with other variational methods also shows that our method is remarkable.

Document Type: Research Article

Affiliations: 1: Department of Computer Science,East China Normal University, Shanghai, China 2: Department of Mathematics,East China Normal University, Shanghai, China

Publication date: 10 April 2013

More about this publication?
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content