Skip to main content
padlock icon - secure page this page is secure

Open Access Recovery of Soil Moisture Active Passive (SMAP) Instrument's Active Measurements via Coupled Dictionary Learning

Download Article:
(PDF 3,002.7 kb)
NASA's Soil Moisture Active Passive (SMAP) satellite mission combines a passive L-band radiometer and an active Synthetic Aperture Radar (SAR) instrument in order to monitor the near-surface soil moisture and freeze-thaw states globally, with a revisit frequency of 2-3 days. SMAP provides three soil moisture products: a high-resolution from the radar, a low-resolution from the radiometer, and an intermediate-resolution from the fusion of the radar and radiometer measurements. Unfortunately, SMAP's SAR instrument halted its transmissions after a short operating period. In order to address this limitation, we introduce a novel post-acquisition computational technique aiming to synthesize the active measurements of SMAP, by exploiting the mathematical frameworks of Sparse Representations and Dictionary Learning. We propose a coupled dictionary learning model which considers joint feature spaces, composed of active and passive images, in order to recover the missing active measurements. We formulate our coupled dictionary learning problem within the context of the Alternating Direction Method of Multipliers. Our experimental results demonstrate the ability of the proposed approach to reconstruct the active measurements, achieving better performance compared to state-of-the-art coupled dictionary learning techniques.

9 References.

No Supplementary Data.
No Article Media
No Metrics


Document Type: Research Article

Publication date: January 1, 2018

More about this publication?
  • For more than 30 years, the Electronic Imaging Symposium has been serving those in the broad community - from academia and industry - who work on imaging science and digital technologies. The breadth of the Symposium covers the entire imaging science ecosystem, from capture (sensors, camera) through image processing (image quality, color and appearance) to how we and our surrogate machines see and interpret images. Applications covered include augmented reality, autonomous vehicles, machine vision, data analysis, digital and mobile photography, security, virtual reality, and human vision. IS&T began sole sponsorship of the meeting in 2016. All papers presented at EIs 20+ conferences are open access.

    Please note: For purposes of its Digital Library content, IS&T defines Open Access as papers that will be downloadable in their entirety for free in perpetuity. Copyright restrictions on papers vary; see individual paper for details.

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more