Skip to main content
padlock icon - secure page this page is secure

Open Access Stereo rendering of photorealistic precipitation

Download Article:
 Download
(PDF 1,119.1 kb)
 
The stereoscopic rendering of rain has been previously studied. We extend the behavior and distribution of rainfall to include photorealistic stereo rendering of rain and snow precipitation at video frame rates. We ignore stereo rendering optimization and concentrate on the visual factors necessary to produce photorealistic output. The experimental method uses a series of controlled human experiments where participants are presented with video clips and still photos of real precipitation. The stimuli vary along three visual factors: particle numbers, particle size, and motion. The goal is to determine the statistical ranking and importance of these visual factors for producing a photorealistic output. The experiments are extended to investigate if stereo improves photorealism. Additionally, experimental stimuli include post-processing on rendered output to produce variable lighting, glow, and fog effects to study their impact on photorealism as the stereo camera moves in the scene. The results demonstrate that the visual factors for photorealism can be ranked as more sensitive to particle numbers and motion than to particle size. Varying light, glow, and fog effects contribute towards photorealism independent of stereo. Future research will exploit the geometric symmetry of the stereoscopic image pairs to render precipitation while maintaining realtime frame rates.
No References for this article.
No Supplementary Data.
No Article Media
No Metrics

Keywords: Photorealistic; Precipitation; Stereo

Document Type: Research Article

Publication date: January 1, 2017

More about this publication?
  • For more than 30 years, the Electronic Imaging Symposium has been serving those in the broad community - from academia and industry - who work on imaging science and digital technologies. The breadth of the Symposium covers the entire imaging science ecosystem, from capture (sensors, camera) through image processing (image quality, color and appearance) to how we and our surrogate machines see and interpret images. Applications covered include augmented reality, autonomous vehicles, machine vision, data analysis, digital and mobile photography, security, virtual reality, and human vision. IS&T began sole sponsorship of the meeting in 2016. All papers presented at EIs 20+ conferences are open access.

    Please note: For purposes of its Digital Library content, IS&T defines Open Access as papers that will be downloadable in their entirety for free in perpetuity. Copyright restrictions on papers vary; see individual paper for details.

    IS&T's Digital Library is Moving
    In our effort to ensure an excellent product, we have encountered a delay in transitioning the content to the new platform. We apologize for any inconvenience this causes. We are doing everything we can to get the new library up as soon as possible.

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more