Skip to main content
padlock icon - secure page this page is secure

Reducing Worst-Case Illumination Estimates for Better Automatic White Balance

Buy Article:

$17.00 + tax (Refund Policy)

Automatic white balancing works quite well on average, but seriously fails some of the time. These failures lead to completely unacceptable images. Can the number, or severity, of these failures be reduced, perhaps at the expense of slightly poorer white balancing on average, with the overall goal being to increase the overall acceptability of a collection of images? Since the main source of error in automatic white balancing arises from misidentifying the overall scene illuminant, a new illuminationestimation algorithm is presented that minimizes the high percentile error of its estimates. The algorithm combines illumination estimates from standard existing algorithms and chromaticity gamut characteristics of the image as features in a feature space. Illuminant chromaticities are quantized into chromaticity bins. Given a test image of a real scene, its feature vector is computed, and for each chromaticity bin, the probability of the illuminant chromaticity falling into a chromaticity bin given the feature vector is estimated. The probability estimation is based on Loftsgaarden-Quesenberry multivariate density function estimation over the feature vectors derived from a set of synthetic training images. Once the probability distribution estimate for a given chromaticity channel is known, the smallest interval that is likely to contain the right answer with a desired probability (i.e., the smallest chromaticity interval whose sum of probabilities is greater or equal to the desired probability) is chosen. The point in the middle of that interval is then reported as the chromaticity of the illuminant. Testing on a dataset of real images shows that the error at the 90th and 98th percentile ranges can be reduced by roughly half, with minimal impact on the mean error.
No Reference information available - sign in for access.
No Citation information available - sign in for access.
No Supplementary Data.
No Article Media
No Metrics

Document Type: Research Article

Publication date: January 1, 2012

More about this publication?
  • CIC is the premier annual technical gathering for scientists, technologists, and engineers working in the areas of color science and systems, and their application to color imaging. Participants represent disciplines ranging from psychophysics, optical physics, image processing, color science to graphic arts, systems engineering, and hardware and software development. While a broad mix of professional interests is the hallmark of these conferences, the focus is color. CICs traditionally offer two days of short courses followed by three days of technical sessions that include three keynotes, an evening lecture, a vibrant interactive (poster) papers session, and workshops. An endearing symbol of the meeting is the Cactus Award, given each year to the author(s) of the best interactive paper; there are also Best Paper and Best Student Paper awards.

    Please note: for Purposes of its Digital Library content, IS&T defines Open Access as papers that will be downloadable in their entirety for free in perpetuity. Copyright restrictions on papers vary; see individual paper for details.

  • Information for Authors
  • Submit a Paper
  • Subscribe to this Title
  • Membership Information
  • Terms & Conditions
  • Ingenta Connect is not responsible for the content or availability of external websites
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more