Project TitleAutoLum: Precise Automatic Camera and Display Calibration Algorithm by Optical Feedback
Track Code2014-028
Short Description

A software algorithm that optimizes camera light levels by accessing and correcting values in a camera's photometric calibration table


Even cheap camera phones can sense finer changes in light than the human visual system can perceive, but noise and commonly used over-smoothed approximate calibration limits their ability to accurately measure small but significant changes in light amounts. We developed a software algorithm, AutoLum, that finds a camera's photometric calibration table, its list of camera-numbers mapped to display-light-amounts, with errors less than a small fraction (e.g. 1/32) of the light-quantization step size for either the camera or the display used to perform the measurements. This table captures individually the light levels where the camera's output changes from one digital value to the next. The method's accuracy arises from statistical methods applied to millions of individual pixel values chosen adaptively. They exhaustively characterize camera flaws that, when corrected mathematically, yield large improvements (e.g. 10X) in results of almost any graphics or computer vision application that relies on light measurements from cameras, including: HDR imaging & light probes, photometric stereo and other shape-from-shading methods, color-matching for textiles, printing, film, and electronic media, automatic quality assessment and flaw-detection for manufacturing, estimators for material reflectance (BRDFs), transparency and translucency. Users of our system simply aim an out-of-focus computer controlled camera at a computer controlled display and allow the algorithm to photograph a series of test signals created adaptively on the display. Because the average light power of the test signal shown on the display is known and controllable in fine increments, the algorithm can closely bracket every unknown quantization boundary on the camera with known light powers. We estimate the value of every camera quantization boundary with a weighted average of the two bracketing display values.

Tagselectronics, SENSOR: optical, software, software: graphics, software: optimization
Posted DateMay 19, 2015 1:14 PM


Paul Olczak

John Tumblin


  • Improved mobile camera calibration during production of integrated systems for better looking camera-phone pictures
  • Wider tolerance and user-updated corrections for digital camera manufacturing variations – e.g. can compensate for inaccuracy in other sensor considerations (ADC, noise, power stability, amplifiers), and enthusiasts can repeat/revise.
  • Use of Cameras and displays as precise measurement instruments and references: phone-based health-and-safety
  • Improve dynamic response of cameras and displays by leveraging the measurement of each with the other
  • Match and calibrate multiple displays/projectors/printers to ensure uniform color image reproduction and appearance
  • Improve performance of any computer vision application that requires precise pixel-to-luminance measurements


  • Robust - Uses millions of samples and resilient to noise
  • Self refining - Longer run-time of the algorithm further improves error detection as it reduces errors and noise
  • Flexible - Non-parametric camera response is found meaning we will not overfit like prior approaches or smooth out important response features and can use the algorithm on 'interesting' cameras
  • Easy to use – Requires no human intervention: start the software and let the software handle the rest – allows anyone to make accurate measurements that might otherwise seem too complex/impractical, expensive, or time-consuming.
  • Inexpensive - Approaches lab precision but does not require special laboratory equipment: uses existing un-modified computer/display/camera combinations such as smartphones, tablets, digital SLRs, projectors and desktop computers.


IP Status

A provisional application has been filed.

Contact Information

Arjan Quist, PhD

Invention Manager

(p) 847-467-0305