top of page

Lighting calibration for Digital Mike 1.0 (in progress...)

Hey there,

I have a 1st attempt to calibrate the assembled HDR and validate it through rendering Mike to match photographed references. Below is a quick rundown of the steps I have taken.

Part 1. RAW (.CR2) to 16-bit TIFF

DCRaw is used to extract the raw color info from CR2 to save directly as 16-bit per channel TIFF.

JPG saved by Canon

TIFF converted via Lightroom

TIFF converted via RawDrop (a dcraw GUI): image appears much more flat

Converted TIFFs are organized based on how a series of reference photos were taken on set with corresponding camera settings.

Note: The last time I used the ColorChecker app to generate DNG for color-correcting CR2. It actually gave me worse results in terms of matching. Based on earlier discussions with Jay, Graham, and Oleg, it seems they process CR2 with dcraw as well and save directly as exr and jpg. I will look into this again to verify the accuracy.

Part 2. Color-Balance the HDR

In Nuke, a 3x3 matrix is calculated between color checker reference TIF and the Macbeth chart presented in HDR. At this stage, the out-of-box match between rendered spheres and photos is fairly good.

For this part, I cross-reference to Scott Metzger's workflow (he did a video particular covering this while working on Rise). I do not have access to HDR Shop so I use a Nuke gizmo (mmColorTarget) for this. Oleg is aware of this gizmo as I read his comment on the author's website regarding not clipping the color during matrix calculation.

Part 3. fine-tune the calibration via matching the gray/chrome balls

Mostly manual work: sampling and comparing the lit and shaded area between rendered and photographed sphere to fine-tune the HDR. I only color-correct with white-balance (whitepoint in the Grade node) and exposure adjustment. Usually in 1 or 2 iterations, the match is really decent.

validation (rendered with physical camera).

Note: for exterior sun directly lit scene, a Vray Disc light is created to represent the sun.

All the renders are done with physical camera using settings that match corresponding reference photos.

The whole process is also based on discussions I had with Arkell Rasiah (a former ILM engineer). I copy what he typically does in terms of extracting TIFF from CR2 and color-balancing in Nuke. I recently came across Sébastien's course at SIGGRAPH 2016 about capturing and calibrating HDR. Too late for this round, but I would definitely want to try it out some ideas next time to calibrate in a "more absolute" fashion as opposed to the "relative-gray-ball" style.

One quick question I have is as I lock the physical cam settings matching to the Canon 5D and change dome light intensity on per image basis, would it presents a potential issue when someone use the HDR to render in another renderer with its physical camera or without using one?

I am excited to see where this is heading.

Cheers and till next time.

Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page