Saturday, May 26, 2012

First Images with Dispersion

I recently implemented dispersion in Photorealizer using some of the ideas I described in a previous post. I'm using the Sellmeier equation along with material-specific Sellmeier coefficients to determine the refractive index of the material at any given wavelength. Right now I'm only tracing three specific wavelengths, one for each of red, green, and blue. Because the sRGB primaries are not spectral colors, I simply chose nearby spectral colors, keeping things simple for now.

Here is a render of diamonds with dispersion. The color in this image comes entirely from dispersion.

Diamonds with dispersion. Click to view full size.

Here's the same render, before bloom was applied (my bloom is a post-process; the renderer applies it to the completed image):

Same render as above, but without bloom.

Here is a similar scene rendered with and without dispersion, for comparison. The color in these images comes entirely from dispersion as well. The effect of dispersion is a little more subtle in these images than in the ones above.

Dispersion. Bloom.
No dispersion. Bloom.
Dispersion. No bloom.
No dispersion. No bloom.

At some point I'd like to develop a more robust spectral rendering framework, so I can more accurately render dispersion, as well as interference, diffraction, and other wavelength-dependent effects for which three wavelengths (R, G, and B) are insufficient.

Here are a few earlier dispersion renders:


  1. Hey Peter, nice work!

    Are you planning on doing some sort of spectrum-based (as opposed to just RGB) dispersion eventually?

    1. Hey, thanks Karl! To answer your question, I would like to implement some sort of spectrum-based dispersion at some point, but I haven't decided on the details yet.

      I mainly did it the RGB way for now because it was faster and simpler to implement. On the physics side, all of the pieces are in place for spectrum-based dispersion, notably the Sellmeier equation for computing the refractive index at any wavelength. However, doing full spectral dispersion would be somewhat tricky because my material colors, lights, absorption coefficients, etc. are all currently in RGB, and the transformation from RGB color to spectral power distribution is not definite (because of metamerism). For best results all of those other things would need be spectrum-based, too, in which case, a straightforward way to do spectral rendering would be to assign a single wavelength to each ray, using importance sampling to select that wavelength from the spectral response curves of the R, G, and B components of the camera's image sensor (or from the entire visible spectrum).

      With almost everything else strictly in RGB though, it probably wouldn't be practical to move to a full spectral rendering system just for dispersion. Instead, I'll probably do something like this: when dispersion happens, trace separate rays for R, G, and B, and choose the exact wavelength for each ray from the respective spectral response curves of the camera (regular Gaussian curves would probably look good). This would allow me to leave everything else in RGB terms, and it would result in rays that are associated with only one of R, G, and B, which would then be compatible with other effects for which R, G, and B need to be traced separately, such as subsurface scattering (in the resources I've looked at, scattering coefficients are usually only provided for R, G, and B).

  2. Impressive work, Are you planing to publish a GUI based renderer for 3d software packages (maya).. ?