Here's a hacked simulation of light passing through a sheet of rippled glass and hitting a surface. (In this case, the sheet of glass has been embossed with a picture.) Notice how the light separates into little rainbows as the sheet of glass moves farther away from the surface.
How does it work? First, you have to turn the original image (in this case, a face) into a vector field. You can do this by calculating the gradient of the luminance of a blurry version of the image. Then, for each pixel in the vector field, you scatter light into a quadrilateral marked out by the endpoints of the vectors at the four corners. You can increase or decrease the total distortion by multiplying the entire vector field by some scaling factor. What kind of sampling you use is pretty important. To minimize aliasing, I used Halton point sampling. (There's free source code if you follow that link.) These images were rendered with 100 samples per pixel, and took about 30 seconds each on an R5000. To get the rainbows you see above, you can assign each sample (i.e. each "photon") a different "frequency", attenuate the distortion by a factor based on that frequency, and then have it deposit an appropriate color from the spectrum. |

5:04 pm, 30 January 1998