Previous | Next --- Slide 20 of 52
Back to Lecture Thumbnails
hubbahubba

This might be a bad question, but what if we use the same sampling directions for all of the points? I predict we would have less noise but all of our results would be biased.

keenan

@hubbahubba No, that's an excellent question. You could indeed use a fixed quadrature rule, rather than a random (Monte Carlo) quadrature. E.g., just split up the hemisphere into some fixed grid, and take a sample for each grid point. For the scene above, you may indeed get smoother results. This technique is also perhaps appropriate for this example because the integrals you want are low-dimensional: just integrating over a 2D hemisphere (or 2D light, depending on how you do it).

However, there are some downsides to this approach. The biggest one is that fixed quadrature rules don't scale well as your integrand goes up in dimension. If you're picking $n$ samples along each axis, then a $d$-dimensional integral will need $n^d$ sample points. Pretty bad! And rendering is all about high-dimensional integrals: at each bounce of your path you have to pick (say) two angles to decide the next outgoing direction. So if you have $k$ bounces, then you'd need something like $n^{2k}$ samples.

The other downside is that you can and will get (a different kind of) artifact: rather than noise, you'll get aliasing along certain directions. For instance, if the light has just the wrong shape, all of your samples might miss it---even in cases where it should be quite bright. With Monte Carlo, you may still get unlucky and miss it, but there's always a nonzero probability that you hit it. In this sense, Monte Carlo is not unlike many randomized algorithms in that it provides robustness to adversarial examples. (If you're not familiar with this idea, randomized quicksort is a great example.)

In short, Monte Carlo:

  • helps to avoid the "curse of dimensionality", and
  • provides robustness across many different sampling scenarios,

at the cost of high-frequency noise and slow convergence. (There are plenty of other pros and cons one could discuss as well, such as the cost of generating pseudorandom numbers...)