Previous | Next --- Slide 55 of 62
Back to Lecture Thumbnails
jerrypiglet

Shouldn't this be BIASED Monte Carlo estimator in that we do not need $p(x)$ to be uniform distribution?

kayvonf

@jerrypiglet: Good question, but the estimator is unbiased in that its expectation is exactly the value of the integral it is trying to estimate, and thus it converges to the "correct answer" as the number of samples used in the estimate increases. A biased estimator does not converge to the correct answer.

There is a difference between biasing the choice of samples used in the estimator (what you were referring to here), and an biased estimator. The note at the bottom of the previous slide. ;-)

kayvonf

Question: Now that you've seen the global illumination lecture, and in light of the discussion above, I can ask you this question: Imagine you write a path tracer that always terminates paths after 4 bounces. Is this estimator biased? (A hint is here.)

jerrypiglet

@kayvon: I guess it will be biased if we always discard low-contribution rays after 4 bounces, but it will be unbiased if we randomly discard them. But I am still not sure why Russian roulette estimator has the same expectation as the original estimator (in this this slide). Can you explain the equation on that slide please?

kayvonf

I added an explanation on the Russian Roulette slide.