Physically Based vs Unbiased: Are they the same or is there an important difference?

Rashad CarterRashad Carter Posts: 1,803
edited December 2014 in The Commons

Physically Based and Unbiased are not the same thing....or are they? A rendering engine can be physically based and still have biases, or at least that is what my research so far has found. Many sites like that of Octane and LuxRender, state that they are Physically Based and Unbiased renders, but there are other sites for other render engines which only claim to be physically based, but do not also claim to be unbiased. There must be a reason for this.

So my question for those in the know is, are these two terms the same, or are they indeed different? What is the difference if indeed there is one?

It matters to me because I have seen several renders from Physically Based render engines that do not appear to be unbiased renders. I want to be certain that people are not rendering with biases when they think they are not. When we compare render speeds and render outputs from different engines it helps a great deal to know whether it is even fair to compare speeds and output from renderers that are playing by different rules, as biases will often give reduced render times compared to fully unbiased processing.

Sometimes in advertising, vendors are willing to mislead customers to encourage sales, so one cannot always trust vendors to be completely honest, we need to sometimes reference our peers to see beyond the sales hype. Untrained persons like myself would probably assume Physically based to be equal to unbiased, and I'd probably be wrong, but I want to be sure.

So there it is. If anyone can explain the difference I'd be very glad to know.
Thanks in advance.

Fun fun!!!!

Post edited by Rashad Carter on

Comments

  • Rashad CarterRashad Carter Posts: 1,803
    edited December 2014

    Okay, how about this? Could one say that Physically Based is in reference to the way Materials are designed, such as Diffuse, Specular, Glossy, etc? And that Biased/Unbiased rendering refers to the way the light interaction is handled? Because if so, then I think I finally understand how an engine can be physically based and yet still have biases in the way the light is handled. Blind leading the blind here, any help is greatly appreciated.

    Post edited by Rashad Carter on
  • wowiewowie Posts: 2,029
    edited December 2014

    From what I understand, Physically Based or Physically Plausible materials generally means you're using shaders/materials that employ analytical models as close as possible to real life materials. Biased or unbiased generally means the renderer does not restrict itself in sampling or bounces. So they're very different concepts. Physically based is generally related to materials while biased/unbiased generally is related to light transport (how light is processed/calculated for the entire 'scene').

    So you can have a biased renderer with physically based materials (energy conservation, accurate analytical BRDF). PRMan, Mental Ray, Vray, 3delight are some prime examples of this. I have not seen an unbiased renderer provide materials that's non-physically based though.

    Unbiased renderers generally also offer a biased mode - usually that means direct lighting with ambient occlusion. Photon mapping with irradiance caching are also considered biased. Generally, unbiased renderers employ more sophisticated methods like bidirectional path tracing (tracing rays from both the light and the camera) without restricting the number of bounces calculated.

    Post edited by wowie on
  • edited December 1969

    That is right Rashad. Wowie has it covered well, but I'll just add something about the difference between biased and unbiased.

    As you say, rendering is performed by simulating light interaction (transport) between surfaces. Obviously we cannot simulate the interaction of every photon transported between every surface to every other surface in an environment, as this would take a long time. Instead we approximate it by simulating only a few of the possible interactions, averaging the result of these interactions, weighted by the likelihood of similar interactions occurring. That is, the 'proportion' of the colour at the point we are sampling, likely to be made up by interactions similar to those we just simulated. But how to choose which interactions to simulate? That is the difference between biased and unbiased. Unbiased renderers pick randomly from a range of all possible interactions (i.e. simulate transport of light from all directions), without any preference, or bias. Biased renderers use hints and scene information to choose a narrower range of possible interactions/directions to simulate.

    The implication of this is that unbiased renderers will eventually always converge to the physically correct result, because all interactions have an equal chance of being simulated and thus included in the result. Biased renderers, because they select what interactions are simulated, may run forever but always exclude some interaction which could make a contribution and so never converge to the correct result.

    I found this to be a good read, if you would like to know more, or have someone explain it better than me!: http://www.cs.columbia.edu/~keenan/Projects/Other/BiasInRendering.pdf

  • Rashad CarterRashad Carter Posts: 1,803
    edited December 1969

    sebmaster said:
    That is right Rashad. Wowie has it covered well, but I'll just add something about the difference between biased and unbiased.

    As you say, rendering is performed by simulating light interaction (transport) between surfaces. Obviously we cannot simulate the interaction of every photon transported between every surface to every other surface in an environment, as this would take a long time. Instead we approximate it by simulating only a few of the possible interactions, averaging the result of these interactions, weighted by the likelihood of similar interactions occurring. That is, the 'proportion' of the colour at the point we are sampling, likely to be made up by interactions similar to those we just simulated. But how to choose which interactions to simulate? That is the difference between biased and unbiased. Unbiased renderers pick randomly from a range of all possible interactions (i.e. simulate transport of light from all directions), without any preference, or bias. Biased renderers use hints and scene information to choose a narrower range of possible interactions/directions to simulate.

    The implication of this is that unbiased renderers will eventually always converge to the physically correct result, because all interactions have an equal chance of being simulated and thus included in the result. Biased renderers, because they select what interactions are simulated, may run forever but always exclude some interaction which could make a contribution and so never converge to the correct result.

    I found this to be a good read, if you would like to know more, or have someone explain it better than me!: http://www.cs.columbia.edu/~keenan/Projects/Other/BiasInRendering.pdf

    Wowie, thanks a ton for clarifying! I was feeling I was on the right track but I always like to be certain.

    Sebmaster,
    Yep, that is a great pdf and I have often suggested it to people to read. I was still however unsure exactly what Physically Based meant, but now I understand it relates to the materials themselves and not to the lighting in any way.

    Most renderers only calculate diffuse indirect bounces between surfaces, skipping the specular bounces because most people don't think about specular as an indirect lighting influence, we tend to think of specular only as it relates to key light sources. Unbiased however doesn't skip the specular step. I'm sure there are a million other examples as well that demonstrate differences between biased radiosity renders and unbiased renders.

    I really love all of this stuff.

  • wowiewowie Posts: 2,029
    edited December 2014

    sebmaster said:
    Unbiased renderers pick randomly from a range of all possible interactions (i.e. simulate transport of light from all directions), without any preference, or bias. Biased renderers use hints and scene information to choose a narrower range of possible interactions/directions to simulate.

    I wouldn't put it that way. Both biased and unbiased renderer can and do depend on multiple importance sampling when calculating lights.
    http://www.fxguide.com/featured/the-state-of-rendering/


    Most renderers only calculate diffuse indirect bounces between surfaces, skipping the specular bounces because most people don't think about specular as an indirect lighting influence, we tend to think of specular only as it relates to key light sources. Unbiased however doesn't skip the specular step. I'm sure there are a million other examples as well that demonstrate differences between biased radiosity renders and unbiased renders.
    .

    Not exactly. Starting with Renderman Pro Server 18, indirect specular can also be computed.
    http://www.fxguide.com/featured/the-state-of-rendering-part-2/

    Post edited by wowie on
  • evilproducerevilproducer Posts: 9,050
    edited December 1969

    Another great source is Jeremy Birn's Digital Lighting and Rendering. He discusses both in an easy to understand manner with nice examples. It's a great book with lots of good information on how to light for both types of renderers. He even gets into physically based textures/shaders.

  • throttlekittythrottlekitty Posts: 173
    edited December 2014

    Un/Biased got explained pretty well, so here's some info on physically based rendering. It is a slight change to the rendering equation to include conservation of energy: an object can't reflect more light than it receives. Many shaders using PBR also include a microsurface term, which blurs the cube map based on the gloss level of the material. This is more of a big news thing in the realtime rendering world (games), but not all renderers have been doing physically accurate renders (3Delight isn't physically-based EDIT maybe it is, so maybe i'm rendering in DS out of the box incorrectly). It does change the way we author gloss and specular maps, so it's good to advertise that fact.

    Using it, artists get better control over the look and feel of their surfaces, and nobody has to worry about specular highlights getting blown out to white. :D

    The fine folk at Marmoset have a nice writeup on the topic.
    http://www.marmoset.co/toolbag/learn/pbr-theory

    Post edited by throttlekitty on
  • wowiewowie Posts: 2,029
    edited December 2014

    (3Delight isn't physically-based EDIT maybe it is, so maybe i'm rendering in DS out of the box incorrectly). It does change the way we author gloss and specular maps, so it's good to advertise that fact.

    I would agree that the default shaders with 3delight, and specifically, DAZ Studio, isn't physically plausible. But the beauty of Renderman shaders is that there's no stopping you from writing your own shaders that's physically plausible. Of course, writing RSL code isn't exactly a walk in the park. :)

    What I generally see is that most people seem to forget the percursor to these methods - making sure you're working in linear space and targeting output with the 'correct' gamma space. Even plain Lambert, Blinn, Phong can look good if you're using linear workflow. Physically accurate or plausible shaders helps you get consistent results between lighting scenarios, so that's another piece of the puzzle.

    In short, a renderer can only do what you tell it to do.

    Post edited by wowie on
Sign In or Register to comment.