Render Times Increased exponentially moving to 4K?

So, this is the first time this has happened to me, and I am trying to figure out what could be the problem.  I have a scene that I am rendering, I did a test render at HD (1080) just before trying this one.  I stopped it after about 5,000 iterations and it was ~80% Converged at that point, and only took about 8 hours.  I saved the scene file, restarted my PC (Because for some reason if I don't do that, most of the time it won't render using GPU on a second render) loaded the scene, set the resolution for 4K instead of 1080, and started the render.  Here I am ~28,000 iterations later, it's been 1 day 3 hours, and it's only 2% converged... Can anyone tell me what might be going on that it's taking so much longer to converge?  I expected it to act like normal and only take about 4 times as long, which would have put me at a day or so... But at this rate, it's gonna be a month or more!? lol

Any help would be appreciated... 

Comments

  • wolf359wolf359 Posts: 3,834
    edited June 2019

    Be prepared to endure the  long render times at 4K
    if indeed this project requires  content to be viewed on a 4K resolution device.
    Even though human eyesight& brain function
    begins to discard detail past a certain point:cool

    Post edited by wolf359 on
  • JonnyRayJonnyRay Posts: 1,744
    edited June 2019

    Full HD is considered 1920 x 1080 = 2,073,600 pixels. At 24 bits (3 bytes) per pixel, that would be 6 MBytes for the current Frame.

    4K UHD is 3840 x 2160 = 8,294,400 pixels. That's 24 Mbytes just to store the rendering image frame. And (I believe) Iray has to store two of them to be able to calculate convergence.

    So a 4X increase. There's definitely more data that needs to be generated and sampled to determine convergence. I wouldn't expect a linear 4X increase in render times though because the increase in data probably doesn't scale linearly throughout the whole rendering pipeline. For instance if you're using the 4.11 Noise filter during the whole render, it has to run that filter against those large images for each iteration.

    You also run the risk of tripping over some "soft limit" in data size that means your render is going an alternative route through the pipeline. For instance, check your log file and see if at some point storing all of the geometry and textures of the scene PLUS the size of the rendering images pushed it out of your GPU and started CPU based rendering. That could easily result in an exponential jump in the time it took to render.

    Post edited by JonnyRay on
  • prixatprixat Posts: 1,590
    edited June 2019

    If that's a still image I would render at 1080p and enlarge to 4k with a graphics program.

    Post edited by prixat on
  • JasonWNalleyJasonWNalley Posts: 122
    wolf359 said:

    Be prepared to endure the  long render times at 4K
    if indeed this project requires  content to be viewed on a 4K resolution device.
    Even though human eyesight& brain function
    begins to discard detail past a certain point:cool

    If it wasn't clear, this isn't my first time rendering in 4k, this is a normal part of my workflow, as far as requirements goes, I'd rather it be larger, like 8k and then downsampled to 4k.  However, that starts to get to the limits of where my hardware just isn't enough.  I won't comment on the rest of your post, nor that guys video, cause I've seen it before, and he's an idiot... 

    JonnyRay said:

    Full HD is considered 1920 x 1080 = 2,073,600 pixels. At 24 bits (3 bytes) per pixel, that would be 6 MBytes for the current Frame.

    4K UHD is 3840 x 2160 = 8,294,400 pixels. That's 24 Mbytes just to store the rendering image frame. And (I believe) Iray has to store two of them to be able to calculate convergence.

    So a 4X increase. There's definitely more data that needs to be generated and sampled to determine convergence. I wouldn't expect a linear 4X increase in render times though because the increase in data probably doesn't scale linearly throughout the whole rendering pipeline. For instance if you're using the 4.11 Noise filter during the whole render, it has to run that filter against those large images for each iteration.

    You also run the risk of tripping over some "soft limit" in data size that means your render is going an alternative route through the pipeline. For instance, check your log file and see if at some point storing all of the geometry and textures of the scene PLUS the size of the rendering images pushed it out of your GPU and started CPU based rendering. That could easily result in an exponential jump in the time it took to render.

    Yeah, I know it doesn't scale linearly due to everything else it takes to make the image.  However, in previous renders done in this same fashion, it's been ballpark close to a linear change in time...  This is the first time I've ever run into it just blowing through that timeline.  I mean currently I am at 1 day 12 hours, 37,000 iterations, and it's only 3.7% converged, that puts me at 1,000,000 iterations needed converge this image...  wtf?  And yeah, I already checked, it's GPU rendering, using both my GPU's and not using my CPU at all...  So... I am lost... It should not have jumped like this on a simple resolution change...

     

    prixat said:

    If that's a still image I would render at 1080p and enlarge to 4k with a graphics program.

    Resizing is not an option when you're looking at fine detail.  Sizing algorithms work pretty well for shrinking, but like anything else can't recreate accurately what isn't there when enlarging...  Look at a BluRay upscaled to 4K, then look at a Native 4K UHD BluRay, you will see a difference.  Especially the fine details.  A 1080 upscaled to 4k is not a good solution for what it is that is trying to be accomplished...

  • can you break it into smaller parts? does everything in the scene need to have all interactions with every thing else.
    ----
    I was doing something that was taking ages (by my standards) so just did a bunch of pieces an assembled in photoshop. 

     

  • wolf359wolf359 Posts: 3,834

    So, this is the first time this has happened to me,....If it wasn't clear, this isn't my first time rendering in 4k, this is a normal part of my workflow, 

    As you have rendered Scenes at this resolutionon the exact same hardware in
    the past  in  less  time.

    Then the  only varible here is the scene complexity itself ,or some specific element within it, that is choking up your machine.

    You are going to have to decide if being patient and waiting it out is a viable option or  if stopping and Dissecting the scene for a possible culprit is the best course.

  • ParadigmParadigm Posts: 421

    I do exclusively 4k renders... What kind of scene are you rendering that takes over a day to converge? Lots of glass / caustics? Two characters + clothes + fibermesh hair(!) + apartment scene takes like 3 hours on my 2080TI to completely converge. I can get decent quality with the denoiser at like 30 min though. Even an older card shouldn't take a day... That sounds like CPU rendering.

  • JasonWNalleyJasonWNalley Posts: 122
    Paradigm said:

    I do exclusively 4k renders... What kind of scene are you rendering that takes over a day to converge? Lots of glass / caustics? Two characters + clothes + fibermesh hair(!) + apartment scene takes like 3 hours on my 2080TI to completely converge. I can get decent quality with the denoiser at like 30 min though. Even an older card shouldn't take a day... That sounds like CPU rendering.

    That's exactly why I was asking, lol...  I can confirm it's definitely not CPU rendering, I've double and triple checked.   It's a single character, in a room on a couch with an HDRI dome because you can see the sky through the window behind her. Hair pieces are fibermesh eyebrows and exnem body hair on the lower part of her arms, and a normal non-fibermesh hairpiece...  I'm using a 1070Ti and a 960...

    Last night, since i had max iterations set to 50k, it stopped once it hit 50k, which gave me the opportunity to look at things.  I noticed environment intensity was set to 30 (No idea how it got there), which I switched down to 2, and changed the spectral rendering from cie 1931 to cie 1964.  Started the render again, and it's way way down on render times, in the last 9 hours it's gotten to 5%.  So there's some progress, I am guessing it was the environment intensity that did it...  But I have no clue honestly... we'll just  see how it turns out from here...

  • JonnyRayJonnyRay Posts: 1,744
    edited June 2019

    According to the video I'm linking below on Iray Render Settings, spectral rendering can definitely change the time it takes to render. And, unless you're trying to color match something else, it seems like it's only marginally useful. It basically changes the spread of the spectrum in the render. Which is something easily done in post work.

    Post edited by JonnyRay on
  • hobhobukhobhobuk Posts: 27

    You do see in 4k.
    That video is utter crap.. I am sorry.. but 4k is resolution..

    A gigantic screen in 4k, will show you the difference between 1080 and 4k.

    I found generally having a 2080 (Nvidia) made 4k renders just as fast as 1080 in some cases.  But you are literally rendering at 4 times the size, so its expected to get bigger.
    Maybe try at 2k levels, still looks great on a 4k screen, altho not quite as big.

  • JasonWNalleyJasonWNalley Posts: 122
    hobhobuk said:

    You do see in 4k.
    That video is utter crap.. I am sorry.. but 4k is resolution..

    A gigantic screen in 4k, will show you the difference between 1080 and 4k.

    I found generally having a 2080 (Nvidia) made 4k renders just as fast as 1080 in some cases.  But you are literally rendering at 4 times the size, so its expected to get bigger.
    Maybe try at 2k levels, still looks great on a 4k screen, altho not quite as big.

    Yeah, thing is, we see in Analog, which is beyond 4K or 8K or 16K...  the argument (one which I don't prescribe to) that is being made by the other guy is that your brain gets rid of details at a given point, which I think depends more on someones focus, and viewing distance, than anything else.  I can instantly tell the difference between a 1080p BluRay and a 4K UHD BluRay on a screen all the way down to 27" when sitting at my computer, below that (On a phone screen for instance) admittedly I have a difficult time discerning which is which.

  • JasonWNalleyJasonWNalley Posts: 122
    JonnyRay said:

    According to the video I'm linking below on Iray Render Settings, spectral rendering can definitely change the time it takes to render. And, unless you're trying to color match something else, it seems like it's only marginally useful. It basically changes the spread of the spectrum in the render. Which is something easily done in post work.

    Yes, spectral on or off can definitely  increase and decrease time dramatically, however, between the 2 profiles I wouldn't expect a difference to be 1 day vs 30 days...  I could be mistaken, but it just seems logical...

  • ebergerlyebergerly Posts: 3,255

    For those who are convinced that the guy in the "You Don't See in 4k" video is an idiot, or the video is "utter crap", or "we see in Analog, which is beyond 4K or 8K or 16K", you might want to study the science of what's called "visual acuity". As he touches on, it's very complex, as is our ability to perceive things. I suspect most who have studied the eye might consider the video fairly well done. 

    You also might want to check out some of the info on how close someone needs to sit in front of a big TV monitor to actually discern the difference between different resolutions. For example, at best, with a 65" monitor at 4K, if you sit further away than around 4 feet even someone with great vision can't distinguish the difference between 4K and HD pixels. And if you're like me with crummy vision the only time I'd benefit from 4K is from a big computer screen that's maybe a foot away.  

    Of course, if anyone has a scientific data that disputes any part of the video I'd be interested to see it. 

     

  • wolf359wolf359 Posts: 3,834
    edited June 2019
    ebergerly said:

    For those who are convinced that the guy in the "You Don't See in 4k" video is an idiot, or the video is "utter crap", or "we see in Analog, which is beyond 4K or 8K or 16K", you might want to study the science of what's called "visual acuity". As he touches on, it's very complex, as is our ability to perceive things. I suspect most who have studied the eye might consider the video fairly well done. 

    You also might want to check out some of the info on how close someone needs to sit in front of a big TV monitor to actually discern the difference between different resolutions. For example, at best, with a 65" monitor at 4K, if you sit further away than around 4 feet even someone with great vision can't distinguish the difference between 4K and HD pixels. And if you're like me with crummy vision the only time I'd benefit from 4K is from a big computer screen that's maybe a foot away.  

    Of course, if anyone has a scientific data that disputes any part of the video I'd be interested to see it. 

     

    This^
    For some bizarre reason 
    (as he stated about the Reddit community),

    This is an emotional issue for people(Very nearly Like attacking/Debunking someones religious beliefs)

    The main point is too get people to consider  
    if the data overhead of  that resolution is justified  within most Visual Content Delivery infrastructures  and for most general usage at this time...............In most cases.. No

    977 GIGABytes for  90 minutes true 4K video??crying

    The last five years of my 19 year career in the printing industry was in outdoor large format  printing so I know how Viewing distance works.

    The sad thing is  that Daz PA's are putting 4K textures 
    on every part of a vehicle prop(including the undercarriage)

    Not realizing that 4k textures rendered in 4K are wasted on anyone not using a 4K resolution viewing device.

    I personally knew a person who only watched DVD's of old 1970's sitcoms
    (Fred Sanford,Goodtimes, The jeffersons etc)

    He bought a 4K TV and  was amazed at how much "better" that old interlaced content was.

    Confirmation bias at its worst.

    Like the guy said, 4K is  already here to stay..relax that 4K TV you bought may actually get to display some true 4K content from Hollywood......someday.

    But why burden yourself/Hardware with that resolution  from a Brute force path tracer, unless you are getting $$Paid$$$ to do so, by someone who is demanding it. 

    Post edited by wolf359 on
  • JasonWNalleyJasonWNalley Posts: 122

    For those who are convinced that the guy in the "You Don't See in 4k" video is an idiot, or the video is "utter crap", or "we see in Analog, which is beyond 4K or 8K or 16K", you might want to study the science of what's called "visual acuity". As he touches on, it's very complex, as is our ability to perceive things. I suspect most who have studied the eye might consider the video fairly well done. 

    You also might want to check out some of the info on how close someone needs to sit in front of a big TV monitor to actually discern the difference between different resolutions. For example, at best, with a 65" monitor at 4K, if you sit further away than around 4 feet even someone with great vision can't distinguish the difference between 4K and HD pixels. And if you're like me with crummy vision the only time I'd benefit from 4K is from a big computer screen that's maybe a foot away.  

    Of course, if anyone has a scientific data that disputes any part of the video I'd be interested to see it. 

    I am aware of visual acuity, however that differs between people, mostly because eyesight varies greatly between people, and to say that all people have the same perception and "Don't see in 4K" is a garbage statement, our eyes don't see in pixels and 4k is a term used to define how many pixels something has, we see analog, it's greater than 4k, 8k, 16k, etc, as I stated previously.  What your brain is able to resolve especially in a fraction of a second, is another matter entirely, but it's still not done in pixels.  This is especially obvious with Letters and Numbers in high contrast situations.  While skintones, and organic textures are not able to be resolved as well, when it's something inorganic that has a clear pattern to it, especially things that are digital in nature to begin with, you are definitely able to tell the difference, and for some people, myself included, organic textures are able to be discerned between the two.  I don't need science to explain to me what I can actually see with my own 2 eyes, whether you believe me or not is entirely up to you, all I can say really is that when I watch the same news broadcast on a 55" 1080p TV and a 55" 4K TV, even from 10 feet back I can see a clear difference in the way text is rendered on the screen, especially Marquee text, and most of the time I can discern different elements of skin textures, rock textures, etc.  One could argue, it's a difference between the TV's, but the same thing happens when we watch movies on the 4K TV...  I can be sitting from 10 feet away and say "Which disk did you put in?  Something looks off" and sure enough, someone accidentally put in the BluRay disk instead of the UHD disk, it's happened more than once, and it's usually not an HDR que, it's typically on the FBI warning screen or on the logo screens.

    This^
    For some bizarre reason 
    (as he stated about the Reddit community),

    This is an emotional issue for people(Very nearly Like attacking/Debunking someones religious beliefs)

    The main point is too get people to consider  
    if the data overhead of  that resolution is justified  within most Visual Content Delivery infrastructures  and for most general usage at this time...............In most cases.. No

    977 GIGABytes for  90 minutes true 4K video??crying

    The last five years of my 19 year career in the printing industry was in outdoor large format  printing so I know how Viewing distance works.

    The sad thing is  that Daz PA's are putting 4K textures 
    on every part of a vehicle prop(including the undercarriage)

    Not realizing that 4k textures rendered in 4K are wasted on anyone not using a 4K resolution viewing device.

    I personally knew a person who only watched DVD's of old 1970's sitcoms
    (Fred Sanford,Goodtimes, The jeffersons etc)

    He bought a 4K TV and  was amazed at how much "better" that old interlaced content was.

    Confirmation bias at its worst.

    Like the guy said, 4K is  already here to stay..relax that 4K TV you bought may actually get to display some true 4K content from Hollywood......someday.

    But why burden yourself/Hardware with that resolution  from a Brute force path tracer, unless you are getting $$Paid$$$ to do so, by someone who is demanding it. 

    Well, data overhead aside, a viewable difference for ME is a viewable difference, and worth the extra time.  But one reason the DAZ PA's are doing what they do is because A) there's a demand for 4K content, but B) working from a higher resolution and then resolving to a lower one yeilds better results.  Have you ever seen those "Mastered in 4K" blurays?  The madness or rather science behind it is that if you create in, and then work from higher resolution content in post, and then shrink it, you're left with more of the information and details than if you had worked entirely in your needed/stated resolution. Gradients become smoother, contrasty areas are kept crisper, etc.  While that's not exactly what I am doing here, the content I am creating is for viewing on a 4K screen, it may well be a deciding factor in the PA's decisions to include 4K textures, despite the size of the files...  

    Also, in reference to the guy who watches Sanford and Sons from a DVD Source was, it's POSSIBLE that he had a perceived difference due to the upscaler being used by the Television, also, the color pallette his new TV uses over his old TV may have also had a hand in it, there may have been, to him, a perceivable difference in the content, since he's familiar with it, I wouldn't be able to say one way or the other without watching his old tv and then watching on his new tv.  I am not one to simply dismiss someones personal, first hand experiences with the content they consume regularly.  On the surface, it does sound dubious, but if you think about it logically for longer than a few seconds, you can certainly start to piece together ways in which he may actually be telling the truth, and not just letting his "Confirmation bias" show...

  • ebergerlyebergerly Posts: 3,255

    our eyes don't see in pixels and 4k is a term used to define how many pixels something has, we see analog, it's greater than 4k, 8k, 16k, etc, as I stated previously.  What your brain is able to resolve especially in a fraction of a second, is another matter entirely, but it's still not done in pixels. 

    Again, it's up to you what you choose to believe, but I think you're misunderstanding the science of how the eye works and it's limitations. Statements like "our eyes see in analog, not pixels, and it's greater than 16k" reflects a basic misunderstanding of the issues. 

    And if something seems to work for you personally, doesn't mean that it's garbage for the rest of the universe.    

  • AtiAti Posts: 9,139

    If I want a print image to fit my entire wall, surely a simple HD resolution won't be enough and my eyes will see the difference.

  • wolf359wolf359 Posts: 3,834

    "And if something seems to work for you personally, doesn't mean that it's garbage for the rest of the universe. "

    Nor does it mean that a person who provides
    Facts based  on medically sound  human physiology is an "Idiot",talkin 
    "utter crap"for potentially invalidating one's personal, subjective &emotional choices.


    If I want a print image to fit my entire wall, surely a simple HD resolution won't be enough and my eyes will see the difference.

     

    Unless you Live in a sprawling mansion/castle ,that is because you will likely be within one meter of the walls of your house.

    Typically wall murals have to be "taken in" at distance to appreciate  thier scope,narrative etc.
    The greater that distance the less the resolution will matter to the human eye.

  • ebergerlyebergerly Posts: 3,255
    edited June 2019

    As far as the OP's original rendering problem....

    How can you be 80% converged at 5,000 iterations with the 1080 setting, but only 2% converged at 28,000 iterations with the 4k setting?? Sounds like you have two completely different render settings for the two renders. Unless I'm missing something...

    As an example, I did a simple scene that took 17.5 minutes at 1080, and 66 minutes at 4k (3.8 times as long), and both ended at 5,000 iterations. Did the 4k scene somehow get some goofy render settings?

    Post edited by ebergerly on
  • JasonWNalleyJasonWNalley Posts: 122
    ebergerly said:

    As far as the OP's original rendering problem....

    How can you be 80% converged at 5,000 iterations with the 1080 setting, but only 2% converged at 28,000 iterations with the 4k setting?? Sounds like you have two completely different render settings for the two renders. Unless I'm missing something...

    As an example, I did a simple scene that took 17.5 minutes at 1080, and 66 minutes at 4k (3.8 times as long), and both ended at 5,000 iterations. Did the 4k scene somehow get some goofy render settings?

    That is precisely why I came here asking...  There's no way it should have increased the render times that much.  The only render setting that I could see that may have gotten changed, possibly by a fat finger or a misclick, is the environment Intensity, because I have never put Environment intensity at 30, but that's where it was when I looked...  And while going through it, I realized it was on CIE 1931 instead of 1964, so I switched that, the only other thing I did was click reset on all of the tone mapping settings, despite them already being at their default locations (with the exception of exposure which was set at 11, which did change the shutter speed on its own, but that was the same value present in the 1080 render I had attempted earlier)... Since then, I started the render again, and we're at 80%, 16000 Iterations, in only 19 hours... So... That to me seems much more of what it "should" have been like... Even though, the amount of iterations has gone up dramatically, time wise, it's more along the lines of what I expected to see...  I am unsure why it did what it did, I am glad it's "fixed" now, but I don't think Environment intensity being at 30, should have changed things by that much... so who knows?

  • ebergerlyebergerly Posts: 3,255

    What is your Max Samples settings (Render Settings/Progressive Rendering)? That will determine how many iterations will happen (unless the Max Time or Convergence goals are reached first). It sounds like you have some crazy high settings. By default I think the Max Samples/Iterations is at 5,000, which IMO is crazy high in the first place.  

  • JasonWNalleyJasonWNalley Posts: 122
    edited June 2019
    ebergerly said:

    What is your Max Samples settings (Render Settings/Progressive Rendering)? That will determine how many iterations will happen (unless the Max Time or Convergence goals are reached first). It sounds like you have some crazy high settings. By default I think the Max Samples/Iterations is at 5,000, which IMO is crazy high in the first place.  

    I have both time and iterations set to maximum without lock, it's something like 14million samples and 14million seconds I think... because I would rather it converge at 100% than end early... But that did not get changed between 1080 and 4k

    Post edited by JasonWNalley on
  • ebergerlyebergerly Posts: 3,255
    edited June 2019

    Well that explains it...

    14 million samples per pixel. Wow.

    My only suggestion is that you do some investigation into what exactly what "samples" means, and I think you'll reconsider such a high number. Or perhaps see for yourself...change the Max Samples to around 100 (rather than 14 million) and see if you can discern any difference under normal viewing. 

     

    Post edited by ebergerly on
  • JasonWNalleyJasonWNalley Posts: 122
    ebergerly said:

    Well that explains it...

    14 million samples per pixel. Wow.

    My only suggestion is that you do some investigation into what exactly what "samples" means, and I think you'll reconsider such a high number. Or perhaps see for yourself...change the Max Samples to around 100 (rather than 14 million) and see if you can discern any difference under normal viewing. 

     

    it took 30,000 samples/iterations just to get rid of the RGB fireflies in the hair on the scene (The models hair is black, if that makes a difference)... So setting it to 100 would have yeilded terrible results...  The way I understand the render settings to work, is that by setting the Max converged ratio to 100%, the max samples to maximum, and the max time in seconds to maximum, it will continue rendering until it hits the max of one of those 2 settings, or 100% convergence, whichever comes first.  It doesn't mean it will do 14million samples per pixel  (by the way I checked, it's actually 1.4 billion), unless it actually requires that many to get to 100% convergence...  Is that wrong?

  • Richard HaseltineRichard Haseltine Posts: 102,334
    ebergerly said:

    Well that explains it...

    14 million samples per pixel. Wow.

    My only suggestion is that you do some investigation into what exactly what "samples" means, and I think you'll reconsider such a high number. Or perhaps see for yourself...change the Max Samples to around 100 (rather than 14 million) and see if you can discern any difference under normal viewing.

    If you meant the settings I assume then it measures the maximum number of samples allowed, not the target/minimum number of samples or number of samples per pixel.

  • ebergerlyebergerly Posts: 3,255
    edited June 2019

    Yes, the OP has effectively told the renderer "I want you to achieve the highest convergence you can, and I don't care how long (or how many samples) it takes". Which is why it seems hard to complain when the renderer takes forever to render. 

    So you're then relying on the internal algorithm, which determines % improvement between samples/iterations, to decide when enough is enough. So you're basically up to the mercy of that algorithm and the complexities of your scene, rather than deciding for yourself what is acceptable. So maybe the particular scene had some difficult sample results that were very tough to converge to such a high degree.

    Which is why, I suppose, they provide the user with the option of deciding how long is enough, and how much convergence is enough, and how many samples are enough. Personally, if it was that important to me (and the defaults weren't enough), I'd do some sample renders to see where to set the limits. And maybe do some post production stuff to clear up any challenging areas, rather than wait for days for a render to finish. 

    BTW, when talking a high resolution 4k image, it seems that a high number of samples is far less necessary than with a lower resolution. Basically the samples are additional rays that get sent thru each pixel to get a better estimate of pixel color, but as the pixels get smaller you'd assume that the variation in colors per pixel, and therefore the need for many samples, would drop.

    Post edited by ebergerly on
  • JasonWNalleyJasonWNalley Posts: 122
    edited June 2019
    ebergerly said:

    Yes, the OP has effectively told the renderer "I want you to achieve the highest convergence you can, and I don't care how long (or how many samples) it takes". Which is why it seems hard to complain when the renderer takes forever to render. 

    So you're then relying on the internal algorithm, which determines % improvement between samples/iterations, to decide when enough is enough. So you're basically up to the mercy of that algorithm and the complexities of your scene, rather than deciding for yourself what is acceptable. So maybe the particular scene had some difficult sample results that were very tough to converge to such a high degree.

    Which is why, I suppose, they provide the user with the option of deciding how long is enough, and how much convergence is enough, and how many samples are enough. Personally, if it was that important to me (and the defaults weren't enough), I'd do some sample renders to see where to set the limits. And maybe do some post production stuff to clear up any challenging areas, rather than wait for days for a render to finish. 

    BTW, when talking a high resolution 4k image, it seems that a high number of samples is far less necessary than with a lower resolution. Basically the samples are additional rays that get sent thru each pixel to get a better estimate of pixel color, but as the pixels get smaller you'd assume that the variation in colors per pixel, and therefore the need for many samples, would drop.

     

    I don't understand the last couple of sentences, pixels don't change in size, so I assume you mean as the resolution gets smaller?  Also, the first sentence seems to indicate 4K should require less samples than 1080?

    Also, the original "complaint" was not that it was taking longer, but that it was taking longer by an unforseen and rather exorbitant amount, due to something unknown.  4K renders will always take longer, I know this, as again, this is a normal part of my work flow.  I always render a 1080 version to around 80% just to make sure all the lighting and interactions in the scene are acceptable and I don't wish to change anything.  Once I approve the 1080 render, I will change the resolution up to 4K and render out the 4K image.  I have always done 100% convergence, and I have always had the max samples and max time settings to a point that it allows the image to hit 100% convergence.  None of that is new.  What IS new is the fact that it was taking the 4K image 28,000 iterations to get to 2% converged and only 5,000 for 80% converged on the 1080 image.  There has never been that kind of a disparity in my renders before, and so I came here asking if anyone could articulate what COULD be happening.  

    It was not a complaint that it was taking too long, nor that it was taking longer, it was a question because it was something I had never previously run into, and if there was a way to avoid it, I would very much like to know, as 28,000 samples for 2% seemed a bit much.   Now that it's back down somewhere in the realm of needing only 50,000-75,000 samples to be 100% converged, things are falling more into the realm of normalcy for me.

    Post edited by JasonWNalley on
  • ebergerlyebergerly Posts: 3,255
    edited June 2019

    The problem is that it could be any of 100 things, and nobody can be sure of the cause except for you after you've done some testing. 

    Heck, maybe your GPU was thermal throttling during that particular render for some reason. Or maybe there was a problem with that scene that only you would be able to diagnose. Or maybe some default render settings were loaded with the 1080 render scene that weren't with the 4K scene. Or maybe it's a 4.11 or driver thing. Or maybe with such a big scene the GPU was doing some memory caching or something that slowed it down. Or maybe the GTX 960 (2GB VRAM??) dropped out. And so on...

    But by removing all limits on samples and convergence and render times that introduces more unknowns and complexities that make it doubly difficult to diagnose, since, as I said, you're basically leaving it up to a convergence algorithm to determine how long to continue rendering. 

    All I can suggest, if you really want to know why the 4K rendered so much longer than the 1080, is to do some true apples-to-apples tests with absolutely identical settings for the exact same scene, and give specific (not unlimited) render settings (convergence, samples, render time, etc.), and monitor your GPUs, and anything else you can think of.  

     

    Post edited by ebergerly on
  • JasonWNalleyJasonWNalley Posts: 122
    edited June 2019
    ebergerly said:

    The problem is that it could be any of 100 things, and nobody can be sure of the cause except for you after you've done some testing. 

    Heck, maybe your GPU was thermal throttling during that particular render for some reason. Or maybe there was a problem with that scene that only you would be able to diagnose. Or maybe some default render settings were loaded with the 1080 render scene that weren't with the 4K scene. Or maybe it's a 4.11 or driver thing. Or maybe with such a big scene the GPU was doing some memory caching or something that slowed it down. Or maybe the GTX 960 (2GB VRAM??) dropped out. And so on...

    But by removing all limits on samples and convergence and render times that introduces more unknowns and complexities that make it doubly difficult to diagnose, since, as I said, you're basically leaving it up to a convergence algorithm to determine how long to continue rendering. 

    All I can suggest, if you really want to know why the 4K rendered so much longer than the 1080, is to do some true apples-to-apples tests with absolutely identical settings for the exact same scene, and give specific (not unlimited) render settings (convergence, samples, render time, etc.), and monitor your GPUs, and anything else you can think of.  

     

    Maybe we have a breakdown in communication here.  So I'll lay everything out for you.

    First render, done at 1920x1080, 1.4billion samples, 1.4billion seconds, convergence ratio set to 100% (5,000 or so iterations = 80% convergence)

    Second render, done at 3840x2160, 1.4billion samples, 1.4billion seconds, convergence ratio set to 100% (28,000 or so iterations = 2% convergence)

    Upon killing this render, I noticed that Environment intensity was at 30, and switched it back down to 2 (normally for me it's 1-2 so I don't know how it got to 30).  I also noticed that it was at CIE 1931 instead of my normal CIE 1964, so I changed it to 1964, and also set the Exposure to the default of 13 instead of 11 which I had chosen manually before rendering out the initial 1080p image.

    Third render, done at 3840x2160 with the above corrections, 1.4billion samples, 1.4billion seconds, convergence ratio set to 100% (30,000 or so iterations = 80% convergence).

    GPU Throttling would limit the speed at which an Iteration is done, and not the amount of iterations required, right?
    Of the settings I've mentioned changing, should any of them have caused that large of a disparity between the last 2 renders?

    Post edited by JasonWNalley on
  • ebergerlyebergerly Posts: 3,255
    edited June 2019
    ebergerly said:

    The problem is that it could be any of 100 things, and nobody can be sure of the cause except for you after you've done some testing. 

    Heck, maybe your GPU was thermal throttling during that particular render for some reason. Or maybe there was a problem with that scene that only you would be able to diagnose. Or maybe some default render settings were loaded with the 1080 render scene that weren't with the 4K scene. Or maybe it's a 4.11 or driver thing. Or maybe with such a big scene the GPU was doing some memory caching or something that slowed it down. Or maybe the GTX 960 (2GB VRAM??) dropped out. And so on...

    But by removing all limits on samples and convergence and render times that introduces more unknowns and complexities that make it doubly difficult to diagnose, since, as I said, you're basically leaving it up to a convergence algorithm to determine how long to continue rendering. 

    All I can suggest, if you really want to know why the 4K rendered so much longer than the 1080, is to do some true apples-to-apples tests with absolutely identical settings for the exact same scene, and give specific (not unlimited) render settings (convergence, samples, render time, etc.), and monitor your GPUs, and anything else you can think of.  

     

    Maybe we have a breakdown in communication here.  So I'll lay everything out for you.

    And that's my point. Clearly you're NOT laying everything out, you're assuming that there are only two possible culprits, the change in color space and change in exposure. Since it's clearly not obvious how those would have any effect whatsoever (and I doubt if anyone here has done any benchmarking of 1080 vs 4k render times when changing those values), we're back to my suggestion that you do your own simple tests on your system to see if you can find the culprit. Especially after you mention stuff like "the environment somehow got to 30 and you're not sure why", it seems far more likely that there's something going on with your system or with that particular 4K render that you're not aware of or mentioning, and not considering as a factor, which is probably the true culprit. And none of us know those details. 

    Rule #1 of figuring out computer stuff: 

    1. Simplify and isolate

     

    Post edited by ebergerly on
Sign In or Register to comment.