DAZ IRay Render Speeds

12357

Comments

  • kaotkblisskaotkbliss Posts: 2,914

    You might be able to find some on Amazon that ship international and some of the prices are pretty cheap too.

    If you decide to pick up some more ram, I would also look to see if you can find a 4-core CPU on there as well for cheap. That will help speed up CPU Iray renders until you can get a good Nvidia card.

  • NovicaNovica Posts: 23,905

     

    Did you turn on OptiX acceleration in your render settings?

    Where the heck is it?  

  • Richard HaseltineRichard Haseltine Posts: 102,464

    Render Settings> Advanced tab

  • macleanmaclean Posts: 2,438

    Since Spooky is popping into this thread, I'll repost something I posted in the tech discussion, but have never managed to solve.

    I see this message in the DS 4.8 log file for every render, and I have no idea what this part means - 'Device optimized for interactive usage; performance could be sacrificed'

    Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Rendering...
    Iray INFO - module:category(IRAY:RENDER):   1.3   IRAY   rend info : CPU (7 threads): Scene processed in 0.054s
    Iray INFO - module:category(IRAY:RENDER):   1.3   IRAY   rend info : CPU (7 threads): Allocated 15 MB for frame buffer
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Scene processed in 0.115s
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Allocated 15 MB for frame buffer
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Allocated 804 MB of work space (1024k active samples)
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Device optimized for interactive usage; performance could be sacrificed
    Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Received update to 00001 iterations after 0.667s.

    I have a GTX 970 and in Render Settings> Advanced I have Photoreal> CPU / GeForce GTX 970 both checked. I also have OptiX Prime Acceleration checked, plus CPU / GeForce GTX 970. Unchecking the Interactive options makes no difference to render time or to the log message.

    mac

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    maclean said:

    Since Spooky is popping into this thread, I'll repost something I posted in the tech discussion, but have never managed to solve.

    I see this message in the DS 4.8 log file for every render, and I have no idea what this part means - 'Device optimized for interactive usage; performance could be sacrificed'

    Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Rendering...
    Iray INFO - module:category(IRAY:RENDER):   1.3   IRAY   rend info : CPU (7 threads): Scene processed in 0.054s
    Iray INFO - module:category(IRAY:RENDER):   1.3   IRAY   rend info : CPU (7 threads): Allocated 15 MB for frame buffer
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Scene processed in 0.115s
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Allocated 15 MB for frame buffer
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Allocated 804 MB of work space (1024k active samples)
    Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 970): Device optimized for interactive usage; performance could be sacrificed
    Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Received update to 00001 iterations after 0.667s.

    I have a GTX 970 and in Render Settings> Advanced I have Photoreal> CPU / GeForce GTX 970 both checked. I also have OptiX Prime Acceleration checked, plus CPU / GeForce GTX 970. Unchecking the Interactive options makes no difference to render time or to the log message.

    mac

    Unchecking Interactive where? :) That has nothing to do with the render settings or draw style in DS. 

     

    What that means is your card is set to both do CUDA computing and Graphics (ie, drawing to your screen.). You can turn that off in the NVIDIA Control panel, though if it is driving your monitor you shouldn't see anything on your monitor if you do that. (As I understand it.) I am not sure you can do that if the card is your only one, or is connected to your monitor. (I wouldn't recommend trying it as getting it back could be problematic if you can't read the screen. :) ) 

    On my test machine with Quadro cards (One K2200 connected to the monitors and one K6000 used only for CUDA) I did change the setting on the K6000 to see the difference. I don't believe it was more than a 1% change, it was very difficult to tell there was any differene at all. 

  • macleanmaclean Posts: 2,438

    Thanks Spooky! An answer at last!

    I meant unchecking Interactive Devices (CPU/GTX970) in the lower OptiX section.

    So, seems like if I try to correct this, I might not be able to see a thing? I think I'll stick with having a monitor that works. LOL.

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    maclean said:

    Thanks Spooky! An answer at last!

    I meant unchecking Interactive Devices (CPU/GTX970) in the lower OptiX section.

    So, seems like if I try to correct this, I might not be able to see a thing? I think I'll stick with having a monitor that works. LOL.

    Technically it isn't something that needs to be fixed.
  • Given how many people (including myself) are going to have to buy new graphics cards to render in DS now I think Nvidea should be paying Daz ;-).

  • At spearcarrier and some others, I have read here and there that render times inside rooms are vastly more than if you use infinite space. Try an iray scene not in a room or boxed in and see if that is your glitch.

  • mjc1016mjc1016 Posts: 15,001

    Given how many people (including myself) are going to have to buy new graphics cards to render in DS now I think Nvidea should be paying Daz ;-).

    It is absolutely FALSE about NEEDING any sort of video card to render in Studio.  Iray will render just fine without being run on an Nvidia card...

  • FishtalesFishtales Posts: 6,162

    I have a load of renders all done in Iray without a Nvidia video card. I even use the AUX Viewport in Iray format to see what the scene will look like as I work on it. I tried it with the full Viewport but it was like watching an old black and white movie as it jerked about smiley

  • Of cours it works without a GPU.

    GPU rendering is much faster compared to CPU rendering but remember that there are many limtations that makes GPU rendering useless in many cases.

    1: The scene must fit in GPU memory, otherwise it will not work (including textures!!!!), Note: two video cards with 4 GB each means you have 4GB available for rendering, not 8!!!

    2: If you have only one video card the UI has to share it with IRAY, less memory for rendering (and slower UI while rendering).

    Each GPU need a copy of the scene so just because you have two video cards does not mean you have double memory available for rendering.

     

  • jodycourjodycour Posts: 137

    I've just switch back to 3d Delight, it works much faster!

     

  • talihawktalihawk Posts: 86

    Have any of you found that Tree / Leaf products cause Iray to slow to a crawl?  I've got some old Cyclorama trees that render just fine (of course they are just planes with transparencies) but an actual 3d tree with transparency based leaves seems to kill my render times, even possibly hanging the render. I've not messed with the base settings of the default iray render, turned off dome and ground, used the iray shaders and I have a nvidia gtx 680, 12 gigs of ram.

  • Another way to speed up Iray. Depends on the size of the scene and on size the GPU memory.

     Before the scene starts rendering, it first loads into the memory of the GPU, it takes 2-4 minutes. You can not to spend these 2-4 minutes, and immediately start rendering. For that don't close the first render window. While the render window is open, the scene remains loaded in the memory of the GPU. If render window open, when you make changes to a scene, the scene just updated but not re-loaded into the memory of the GPU again. The first render window is always open, and for saving images use second render window. This requires at least 4 GB of GPU memory for  for medium scenes (2-3 figures+props+clothes...). 

    This method also works for viewport in mode Iray.

  • mjc1016mjc1016 Posts: 15,001
    talihawk said:

    Have any of you found that Tree / Leaf products cause Iray to slow to a crawl?  I've got some old Cyclorama trees that render just fine (of course they are just planes with transparencies) but an actual 3d tree with transparency based leaves seems to kill my render times, even possibly hanging the render. I've not messed with the base settings of the default iray render, turned off dome and ground, used the iray shaders and I have a nvidia gtx 680, 12 gigs of ram.

    What version of Studio?

     

  • Liquid RockLiquid Rock Posts: 9
    edited September 2015

    All Informations here about OptiX are incorrect! At this Version of DAZ studio and its implemented iray machine, nothing will happen or changes anything if you turn on this! OptiX is an small Tool with a really big effort to handle with, but first of all, read this! https://developer.nvidia.com/gameworks-optix-overview. If you think with this option something s going on faster, then try another one without OptiX off and render again.

    XoechZ said:

    SickleYield said:

    Did you turn on OptiX acceleration in your render settings?

     

    No. What is this setting good for?

     

    It turns on the Magic Go Faster button.

    Not really, it's the Opti X very fast ray tracer. It speeds up most renders. I'm not sure why they made it optional.

     

    Post edited by Liquid Rock on
  • stormqqstormqq Posts: 76
    edited September 2015

    IRay seems need a mid to high end card such as GtX 560+, 660+, 760+, 960+  ,if owned anything below then don't use IRay rendering because it will waste your time. anything from Nvidia is demanding (more like purposely unoptimized for low end and old gen cards) such as gamework etc..  so you need a good graphic card to use nvidia technologies.

     

    Post edited by stormqq on
  • AndyGrimmAndyGrimm Posts: 910
    edited September 2015

    that's not true... iray renders pysically effects (pbr) faster then 3delight does by standard - even using just CPU.....    

    Correct is to KNOW which effects you prefer for a final render - if as a example : a simple eye reflection map (cheat) is enough ..  insted a real world ligth source reflection (or NONE depending on reflection ray's angle to the camera)...  then 3 delight is faster :-)...

    if you aim for reality and not just fast (cheated) render texture effects.. then you will use IRAY.. if you are happy with what  3delight can do with cheated skins and painted reflections - well it is faster. because ambient and shadows are in the textures allready,


     

    Post edited by AndyGrimm on
  • mjc1016mjc1016 Posts: 15,001
    AndyGrimm said:

    if you aim for realitiy and not just fast (cheated) render texture effects.. then you will use IRAY.. if you are happy with what  3delight can do with cheat skins and painted reflections - well it is faster. because ambient and shadows are in the textures allready,
     

     They don't have to be 'baked' in, in 3Delight...it's just that most of the Studio shaders (not presets, but shaders) either don't have what is needed or render very slowly without the 'cheats' that people would rather use them than wait.  Modern, physically plausible (no, 3Delight does not do 'real', because it is 'biased) shaders don't need all those oldschool cheats...and are much faster.  But...since there are next to none of those in Studio...

    But even then, in GPU mode, Iray will be faster...in CPU mode it MAY be faster, but with all the advances and modern shaders, I'm not too sure about that...Studio is only harnessing a fraction of the power of the 3Delight renderer, and that makes direct comparisons even harder.

  • AndyGrimmAndyGrimm Posts: 910
    edited September 2015

    @mjc1016 well technically we mean the same....   i could have said it more clear....   using cheat textures with baked in ambient and spec... a biased render is always faster....   But IRay is fast even on CPU for a nonbiased. I do renders in reasonable time using a entry Nvidia labtop card with only 96 cudas.

    My: "that's not true"  was directed to 

     

    gmlsx90 said:

    IRay seems need a mid to high end card such as GtX 560+, 660+, 760+, 960+  ,if owned anything below then don't use IRay rendering because it will waste your time. anything from Nvidia is demanding (more like purposely unoptimized for low end and old gen cards) such as gamework etc..  so you need a good graphic card to use nvidia technologies.

     

     

    Post edited by AndyGrimm on
  • From all that I've read, my two computers must be freaks or something, because I can render Iray scenes on both of them (though they take 2 hours to 'fully bake') without anything melting or blowing up.   My laptop is an old Dell Precision M6400 with Core 2 Duo 2.4 processors, 8GB of Ram and a Nvidia Quadro FX 2700 M Graphic card with 512MB of memory.    My desktop is a Dell Vostro 420 with  Core 2 Quad 2.4 processors 8GB of Ram and a Radeon HD 4800 Graphic card with 512 MB of memory (which I'm wanting to replace with a EVGA Geforce 730 GT with 2GB of memory).

    Both systems render Iray scenes very well, as long as I'm willing to wait a few hours for them to finish.   Most of my renderings have a couple of figures and props in them, though I usually use image backgrounds.    The only problem I've had was the 2 hour time limit setting ending larger renders on the Radeon system before they were complete.    There were little unrendered pixels on some of the renderings (looking as if you superimposed a star chart on the rendering) until I began reading here about the Iray shader settings (which I had had neglected to install).   Now the Radeon based Desktop system renders very well, though it is obviously only the four core processors of the CPU doing the work.   Maybe when I buy the 730 it will speed up.   The power supply limits me to a 730 GT.

    On the Precision laptop, it renders slow, but will eventually finish, but the downside of that is that when it is rendering, you can't do anything else but let it go.   Still, for an older laptop, it isn't bad if you are patient enough.    

    I want to thank those who posted suggestions to this forum as I learned a lot of tricks today.

  • mjc1016mjc1016 Posts: 15,001

    Actually, that is not 'slow, ec21davis, for CPU render (because that is what you are doing), that's about average to slightly better, especially with that hardware.  The Quadro card is most likely NOT being used, because of its limited memory size and even if there was more memory on the Radeon, it would never be used, either.

  • edited September 2015
    mjc1016 said:

    Actually, that is not 'slow, ec21davis, for CPU render (because that is what you are doing), that's about average to slightly better, especially with that hardware.  The Quadro card is most likely NOT being used, because of its limited memory size and even if there was more memory on the Radeon, it would never be used, either.

    I've set up the Quadro card in the Iray settings so that Iray claims it will use both of them, but the card gets rather hot when running some large videos or games and has tripped the temperature shut-down feature on my laptop.    The thing is, I like using the laptop for most of my work (which is mostly fantasy illustrations), so I save most of my stuff on a 2TB portable harddrive and then render the scene with the desktop that has the Radeon card.

     

    I've spent most of the day looking for a replacement -- on-line-- for the Radeon and it is a real mess when you have an older desktop.   I can only select a PCI 2.0 card, and the power supply is only 350 watts, so no high-end cards.   After reviewing cards and reading reviews, I'll likely move away from selecting a GeForce and go with a Quadro K620, which is rated (by gamers) at about twice the speed of my current Radeon.   It is PCI 2.0 and low profile, only uses about 45 watts of power (according to the NVIDIA site) and it can handle Direct X 12 (I run Windows 10 on all my computers).   It has only 2 GB of memory, though, so it probably won't help much on renders, but at least it will be an NVIDIA card, which should play well with Iray.  

    Post edited by ec21davis_e7f4b03c52 on
  • You can tell Iray to use the card, but if the scene exceeds its memory the card will not be used or will drop out partway through the render.

  • fastbike1fastbike1 Posts: 4,078

    K1200 uses same power, has 4GB, almost twice the cuda cores. Twice the money, but the extra vram should be handy.

  • felisfelis Posts: 4,633

    Maybe a stupid question, but I haven't been able to figure it out. I have checked the log, but no info there.

    How do you determine the size of the scene, i.e. if it can fit in memory?

  • fastbike1 said:

    K1200 uses same power, has 4GB, almost twice the cuda cores. Twice the money, but the extra vram should be handy.

    Yeah, the K1200 would be awesome, but I don't have $350 to spend.    I looked at them on E-bay and they are not much cheaper used.

  • AndyGrimmAndyGrimm Posts: 910
    edited September 2015

     

    mjc1016 said:

    Actually, that is not 'slow, ec21davis, for CPU render (because that is what you are doing), that's about average to slightly better, especially with that hardware.  The Quadro card is most likely NOT being used, because of its limited memory size and even if there was more memory on the Radeon, it would never be used, either.

    I've set up the Quadro card in the Iray settings so that Iray claims it will use both of them, but the card gets rather hot when running some large videos or games and has tripped the temperature shut-down feature on my laptop.    The thing is, I like using the laptop for most of my work (which is mostly fantasy illustrations), so I save most of my stuff on a 2TB portable harddrive and then render the scene with the desktop that has the Radeon card.

     

    I've spent most of the day looking for a replacement -- on-line-- for the Radeon and it is a real mess when you have an older desktop.   I can only select a PCI 2.0 card, and the power supply is only 350 watts, so no high-end cards.   After reviewing cards and reading reviews, I'll likely move away from selecting a GeForce and go with a Quadro K620, which is rated (by gamers) at about twice the speed of my current Radeon.   It is PCI 2.0 and low profile, only uses about 45 watts of power (according to the NVIDIA site) and it can handle Direct X 12 (I run Windows 10 on all my computers).   It has only 2 GB of memory, though, so it probably won't help much on renders, but at least it will be an NVIDIA card, which should play well with Iray.  

    PCI 3.0 is downward compatible to PCI 2.0...  only your scenes will load a bit slower - (render is then 100% on the card with normal speed --  replace also your powersupply and you should be able to use any grapic card which you like.

    I am in the exact same situation - working mostly on my labtop but have a older not used desktop with 350 watt powersupply and PCI 2.0....   i dont want invest a lot right now because i wait till LAN GPU rendering is ready and save my money for a extern GPU card host case.
    For the moment i will grab a second hand gtx 780( ti )... and buy a stronger powersupply...  with patience i can do this for lesser then 300usd... and my old desktop will be fine as a render machine for smaller scenes...780 (and or 780 ti)  has 3GB ram but is FAST).   (hope to catch a 6GB 780ti..but they are rare)...

    Post edited by AndyGrimm on
  • I've been watching Ebay for a few days for Quadro cards and found a F620 for $100 but I had to transfer funds and my bank was closed today, so someone else go it.    Rather frustrating, but I'm going to keep looking.   I'd love to get a 4GB card, but the prices of these are a wee bit too high for me and you never know when you buy a used card.   A Quadro K1200 would be ideal with 4GB.    As far as swapping out power supplies, I've thought of that too and I might have to try that.    Dell is notorious for underrating their power supply ratings, but most of the good GeForce cards require at least 400 watts and many need 500.   That gets a bit expensive, after you add it all up.   Maybe I ought to just be satisfied with what I've got.   Most of the computer stuff I own was bought used, and 3d graphics can be a rich man's hobby if you want the very best stuff.    I guess a slow rendering speed is not as bad as not being able to render at all.   

    I'm mainly using Studio for fun and to render characters and scenes from some of my books (I'm a fantasy writer), so I can't really justify spending $400 or more on a video card and power supply.

Sign In or Register to comment.