GPU to CPU, (it's not going away anytime soon)

1235»

Comments

  • Taoz said:
    248sales said:
    Padone said:
    248sales said:

    The CPU can crank a decent noiseless 10,000 x 5,625 rendering in 2 hours with 10 characters and props. Is there any GPU on the market that can do that?!

    I agree that for still pictures the cpu may be an option, especially if you are fine rendering overnight. Personally I can't wait for a render more than a few minutes so the gpu with the denoiser is the only way. Also exporting to a real time engine is a very good option for animations. Then we have iray interactive that may be good for limited quality animation too. And I do agree that the vram usage of most daz assets is too much, mostly because of 4K textures. That's why iray is unusable without the scene optimizer addon. That's why some optimization addons as standard inside daz studio would be fine to have.

    Back to the topic just for the sake of completeness I'd add that my system seems to work fine, with none of the issues described in this discussion, apart the extra vram for optix that I used to turn on anyway so no differences for me. Specs in the signature.

    I tried the scene optimizer and that in itself takes a very long time to run as well! I wish Nvidia offerred an affordable graphics card with 48 or 64 GB RAM. I am sure many people want to render more than 1 character in the scene and just two characters appear to max out the 11GB of my 1080 ti. So I don't know if even 64 GB RAM would be enough. Sure I can hide some charcters and then render them later but if all the characters will be casting shadows on each other you can't do that because the missing shadows would make the image look fake.

    I'm not sure how affordable a card liek that could be - the memory used in GPUs is priciey than standard memory, as far as I know, and 64GB of regular memory for this machine was somewhere around £350 on its own as I recall.

    AMD has a 32 GB GDDR5 card that cost less than most RTX 2080ti, so NVidia could probably have offered a 32 GB GTX for a similar price if they had wanted to:

    https://www.amazon.co.uk/AMD-100-506048-Radeon-GDDR5-Video/dp/B072HTFZM4/ref=sr_1_4?keywords=amd+radeon+32gb&qid=1580728797&sr=8-4

     

    But still not cheap, which was my point - that the saving from a lesser GPU chip might not lead to a low enough price to make the low-end chip/high memory boards really affordable..

  • TaozTaoz Posts: 9,940
    Taoz said:
    248sales said:
    Padone said:
    248sales said:

    The CPU can crank a decent noiseless 10,000 x 5,625 rendering in 2 hours with 10 characters and props. Is there any GPU on the market that can do that?!

    I agree that for still pictures the cpu may be an option, especially if you are fine rendering overnight. Personally I can't wait for a render more than a few minutes so the gpu with the denoiser is the only way. Also exporting to a real time engine is a very good option for animations. Then we have iray interactive that may be good for limited quality animation too. And I do agree that the vram usage of most daz assets is too much, mostly because of 4K textures. That's why iray is unusable without the scene optimizer addon. That's why some optimization addons as standard inside daz studio would be fine to have.

    Back to the topic just for the sake of completeness I'd add that my system seems to work fine, with none of the issues described in this discussion, apart the extra vram for optix that I used to turn on anyway so no differences for me. Specs in the signature.

    I tried the scene optimizer and that in itself takes a very long time to run as well! I wish Nvidia offerred an affordable graphics card with 48 or 64 GB RAM. I am sure many people want to render more than 1 character in the scene and just two characters appear to max out the 11GB of my 1080 ti. So I don't know if even 64 GB RAM would be enough. Sure I can hide some charcters and then render them later but if all the characters will be casting shadows on each other you can't do that because the missing shadows would make the image look fake.

    I'm not sure how affordable a card liek that could be - the memory used in GPUs is priciey than standard memory, as far as I know, and 64GB of regular memory for this machine was somewhere around £350 on its own as I recall.

    AMD has a 32 GB GDDR5 card that cost less than most RTX 2080ti, so NVidia could probably have offered a 32 GB GTX for a similar price if they had wanted to:

    https://www.amazon.co.uk/AMD-100-506048-Radeon-GDDR5-Video/dp/B072HTFZM4/ref=sr_1_4?keywords=amd+radeon+32gb&qid=1580728797&sr=8-4

     

    But still not cheap, which was my point - that the saving from a lesser GPU chip might not lead to a low enough price to make the low-end chip/high memory boards really affordable..

    Well my point was that it doesn't seem to be VRAM prices that keeps NVidia from adding more VRAM to their cards.  The Titan X for example has 12 GB DDR5 and costs almost £300 more than the AMD Radeon card with 32 GB DDR5.  Apart from 20 GB more RAM the AMD card also seems to have better specs than the NVidia card in most contexts, according to this page:

    https://www.game-debate.com/gpu/index.php?gid=3930&gid2=2492&compare=radeon-pro-duo-32gb-vs-geforce-gtx-titan-x 

     

  • fdsphotofdsphoto Posts: 62
    edited February 2020

    So, diving a little deeper in to this. The RTX 2080 ti has done a great job of reducing my render times, and I'm happy with the results.

    However, there are a few things that I've observed along the way.

    #1 While my render times have decreased dramatically from 8+ hours to ~45 minutes, I believe I system was doing so while the system was operating my graphics card bus at x8 bandwidth (meaning at half it's capacity) vs. it's x16 full bandwidth  mode, as I had other devices (Wi-Fi & SATA Controller) also installed on the PCIe bus. Along with, what I already knew was gonna happen, the Intel i7 3770 Quad-Core (3rd gen) bottlenecking the performance, along with the DDR3 1666 MHz RAM.

    If you're experiencing performance issues, please check to make sure other device are not causing your PCIe lanes to switch modes like mine were.

    Tomorrow, I will be experimenting with a new rig:

    AMD Ryzen 3900x 12-core 24 thread, PCIe 4.0 Support (dedicated full-time x16 PCIe Interface)

    DDR4 3200 MHz - 32GB

    MSI AM4 X570 PCIe 4.0 - Dedicated Full-Time x16 PCIe Slot - plus an additional x16 / x8 / x4,  and x1 PCIe Slots (PCI Bus) - which will be used for my Wi-Fi & SATA Controller.

    If anyone is interested in the results, let me know, as (with the exception of the 2080 ti) this build is pretty cheap, but the performance should be outstanding.

    Post edited by fdsphoto on
  • fdsphotofdsphoto Posts: 62
    edited February 2020

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to install it if it f's everything up. lol

    Will update on the results.

    Post edited by fdsphoto on
  • TaozTaoz Posts: 9,940
    fdsphoto said:

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to inistall it if it f's everything up. lol

    Will update on the results.

    I've just installed it, no problems so far (GTX 1070, DS 4.12.1.55 public beta).

  • marblemarble Posts: 7,500
    Taoz said:
    fdsphoto said:

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to inistall it if it f's everything up. lol

    Will update on the results.

    I've just installed it, no problems so far (GTX 1070, DS 4.12.1.55 public beta).

    Thank you for that information. Same configuration that I have.

  • TaozTaoz Posts: 9,940
    edited February 2020
    Taoz said:
    fdsphoto said:

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to inistall it if it f's everything up. lol

    Will update on the results.

    I've just installed it, no problems so far (GTX 1070, DS 4.12.1.55 public beta).

    Well, still dropping to CPU sometimes if modifying textures/shaders in Iray preview mode (small scene, Allow CPU Fallback unchecked (what does that do, actually?)), but I don't know if that's a DS or a driver problem.  

    Post edited by Taoz on
  • fdsphotofdsphoto Posts: 62
    edited February 2020

    No problems so far. (RTX 2080 Ti, DS 4.12.1.55)

    Yeah, the texture thing is still there for me too. I've just come to terms with that... lol

    Post edited by fdsphoto on
  • Taoz said:
    Taoz said:
    fdsphoto said:

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to inistall it if it f's everything up. lol

    Will update on the results.

    I've just installed it, no problems so far (GTX 1070, DS 4.12.1.55 public beta).

    Well, still dropping to CPU sometimes if modifying textures/shaders in Iray preview mode (small scene, Allow CPU Fallback unchecked (what does that do, actually?)), but I don't know if that's a DS or a driver problem.  

    The CPU Fallback option determnes what happens when the GPU(s) fail - see https://www.daz3d.com/forums/discussion/comment/5274781/#Comment_5274781 (note that chnaging the setting requires a restart).

  • TaozTaoz Posts: 9,940
    Taoz said:
    Taoz said:
    fdsphoto said:

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to inistall it if it f's everything up. lol

    Will update on the results.

    I've just installed it, no problems so far (GTX 1070, DS 4.12.1.55 public beta).

    Well, still dropping to CPU sometimes if modifying textures/shaders in Iray preview mode (small scene, Allow CPU Fallback unchecked (what does that do, actually?)), but I don't know if that's a DS or a driver problem.  

    The CPU Fallback option determnes what happens when the GPU(s) fail - see https://www.daz3d.com/forums/discussion/comment/5274781/#Comment_5274781 (note that chnaging the setting requires a restart).

    OK, thanks!

  • scorpioscorpio Posts: 8,415
    248sales said:
    Padone said:
    248sales said:

    The CPU can crank a decent noiseless 10,000 x 5,625 rendering in 2 hours with 10 characters and props. Is there any GPU on the market that can do that?!

    I agree that for still pictures the cpu may be an option, especially if you are fine rendering overnight. Personally I can't wait for a render more than a few minutes so the gpu with the denoiser is the only way. Also exporting to a real time engine is a very good option for animations. Then we have iray interactive that may be good for limited quality animation too. And I do agree that the vram usage of most daz assets is too much, mostly because of 4K textures. That's why iray is unusable without the scene optimizer addon. That's why some optimization addons as standard inside daz studio would be fine to have.

    Back to the topic just for the sake of completeness I'd add that my system seems to work fine, with none of the issues described in this discussion, apart the extra vram for optix that I used to turn on anyway so no differences for me. Specs in the signature.

    I tried the scene optimizer and that in itself takes a very long time to run as well! I wish Nvidia offerred an affordable graphics card with 48 or 64 GB RAM. I am sure many people want to render more than 1 character in the scene and just two characters appear to max out the 11GB of my 1080 ti. So I don't know if even 64 GB RAM would be enough. Sure I can hide some charcters and then render them later but if all the characters will be casting shadows on each other you can't do that because the missing shadows would make the image look fake.

    I have a 1080ti and often render more than 2 people and scenery in one render.

    . If you do the seperate render correctly the shadows shouldn't be a problem, it does require some pre thinking, but you can also do spot renders rather than render the whole image.

  • Serene NightSerene Night Posts: 17,641

    I am trying to digest this thread since I have had issues with studio since I bought a new computer with a GTX 2080 Super. It has 8 gigs of ram. My computer has 32 gigs of ram, it seems rendering with the super does not work nearly as well as it should.

    Scene optimizer does solve the problem but I am at a loss as to why this is all needed to render stuff I could easily render on my previous computer with less hardware?  

  • Phoenix1966Phoenix1966 Posts: 1,670
    Taoz said:
    Taoz said:
    fdsphoto said:

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to inistall it if it f's everything up. lol

    Will update on the results.

    I've just installed it, no problems so far (GTX 1070, DS 4.12.1.55 public beta).

    Well, still dropping to CPU sometimes if modifying textures/shaders in Iray preview mode (small scene, Allow CPU Fallback unchecked (what does that do, actually?)), but I don't know if that's a DS or a driver problem.  

    I continue to have the same issue with D|S dropping to CPU for the same reason. This has been going on for me for some time now. I keep hoping a new Nvidia driver will rectify the issue, but no joy.

  • Taoz said:
    Taoz said:
    fdsphoto said:

    So, Nvidia 442.19 Studio Driver is out. Getting Anxiety over it already. Not looking forward to having to inistall it if it f's everything up. lol

    Will update on the results.

    I've just installed it, no problems so far (GTX 1070, DS 4.12.1.55 public beta).

    Well, still dropping to CPU sometimes if modifying textures/shaders in Iray preview mode (small scene, Allow CPU Fallback unchecked (what does that do, actually?)), but I don't know if that's a DS or a driver problem.  

    I continue to have the same issue with D|S dropping to CPU for the same reason. This has been going on for me for some time now. I keep hoping a new Nvidia driver will rectify the issue, but no joy.

    WSince everyone does not have the issue a driver update ios very unlikely to fix the issue. You need to identify why it is happening. More than likely you're running out of VRAM and need to optimize.

  • Im personally fed up with iray and RTX, i have a 2080t, i have the latest beta with the switch that prevents dropping to CPU. Yes it does not drop to CPU anymore, it just stops rendering. Wow.. great fix.

    I'm one of the lucky ones that i still have daz 4.11, i think that was the last version that came without RTX accelertion for the 20 series of cards. That version of studio is rock solid, I could be working for an hour, multiple renderes and it will not crash. Now the latest beta, i will render one scene, it renders fine. i will then rotate the model 90 degrees and it will crash, i can not get more than 2 renders out of studio without having to restart it. My renders are super simple, one figure, no clothes, no hair. no props or scenery. render size 2500 x 2500. and it's not running out of memory, it's stops rendering becuase of an illegal access to memory. 

    I've tested Octane which also has RTX and it's also rock solid.

     

  • PaintboxPaintbox Posts: 1,633
    edited February 2020

    Just adding to the rest of the voices that I also encounter this behaviour since I updated my Nvidia Driver and the latest 4.12 (not beta) for the 1060 6GB... There is definitely something weird going on. It tends to drop to CPU much much quicker than previously, and the noise filter just plainly stops working after one render.

    Post edited by Paintbox on
  • Im personally fed up with iray and RTX, i have a 2080t, i have the latest beta with the switch that prevents dropping to CPU. Yes it does not drop to CPU anymore, it just stops rendering. Wow.. great fix.

    The switch has nothing to do with stopping the GPU from runnng out of memory, it is there because some users wanted it so that a failed GPU render would stop, allowing them to review the scene or take other steps, rather than launching a CPU render that they didn't want.

    I'm one of the lucky ones that i still have daz 4.11, i think that was the last version that came without RTX accelertion for the 20 series of cards. That version of studio is rock solid, I could be working for an hour, multiple renderes and it will not crash. Now the latest beta, i will render one scene, it renders fine. i will then rotate the model 90 degrees and it will crash, i can not get more than 2 renders out of studio without having to restart it. My renders are super simple, one figure, no clothes, no hair. no props or scenery. render size 2500 x 2500. and it's not running out of memory, it's stops rendering becuase of an illegal access to memory. 

    You are closing the previous render windows? Dropping the GPU (it isn't correct to describe that as a crash) for such a scene on a 2080Ti is surprising - I generally do only 1,000 pixel sqaure images right now and am able to do multiple renders in succession. If you have a scene that reliably does this please attach it to a Techncial Support ticket.

    I've tested Octane which also has RTX and it's also rock solid.

  • Taoz said:

    AMD has a 32 GB GDDR5 card that cost less than most RTX 2080ti, so NVidia could probably have offered a 32 GB GTX for a similar price if they had wanted to:

    https://www.amazon.co.uk/AMD-100-506048-Radeon-GDDR5-Video/dp/B072HTFZM4/ref=sr_1_4?keywords=amd+radeon+32gb&qid=1580728797&sr=8-4

     

    The Radeon Pro Duo, you link to, is a dual GPU solution. The usable ram is only 16GB/gpu.

    It's also GDDR5 as opposed to GDDR6.

    It' slower ram, 1.7GHZ compared to 14GHZ.

    Apples and oranges.

  • ParallaxCreatesParallaxCreates Posts: 450
    edited February 2020

    Hi everyone, I render a lot. I mean hours upon hours on a daily basis and would like to contribute to this thread with my scenario.

    Firstly, my rig.

    Nvidia Driver 418.81 or 430.86 - either or give me the same results.

    i5-8600K @ 4.30 GHZ / 100% Usage Temp 52c (Liquid Cooled)

    32GB Trident Z RBG 4 x 8GB Stix (2400 Mhz)

    EVGA GTX 1080 Ti SC2 (No changes made to EVGA's settings) 100% Usage Temp 55c

    EVGA GTX 1080 SC (No changes made to EVGA's settings) 100% Usage Temp 47c

    EVO SSD 1TB Windows 8.1 Pro

    EVO SSD 250GB (Daz Studio 4.11.0.383 PRO runs off of here)

    HDD 7500 RPM 2TB (Daz content here)

    Scenario A: Five fully clothed G8s with hair - Iray preview window via Daz 4.11.0.383 with an HDRI 4096 x 2048 and one linear light.

    I can sit here and run that scene for hours in iray preview and nothing goes wrong. Of course rendering normally at 5k x 5k at 95% convergence is fine as well. No fallbacks, nothing. 

    Scenario B: Five fully clothed G8s with hair - Iray preview window via Daz 4.12.1.55 or Daz 4.12.1.40 with an HDRI 4096 X 2048 and one linear light. 

    The moment I move that camera or perspective or change anything BOOM - fall back to CPU. That of course is when I am iray previewing so to speak. Normal rendering at 5k x 5k 95% convergence does its job via the GPUs just fine - that is if I didn't fall back first. 

    Hopefully any of this helps. As a PA here I just keep to working via Daz 4.11.0.383 and have a happy life doing so. The moment I step foot in any other Daz above that version...game over.

    Post edited by ParallaxCreates on
  • SergoyelesSergoyeles Posts: 11
    edited February 2020

    Im personally fed up with iray and RTX, i have a 2080t, i have the latest beta with the switch that prevents dropping to CPU. Yes it does not drop to CPU anymore, it just stops rendering. Wow.. great fix.

    The switch has nothing to do with stopping the GPU from runnng out of memory, it is there because some users wanted it so that a failed GPU render would stop, allowing them to review the scene or take other steps, rather than launching a CPU render that they didn't want.

    Anyway even with the switch turned on, you need to restart Daz Studio because you can't render again, so this fix is not helpful.

    Post edited by Sergoyeles on
  • Im personally fed up with iray and RTX, i have a 2080t, i have the latest beta with the switch that prevents dropping to CPU. Yes it does not drop to CPU anymore, it just stops rendering. Wow.. great fix.

    The switch has nothing to do with stopping the GPU from runnng out of memory, it is there because some users wanted it so that a failed GPU render would stop, allowing them to review the scene or take other steps, rather than launching a CPU render that they didn't want.

    Anyway even with the switch turned on, you need to restart Daz Studio because you can't render again, so this fix is not helpful.

    To whom? The people who wanted the feature will presumably run with it always on. You apparently aren't one of them, so it is not a useful feature to you but it is to others.

  • Im personally fed up with iray and RTX, i have a 2080t, i have the latest beta with the switch that prevents dropping to CPU. Yes it does not drop to CPU anymore, it just stops rendering. Wow.. great fix.

    The switch has nothing to do with stopping the GPU from runnng out of memory, it is there because some users wanted it so that a failed GPU render would stop, allowing them to review the scene or take other steps, rather than launching a CPU render that they didn't want.

    Anyway even with the switch turned on, you need to restart Daz Studio because you can't render again, so this fix is not helpful.

    To whom? The people who wanted the feature will presumably run with it always on. You apparently aren't one of them, so it is not a useful feature to you but it is to others.

    I will definitely run it on 100% of the time. I very much never want the CPU used in renders. If the scene is too big for my 1080ti I don't want to struggle through the lag of a cPU render to shut things down.

  • Hi everyone, I render a lot. I mean hours upon hours on a daily basis and would like to contribute to this thread with my scenario.

    Firstly, my rig.

    Nvidia Driver 418.81 or 430.86 - either or give me the same results.

    i5-8600K @ 4.30 GHZ / 100% Usage Temp 52c (Liquid Cooled)

    32GB Trident Z RBG 4 x 8GB Stix (2400 Mhz)

    EVGA GTX 1080 Ti SC2 (No changes made to EVGA's settings) 100% Usage Temp 55c

    EVGA GTX 1080 SC (No changes made to EVGA's settings) 100% Usage Temp 47c

    EVO SSD 1TB Windows 8.1 Pro

    EVO SSD 250GB (Daz Studio 4.11.0.383 PRO runs off of here)

    HDD 7500 RPM 2TB (Daz content here)

    Scenario A: Five fully clothed G8s with hair - Iray preview window via Daz 4.11.0.383 with an HDRI 4096 x 2048 and one linear light.

    I can sit here and run that scene for hours in iray preview and nothing goes wrong. Of course rendering normally at 5k x 5k at 95% convergence is fine as well. No fallbacks, nothing. 

    Scenario B: Five fully clothed G8s with hair - Iray preview window via Daz 4.12.1.55 or Daz 4.12.1.40 with an HDRI 4096 X 2048 and one linear light. 

    The moment I move that camera or perspective or change anything BOOM - fall back to CPU. That of course is when I am iray previewing so to speak. Normal rendering at 5k x 5k 95% convergence does its job via the GPUs just fine - that is if I didn't fall back first. 

    Hopefully any of this helps. As a PA here I just keep to working via Daz 4.11.0.383 and have a happy life doing so. The moment I step foot in any other Daz above that version...game over.

    Exact same problems like the person above me describes. I am having success with Daz Studio 4.12.0 and the moment I updated to 4.12.1 all hell broke lose and GPU to CPU fallback happens near instantly on a SINGLE FIGURE with nothing else in the scene. GTX 1080 Ti here.

  • For me with the 2080ti it happens on the second or thrid render. As stated a few posts above, 1 single g8 figure, no hair, no clothes, but the image size is 2500 x 2500 pixels, and i do close render window when complete. Not sure what else to do, it's not running out of memory, because i have more than enough, i can;t email my scene, because it's does not always happen. it is not consitent, it might happen on the second render or it might take place on the fifith. But alway it's an illegal access to memory.

    2020-02-04 12:42:38.366 Iray [VERBOSE] - IRAY:RENDER ::   1.11  IRAY   rend progr: CUDA device 0 (GeForce RTX 2080 Ti): Processing scene...
    2020-02-04 12:42:38.370 Iray [VERBOSE] - IRAY:RENDER ::   1.7   IRAY   rend stat : Geometry memory consumption: 3.621 MiB (device 0), 0.000 B (host)
    2020-02-04 12:42:38.408 Iray [VERBOSE] - IRAY:RENDER ::   1.7   IRAY   rend stat : Texture memory consumption: 267.250 MiB for 23 bitmaps (device 0)
    2020-02-04 12:42:38.408 Iray [INFO] - IRAY:RENDER ::   1.7   IRAY   rend info : Importing lights for motion time 0
    2020-02-04 12:42:38.408 Iray [INFO] - IRAY:RENDER ::   1.7   IRAY   rend info : Initializing light hierarchy.
    2020-02-04 12:42:38.412 Iray [INFO] - IRAY:RENDER ::   1.7   IRAY   rend info : Light hierarchy initialization took 0.000s
    2020-02-04 12:42:38.412 Iray [VERBOSE] - IRAY:RENDER ::   1.7   IRAY   rend stat : Lights memory consumption: 196.000 B (device 0)
    2020-02-04 12:42:38.415 Iray [VERBOSE] - IRAY:RENDER ::   1.7   IRAY   rend stat : Material measurement memory consumption: 0.000 B (GPU)
    2020-02-04 12:42:38.416 Iray [VERBOSE] - IRAY:RENDER ::   1.7   IRAY   rend stat : PTX code (111.042 KiB) for SM 7.5 generated in 0.000s
    2020-02-04 12:42:38.420 Iray [VERBOSE] - IRAY:RENDER ::   1.7   IRAY   rend stat : Materials memory consumption: 67.578 KiB (GPU)
    2020-02-04 12:42:38.446 Iray [INFO] - IRAY:RENDER ::   1.7   IRAY   rend info : JIT-linking wavefront kernel in 0.030s
    2020-02-04 12:42:38.448 Iray [INFO] - IRAY:RENDER ::   1.7   IRAY   rend info : JIT-linking mega kernel in 0.001s
    2020-02-04 12:42:38.448 Iray [INFO] - IRAY:RENDER ::   1.11  IRAY   rend info : CUDA device 0 (GeForce RTX 2080 Ti): Scene processed in 0.116s
    2020-02-04 12:42:38.450 Iray [INFO] - IRAY:RENDER ::   1.7   IRAY   rend info : CUDA device 0 (GeForce RTX 2080 Ti): Allocated 47.550 MiB for frame buffer
    2020-02-04 12:42:38.454 Iray [INFO] - IRAY:RENDER ::   1.11  IRAY   rend info : CUDA device 0 (GeForce RTX 2080 Ti): Allocated 1.688 GiB of work space (2048k active samples in 0.000s)
    2020-02-04 12:42:38.454 Iray [INFO] - IRAY:RENDER ::   1.11  IRAY   rend info : CUDA device 0 (GeForce RTX 2080 Ti): Used for display, optimizing for interactive usage (performance could be sacrificed)
    2020-02-04 12:42:38.479 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(332): Iray [ERROR] - IRAY:RENDER ::   1.7   IRAY   rend error: CUDA device 0 (GeForce RTX 2080 Ti): an illegal memory access was encountered (while launching CUDA renderer)

  • That looks like the memory leak. Are you tracking your VRAM usage in GPU-Z?

  • JPJP Posts: 60
    scorpio said:
    248sales said:
    Padone said:
    248sales said:

    The CPU can crank a decent noiseless 10,000 x 5,625 rendering in 2 hours with 10 characters and props. Is there any GPU on the market that can do that?!

    I agree that for still pictures the cpu may be an option, especially if you are fine rendering overnight. Personally I can't wait for a render more than a few minutes so the gpu with the denoiser is the only way. Also exporting to a real time engine is a very good option for animations. Then we have iray interactive that may be good for limited quality animation too. And I do agree that the vram usage of most daz assets is too much, mostly because of 4K textures. That's why iray is unusable without the scene optimizer addon. That's why some optimization addons as standard inside daz studio would be fine to have.

    Back to the topic just for the sake of completeness I'd add that my system seems to work fine, with none of the issues described in this discussion, apart the extra vram for optix that I used to turn on anyway so no differences for me. Specs in the signature.

    I tried the scene optimizer and that in itself takes a very long time to run as well! I wish Nvidia offerred an affordable graphics card with 48 or 64 GB RAM. I am sure many people want to render more than 1 character in the scene and just two characters appear to max out the 11GB of my 1080 ti. So I don't know if even 64 GB RAM would be enough. Sure I can hide some charcters and then render them later but if all the characters will be casting shadows on each other you can't do that because the missing shadows would make the image look fake.

    I have a 1080ti and often render more than 2 people and scenery in one render.

    . If you do the seperate render correctly the shadows shouldn't be a problem, it does require some pre thinking, but you can also do spot renders rather than render the whole image.

    Yep I have done that. However if you have reflections then the missing characters will affect the look of the rendering. Ultra HDRI renders appear to be a lot faster than ones with a sun light.

  • JPJP Posts: 60

    Im personally fed up with iray and RTX, i have a 2080t, i have the latest beta with the switch that prevents dropping to CPU. Yes it does not drop to CPU anymore, it just stops rendering. Wow.. great fix.

    The switch has nothing to do with stopping the GPU from runnng out of memory, it is there because some users wanted it so that a failed GPU render would stop, allowing them to review the scene or take other steps, rather than launching a CPU render that they didn't want.

    Anyway even with the switch turned on, you need to restart Daz Studio because you can't render again, so this fix is not helpful.

    To whom? The people who wanted the feature will presumably run with it always on. You apparently aren't one of them, so it is not a useful feature to you but it is to others.

    I will definitely run it on 100% of the time. I very much never want the CPU used in renders. If the scene is too big for my 1080ti I don't want to struggle through the lag of a cPU render to shut things down.

    You can set the affinity of your CPU cores via Task Manager. I have a 10 core / 20 thread I9 CPU and usually remove 2 threads from DAZ when rendering. I actually prefer CPU rendering with IRAY. Because I can free up 2 threads and use the computer while it is rendering in the background. With GPU rendering the computer is not as usable while it is rendering.

  • I had this Problem as well. The Asus 2070 nvidia I have was set to a bit of an overclock which caused it to drop to CPU. I down clocked it using the MSI Afterburner software and now it work great, I just had to turn the GPU speed down and set it o default. You may want to play with the memory clock as well to fine tune, I havent had time to do that yet.

  • Silver DolphinSilver Dolphin Posts: 1,608

    Iray looks great but it depends on what you are trying to create. I have a new computer a alienware area 51 r7 with a 2950x amd threadripper 16cores 32threads and it just kills 3delight renders. I do Iray just for main charater and composite inside of Gimp or Photoshop. I also found that if you render the main character in both 3delight and Iray and merge them both you don't have to worry about fireflies that seem to be just a fact of life with Iray. If you render with differnt degrees of light and exposure you can also with photoshop create a fake hdri which looks even better!!!! Oh on a side note> If you have one of these Alienware area 51 r7 threadripper systems you need to get rid of the crappy small 120mm water cooler and get the (bequiet  dark rock pro tr4 air cooler) it keeps my render temps under control unlike the water cooler used by alienware. I highly recommend this air cooling solution.

  • I have used daz studio since it's inception (and paid for it too). It's always been a dog when rendering animations. The makers have always insisted that Daz was not designed for animation but when you include it as a feature you expect it to work (and be improved as users demand it). There seems to be an inheirent problem with the program realeasing memory, or clearing it's cache. More and more people are into animation these days if only to get multiple shots from different angles and use the stills. I have used many other programs for rendering animations that have no problems with large or complex files, and when you see a program like Unreal 5, that can render 3D in realtime at 60fps plus, it makes you wonder what Daz programmers are missing?

    I am moving from Daz as fast as I can to Ureal 5 to do my rendering from now on.

Sign In or Register to comment.