Daz 4.12 GPU Rendering Issues

Been having issues with certain scenes that render fine in 4.10 and 4.11, but suddenly have memory allocation problems in 4.12, and so revert to CPU rendering. I'm using a GTX 1080 on the latest drivers, so I'm unsure why this is happening. Initially thought it might be a driver problem, but I did a clean reinstall, no change. Reinstalled Daz 4.10, exact same scenes rendered without any GPU problems. Reinstalled 4.12, renders have GPU memory problems.

 

2019-11-09 22:38:49.797 Iray [INFO] - IRAY:RENDER ::   1.4   IRAY   rend info : CUDA device 0 (GeForce GTX 1080): Scene processed in 36.766s
2019-11-09 22:38:49.826 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER ::   1.4   IRAY   rend error: CUDA device 0 (GeForce GTX 1080): out of memory (while allocating memory)
2019-11-09 22:38:49.829 Iray [INFO] - IRAY:RENDER ::   1.0   IRAY   rend info : Received update to 00001 iterations after 36.794s.
2019-11-09 22:38:49.837 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER ::   1.4   IRAY   rend error: CUDA device 0 (GeForce GTX 1080): Failed to allocate 447.7 MiB
2019-11-09 22:38:49.842 Iray [INFO] - IRAY:RENDER ::   1.4   IRAY   rend info : CUDA device 0 (GeForce GTX 1080): Allocated 7.99328 MiB for frame buffer
2019-11-09 22:38:49.845 Iray [INFO] - IRAY:RENDER ::   1.4   IRAY   rend info : CUDA device 0 (GeForce GTX 1080): Used for display, optimizing for interactive usage (performance could be sacrificed)
2019-11-09 22:38:50.895 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER ::   1.4   IRAY   rend error: CUDA device 0 (GeForce GTX 1080): out of memory (while allocating memory)
2019-11-09 22:38:50.899 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER ::   1.4   IRAY   rend error: CUDA device 0 (GeForce GTX 1080): Failed to allocate 170.625 MiB

«1

Comments

  • I think the version of Iray used in 4.12 uses a bit more VRAM than previous ones. Can you get a simpler scene to render?

  • Vertigo789Vertigo789 Posts: 79
    edited November 2019

    I think the version of Iray used in 4.12 uses a bit more VRAM than previous ones. Can you get a simpler scene to render?

    I tested a more complex scene that was created in 4.12, and it rendered fine. Three characters instead of two.

    Tested another scene with memory issues, the log would suggest that Optix Rendering was enabled, even though I had it off. Wonder if that might be contributing to the issues.

     

    2019-11-10 00:50:35.418 Iray [INFO] - IRAY:RENDER ::   1.2   IRAY   rend info : Initializing OptiX Prime for CUDA device 0
    2019-11-10 00:50:35.868 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER ::   1.2   IRAY   rend error: OptiX Prime error (Device rtpModelUpdate BL): Memory allocation failed (Function "_rtpModelUpdate" caught exception: Encountered a CUDA error: cudaMalloc(&ptr, size) returned (2): out of memory)
    2019-11-10 00:50:36.297 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER ::   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080): Scene setup failed
    2019-11-10 00:50:36.301 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER ::   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080): Device failed while rendering

     

    With a 1080, I have 8GB of VRAM available, and I've rendered far more complex scenes without issue, so clearly there's something wrong with 4.12 here.

     

    Post edited by Vertigo789 on
  • 4.12 has Optix on no matter what. The issue with it randomly using way more memory in some scenes is supposed to be fixed but I guess not.

  • Have the beta versions addressed this problem?

  • Have the beta versions addressed this problem?

    I haven't had any problem. So if you are its the first I've heard of it.

  • JD_MortalJD_Mortal Posts: 760
    edited November 2019

    The latest version uses individual "render cores", based on the type of card and/or CPU you are using. Where as the 4.10 and 4.11 used a single render core for rendering everything. If you look at the logs, between various cards and CPUs, you will see that there are, at-least five individual rendering cores now. (A fancy way of saying "specific optomized code", for the various flavors of cards and CPUs.)

    I am also having cards randomly drop-out or fail, without warning. The only cure is letting Daz shut down, completely, as observed in task-manager. Then, reloading it and crossing your fingers that it uses all your resources the next time. I use MSI-Afterburner to check the status of my cards, to see if Daz has suddenly dropped one, or all of them, from a render. Though, I believe CPU-Z can also indicate the same, showing card activity.

    I would chalk this up as growing-pains, for the new IRAY card-isolated system. It would be nice if they indicated which cards were actually being used, while rendering. Instead of just at the beginning, where the cards seem to drop-out just after that point.

    As for the "Virtual Ram"... Windows is bugged-out at the moment. Which may be messing with Daz. It reports that we have 128-TB of virtual memory available. (Check your logs, no joke there.) Though, that is actually the "limit" of total potential virtual memory that the OS can have, for windows 10. It is not the actual "available memory". (Yes, that is 128 Terrabytes! Try allocating that, or limiting your system based on "what is available", and things start to break.)

    There are a couple of posts about this, in other programs, where windows is reporting the same thing. I am sure it is an unhandled error or a change of an API call, which is leading to the issue. How it alters Daz's operation, is unclear. They are asking for "available virtual memory", I am sure, for a reason, to use internally.

     

    2019-11-10 01:14:50.285 Physical Memory:2019-11-10 01:14:50.285 	Total: 63.7 GB (68400033792)2019-11-10 01:14:50.285 	Avail: 59.9 GB (64363298816)2019-11-10 01:14:50.285 Virtual Memory:2019-11-10 01:14:50.285 	Total: 127.9 TB (140737488224256)2019-11-10 01:14:50.285 	Avail: 127.9 TB (140732863463424)2019-11-10 01:14:50.286 Current Memory Usage: 5%2019-11-10 01:14:50.286 Current DateTime:2019-11-10 01:14:50.286 	Loc: Sun Nov 10 01:14:50 20192019-11-10 01:14:50.286 	UTC: Sun Nov 10 06:14:50 20192019-11-10 01:14:50.286 Temp Data:2019-11-10 01:14:50.286 	Location = C:/Users/xxxxx/AppData/Roaming/DAZ 3D/Studio4/temp2019-11-10 01:14:50.286 	Disk Total: 1.8 TB (2047761027072)2019-11-10 01:14:50.286 	Disk Avail: 1.1 TB (1236528267264)

     

    Post edited by JD_Mortal on
  • Been stuggling with this issue as well. Not sure if it's Daz, Nvidia, or Windows that needs to implement a fix, but I hope it happens soon.

  • JD_Mortal said:

    The latest version uses individual "render cores", based on the type of card and/or CPU you are using. Where as the 4.10 and 4.11 used a single render core for rendering everything. If you look at the logs, between various cards and CPUs, you will see that there are, at-least five individual rendering cores now. (A fancy way of saying "specific optomized code", for the various flavors of cards and CPUs.)

    I think this is a misunderstanding of the way GPUs work: they may have different types of processing units, referred to as cores, on each card - the latest to be supported being the RTX cores used for ray-tracing calculations on some of the 20x0 cards.

    JD_Mortal said:

    I am also having cards randomly drop-out or fail, without warning. The only cure is letting Daz shut down, completely, as observed in task-manager. Then, reloading it and crossing your fingers that it uses all your resources the next time. I use MSI-Afterburner to check the status of my cards, to see if Daz has suddenly dropped one, or all of them, from a render. Though, I believe CPU-Z can also indicate the same, showing card activity.

    I would chalk this up as growing-pains, for the new IRAY card-isolated system. It would be nice if they indicated which cards were actually being used, while rendering. Instead of just at the beginning, where the cards seem to drop-out just after that point.

    As for the "Virtual Ram"... Windows is bugged-out at the moment. Which may be messing with Daz. It reports that we have 128-TB of virtual memory available. (Check your logs, no joke there.) Though, that is actually the "limit" of total potential virtual memory that the OS can have, for windows 10. It is not the actual "available memory". (Yes, that is 128 Terrabytes! Try allocating that, or limiting your system based on "what is available", and things start to break.)

    The available memory figures are reproted to DS, they are not actually used for anything (DS doesn't count-down the memory used against the reproted memory available to work out how much it has left) so they have no functional significance.

    JD_Mortal said:

    There are a couple of posts about this, in other programs, where windows is reporting the same thing. I am sure it is an unhandled error or a change of an API call, which is leading to the issue. How it alters Daz's operation, is unclear. They are asking for "available virtual memory", I am sure, for a reason, to use internally.

     

    2019-11-10 01:14:50.285 Physical Memory:2019-11-10 01:14:50.285 	Total: 63.7 GB (68400033792)2019-11-10 01:14:50.285 	Avail: 59.9 GB (64363298816)2019-11-10 01:14:50.285 Virtual Memory:2019-11-10 01:14:50.285 	Total: 127.9 TB (140737488224256)2019-11-10 01:14:50.285 	Avail: 127.9 TB (140732863463424)2019-11-10 01:14:50.286 Current Memory Usage: 5%2019-11-10 01:14:50.286 Current DateTime:2019-11-10 01:14:50.286 	Loc: Sun Nov 10 01:14:50 20192019-11-10 01:14:50.286 	UTC: Sun Nov 10 06:14:50 20192019-11-10 01:14:50.286 Temp Data:2019-11-10 01:14:50.286 	Location = C:/Users/xxxxx/AppData/Roaming/DAZ 3D/Studio4/temp2019-11-10 01:14:50.286 	Disk Total: 1.8 TB (2047761027072)2019-11-10 01:14:50.286 	Disk Avail: 1.1 TB (1236528267264)

     

     

  • Vertigo789Vertigo789 Posts: 79
    edited November 2019

    Have the beta versions addressed this problem?

    I haven't had any problem. So if you are its the first I've heard of it.

    You literally just said this in your previous post:
    "The issue with it randomly using way more memory in some scenes is supposed to be fixed"

    Anyway, raised a ticket with support, hopefully they can provide me with an installer for 4.11, because 4.12 is utterly broken for me when it comes to GPU rendering, even though 8GB of VRAM and 16GB of RAM is more than enough for the scenes I'm doing, which again worked fine in prior Daz versions.

    Post edited by Vertigo789 on
  • Have the beta versions addressed this problem?

    I haven't had any problem. So if you are its the first I've heard of it.

    You literally just said this in your previous post:
    "The issue with it randomly using way more memory in some scenes is supposed to be fixed"

    Anyway, raised a ticket with support, hopefully they can provide me with an installer for 4.11, because 4.12 is utterly broken for me when it comes to GPU rendering, even though 8GB of VRAM and 16GB of RAM is more than enough for the scenes I'm doing, which again worked fine in prior Daz versions.

    Supposed to be fixed in the transition from 4.11 to 4.12. If you're having an issue with Optix in 4.12 that is new.

  • 4.12 has Optix on no matter what. The issue with it randomly using way more memory in some scenes is supposed to be fixed but I guess not.

    I have 4.12 just released and can turn optiX ON or OFF with the check box. 

  • 4.12 has Optix on no matter what. The issue with it randomly using way more memory in some scenes is supposed to be fixed but I guess not.

    I have 4.12 just released and can turn optiX ON or OFF with the check box. 

    The check box no longer does anything. In 4.12 Optix is always used.

  • JD_Mortal said:

    The latest version uses individual "render cores", based on the type of card and/or CPU you are using. Where as the 4.10 and 4.11 used a single render core for rendering everything. If you look at the logs, between various cards and CPUs, you will see that there are, at-least five individual rendering cores now. (A fancy way of saying "specific optomized code", for the various flavors of cards and CPUs.)

    I think this is a misunderstanding of the way GPUs work: they may have different types of processing units, referred to as cores, on each card - the latest to be supported being the RTX cores used for ray-tracing calculations on some of the 20x0 cards.

    Not a misunderstanding... Intel CPUs use a new DLL and Titans use another and AMD uses another and RTX uses another. I'm not talking about cores in the cards. I am talking about IRAYs "render cores", the code used to process the scenes. There are five individual variations that I have counted now. Previously, it was all just one "core" code doing the rendering.

    Intel CPUs use "Embree", RTX cards use the RTX specific drivers, Volta/Tensor non-RTX cards use another driver core, Pascal only cards use another, and the rest seem to all use the Maxwell driver cores. Not sure what AMD processors use, but I know it isn't "Embree" or any Cuda-base rendering core drivers.

  • PadonePadone Posts: 3,778
    JD_Mortal said:

    Not sure what AMD processors use, but I know it isn't "Embree"

    Unless they changed something afaik embree should work fine on amd cpus. I looked at the embree page and I didn't find anything stating that it only works on intel cpu, though it is probably more optimized for them.

  • On consumer grade CPU's there is nearly no difference between CPU brands. You will find SW that is optimized for single core which will perform better on Intel and multi threaded which will perform better on AMD but to create an entire render library that is reliant on some Intel only feature? Not only am I not sure what could be leveraged to do that it would have to be written in something like assembly since all the C/C++ compilers I'm aware of don't even provide a switch that would generate such code, even icc (Intel's own compiler) produces code that runs on both by default, although I'm guessing it might have some way to create Intel only binaries (I know it used to for Xeons vs. Opterons).

  • fastbike1fastbike1 Posts: 4,078

    @david_23523141 "I have 4.12 just released and can turn optiX ON or OFF with the check box. "

    That option is no longer available in Beta 4.12.1.16

  • edited May 2020

    I know this is an older thread but I'd like to chime in with the same issue as the OP and I have both a 1080GTX and a 1080Ti in this machine with 64GB of ram on an I7.... Renders were fine in 4.11.... After upgrading to 4.12 everything has gone [off]. I started noticing renders slowing down during multiple frame renders, which apparently is a VRam issue as per some other threads, oddly there was no VRam issue on the same scenes in 4.11! I also tried the latest Beta, and now I can't even get a single render to complete without the render failing over to CPU only rendering 0_0

    I have since updated my NVidia drivers and run a GPU / CPU / RAM benchmark all is well with the machine and it is performing as expected.... Except, with Daz3d...

    Post edited by Richard Haseltine on
  • Richard HaseltineRichard Haseltine Posts: 102,344

    Both of your cards are non-RTX, that does mean Iray is now passing more code across to them and so you are more likely to have the render drop to CPU. Still, exactly which drivers are you using?

  • Vertigo789Vertigo789 Posts: 79
    edited May 2020

    Both of your cards are non-RTX, that does mean Iray is now passing more code across to them and so you are more likely to have the render drop to CPU. Still, exactly which drivers are you using?

    I'm currently using 445.75. Decided to test the latest beta build of 4.12 to see if things have improved. But no, same problem as before. Scenes that work perfectly in 4.11 consistently encounter memory allocation errors, which severely impacts render times. My card literally has 8GB of VRAM, and my system has 16GB. It's an absolute joke that the exact same scenes with no render setting changes struggle to perform at the same level as in 4.11. With regards to the RTX situation, I guess that means Daz/Nvidia decided to give the middle finger to anyone who hasn't immediately upgraded to the very latest architecture?

    4.12
    https://pastebin.com/2cJNx4qg

    4.11

    https://pastebin.com/RaVMNvaJ

    Post edited by Vertigo789 on
  • Richard HaseltineRichard Haseltine Posts: 102,344

    Iray is nVidia's code, not something Daz has control over.

    There is a Ray Tracing Low Memory option in Render Settings, in the Optimization group - it may be worth seeing if that helps.

  • DondecDondec Posts: 243

    I was shocked when my IRAY renders rendered empty with GPU rendering only option enabled on 4.12.  It starts and ends immediately, render window showing only checkerboard background. 

    I have a Titan X w 12G of VRAM. 

    I tried the latest Nvidia drivers and an older one I knew worked.  No difference. 

    Enabling CPU rendering worked fine, though slow.

    My immediate need is to get back to fast IRAY on my Titan X.  Others seemed to indicate that 4.11 worked fine.  Is there a place where I can find that older version and reload it.

    Thank you very much... a little scared by this.

        Don

  • Richard HaseltineRichard Haseltine Posts: 102,344
    Dondec said:

    I was shocked when my IRAY renders rendered empty with GPU rendering only option enabled on 4.12.  It starts and ends immediately, render window showing only checkerboard background. 

    I have a Titan X w 12G of VRAM. 

    I tried the latest Nvidia drivers and an older one I knew worked.  No difference. 

    Enabling CPU rendering worked fine, though slow.

    My immediate need is to get back to fast IRAY on my Titan X.  Others seemed to indicate that 4.11 worked fine.  Is there a place where I can find that older version and reload it.

    Thank you very much... a little scared by this.

        Don

    You can open a support ticket to request an older version of DS https://www.daz3d.com/help/help-contact-us

  • Wicked WhompWicked Whomp Posts: 220
    edited May 2020

    Not to bump an old thread, but I've been having memory allocation errors myself since upgrading to 4.12 as well, the latest issue being Daz unable to allocate 16mb out of 4gb of available memory. Strange too, since my card (RTX 2060) is a 6GB card.

    Post edited by Wicked Whomp on
  • Wallace3DWallace3D Posts: 170

    Not to bump an old thread, but I've been having memory allocation errors myself since upgrading to 4.12 as well, the latest issue being Daz unable to allocate 16mb out of 4gb of available memory. Strange too, since my card (RTX 2060) is a 6GB card.

    if you have a Display plugged into that card it takes up VRAM

    I have no displays on my 11GB and it sees the whole 11GB when allocating 

    Also All my renders I had no issues with before I had unchecked OptiX in order to render on my GPU (in past versions it would not render on GPU with it selected and would always revert back to CPU)

    Now that this newest version Forces you to use OptiX is why I was having the issue of it using the CPU only during all my renders.  I have since optimized my scenes to use less memory on the textures (uses 90% of the Memory)

    since I have made that difficult adjustment I now use my GPU's 100% of the time when rendering at 50% retention before the memory leak failure to clear the VRAM during a render and then it finishes off suing the GPU are in some cases it dropps one of my GPU's and continues with my Primary GPU

  • kenshaw011267kenshaw011267 Posts: 3,805
    Wallace3D said:

    Not to bump an old thread, but I've been having memory allocation errors myself since upgrading to 4.12 as well, the latest issue being Daz unable to allocate 16mb out of 4gb of available memory. Strange too, since my card (RTX 2060) is a 6GB card.

    if you have a Display plugged into that card it takes up VRAM

    No, it doesn't. Research WDDM

  • BraincellsBraincells Posts: 42

    Same frustrating issue. On a 1080 card with 11GB, DS 4.12.117, latest Studio Driver. First render with 3 figures worked on GPU, my current one with 1 figure returns only transparent or black renders and fall back in CPU. Unchecked fallback on CPU. Renders very slowly on CPU only if the dome is off. Error log mentioned that cannot find a render entity, I suppose it means it can't find the GPU card?

  • kenshaw011267kenshaw011267 Posts: 3,805

    Not the same issue. Post the log. 

  • Wallace3DWallace3D Posts: 170

    I see 3 figures vs one figure but what is all in the scene? Just the Figures? or do you have props and stuff that could take up memory just from textures alone.  I have learned it is all about the textures and the fact that they take up the most VRam 

    does your single fugure use more textures than the 3 figures? ie Diffuse, tesselation, Normals, Bumps maps .at 4096x4096 for each skin surface. vs just a diffuse map for each skin surface at 1048x1048

  • BobvanBobvan Posts: 2,652
    edited May 2020

    For the most part my 2080ti has been faster overall but ran into my 1st reaaally long render appox 6 to 8 hours. 3 Characters HRDI background taking forever for lines to clear off.

    b.png
    1860 x 950 - 3M
    Post edited by Bobvan on
  • Wallace3DWallace3D Posts: 170
    Bobvan said:

    For the most part my 2080ti has been faster overall but ran into my 1st reaaally long render appox 6 to 8 hours. 3 Characters HRDI background taking forever for lines to clear off.

    That scene would have taken an hour to render on my setup with a resolution of 4096 x 4096. Now if I did not have my 1080ti and rendered it on my i7 5930K it would probably take 2 - 3 hours with a CPU limit of 10 of the 12 threads

Sign In or Register to comment.