DazStudio 4.21 won't use GPU to render
Hello,
DazStudio 4.21 will no longer use GPU to render causing extremely long render times/loads. When using any pre- 4.20 version of DazStudio, this was not an issue.
I have a NVIDIA GeForce GTX 1080 Ti which is listed as having 11gb of onboard memory and I'm using the currently available drivers (531.18). My PC has an additional 64 gigs of memory.
Looking through older suggestions, I went into render settings and tried to disable CPU as an available device in both photoreal and Interactive (Biased) Modes. This just resulted in a less than a minute blank picture being rendered. Below is the pertinant lines in the error log.
I attached an error log for one of these failed renders.
Any help would be greatly appreciated. This is extremely frustrating and downgrading to an older version of DazStudio is something I both don't really know how to do and also seems like an extremely time consuming task.
Comments
What is in the scene you are trying to render? Near the beginning of your log file it says:
"Geometry import (1 triangle object with 65 M triangles"
65 million triangles is a lot of geometry. You are probably running out of memory, just like the log files says further down.
Open a new scene and create a primitive cube (on the Create menu) with 1 division. Does that render?
Running out of VRAM should be the reason.. From the log, 65 Mil Triangles geometry is calculated based on the parameter 'Render SubD Level'. For example, if there're two G8.1 in the scene with default Render SubD Level at 3, the geometry calculation will be 4 Mil. If you increase the level to 5 for both of the figures, the calculation will be 65+ Mil (x 4 x 4) , and it will approximately consume 3 Gb more VRAM. You may check if there's the similar case in your scene... figures, props, etc... No need to set very high level Render SubD...
Besides, there're 733 texture in your scene. I believe it will consume a lot VRAM as well, particularly if there're many 2K/4K maps... You may use Scene Optimizer to check the related status and use GPU-Z to observe the VRAM consumption (Memory Used)...
Thank you for both the responses.
I did what you suggested, @Barbuit and it worked fine, so I am overloading the cards memory.
And I will look into the Scene Optomizer script, @Crosswind. I came across a video from TheWPGuru regarding it well looking up information on this and it looks like an extremely helpful tool.
Thanks again!