CUDA & GPU Rendering - Status?
![Alien Alloy](https://farnsworth-prod.uc.r.appspot.com/forums/uploads/userpics/830/nKVOOJ1V38N60.jpg)
Hello good people.
I've been digging around the forum here and there to try and find some information about the IRAY engine combined with the GPU rendering and anywhere I read, it seems that it works once and then nothing more.
I've been experiencing issue like that for a while myself.
Loading now Daz, with a single figure, no lights and just hair, if I hit "RENDER" without touching anything, the process flies via GPU and it takes mere seconds to produce the render up to like 89%, then it commit suicide and falls back to CPU to finish.
If I do again a render in the same window, it won't even bother to look for the GPU, it goes directly in CPU mode and use all the 8 cores to do the job.
Granted my videocard is an old 1080 Strix with merely 8 gigs on board, while my system has 32 *shrugs*
I've been reading all around trying to find a solution for this plague, it's like having the "Ludricous Speed" but not being able to use it properly, not even once.
I've tried the Nvidia Control Pannel Physix to CPU trick (no difference). <- Is this actually worth? I play with this machine, sometimes :D
I've changed the Texture Compression Threshold to 512 (Medium) and 2048 (High).
Unchecked the OptiX Prime Acceleration.
Unchecked the CPU boxes.
I am running the latest drivers ofcourse.
Is there anything else that can be done to try and make this work or the CUDA thing is just your "average" badge of premium that it's not even worth the time to fiddle with?
I've also notice using GPU-Z that apparently DAZ doesn't "dump" memory: during the first render that it seems to barely complete with a single figure, the video ram usage goes from 400mb up to 4 gigs, then when I close the preview windows back down to 2-3 gigs.
Indeed it is a partial rant, for a free product I know
But I would like to know from users that have more experiene than me on the matter if they have found some tricks to optimize the thingy or if there are some hidden stuff and know-hows.
Thanks in advance guys!
Comments
I'm the creator/maintainer of this thread. For my next project I'm actually thinking about assembling a guide for people on best practices for using Iray in Daz Studio.
Honestly it's mostly about knowing what to avoid doing at the same time based on which GPU/Iray/Daz Studio combination you're working with. In your case, it sounds to me like you might want to avoid using Iray preview mode in the same DS instance that you do your full Iray rendering. Another thing you could try is reducing the total dimensions of your final render since that seems to tie directly into the memory consumption hit raytracing has on non-RTX cards in 4.12 (the most common cause for spikes in ram usage/CPU fallback during a render.)
I will gladly read that guide you will put togethet then :)
While I work in Daz I use either Texture Shaded or Cartoon Shaded for the preview (if we are talking about that little ball thing near the camera view selection tab).
I usually try to render between 3000 and 3500 pixel to give me enough room in Photoshop to do correction and postwork.
I suppose that knowing a good balance between test render settings and final render settings will help, right?
Double post, sorry.
@RayDAnt "Another thing you could try is reducing the total dimensions of your final render since that seems to tie directly into the memory consumption hit raytracing has on non-RTX cards in 4.12"
I have noticed anything like this. I routinely render at 3000x3000 to 4000x6000 on a 980TI (Nvidia driver 430.86) on both 4.12.0.86 and 4.12.1.16 without dropping to CPU.
@Alien Alloy "the video ram usage goes from 400mb up to 4 gigs, then when I close the preview windows back down to 2-3 gigs."
You need to be more patient. The GPU will clear back to the base memory with your Studio scene, but it may take from 15 to 30 seconds (not immediately).
I'm not sure whats going on with your settings. I don't have these problems with a 980TI. Here are some examples. The originals are all over 3000x3000.
Which version of Windows are you running?
Also, how many/at what dimensions do you run your displays? (assuming the GPU you use for rendering is the same as the one you use for display output.) Because those can also be important factors in this.
Also also nice renders. :)
?? "this thread" was meant to lead us where please?
Oops. fixed it. Thanks!
Maybe it's time to do a fresh format and re-instal the system as a whole.
Last time was 6 months ago after all.
Oh right I forgot.
Windows 10
I drive two display on the same videocard, the secondary one is just to have all the tools when I work on the main monitor.
The videocard is a 1080 Strix OC
Cpu is a Ryzen 1700 and got 3200 G.Skills ram.
The system is stable and rock solid, water cooled and all, so I've excluded already that it's something wrong with the hardware itself.
I might want to do fresh instal soon, just to be on the safe side of things
Buddy, can I bother you further and ask what are you Final render settings?
You know the Max Sample, Quality Render, if you use any, etc etc?
Also you use both gpu and cpu render? Optix?
Win 7 64 bit. 32GB RAM, 2 different sized monitors. Main runs @ 1920x1080, 2nd @ 1600x900. single 980TI. Thanks for the compliment.
If you were asking me, see attached for my settings. I don't check CPU or Optix Prime. In my experience both are slower. Plus not using CPU allows me to do other things while the render runs.
I have 980Ti and since last DS updates I run into the similar problem. I could render huge scenes with GPU (980Ti) and large dimensions. Actually I didn't notice dimensions changed the speed or memory or what.
But now I can't render a scene well with 3-4 GB memory consumption (my GPU has 6 GB). Always fall back to CPU. I started to check where is the OptiX button and noticed there is no turn on/off button for OptiX. And in log file I read that OptiX failed.
IRAY rend error: OptiX Prime error (Device rtpModelUpdate BL): Memory allocation failed (Function "_rtpModelUpdate" caught exception: Encountered a CUDA error: cudaMalloc(&ptr, size) returned (2): out of memory)
And as I read some threads, 4.12 ALWAYS use OptiX because of the RTX users????? Really? I am sure only 5-10 % of the DS users use RTX. Or even less. That is an expensive card. I spent a lot of money to put my PC together and I could use it well until nowadays. I am really glad Daz thinking about people with RTX card, but us, the remaining 90% can't use well with this OptiX thing. I don't care if my render took 10 minutes more or even 30. But memory is always important!
So why Daz took this option off from us?
I use Windows 10 Pro
32 Gb RAM
CPU I7-4790K 4 GHz
2 NVIDIA 980Ti - one is only for DS renders and still .....
It was an nVidia decision - they write Iray, Daz integrates it into DS but they have no control over the engine itself
oh, thank you ... I wish there would be a way to turn it off :(
Optix appears to be much improved. It clearly uses less VRAM than older iterations. With or without my 2070 enabled for rendering renders in 4.12 are substantially faster.
@3D-GHDesign "But now I can't render a scene well with 3-4 GB memory consumption (my GPU has 6 GB)."
I suspect Windows 10 is some of your problem. I believe it reserves a chuink of VRAM for System operation. Although, since you have 2 980TI, I would think only one was affected.
Nevertheless, I have a 980TI on Windows 7 and just rendered a scene that used 5789 GB VRAM on both 4.12.0.86 and 4.12.1.16.
I believe you have something else going on. I also have the same experience as @kenshaw011267. My 4.12 render are ~20% faster than 4.11.
Actually I updated NVIDIA driver today and now I couldn't render with GPU a smaller scene without Genesis :(
And I have Windows 10 since years and could render without any problem ... even with 3-4 Genesis in the scene ... only with last DS update and now the last NVIDIA update .... I think I will put back an older NVIDIA driver ... maybe ....
Edit .. I just noticed there is a Studio Driver for 3D content creators, however I can't find one for 980Ti ... maybe this can cause problem?
I just started having problems rendering with the GPU (1080TI) and the first thing I did was update to the newest NVIDIA drivers (dropped on 10/22). Still no dice. Old scenes that previously rendered fine on the GPU wouldn't. Next test was to try a different version of DAZ, as I still have the July version of the 4.12 beta. That did it. So something is up with the current build of DAZ.Thankfully I can use the beta I have until whatever it is gets fixed.
@melissastjames
Nothing wrong with 4.12.0.86. The "latest" Nvidia driver isn't always the best choice. 4.12.0.86 is stable with Nvidia 430.86, Win 7, GTX 980TI. Same for Beta 4.12.1.16.
I started having issues before I updated to the newest driver. Updating is just my first step in trying to resolve the issue, as that is usually what fixes it. At this point, I cannot render on my GPU with the current build of DAZ 4.12. However, the July 2019 version of the beta works fine.
Thanks :)
And sorry for the late reply!