New RTX-2070 Super, little to no render time improvement over CPU. Why?
![jukingeo](https://secure.gravatar.com/avatar/64088ad8cf172032c2a4d5e2cea24fe4?&r=pg&s=100&d=https%3A%2F%2Fvanillicon.com%2F64088ad8cf172032c2a4d5e2cea24fe4_100.png)
Hello All,
For Christmas my wife bought me an RTX-2070 Super GPU and I was hoping to see a blazing improvement over the CPU which I been using for a year now. So I installed the card as per instructions, I downloaded and installed the drivers from the Nvidia site AND I updated Windows. In Daz Studio (I have version 4.12), I made sure that under the Render Tab - Advanced Settings that the CPU was unchecked and GPU was checked. So for a test I loaded up a render that I did previously using just the CPU and that render took 10 hours to get to 80%. I would be hoping that the RTX-2070 could get the job done in one hour. So I set up my machine, left for an hour and came back to find the Render progress at only 400 iterations and at 0% still. The render that was done in that time still looks grainy.
Another test I did was to check out the Iray mode over the Textured Shaded mode and panning and zooming I expected to be pretty smooth in Iray mode and again, I don't see much improvement at all.
So before I take the card out and send it back, I am wondering if there are other settings that I am overlooking?
For $500+ for the RTX-2070, and for the benchmark tests I looked at before I bought the card, I was expecting my render times to be cut down by 1/10th the time. I would be even happy with 1/5th the time. But if this is all I am getting out of this card, it is going to go back to the store. I would be better off getting a faster CPU.
Any assitance would be appreciated.
Thank you,
Geo
Comments
First you need to try a simple scene, a single primitive witha shader applied or a single prop. If that doesn't render very fast then you need to check the Nvidia control panel, manage 3D settings, select Daz and make sure it can use the GPU.
It it does render the primitive but not your test scene it was likely to big for the cards VRAM.
If the simple scene doesn't render and the control panel is set properly you should plug a monitor, if one isn't already, and make sure you're getting video from the card.
It's most likely switching to CPU rendering for some reason. Check your log file. It will tell you why it's switching to CPU.
To view the log file go to Help->Troubleshooting->View Log File...
Definately something not setup correctly there, 2403 cuda cores should blow the doors off IRAY rendering using a CPU, as Cinus said check in the log file first, post the appropriate part of it here if you need further help.
Steve.
Sounds like it is dropping back to CPU rendering. The option to turn off CPU rendering is only to remove it from the normal chain of rendering. However, if the GPU fails, it will STILL render in CPU mode. Which I wish it wouldn't, because then you honestly don't know if it failed, unless you happen to see that line or notice that it's going slow.
In any event... There is a few reasons it may fall back to CPU rendering...
1: The things you are trying to render will not fit into the cards memory.
2: Something you are rendering, for some reason, may trigger some failure when loading into the card. Causing it to fall back to rendering on the CPU, even though it would have fit into the video-cards memory.
3: Your memory-card has not yet released the memory, for some reason, and the new scene will not ALSO fit into the available memory left.
4: A general communication failure between Daz3D and IRAY, or IRAY and the video-card. Leading to an inability to put anything into GPU rendering.
It seems that #2, #3 and #4 are sort-of hitting a few of us at the moment. A combination of driver issues, from using the latest drivers, as well as IRAY and Daz3D issues, with the latest releases of V4.12. They seem to be most noted on the Titan series and 10xx and 20xx cards, which is the newer generation cards. Essentially, anything with "Volta" components and/or RTX components.
Quick things you can try, to reduce the potential issue.
1: Save your scene before you render. (Not because Daz3D will crash, but because if it stops rendering to GPU, it will not recover. You will have to restart Daz3D to re-establish the links to the cards and free-up memory.) Restart Daz3D, after checking task-manager to see if it has actually shut down completely and freed-up your memory in RAM and VRAM. (Hit [ctrl]+[alt]+[del] to bring-up the option to open task-manager.)
2: Don't use IRAY preview. At-least when you plan to actually render something. Use it while playing around, if needed, then save and try rendering. See step #1 if that drops back to CPU again. You may get one, or a few good renders, then it will fail again.
3: Check the logs to see if there is any mention about the total scene size. Though it is not 100% accurate, relative to what is inside the actual card, it gives you an idea of a potential problem if you are trying to stuff a 8GB scene into a 8GB card. You don't have 8GB of available memory in a 8GB card, and you still need room to manipulate and calculate the final rendered image too. 6-7GB should fit decently into a 8GB card. (Windows can consume up to 1.5GB of your VRAM for itself. Normally, it should only be about 0.3-0.7GB, if it is not your primary display driver.)
4: Check your video-card memory use and CUDA use, using task-manager. (Hit [ctrl]+[alt]+[del] and select taskmanager and go to the "Performance" tab. You will have to select CUDA as a data-set to monitor, for your GPU. The memory will show at the bottom. If you see it max-out, then drop down to almost nothing, you have just dropped back to CPU rendering again. Daz tried to stuff too much into the video-card memory and it peaked, ran out of room, and crashed.)
5: Wait... Like the rest of us... We are at the mercy of Daz3D, IRAY and nVidia drivers. Maybe Daz will make a "safe mode render", and create an "Alert", to let us know that we have just dropped into CPU mode, or actually just stop rendering if we select NOT to render in CPU mode. Other than that, the only thing you can do is render smaller or older items that stay far away from topping-out VRAM and don't have complex materials or shapes, that seem to be tripping-up our renders.
If you have windows 10 and you updated that after loading the Nvidia driver, Windows probably overwrote the driver.
Correction, I have the SUPER so the Cuda Cores are 2560 and yes, that was the expected performance. I did some calculations prior to getting the card and I estimated that I should be cuting my render times down by a factor of 10. So a 10 hour render should take 1 hour. At least if it works that way. Apparently not.
I did one up from that I ran some tests using a single Gen 8 figure set to 1900 x 1900 pix square and 95% and that took 20 seconds! THAT was impressive. I was expecting that. The log file is showing that Daz IS using the card.
Then I tried the single figure, with a floor, a wall with windowsl, and a tree lanscape that is seen through the windows. Again I checked the log file and that took 15 minutes. Not as impressive as above, but still a noted ifference.
It seems the simpler the scenes are, the render times improve dramatically. But the more stuff you add, it doesn't seem like it the card doesn't have that dramatic of an effect on it. In both cases I have noticed that the CPU fan seemed to go much slower than when I was using the CPU alone. The fans on the GPU sped up a bit, but not by a lot. So it DOES seem like the card is doing something.
I also cleared everything out and loaded the scene above (figure, wall, floor, and forest backdrop) and put the viewport in Iray mode. Unlike yesterday, the panning is now MUCH smoother! So something must have initially gone wrong. I am going to try the one large scene again and see how long it takes. But for certain with the simple tests Daz Studio is definitely using the GPU.
There is definitely something happening as I noticed that my system sounds and operates differently. First off, the CPU fan doesn't scream at a high speed any longer
No, I have Windows 7 64 bit. I didn't have a prior video card in the computer as I was using the CPU graphics up until now.
So the memory doesn't work on a FIFO principle in which once what is in memory is freed up it loads up more? So if it doesn't fit then it drops to CPU only for the WHOLE render? If so, that's not cool.
How do I check to make sure the memory is fully available? How do I clear the memory out?
This might have happened as it seems like the simple renders are working better today. So I am going to try the big one again.
I hope not, I just spent over $500 for this card and I am not looking forward to problems.
Ok
So Daz Studio has to be shut down completely and that is what clears out the memory in the GPU?
Well, one of the reasons for getting the card is so this way I don't have to constantly switch from Textured mode to Iray. I know it isn't a good thing to have turned on during a render, so that I understand to switch it off then. But for positioning and posing in lieu of lighting, I would like to have it on.
Where do I check this?
Wow! Kind of stupid that it does that. I figured it should load up as much as it could in memory and THEN put the remainder into the CPU. I don't like that it just completely drops the GPU out.
That kind of sucks and defeats the purpose of buying an expensive card if you ask me.
At any rate, I went back and I tried the 3 person, large room, nearly 200 prop room I started with at first. That Chungus 10 hour file. Again I noted the computer sounds. First it almost took 8 minutes for the prerender and then I heard the CPU fan ramp up. I let it go for about 10 to 15 minutes and then did something else in the meantime. When I came back the CPU fan was still racing. So I shut down the render and took a look at the log.
Here is the last bit of that file:
2020-01-04 14:17:32.400 Iray [INFO] - MATCNV:RENDER :: 1.0 MATCNV rend info : found 1377 textures, 198 lambdas (65 unique)
2020-01-04 14:17:32.446 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Emitter geometry import (16 light sources with 566k triangles, 1 instance) took 0.023s
2020-01-04 14:17:32.446 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Updating environment.
2020-01-04 14:17:37.470 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Updating backplate.
2020-01-04 14:17:37.470 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Updating lens.
2020-01-04 14:17:37.470 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Updating lights.
2020-01-04 14:17:37.470 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Updating object flags.
2020-01-04 14:17:37.470 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Updating caustic portals.
2020-01-04 14:17:37.470 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Updating decals.
2020-01-04 14:17:37.470 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:17:37.485 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Allocating 1-layer frame buffer
2020-01-04 14:17:37.501 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Using batch scheduling, caustic sampler disabled
2020-01-04 14:17:37.501 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Initializing local rendering.
2020-01-04 14:17:37.532 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Rendering with 1 device(s):
2020-01-04 14:17:37.532 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : CUDA device 0 (GeForce RTX 2070 SUPER)
2020-01-04 14:17:37.532 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Rendering...
2020-01-04 14:17:37.532 Iray [VERBOSE] - IRAY:RENDER :: 1.3 IRAY rend progr: CUDA device 0 (GeForce RTX 2070 SUPER): Processing scene...
2020-01-04 14:17:37.828 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : Using OptiX version 6.1.2
2020-01-04 14:17:37.828 Iray [VERBOSE] - IRAY:RENDER :: 1.4 IRAY rend stat : Geometry memory consumption: 1.28015 GiB (device 0), 0 B (host)
2020-01-04 14:17:37.860 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : Initializing OptiX for CUDA device 0
2020-01-04 14:20:10.180 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : Importing lights for motion time 0
2020-01-04 14:20:10.180 Iray [VERBOSE] - IRAY:RENDER :: 1.4 IRAY rend stat : Texture memory consumption: 4.28777 GiB for 394 bitmaps (device 0)
2020-01-04 14:20:11.147 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : Initializing light hierarchy.
2020-01-04 14:20:16.982 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : Light hierarchy initialization took 5.837s
2020-01-04 14:20:16.997 Iray [VERBOSE] - IRAY:RENDER :: 1.4 IRAY rend stat : Lights memory consumption: 33.8823 MiB (device 0)
2020-01-04 14:20:17.013 Iray [VERBOSE] - IRAY:RENDER :: 1.4 IRAY rend stat : Material measurement memory consumption: 0 B (GPU)
2020-01-04 14:20:17.559 Iray [VERBOSE] - IRAY:RENDER :: 1.4 IRAY rend stat : Materials memory consumption: 1.2794 MiB (GPU)
2020-01-04 14:20:17.559 Iray [VERBOSE] - IRAY:RENDER :: 1.4 IRAY rend stat : PTX code (450 KiB) for SM 7.5 generated in 0.546s
2020-01-04 14:20:34.220 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : JIT-linking wavefront kernel in 0.050s
2020-01-04 14:20:34.235 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : JIT-linking mega kernel in 0.014s
2020-01-04 14:20:34.282 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : CUDA device 0 (GeForce RTX 2070 SUPER): Scene processed in 176.716s
2020-01-04 14:20:34.360 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: CUDA device 0 (GeForce RTX 2070 SUPER): out of memory (while allocating memory)
2020-01-04 14:20:34.360 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: CUDA device 0 (GeForce RTX 2070 SUPER): Failed to allocate 832 MiB
2020-01-04 14:20:34.688 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : CUDA device 0 (GeForce RTX 2070 SUPER): Allocated 66.102 MiB for frame buffer
2020-01-04 14:20:34.750 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : CUDA device 0 (GeForce RTX 2070 SUPER): Allocated 11.0742 MiB of work space (13k active samples in 0.057s)
2020-01-04 14:20:34.781 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : CUDA device 0 (GeForce RTX 2070 SUPER): Used for display, optimizing for interactive usage (performance could be sacrificed)
2020-01-04 14:20:35.358 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER :: 1.4 IRAY rend error: CUDA device 0 (GeForce RTX 2070 SUPER): out of memory (while launching CUDA renderer)
2020-01-04 14:20:35.374 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER :: 1.4 IRAY rend error: CUDA device 0 (GeForce RTX 2070 SUPER): Failed to launch renderer
2020-01-04 14:20:35.405 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: CUDA device 0 (GeForce RTX 2070 SUPER): Device failed while rendering
2020-01-04 14:20:35.405 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.3 IRAY rend warn : All available GPUs failed.
2020-01-04 14:20:35.452 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.3 IRAY rend warn : No devices activated. Enabling CPU fallback.
2020-01-04 14:20:35.530 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: All workers failed: aborting render
2020-01-04 14:20:35.920 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : CPU: using 8 cores for rendering
2020-01-04 14:20:35.936 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Rendering with 1 device(s):
2020-01-04 14:20:35.936 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : CPU
2020-01-04 14:20:35.936 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Rendering...
2020-01-04 14:20:35.936 Iray [VERBOSE] - IRAY:RENDER :: 1.3 IRAY rend progr: CPU: Processing scene...
2020-01-04 14:20:35.936 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : Using Embree 2.8.0
2020-01-04 14:20:35.936 Iray [INFO] - IRAY:RENDER :: 1.4 IRAY rend info : Initializing Embree
2020-01-04 14:21:28.492 Iray [VERBOSE] - IRAY:RENDER :: 1.4 IRAY rend stat : Native CPU code generated in 0.545s
2020-01-04 14:21:28.492 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : CPU: Scene processed in 52.553s
2020-01-04 14:21:28.523 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : CPU: Allocated 66.102 MiB for frame buffer
2020-01-04 14:21:41.097 Iray [INFO] - IRAY:RENDER :: 1.3 IRAY rend info : Allocating 1-layer frame buffer
2020-01-04 14:21:41.440 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - POST:RENDER :: 1.0 POST rend warn : renderer iray has no more devices available. Postprocessing falling back to CPU.
2020-01-04 14:21:41.799 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00001 iterations after 243.853s.
2020-01-04 14:21:41.939 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:21:53.717 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00002 iterations after 255.771s.
2020-01-04 14:21:53.764 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:22:06.166 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00003 iterations after 268.220s.
2020-01-04 14:22:06.197 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:22:18.677 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00004 iterations after 280.733s.
2020-01-04 14:22:18.708 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:22:31.032 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00005 iterations after 293.086s.
2020-01-04 14:22:31.064 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:22:43.408 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00006 iterations after 305.462s.
2020-01-04 14:22:43.486 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:22:55.829 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00007 iterations after 317.884s.
2020-01-04 14:22:55.954 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:23:08.309 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00008 iterations after 330.363s.
2020-01-04 14:23:08.356 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:23:20.774 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Received update to 00009 iterations after 342.827s.
2020-01-04 14:23:20.805 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(305): Iray [WARNING] - IRAY:RENDER :: 1.0 IRAY rend warn : The 'iray_optix_prime' scene option is no longer supported.
2020-01-04 14:23:26.530 Saved image: C:\Users\Geo\AppData\Roaming\DAZ 3D\Studio4\temp\render\r.png
2020-01-04 14:23:26.530 Finished Rendering
2020-01-04 14:23:27.123 Total Rendering Time: 7 minutes 36.83 seconds
About midway through all of that I can see where it decided to switch to the CPU. I highlighted the changes in red. So with the large file, it is definitely going to the CPU. Now is it a problem with this particular file? Or am I going to go through this with every file this about this size?
Here I am using a 1900 pixel setting and 1:1 square scene. The scene has 3 Gen 8 characters in it, it is a living room setting and with a Christmas Tree, there is easily about 200 - 300 props in the scene. I wouldn't think that such a small area would fill up the 8 gigs of memory in the video card. But something is going wrong with this file as it is clear that Daz Studio wants to use the CPU over the GPU here.
My intention was to do even larger scenes than this one and I am dismayed if I am already running out of GPU memory. As it is, I doubt that there is more to be had than 11 gig GPU memory which is what the 2080 Ti has. I am certainly not paying almost the cost of a new computer just for a GPU card as that is what the 2080 Ti goes for.
So it seems like the card is working, but apparently I wasn't noticing the difference between the GPU and CPU on the large file because simply, Daz doesn't want to use the GPU for that file. I don't know if it is too big or not. Is there any where I can see that information?
I do find this information a bit disappointing as it seems that when it comes to larger files, I am no better off and Daz still ends up using the CPU anyway. So would I be better off just returing the video card and just get the fastest processor that will work on my motherboard?
Anyway, thank you all for the information and help. I did pick up quite a bit of information here.
Geo
If the whole scene does not fit in the GPU memory, Iray will only use CPU.
The main culprit is most likely the textures on the 200-300 props in the scene. You can try this free script to reduce the size of the textures : https://www.daz3d.com/forums/discussion/137161/reduce-texture-sizes-easily-with-this-script/p1
You can also buy Scene Optimizer (https://www.daz3d.com/scene-optimizer) from the store to reduce texture sizes, etc.
You are not that far over the 8GB limit, so either of the approaches above should get the scene to render on the 2070.
This is VERY disappointing news. I am getting the gist that Daz Studio is more CPU based than GPU. After doing more tests I am finding out that I can't use almost half of my renders with the new GPU. I think I am better off packing it up, sending it back and just get the most powerful i7 available. Then I don't have to worry about 'running out' of anything. As it is, there are only a few more cards that are more powerful than the one I have. The best one being a 2080 Ti which has 3 more gigs of ram, but at what cost? Around $1400 for that card? I think I am better off just getting a more powerful CPU for my money. As it is, Zbrush, another program I am looking into, only uses the CPU as well.
On the other hand, the developers of Daz should realize this issue so that the program will use what it could out of the GPU memory and then put the rest in CPU, or they develope a FIFO system in which once memory is freed up on the video card that the CPU can refresh the GPU memory.
I mainly bought the RTX 2070 to have a boost of speed all around. I don't want to have to be restricted into a 'box' in that I am stuck with only a single Gen 8 character with a simple background all the time. Now I have to constantly be conscious of how much I can put into the GPU otherwise the render will drop to CPU? Naw, not for me. I have to think on this, but most likely the GPU is going to go back to the store.
Anyway, thank you and everyone else for helping.
Geo
You had unrealistic expectations, sadly. Now you have a better understanding. The issue is Iray, which is provided to Daz by Nvidia (whose business is selling ever more powerful and expensive graphics cards, you know.) If you want the quality of Iray renders, these are the limitations you live with. There are things you can do to help, but if your scene is just way too big and complicated, you won't ever get it to fit in the GPU. Some things to consider are:
Watch out that you do not also have unrealistic expectations of a faster CPU! I don't think you will see a tremendous speed improvement after that kind of upgrade, unless your current CPU is extremely underpowered and old.
Hello All,
Well, I have decided. The card is going to go back. This scene below is a two Gen 8 and one Gen 3 character render with a handful of props a ground and background. Believe it or not, this simple scene already dropped to CPU. I can't deal with that. It seems to me a GPU is pretty much useless in Daz Studio.
Understood. I was looking at the largest CPU that would fit into my machine. I have a Kaby-Lake motherboard and the biggest processor that will fit is an intel i7 7700k. The thing is I already have an i7 6700k. Comparing the processors benchmarks there is only an 8% increase of the 7700k over the 6700k. The former is 4.2ghz and the latter is 4.0. So that may not be significant enough. It is just that I can't stand these 10 hour renders of late. I was initially under the impression that the GPU would handle the bulk of the render and whatever was left the CPU would take over. That I could deal with. But that I have to be conscious of ram space on the GPU all the time now, that isn't cool. That scene that I just posted above I don't even consider a complex scene. Even with the CPU, that one rendered in about 2 to 3 hours. So what is my time savings if I only save on renders that normally were under 2 hours with the processor? That isn't much at all. I think the option is non of the above. The card is going to go back. It just isn't worth the money. Perhaps in a year or so I can get a 1080Ti for a good price (less than $300) then it might be worth it since that card is older generation and has 11gig ram. As it is, I am having trouble with my Linux Partition right now as well since I have to update the driver, I am forced to do a whole OS upgrade and I don't want to do that right now.
I will take the scene optimization into consideration as I am sure it will help with the CPU power too.
Thanks again for the info and help.
Geo
Maybe it is not time to give up yet. You will never get GPU Iray render rates with any CPU upgrade.
All of your characters and props are at a distance from the camera. Most of their body parts are covered with clothing. You could hide all those hidden body parts and remove the skin textures from those hidden parts. You could reduce the clothing and face textures by a lot; you don't need 4K textures on full body size characters. I bet you could make this fit with a little work. Maybe you don't want to invest that time and effort in tweaking your scene, and that is fine. It is certainly your choice.
I have a graphics card with only 6GB of video RAM, and using this kind of optimization technique, I can render similar scenes in GPU. I'll attach a recent example with 4 G8 characters, props, background, etc.
We were typing at the same time. I don't mean to continually pester you after you have decided what is best for you.![blush blush](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/embarrassed_smile.png)
If you want good results unfortunately it takes some elbow grease. I render large complicated scenes on Iray on two 1080GTXs which have the same memory limit. In order to do that, I have to use scene optimizer to check the sizes on everything, identify assets that are unreasonably expensive, downrez those textures to smaller sizes, make sure background elements use lower detail etc. That is just the limitation of the technology.
It isn't just us, the film industry has limitations too. I heard a story on the first Matrix movie that on their first attempt, they bought a bunch of powerful hardware and decided to leave all the squids at full resolution with 40GB of data per model. After they tried rendering a dozen on screen at once, their system shit itself, they went back and created lower res versions for all the squid in the background and optimized their assets to fit within their hardware limits. That's just how cg art works.
Yes, I was doing the math on that and I guess it is just best that I save my money now and in the future I will eventually get an i9 machine or perhaps someday Nvidia makes a gpu with 16gig or more ram on it as that is the only way I will see improvements. Yes, I am familiar with hiding things that are not in view and also reducing the sub d levels down for rendering. In the render I posted the characters are naked underneath and I have reduced the sub d levels to 1 (I think). I considered that a simple scene and if it couldn't fit into the 8 gigs on the card, then the card doesn't justify it's value to me. Further, I have many many more scenes that are more involved than that one such as the one I attached below. That render was CPU only and it was over a 10 hours about 85% and I have many more like that one. So as for making the scene fit, I don't think there is any force out there that would make that one fit into the space if the one I posted above didn't fit. But as you pointed out, there is only so far I will go as I do crank out many scenes in a weeks time and I don't want to go too nuts in crunching things down when it does seem that most of the time my scenes are going to be far too detailed anyway. Maybe, Maybe a Ti card could work (as it has 3 gig more ram), but then I am going to go down to the 1080 Ti when it comes down more in price. For $300 (or less) then I can justify the cost. But the RTX 2070 costed me $550 and I don't feel it is justfied. If I were a hard core PC gamer and needed the card for game enhancement, then I would keep it, but as it is, my computer is mainly for productivity and I rarely play games on it. As I mentioned either earlier or in another post, I am interested in Zbrush as well and that program doesn't really use CPU power at all. So I think I am making a good decision in sending the card back as it doesn't meet my expectations.
Oh no, you are not pestering at all. You are helping out and I appreciate that. If I am coming off as sounding disappointed it is because of the reality of the situation as I expected much more out of this GPU than I got.
I don't mind doing some optimization to cut down the time, but for the most part, I can put up a render when I go to bed and wake up the next morning and it is done. Yes, there are some that go longer, but that is rare. Just overdoing with the optimization is putting more time into it that way when I could be doing something else while the computer is rendering. So in a way it is borrowing from Peter to pay Paul. I don't mind putting some time into optimization, but I am not going to sit there and crunch the numbers to fit the renders to the memory size for every render when I know already that most will not fit into that card. Today's tests proved that. If it were once in a blue moon I can see it, but I already determined that 75% of my renders will NOT fit. And this number is only going to get bigger and the times longer in the future. What will happen when Genesis 9 is released? I am sure that any new character will be even more complex than Genesis 8 and will take longer to render. I think it might be even better off to set up a 'render machine' that I can push files to and thus just have it continuously running knocking out one render after another all day 24 hours a day. That to me would be a better investment than a video card that can't even hold 3 Gen 8 characters with a background as it would free up my main machine.
Intersting bit of info. But I think I am still better off setting up a dedicated rendering machine so it frees up my main machine. Saving the $550 from this card and putting it towards such a machine would be a wiser investment.
Thanks for the info,
Geo
First You clearly havesome settings somewhere too high. I render scenes with 3 figures and an environment all the time on a 8Gb card. The most likely culprits are subD or your render size. You need to make sure nothing has subD set over 3 (subD creates geometrically more polygons at each step).
Perhaps, I know the scene above the only one that might be over 3 is the girl in front, the two in the back are either 1 or 2. Granted, I know that particular render is taxing and it was one of my longest to render on a CPU. I think another issue is lighting. Using one light source always seems to go faster. Here there is light coming from the window, overhead, the fireplace, the table lamp and a front lamp. So that is quite a bit of emissive surfaces. I always try to use as few light sources as possible and after this render I decided to shut the shutters and turn off the outside light source. Working around by not including a table lamp or the fireplace in the scene helps too if I can do it. I admit there is a lot going on in that scene above and more than likely it isn't going to fit on any single video card. But I am not upset about that. It was the other three figure one outside with one light source that was the nail in the coffin for me.
Probably what I am going to do is what I said above and just make myself up a dedicated i9 computer that I can run an instance of Daz Studio on JUST for rendering purposes and I will just push files over to it and it can have a render running while my computer is free for me to work on.
I certainly will keep in mind everying said here as I am sure that information will help to speed up processor renders.
But I already decided I am not keeping the GPU. When the 1080 Ti's come down in price...and I mean REALLY down (or I get one used), then perhaps I will give it a go again. For now, for just having it to do small renders with the RTX 2070 is just a waste of money.
Thanks again for the info.
Geo
Nvidia does make graphics cards with 24 GB of memory, but they are expensive. Titan RTX
Also, there is another option. Jack Tomalin has an Iray server that you can rent time on. I've never tried it, but you might want to look into it. I believe he has 4 2080Ti cards, unless he has upgraded again.
Yeah, but at $2500, I could go with a dual processor i9 system. However, I read the Titan serious has issues with the Linux driver so it probably wouldn't work out.
4 2080 Ti 's. So that would be 44gig of ram. Pretty impressive. Still though it is a bit counter productive to pay for time when in the long run it would be more cost effective to have your own hardware.
So far it seems like having a dedicated machine to do all the rendering is the best bet for me.
Unfortunately nope, its only 11 gigs of memory. GPU Memory doesn't add, it is lowest common denominator. So if you have an 11 GB card and a 4GB card, it will hit the 4GB limit and fail back to CPU, or in Jack's case you're limited to 11 GB. There were rumors that new RTX technology would allow cards to create a pool that would add them together but as far as I know that was only in theory and the drivers/tech aren't close to supporting that yet. That'll be my next big GPU upgrade if that ever happens.
No, if it hits 4GB the 4GB card will be dropped, the 11GB card would continue to work unless/until it hits 11GB.
Texture maps are set up to accomodate people that will be doing large sized closeups. If you are not doing large size closeups, 4k and larger maps are overkill. For the render you showed you could use scene optimizer to reduce the largest maps to 2k or maybe even 1k and you should not see a difference, but it should fit in the GPU. CPU renders, even if you get the best CPU avalable, or even a few of them, are going to take hours. For example, an art scene I do generally takes 30 min to 3 hours on GPU. On CPU, I have let one of my scenes cook all day while I was at work, and it still wasn't ready when I got home.
https://www.daz3d.com/forums/discussion/341041/daz-studio-iray-rendering-hardware-benchmarking/p1#Section 2
You can check this thread to get an idea on times. Tha fastest intel CPU was over an hour, a threadripper was 47 minutes. Keep in mind, this is a very simple scene. On my 2080 super it takes about 5 minutes, on my ryzen it takes an hour. In real world scene renders its up to a few hours on my GPU, vs a day or more on my CPU.
IME lighting consumes functionally no VRAM. It certainly makes renders take longer but your issue is a scene not fitting in VRAM.
You never need SubD above 2 except for very tight closeups. SubD 3 uses 4 times as many polygons as 2. If your scene would fit in 8 Gb, but barely, a SubD 3 object would drop you to CPU.
I have a 2070 and do large renders on it all the time. With some effort, and things like Scene Optimizer, I've gotten 8 characters plus props and an environmnet to render.
I ran a test render on the dual 7742 rack we have last week. iRay couldn't use all of those cores, I think it was a NUMA issue but even that, the highest core count single rack system out there, wouldn't have gotten the benchmark scene done as fast as my 1080ti.
I have a question about optimising GPU memory: If you have things in your scene hidden, do they still contribute to the GPU limit or no? (I assume no?)
Also another question: Do you need high subD to take advantage of some fur shaders that use displacement?
Items with Visible or Visible in Render off are not passed to the render (items that are made invisible though surface properties, however, are).
For Iray, yes as displacement needs vertices.
That totally sucks! I was under the impression that the purpose for linking cards is to add the memory. So then what is the purpose of linking cards? Now on the flipside. If you have a dual processor motherboard and two i9 processors would that speed up render times over one cpu?
It seems gpu memory is the big problem and if you want details, then one is forking over $2500 for a Titan. I have been thinking about getting a 1080ti second hand as that card is a bit better than the 2970 Super. The extra 3gig of ram should make a difference. It also has more cuda cores.