dzneuraymgr.cpp 369 cannot allocate framebuffer during GPU render - Now what do I do?
dzneuraymgr.cpp(369) : cannot allocate framebuffer during GPU render; Now what do I do?
I'm trying to use DAZ Studio after roughly a nine-month hiatus. So, I'm familiar with it but some things are rusty.
Earlier today, I set up my first new scene in a while and then I went to render it. It's my first time using Genesis 9, I think.
If I want to do GPU renders, what do I do now?
DAZ Studio 4.21.0.5 (64-bit)
NVIDIA Drivers 536.23
CUDA 12.1
NVIDIA Control Panel says my card has:
43999MB Total available graphics memory
11264MB Dedicated video memory
Here's the trimmed log file to see the details:
+++
2023-06-24 13:22:44.837 [INFO] :: Rendering image
[loading images]
2023-06-24 13:23:27.735 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: CUDA device 0 (NVIDIA GeForce RTX 2080 Ti): Not enough memory for kernel launches (260.153 MiB (398.278 MiB) required, 0.000 B available). Cannot allocate framebuffer.
2023-06-24 13:23:27.735 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: CUDA device 0 (NVIDIA GeForce RTX 2080 Ti): Failed to setup device frame buffer
2023-06-24 13:23:27.735 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: CUDA device 0 (NVIDIA GeForce RTX 2080 Ti): Device failed while rendering
2023-06-24 13:23:27.735 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [WARNING] - IRAY:RENDER :: 1.3 IRAY rend warn : CUDA device 0 (NVIDIA GeForce RTX 2080 Ti) ran out of memory and is temporarily unavailable for rendering.
2023-06-24 13:23:28.157 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [WARNING] - IRAY:RENDER :: 1.3 IRAY rend warn : All available GPUs failed.
2023-06-24 13:23:28.157 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: Fallback to CPU not allowed.
2023-06-24 13:23:28.157 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: All workers failed: aborting render
2023-06-24 13:23:28.157 [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [ERROR] - IRAY:RENDER :: 1.3 IRAY rend error: CUDA device 0 (NVIDIA GeForce RTX 2080 Ti): [Guidance sync] Failed slave device (remaining 0, done 0).
2023-06-24 13:23:28.157 [ERROR] Iray :: Internal rendering error.
2023-06-24 13:23:28.247 [INFO] :: Saved image: ..\AppData\Roaming\DAZ 3D\Studio4\temp\render\r.png
2023-06-24 13:23:28.258 [INFO] :: Finished Rendering
2023-06-24 13:23:28.293 [INFO] :: Total Rendering Time: 43.53 seconds
2023-06-24 13:23:34.535 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : Device statistics:
2023-06-24 13:23:34.535 Iray [INFO] - IRAY:RENDER :: 1.0 IRAY rend info : CUDA device 0 (NVIDIA GeForce RTX 2080 Ti): 0 iterations, 18.235s init, 0.422s render
Comments
Here is the file I was unable to attach to the initial post
The image that DAZ 3D Forums will not let me attach is merely the results of HelpTroubleShootingAboutYourVideoCard:
Current Hardware Features
Platform OpenGL
Vendor NVIDIA Corporation
Version 4.6.0 NVIDIA 536.23
Hardware NVIDIA GeForce RTX 2080 Ti/PCIe/SSE2
Features
Multi-Texturing Supported
Shadow Map Supported
Hardware Anti-Aliasing Supported
OpenGL Shading Language Supported
Pixel Buffer Supported
Pixel Buffer Size Not Enabled
Maximum Number of Lights 8
Number of Texture Units 4
Maximum Texture Size 32768x32768
<OK>
I still don't know how to tell how much memory in my scene is too much, but I used
Scene Optimizer
https://www.daz3d.com/scene-optimizer
And "reduced the maps" of a whole bunch of items that I knew weren't visible. I think I also discovered that the purchased Environment I used in the render was 8K! I don't know about the Genesis 9 figure, but I tried to use the script to leave the subject untouched and just reduce parts of the environment that I knew weren't visible (but I would need to get the lighting right).
I also discovered that I had:
Camera View Optimizer
https://www.daz3d.com/camera-view-optimizer
which I didn't use in this case at all, but it does seem to me like at least with my current rig, and possibly moving into using more 8K content, I'm going to have to start paying attention to the memory used.
Does anyone know - except for getting the "framebuffer" error in the log, just how you can check a scene to see if it will be able to GPU render?
Short Answer: There is no way to accurately calculate in advance the VRAM useage of any scene.
There are too many variables. You could, theoretically, go through each element of the scene, adding up the base memory useage of each and every map but that tedious task wouldn't take into account the texture compression settings in the advanced section of the Render tab, it wouldn't take into account the resolution you are rendering at, etc, etc. That's all moot anyway. The Iray renderer itself doesn't know what the VRAM useage will be in advance.
That said, 11GB of VRAM isn't too shabby and should be more than enough for a single G9, hair, clothing and a scene. You'd have to try hard not to fit that into 11GB and you've optimized your scene as well....
What environment are you using? If it's one of the big forest/landscape scenes or a large interior scene then try setting Instancing Optimization to Memory in the Render Settings. You could also get hold of Instancify from the Daz store but if your scene has no duplicate objects (or already uses instances) then that won't help at all.
What hair are you using? dForce strand hairs can be murderous on system RAM and VRAM.
You've told us everything about your system (which is helpful) but nothing about the actual products you're trying to render. There have been a few environments and props released over the last few months or so which use ridiculous amounts of VRAM because they are abysmally optimized. Not just 'could be better'. Appallingly bad. Maybe you bought one of them...
More info please!