Is it possible to go back from version 4.12 to version 4.10? I'm a dumb idea update daz 3d and now the projects got corrupted, the shaders look different, ect
You would need to open a support ticket if you didn't back up the isntallers, but 4.10 to 4.12 should not have such a dramatic effect so perhaps if you give more infromation we will be able to help you get 4.12 working correctly.
That looks, at least in part, to be a lack of convergence - what is your GPU, and what is the driver version (right-click on desktop>nVidia Control Panel)?
How much memory does that have? There does seem to be an issue with OptiX Prime always being on for non- RTX cards, which incerases memory use compared to Optix Prime off (if that's how you were rendering). if it's dropping to CPU then it may be stoping for time (two hours by default) isntead of convergence. The other character, that is converging OK, may have less dmanding materials or may be getting more direct light. The simplest way to check if I am along the right lines would be to look at the log file (Help>Troubleshooting>View Log) after rendering and check the report on how many samples were handled by which device.
A lot of memory, this is hevy project, with a 12GB of memory, the project consumes roughly 11gb, but thank 4 help, I just fixed it. I just load new charackter, aplay shaders ect and looks good.
btw. is it normal that a processor is also used during rendering? It is used 50% despite being unchecked
A lot of memory, this is hevy project, with a 12GB of memory, the project consumes roughly 11gb, but thank 4 help, I just fixed it. I just load new charackter, aplay shaders ect and looks good.
btw. is it normal that a processor is also used during rendering? It is used 50% despite being unchecked
A 1050ti only has 4gb of VRAM, if your scene is using 11gb of memory iray it won't use the card to render it will fall back to the CPU for rendering, which is much slower.
11gb that's too much said, this is the memory consumption during rendering. When the scene is loaded but the render has not started, the memory consumption is within 5.6 GB with the browser enabled.
Since I am stuck for now with 4.12 as cannot be bothered with the rigmarole involved with contacting customer service for a program I mostly just export from, is there some trick I can use to disable OptiX prime as I have a 980ti and do animation when I do use it.
Basically is there a process somewhere say in task management I can kill the first frame so it won't go to CPU the next one?
Or something I can modify something in the Nvidia settings.
Since I am stuck for now with 4.12 as cannot be bothered with the rigmarole involved with contacting customer service for a program I mostly just export from, is there some trick I can use to disable OptiX prime as I have a 980ti and do animation when I do use it.
Basically is there a process somewhere say in task management I can kill the first frame so it won't go to CPU the next one?
Or something I can modify something in the Nvidia settings.
As far as I am aware no, OptiX Prime is always on for non-RTX cards. You can of course reduce texture sizes, usually without any ill-effect.
11gb that's too much said, this is the memory consumption during rendering. When the scene is loaded but the render has not started, the memory consumption is within 5.6 GB with the browser enabled.
Which if your card only has 4GB VRAM then the scene won't fit on it and it will fall back to CPU rendering.
Comments
You would need to open a support ticket if you didn't back up the isntallers, but 4.10 to 4.12 should not have such a dramatic effect so perhaps if you give more infromation we will be able to help you get 4.12 working correctly.
OK, for example, a render made on version 4.10 and same render on 4.12, look at face
G2F + Beautiful skin iray
That looks, at least in part, to be a lack of convergence - what is your GPU, and what is the driver version (right-click on desktop>nVidia Control Panel)?
gtx 1050 ti, drivers version 436.48, had to install the latest, because daz does want render (black screen)
It's kind of weird because I have a second character on the scene and she looks normal
How much memory does that have? There does seem to be an issue with OptiX Prime always being on for non- RTX cards, which incerases memory use compared to Optix Prime off (if that's how you were rendering). if it's dropping to CPU then it may be stoping for time (two hours by default) isntead of convergence. The other character, that is converging OK, may have less dmanding materials or may be getting more direct light. The simplest way to check if I am along the right lines would be to look at the log file (Help>Troubleshooting>View Log) after rendering and check the report on how many samples were handled by which device.
A lot of memory, this is hevy project, with a 12GB of memory, the project consumes roughly 11gb, but thank 4 help, I just fixed it. I just load new charackter, aplay shaders ect and looks good.
btw. is it normal that a processor is also used during rendering? It is used 50% despite being unchecked
A 1050ti only has 4gb of VRAM, if your scene is using 11gb of memory iray it won't use the card to render it will fall back to the CPU for rendering, which is much slower.
11gb that's too much said, this is the memory consumption during rendering. When the scene is loaded but the render has not started, the memory consumption is within 5.6 GB with the browser enabled.
Since I am stuck for now with 4.12 as cannot be bothered with the rigmarole involved with contacting customer service for a program I mostly just export from, is there some trick I can use to disable OptiX prime as I have a 980ti and do animation when I do use it.
Basically is there a process somewhere say in task management I can kill the first frame so it won't go to CPU the next one?
Or something I can modify something in the Nvidia settings.
As far as I am aware no, OptiX Prime is always on for non-RTX cards. You can of course reduce texture sizes, usually without any ill-effect.
Which if your card only has 4GB VRAM then the scene won't fit on it and it will fall back to CPU rendering.