OOT hair styles, cant render. Daz goes back to pxt CPU?

Hi, Does anyone know why the OOT hair styles have such a big impact and never renders with the gpu on my system but only goes to pxt CPU? I have a 1080 ti and a 980 ti installed. Even when i have nothing in the scene only the model it goes back to cpu. Thanks

Comments

  • SixDsSixDs Posts: 2,384

    Hypothetically, the main reason for Iray renders defaulting to the CPU is that the total memory requirements of all assets in a scene which must be loaded into the VRAM of your video cards to complete the render, exceeds the amount of available VRAM. The total memory required to load the texture resources is often the culprit. You may know this already, but here is how it works: when the render is started, all the necessary resources to complete the render begin loading into the video cards RAM. With two cards installed, one with 6 Gb of VRAM and one with 11 Gb of VRAM, the scene will be loaded into memory and rendering will begin, using the two GPUs, unless the amount of VRAM exceeds 6 Gb. If the latter is the case, the 980 Ti will drop out since it is unable to load all the required data. If the amount of data exceeds 11 Gb, then the 1080 Ti will drop out also, and the render will default to CPU. In the example that you have given, with a single figure loaded, it would seem unlikely that the amount of video RAM required would exceed 11 GB, but it is not beyond the realm of possibility, depending upon the textures being used on the figure itself, any clothing that is used, and, of course, the hair. Still, it would seem an unlikely scenario.

    If you are using verbose mode, so that the process of loading the textures into memory is being displayed, what sort of numbers are you seeing? Alternatively, you can check the DAZ Studio log file for results from previous attempts. I would be interested to know if the amount of VRAM is being exceeded, or whether the log file is indicating some other reason why the render is defaulting to the CPU. I know some users have indicated problems with certain Nvidia drivers, but the log file should give clues as to what is really going on.

  • xXQuatroXxxXQuatroXx Posts: 173
    Damn so stuppid of me not trying and to deselect the 980 ti and try only on the 1080 ti see if that helps. If it does then ill spot render the face and hair only with 1080 ti and then deselect the hair and tender the full image with both cards and the blend those 2 together in photoshop. First try it and ill get back too you if the 980 ti is the issue with the vram. I know about that all just forgot. Daz works only with the lowest card on vram. If it doesnt fit within the lowest vram it goes too cpu. Doesnt switch to the 1080 ti sadly. Verbose mode, any other modes available?
  • SixDsSixDs Posts: 2,384

    Actually, my understanding is that the CUDA software will be capable of using only one card, if necessary. Again, I still think that it is important to have a look at that log file to see what is actually happening, before jumping to conclusions.

  • xXQuatroXxxXQuatroXx Posts: 173

    Hi All,

    So deselected the 980 ti and only work with the 1080 ti.

    Added Standard Gen 8 female. / OOT hair used is: https://www.daz3d.com/rose-hair-for-genesis-3-and-8-females

    Same result draws back to CPU... and this happens only with OOT hair styles that i know off.

    Other hair no issue.

    working with Daz 4.10.0.123

    Nvidia Driver: 418.81

    My Rig screen shot: https://gyazo.com/54e6e1b5b560b9c904d5e4fcf1c50385

    The Log file is attached.

    Thanks for any help you can give.

     

    txt
    txt
    log.txt
    1M
  • xXQuatroXxxXQuatroXx Posts: 173
    edited March 2019

    and yes, for Vram inside daz will be working with only 1 card with the lowest Vram.

    But Daz does combine the cuda cores installed for the rendering process. So that if all you have in your scene fits within the vram (of the lowest card) you are able to use both card for the cuda (not vram) cores to render quicker.

    Post edited by xXQuatroXx on
  • SixDsSixDs Posts: 2,384

    2019-03-03 12:44:36.544 Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 1080 Ti): Scene processed in 12.732s
    2019-03-03 12:44:36.549 Iray INFO - module:category(IRAY:RENDER):   1.5   IRAY   rend info : CUDA device 0 (GeForce GTX 1080 Ti): Allocated 7.43592 MiB for frame buffer
    2019-03-03 12:44:36.550 Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 1080 Ti): Allocated 1.65625 GiB of work space (2048k active samples in 0.000s)
    2019-03-03 12:44:36.550 Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : CUDA device 0 (GeForce GTX 1080 Ti): Used for display, optimizing for interactive usage (performance could be sacrificed)
    2019-03-03 12:44:36.565 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): Kernel [0] failed after 0.009s
    2019-03-03 12:44:36.565 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal instruction was encountered (while launching CUDA renderer in core_renderer_wf.cpp:807)
    2019-03-03 12:44:36.565 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): Failed to launch renderer
    2019-03-03 12:44:36.571 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): Device failed while rendering
    2019-03-03 12:44:36.571 WARNING: dzneuraymgr.cpp(307): Iray WARNING - module:category(IRAY:RENDER):   1.2   IRAY   rend warn : All available GPUs failed.
    2019-03-03 12:44:36.571 Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : Falling back to CPU rendering.

    This is where the problem lies. I expect that a similar message would be generated for both GPUs if they were enabled. As you can see in the excerpt, the problem does not lie in the amount of memory being consumed - it is far less than the amount of VRAM that the 1080 Ti has (or the 980 Ti, for that matter).

    Here is a complete shot in the dark, but I see from your log file that you have OptiX Prime enabled. Try disabling that to see if that has any effect. I know that some users have had difficulties with that in the past.

    At this point, I am not inclined to believe that the issue is necessarily limited to the OOT product - it may simply be triggering something else.

    I wonder if you could post a screenshot of your render settings for us?

  • There have been several threads recently about problems with the 418 Nvidia drivers. That may, or may not, be your problem. Version 417 seems to be fine, a few people are still having issues with the latest 419 drivers. Might be worth going back to the 417's for now.

  • FSMCDesignsFSMCDesigns Posts: 12,754

    Also the OOT hairs tend to use really insane texture sizes, so reducing the resolution might help also.

  • xXQuatroXxxXQuatroXx Posts: 173
    Optix off did nothing. Neither did scene optimizer for reducing the hair texture. Was 8000x8000 save back to 20xx x 20xx tried again. Still going back to cpu. Going to try driver 4.17 next.
  • xXQuatroXxxXQuatroXx Posts: 173

    Well that fixed it ;) uninstalled drivers compleetly with DDU.

    The downloaded nvidia 391.35 @sickleyield thanks for the advise on the driver (same goes for Fab3x, advised 417.71) both drivers work.

    Where it showes PTX now, it last showed PTX with CPU node, now it doesn't and renders insanly fast with only G8F and some OOT hair in 4K under just under 1 min.

    FInally!!!! thanks to those who all chipped in.

    Much very much appriciated.

    aff14f5037a2a079e010a0d55abf8bac.png
    322 x 81 - 6K
Sign In or Register to comment.