GPUs not used during rendering

I am encountering a rendering issue. I have a Titan X Maxwell and 2 GTX 1080Tis. When I have a scene loaded, GPUZ tells me each GPU is using between 5.5 and 7.5 GB of VRAM. It seems strange to me that they are all using such different amounts of VRAM, but still that's well within the 11GB and 12GB amounts of the 1080tis and Titan X, respectively.

While in iray preview mode in the viewport, all 3 gpus are being utilized as measure by GPUZ. However, when actually try to render the scene, only the titan x is used.

All GPU drivers are up to date, and all BIOS and chipset drivers are up to date as well. This happens in both the public and beta buils of DS. 

Any ideas on how I can fix this? Or what the problem is? Or perhaps someone has encountered the same issue before? Thanks.

Comments

  • JCThomasJCThomas Posts: 254

    oh yeah, and I'm on X470 with Ryzen 7 2700X.

  • JCThomasJCThomas Posts: 254

    And the same scene rendered in Octane does not have these problems. All 3 gpus are used in Octane, so it's an iray issue I guess?

  • kenshaw011267kenshaw011267 Posts: 3,805

    Check that all GPU's are selected in DS render settings.

  • JCThomasJCThomas Posts: 254

    Sigh, I wish it had been just a silly mistake like that, but alas, all the GPUs are enabled in my DS render settings. To be honest, i'm completely dumbfounded here.

  • JCThomasJCThomas Posts: 254
    edited April 2020

    Edit: double post

    Post edited by JCThomas on
  • I'd first suggest dropping the Titan x from rendering and see if the issue persists. Couple questions. Is the Titan x a Maxwell or Pascal? Which gpu is driving your monitor?
  • kenshaw011267kenshaw011267 Posts: 3,805
    JCThomas said:

    Sigh, I wish it had been just a silly mistake like that, but alas, all the GPUs are enabled in my DS render settings. To be honest, i'm completely dumbfounded here.

    Could you post the log of a start of a render?

  • JCThomasJCThomas Posts: 254

    Thanks for the tip. Yeah, I had already tried dropping the Titan X, which is a Maxwell, from the render. It then just rendered on CPU even though I had CPU unselected in render settings. The Titan X is driving the monitors currently, but I've tried both 1080tis and the problem persists.

    But if i use scene optimizer to divide the textures in half, the all 3 gpus are utilized in the render. So i guess it's a VRAM thing, but I don't see why all 3 would be used in the viewport but not in the final render.

  • Dim ReaperDim Reaper Posts: 687

    I'm running a 2080ti and a 1080ti, both have 11GB VRAM.  If I'm working on a scene and using the iray preview, I sometimes find that one of the GPUs does not take part in the render - and strangely, it can be either one.  I've found that the best thing to do is save often and before doing the final render, save the scene, close DS, then go into task manager and kill the DS process if it is still there.  Reload again, load the scene and hit render.  It seems a bit of a faff but I find it is better than starting a render which then goes to CPU and then takes ages to cancel. 

  • kenshaw011267kenshaw011267 Posts: 3,805
    JCThomas said:

    Thanks for the tip. Yeah, I had already tried dropping the Titan X, which is a Maxwell, from the render. It then just rendered on CPU even though I had CPU unselected in render settings. The Titan X is driving the monitors currently, but I've tried both 1080tis and the problem persists.

    But if i use scene optimizer to divide the textures in half, the all 3 gpus are utilized in the render. So i guess it's a VRAM thing, but I don't see why all 3 would be used in the viewport but not in the final render.

    viewport uses draw settings and a render uses render settings so you could have some difference between the two affecting memory concumption. If there are any instances in the render then switching from speed to memory might get all 3 going. Also check your texture compression settings.

Sign In or Register to comment.