Daz Studio rendering slowing (using native GPU not Nvidia card

To level set. a smaller version of this same scene rendered with the NVidia card, this one is not. It spits out a single interation every 5 minutes or so (I DID make the scene a 4K draw, so I'm not surprised TOO much). Does anyone know why it would force the use of the native GPU and not use the nVidia card? I tried unchecking the CPU as an option but that did not help.

System: Dell Inspiron 7555, GPU: nVidia GForce 960m, Windows 10, 16 GB RAM

Comments

  • kenshaw011267kenshaw011267 Posts: 3,805

    More than likely the scene is too big for the amount of memory dedicated to the GPU or Iray doesn't like the slow as molasses DDR memory instead of the GDDR GPU's have.

  • rrwardrrward Posts: 556

    This uses shared memory, correct? Try assigning more memory to the GPU. That mGPU is going to be a very limiting factor when it comes to image complexity and size.

  • CoryllonCoryllon Posts: 284

    It SHOULD be and always has used my GPU before this. even this same scene on a smaller scale with a different angle and more of the same props. 

  • kenshaw011267kenshaw011267 Posts: 3,805

    So? Rendering at 4k takes more RAM and you need to give it more, at a minimum.

  • CoryllonCoryllon Posts: 284

    It uses all available RAM. I'm not limiting anything. I am not saying it's taking my video card memory, then going to get more from my CPU and RAM, I'm saying it's not taking advantage at all of my GPU, or at least not logging that it is. It usually says something about the nVidia card in its render initialization then something about the CPU. Now it jsut says something about the native CPU. Native suggests the discrete card, not the dedicated one. 

    If you are telling me that because I am rendering in 4K it will only use RAM because that's a bigger "bucket" something is wrong on the programing end.

  • rrwardrrward Posts: 556
    Coryllon said:

    It uses all available RAM. I'm not limiting anything. I am not saying it's taking my video card memory, then going to get more from my CPU and RAM, I'm saying it's not taking advantage at all of my GPU, or at least not logging that it is. It usually says something about the nVidia card in its render initialization then something about the CPU. Now it jsut says something about the native CPU. Native suggests the discrete card, not the dedicated one. 

    If you are telling me that because I am rendering in 4K it will only use RAM because that's a bigger "bucket" something is wrong on the programing end.

    Okay, looking at Dell's site, the 960m should have 4GB of dedicated VRAM. That being said, your 960m has to store your screen interface, the asset data for your render, and the final render all at the same time. And if you've got Photoshop open at the same time it will be grabbing a chunk of VRAM as well. Try rendering the scene at smaller sizes working your way back up to see at what resolution things go south on you.

  • CoryllonCoryllon Posts: 284
    edited August 2019

    No Photoshot or any other application. and it does it even when I am test rendering a small section. Also it used to used to say it was using the GPU AND the CPU, now it's just saying Native GPU. I have also done larger scenes than this

    Post edited by Coryllon on
  • kenshaw011267kenshaw011267 Posts: 3,805

    The only GPU DS can use is an Nvidia one. The 960m is the only one of those on that laptop unless you have connected an external one.

  • CoryllonCoryllon Posts: 284
    edited August 2019

    There is always the intel video card (the discrete one that comes on every PC). However, I do think I found the problem... It might be switching randomly to verbose logging. So native GPU might be whatever GPU it's configured to use and not telling me that's it's using both. It does seem to take longer than expected, however that might be because I have a ton of grass and that's a lot of extra verticies and ray tracing. 

    Post edited by Coryllon on
  • DustRiderDustRider Posts: 2,797

    Unfortunately, Iray can't use your integrated Intel GPU, It can only use your Nvidia GPU and/or your CPU. It sounds like your render is dropping to CPU because it is using more memory than your GPU can handle. Keep in mind that different figures/props will consume greater amounts of GPU memory, and a 4K image will consume 4X the memory of a 1080p image (the render also has to fit in GPU memory).

Sign In or Register to comment.